Patents by Inventor Forest Jay Handford

Forest Jay Handford has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11067405
    Abstract: Image-based analysis techniques are used for cognitive state vehicle navigation, including an autonomous or a semi-autonomous vehicle. Images including facial data of a vehicle occupant are obtained using an in-vehicle imaging device. The vehicle occupant can be an operator of or a passenger within the vehicle. A first computing device is used to analyze the images to determine occupant cognitive state data. The analysis can occur at various times along a vehicle travel route. The cognitive state data is mapped to location data along the vehicle travel route. Information about the vehicle travel route is updated based on the cognitive state data. The updated information is provided for vehicle control. The updated information is rendered on a second computing device. The updated information includes road ratings for segments of the vehicle travel route. The updated information includes an emotion metric for vehicle travel route segments.
    Type: Grant
    Filed: January 30, 2019
    Date of Patent: July 20, 2021
    Assignee: Affectiva, Inc.
    Inventors: Maha Amr Mohamed Fouad, Chilton Lyons Cabot, Rana el Kaliouby, Forest Jay Handford
  • Patent number: 11056225
    Abstract: Analytics are used for live streaming based on image analysis within a shared digital environment. A group of images is obtained from a group of participants involved in an interactive digital environment. The interactive digital environment can be a shared digital environment. The interactive digital environment can be a gaming environment. Emotional content within the group of images is analyzed for a set of participants within the group of participants. Results of the analyzing of the emotional content within the group of images are provided to a second set of participants within the group of participants. The analyzing emotional content includes identifying an image of an individual, identifying a face of the individual, determining facial regions, and performing content evaluation based on applying image classifiers.
    Type: Grant
    Filed: February 28, 2017
    Date of Patent: July 6, 2021
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, James Henry Deal, Jr., Forest Jay Handford, Panu James Turcot, Gabriele Zijderveld
  • Patent number: 10843078
    Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.
    Type: Grant
    Filed: February 1, 2016
    Date of Patent: November 24, 2020
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Publication number: 20190162549
    Abstract: Image-based analysis techniques are used for cognitive state vehicle navigation, including an autonomous or a semi-autonomous vehicle. Images including facial data of a vehicle occupant are obtained using an in-vehicle imaging device. The vehicle occupant can be an operator of or a passenger within the vehicle. A first computing device is used to analyze the images to determine occupant cognitive state data. The analysis can occur at various times along a vehicle travel route. The cognitive state data is mapped to location data along the vehicle travel route. Information about the vehicle travel route is updated based on the cognitive state data. The updated information is provided for vehicle control. The updated information is rendered on a second computing device. The updated information includes road ratings for segments of the vehicle travel route. The updated information includes an emotion metric for vehicle travel route segments.
    Type: Application
    Filed: January 30, 2019
    Publication date: May 30, 2019
    Applicant: Affectiva, Inc.
    Inventors: Maha Amr Mohamed Fouad, Chilton Lyons Cabot, Rana el Kaliouby, Forest Jay Handford
  • Publication number: 20170171614
    Abstract: Analytics are used for live streaming based on image analysis within a shared digital environment. A group of images is obtained from a group of participants involved in an interactive digital environment. The interactive digital environment can be a shared digital environment. The interactive digital environment can be a gaming environment. Emotional content within the group of images is analyzed for a set of participants within the group of participants. Results of the analyzing of the emotional content within the group of images are provided to a second set of participants within the group of participants. The analyzing emotional content includes identifying an image of an individual, identifying a face of the individual, determining facial regions, and performing content evaluation based on applying image classifiers.
    Type: Application
    Filed: February 28, 2017
    Publication date: June 15, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, James Henry Deal, JR., Forest Jay Handford, Panu James Turcot, Gabriele Zijderveld
  • Publication number: 20160144278
    Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.
    Type: Application
    Filed: February 1, 2016
    Publication date: May 26, 2016
    Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith