Patents by Inventor Rana el Kaliouby

Rana el Kaliouby has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170330029
    Abstract: Disclosed embodiments provide for deep convolutional computing image analysis. The convolutional computing is accomplished using a multilayered analysis engine. The multilayered analysis engine includes a deep learning network using a convolutional neural network (CNN). The multilayered analysis engine is used to analyze multiple images in a supervised or unsupervised learning process. The multilayered engine is provided multiple images, and the multilayered analysis engine is trained with those images. A subject image is then evaluated by the multilayered analysis engine by analyzing pixels within the subject image to identify a facial portion and identifying a facial expression based on the facial portion. Mental states are inferred using the deep convolutional computer multilayered analysis engine based on the facial expression.
    Type: Application
    Filed: August 1, 2017
    Publication date: November 16, 2017
    Inventors: Panu James Turcot, Rana el Kaliouby, Daniel McDuff
  • Publication number: 20170238859
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. Intermittent mental state data is interpolated. The data and additional data allow interpretation of individual mental state information. The additional data is tagged to the mental state data. At least some of the mental state data, along with the tagged data, is analyzed to produce further mental state information. A mood measurement is a result of the analysis.
    Type: Application
    Filed: May 8, 2017
    Publication date: August 24, 2017
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Publication number: 20170238860
    Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video. The heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media, which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The inferred mental states are used to output a mood measurement. The mental state analysis, based on the heart rate information, is used to optimize digital media or modify a digital game. Training is employed in the analysis. Machine learning is engaged to facilitate the training.
    Type: Application
    Filed: May 8, 2017
    Publication date: August 24, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Viprali Bhatkar, Niels Haering, Youssef Kashef, Ahmed Adel Osman
  • Patent number: 9723992
    Abstract: Mental state analysis is performed by obtaining video of an individual as the individual interacts with a computer, either by performing various operations or by consuming a media presentation. The video is analyzed to determine eye-blink information on the individual, such as eye-blink rate or eye-blink duration. A mental state of the individual is then inferred based on the eye blink information. The blink-rate information and associated mental states can be used to modify an advertisement, a media presentation, or a digital game.
    Type: Grant
    Filed: March 15, 2014
    Date of Patent: August 8, 2017
    Assignee: Affectiva, Inc.
    Inventors: Thibaud Senechal, Rana el Kaliouby, Niels Haering
  • Publication number: 20170171614
    Abstract: Analytics are used for live streaming based on image analysis within a shared digital environment. A group of images is obtained from a group of participants involved in an interactive digital environment. The interactive digital environment can be a shared digital environment. The interactive digital environment can be a gaming environment. Emotional content within the group of images is analyzed for a set of participants within the group of participants. Results of the analyzing of the emotional content within the group of images are provided to a second set of participants within the group of participants. The analyzing emotional content includes identifying an image of an individual, identifying a face of the individual, determining facial regions, and performing content evaluation based on applying image classifiers.
    Type: Application
    Filed: February 28, 2017
    Publication date: June 15, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, James Henry Deal, JR., Forest Jay Handford, Panu James Turcot, Gabriele Zijderveld
  • Patent number: 9646046
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.
    Type: Grant
    Filed: March 15, 2014
    Date of Patent: May 9, 2017
    Assignee: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Patent number: 9642536
    Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game.
    Type: Grant
    Filed: March 15, 2014
    Date of Patent: May 9, 2017
    Assignee: Affectiva, Inc.
    Inventors: Youssef Kashef, Rana el Kaliouby, Ahmed Adel Osman, Niels Haering, Viprali Bhatkar
  • Publication number: 20170109571
    Abstract: Images are analyzed using sub-sectional component evaluation in order to augment classifier usage. An image of an individual is obtained. The face of the individual is identified, and regions within the face are determined. The individual is evaluated to be within a sub-sectional component of a population based on a demographic or based on an activity. An evaluation of content of the face is performed based on the individual being within a sub-sectional component of a population. The sub-sectional component of a population is used for disambiguating among content types for the content of the face. A Bayesian framework that includes a conditional probability is used to perform the evaluation of the content of the face, and the evaluation is further based on a prior event that occurred.
    Type: Application
    Filed: December 30, 2016
    Publication date: April 20, 2017
    Applicant: Affectiva, Inc.
    Inventors: Daniel McDuff, Rana el Kaliouby
  • Publication number: 20170105668
    Abstract: Image analysis is performed on collected data from a person who interacts with a rendering such as a website or video. The images are collected through video capture. Physiological data is captured from the images. Classifiers are used for analyzing images. Information is uploaded to a server and compared against a plurality of mental state event temporal signatures. Aggregated mental state information from other people who interact with the rendering is received, including video facial data analysis of the other people. The received information is displayed along with the rendering through a visual representation such as an avatar.
    Type: Application
    Filed: December 29, 2016
    Publication date: April 20, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
  • Publication number: 20170098122
    Abstract: Image content is analyzed in order to present an associated representation expression. Images of one or more individuals are obtained and the processors are used to identify the faces of the one or more individuals in the images. Facial features are extracted from the identified faces and facial landmark detection is performed. Classifiers are used to map the facial landmarks to various emotional content. The identified facial landmarks are translated into a representative icon, where the translation is based on classifiers. A set of emoji can be imported and the representative icon is selected from the set of emoji. The emoji selection is based on emotion content analysis of the face. The selected emoji can be static, animated, or cartoon representations of emotion. The individuals can share the selected emoji through insertion into email, texts, and social sharing websites.
    Type: Application
    Filed: December 9, 2016
    Publication date: April 6, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, May Amr Fouad, Abdelrahman Mahmoud, Seyedmohammad Mavadati, Daniel McDuff
  • Publication number: 20170095192
    Abstract: Analysis of mental states is provided using web servers to enable data analysis. Data is captured for an individual where the data includes facial information and physiological information. Data that was captured for the individual is compared against a plurality of mental state event temporal signatures. Analysis is performed on a web service and the analysis is received. The mental states of other people are correlated to the mental state for the individual. Other sources of information are aggregated, where the information is used to analyze the mental state of the individual. Analysis of the mental state of the individual or group of individuals is rendered for display.
    Type: Application
    Filed: December 16, 2016
    Publication date: April 6, 2017
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby, Rosalind Wright Picard, Oliver Orion Wilder-Smith, Panu James Turcot, Zhihong Zeng
  • Publication number: 20170068847
    Abstract: Analysis of mental state data is provided to enable video recommendations via affect. Analysis and recommendation is made for socially shared live-stream video. Video response is evaluated based on viewing and sampling various videos. Data is captured for viewers of a video, where the data includes facial information and/or physiological data. Facial and physiological information is gathered for a group of viewers. In some embodiments, demographic information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.
    Type: Application
    Filed: November 21, 2016
    Publication date: March 9, 2017
    Inventors: Rana el Kaliouby, Abdelrahman Mahmoud, Panu James Turcot
  • Publication number: 20170011258
    Abstract: Facial expressions are evaluated for control of robots. One or more images of a face are captured. The images are analyzed for mental state data. The images are analyzed to determine a facial expression of the face within an identified a region of interest. Mental state information is generated. A context for the robot operation is determined. A context for the individual is determined. The actions of a robot are then controlled based on the facial expressions and the mental state information that was generated. Displays, color, sound, motion, and voice response for the robot are controlled based on the facial expressions of one or more people.
    Type: Application
    Filed: September 23, 2016
    Publication date: January 12, 2017
    Inventors: Boisy G. Pitre, Rana el Kaliouby, Abdelrahman Mahmoud, Seyedmohammad Mavadati, Daniel McDuff, Panu James Turcot, Gabriele Zijderveld
  • Publication number: 20160379505
    Abstract: Mental state event signatures are used to assess how members of a specific social group react to various stimuli such as video advertisements. The likelihood that a video will go viral is computed based on mental state event signatures. Automated facial expression analysis is utilized to determine an emotional response curve for viewers of a video. The emotional response curve is used to derive a virality probability index for the video. The virality probability index is an indicator of the propensity to go viral for a given video. The emotional response curves are processed according to various demographic criteria in order to account for cultural differences amongst various demographic groups and geographic regions.
    Type: Application
    Filed: September 12, 2016
    Publication date: December 29, 2016
    Inventors: Rana el Kaliouby, Evan Kodra, Daniel McDuff, Thomas James Vandal
  • Patent number: 9503786
    Abstract: Analysis of mental states is provided to enable data analysis pertaining to video recommendation based on affect. Analysis and recommendation can be for socially shared livestream video. Video response may be evaluated based on viewing and sampling various videos. Data is captured for viewers of a video where the data includes facial information and/or physiological data. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.
    Type: Grant
    Filed: August 10, 2015
    Date of Patent: November 22, 2016
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Abdelrahman Mahmoud, Panu James Turcot
  • Publication number: 20160191995
    Abstract: Facial evaluation is performed on one or more videos captured from an individual viewing a display. The images are evaluated to determine whether the display was viewed by the individual. The individual views a media presentation that includes incorporated tags and is rendered on the display. Based on the tags, video of the individual is captured and evaluated using a classifier. The evaluating includes determining whether the individual is in front of the screen, facing the screen, and gazing at the screen. An engagement score and emotional responses are determined for media and images provided on the display.
    Type: Application
    Filed: March 4, 2016
    Publication date: June 30, 2016
    Inventors: Rana el Kaliouby, Nicholas Langeveld, Daniel McDuff, Seyedmohammad Mavadati
  • Publication number: 20160144278
    Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.
    Type: Application
    Filed: February 1, 2016
    Publication date: May 26, 2016
    Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Publication number: 20160081607
    Abstract: An individual can exhibit one or more mental states when reacting to a stimulus. A camera or other monitoring device can be used to collect, on an intermittent basis, mental state data including facial data. The mental state data can be interpolated between the intermittent collecting. The facial data can be obtained from a series of images of the individual where the images are captured intermittently. A second face can be identified, and the first face and the second face can be tracked.
    Type: Application
    Filed: December 7, 2015
    Publication date: March 24, 2016
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Publication number: 20160078279
    Abstract: Image analysis for facial evaluation is performed using logic encoded in a semiconductor processor. The semiconductor chip analyzes video images that are captured using one or more cameras and evaluates the videos to identify one or more persons in the videos. When a person is identified, the semiconductor chip locates the face of the evaluated person in the video. Facial regions of interest are extracted and differences in the regions of interest in the face are identified. The semiconductor chip uses classifiers to map facial regions for emotional response content and evaluate the emotional response content to produce an emotion score. The classifiers provide gender, age, or ethnicity with an associated probability. Localization logic within the chip is used to localize a second face when one is evaluated in the video. The one or more faces are tracked, and identifiers for the faces are provided.
    Type: Application
    Filed: November 20, 2015
    Publication date: March 17, 2016
    Applicant: AFFECTIVA, INC.
    Inventors: Boisy G Pitre, Rana el Kaliouby, Panu James Turcot
  • Patent number: 9247903
    Abstract: Mental state data is collected as a person interacts with a game machine. Analysis is performed on this data and mental state information and affect are shared across a social network. The affect of a person can be represented to the social network or gaming community in the form of an avatar. Recommendations can be based on the affect of the person. Mental states can be analyzed by web services which may, in turn, modify the game.
    Type: Grant
    Filed: February 6, 2012
    Date of Patent: February 2, 2016
    Assignee: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky, Panu James Turcot, Oliver Orion Wilder-Smith