Patents by Inventor Rana el Kaliouby

Rana el Kaliouby has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160004904
    Abstract: Concepts for facial tracking with classifiers is disclosed. One or more faces are detected and tracked in a series of video frames that include at least one face. Video is captured and partitioned into the series of frames. A first video frame is analyzed using classifiers trained to detect the presence of at least one face in the frame. The classifiers are used to initialize locations for a first set of facial landmarks for the first face. The locations of the facial landmarks are refined using localized information around the landmarks, and a rough bounding box that contains the facial landmarks is estimated. The future locations for the facial landmarks detected in the first video frame are estimated for a future video frame. The detection of the facial landmarks and estimation of future locations of the landmarks are insensitive to rotation, orientation, scaling, or mirroring of the face.
    Type: Application
    Filed: September 8, 2015
    Publication date: January 7, 2016
    Inventors: Thibaud Senechal, Rana el Kaliouby, Panu James Turcot
  • Patent number: 9204836
    Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.
    Type: Grant
    Filed: October 26, 2013
    Date of Patent: December 8, 2015
    Assignee: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Publication number: 20150350730
    Abstract: Analysis of mental states is provided to enable data analysis pertaining to video recommendation based on affect. Analysis and recommendation can be for socially shared livestream video. Video response may be evaluated based on viewing and sampling various videos. Data is captured for viewers of a video where the data includes facial information and/or physiological data. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.
    Type: Application
    Filed: August 10, 2015
    Publication date: December 3, 2015
    Inventors: Rana el Kaliouby, Abdelrahman Mahmoud, Panu James Turcot
  • Publication number: 20150313530
    Abstract: Analysis of mental states is provided based on videos of a plurality of people experiencing various situations such as media presentations. Videos of the plurality of people are captured and analyzed using classifiers. Facial expressions of the people in the captured video are clustered based on set criteria. A unique signature for the situation to which the people are being exposed is then determined based on the expression clustering. In certain scenarios, the clustering is augmented by self-report data from the people. In embodiments, the expression clustering is based on a combination of multiple facial expressions.
    Type: Application
    Filed: July 10, 2015
    Publication date: November 5, 2015
    Inventors: Evan Kodra, Rana el Kaliouby, Thomas James Vandal
  • Patent number: 9106958
    Abstract: Analysis of mental states is provided to enable data analysis pertaining to video recommendation based on affect. Video response may be evaluated based on viewing and sampling various videos. Data is captured for viewers of a video where the data includes facial information and/or physiological data. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.
    Type: Grant
    Filed: February 27, 2012
    Date of Patent: August 11, 2015
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Rosalind Wright Picard, Oliver Orion Wilder-Smith, May Bahgat
  • Publication number: 20150206000
    Abstract: Expression analysis is performed via a background process and provided to foreground applications that register for emotion services. The foreground services are provided notification when a particular, previously determined emotional state is detected. The emotional state can be identified using facial feature analysis and/or gesture analysis. Upon receiving the notification of the state from the background process, the foreground services perform an emotion response action. The emotion response action can include sending a reply message indicating that a desired emotional response has been detected, providing a reward, and/or generating an automatic like on a social media system.
    Type: Application
    Filed: March 30, 2015
    Publication date: July 23, 2015
    Inventors: Rana el Kaliouby, Timothy Peacock, Gregory Poulin
  • Publication number: 20150186912
    Abstract: Expression analysis is performed in response to a request for an expression. The expression is related to one or more mental states. The mental states include happiness, joy, satisfaction, and pleasure, among others. Images from one or more cameras capturing a user's attempt to provide the requested expression are received and analyzed. The analyzed images serve to gauge the response of the person to the request. Based on the response of the person to the request, the person can be rewarded for the effectiveness of his or her mental state expression. The intensity of the expression can also be used as a factor in determining the reward. The reward can include, but is not limited to, a coupon, digital coupon, currency, or virtual currency.
    Type: Application
    Filed: March 16, 2015
    Publication date: July 2, 2015
    Inventors: Rana el Kaliouby, Timothy Peacock, Gregory Poulin
  • Publication number: 20150142553
    Abstract: Mental state data is gathered from a plurality of people and analyzed in order to determine mental state information. Metrics are generated based on the mental state information gathered as the people view media presentations. Norms, defined as the quantitative measures of the mental states of a plurality of people as they view the media presentation, are determined based on the mental state information metrics. The norms can be determined based on various viewer criteria including country of residence, demographic group, or device type on which the media presentation is viewed. Responses to new media are then compared against norms to determine the effectiveness of the new media presentations.
    Type: Application
    Filed: January 15, 2015
    Publication date: May 21, 2015
    Inventors: Evan Kodra, Rana el Kaliouby, Timothy Peacock, Gregory Poulin
  • Publication number: 20150099987
    Abstract: A system and method for evaluating heart rate variability for mental state analysis is disclosed. Video of an individual is captured while the individual consumes and interacts with media. The video is analyzed to determine heart rate information with heart rate variability (HRV) being calculated and being understood to be in response to stimuli from the media. The analysis of heart rate variability is based upon a sympathovagal balance derived from a ratio of low frequency heart rate values to high frequency heart rate values. Heart rate variability is analyzed to determine changes in an individual's mental state related to the stimuli. Heart rate variability is determined and thereby mental state analysis is performed to evaluate media.
    Type: Application
    Filed: December 13, 2014
    Publication date: April 9, 2015
    Inventors: Viprali Bhatkar, Rana el Kaliouby, Youssef Kashef, Ahmed Adel Osman
  • Publication number: 20140357976
    Abstract: A mobile device is emotionally enabled using an application programming interface (API) in order to infer a user's emotions and make the emotions available for sharing. Images of an individual or individuals are captured and send through the API. The images are evaluated to determine the individual's mental state. Mental state analysis is output to an app running on the device on which the API resides for further sharing, analysis, or transmission. A software development kit (SDK) can be used to generate the API or to otherwise facilitate the emotional enablement of a mobile device and the apps that run on the device.
    Type: Application
    Filed: August 15, 2014
    Publication date: December 4, 2014
    Inventors: Boisy G. Pitre, Rana el Kaliouby, Youssef Kashef
  • Publication number: 20140323817
    Abstract: The mental state of an individual is obtained in order to generate an emotional profile for the individual. The individual's mental state is derived from an analysis of the individual's facial and physiological information. The emotional profile of other individuals is correlated to the first individual for comparison. Various categories of emotional profiles are defined based upon the correlation. The emotional profile of the individual or group of individuals is rendered for display, used to provide feedback and to recommend activities for the individual, or provide information about the individual.
    Type: Application
    Filed: July 10, 2014
    Publication date: October 30, 2014
    Inventors: Rana el Kaliouby, Avril England
  • Publication number: 20140201207
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.
    Type: Application
    Filed: March 15, 2014
    Publication date: July 17, 2014
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Publication number: 20140200416
    Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game.
    Type: Application
    Filed: March 15, 2014
    Publication date: July 17, 2014
    Applicant: Affectiva, Inc.
    Inventors: Youssef Kashef, Rana el Kaliouby, Ahmed Adel Osman, Niels Haering, Viprali Bhatkar
  • Publication number: 20140200463
    Abstract: The mental state of an individual is obtained to determine their well-being status. The mental state is derived from an analysis of facial information and physiological information of an individual. The well-being status of other individuals is correlated to the well-being status of the first individual. The well-being status of the individual or group of individuals is rendered for display. The well-being status of an individual is used to provide feedback and to recommend activities for the individual.
    Type: Application
    Filed: March 15, 2014
    Publication date: July 17, 2014
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Abraham Bender
  • Publication number: 20140200417
    Abstract: Mental state analysis is performed by obtaining video of an individual as the individual interacts with a computer, either by performing various operations or by consuming a media presentation. The video is analyzed to determine eye-blink information on the individual, such as eye-blink rate or eye-blink duration. A mental state of the individual is then inferred based on the eye blink information. The blink-rate information and associated mental states can be used to modify an advertisement, a media presentation, or a digital game.
    Type: Application
    Filed: March 15, 2014
    Publication date: July 17, 2014
    Applicant: Affectiva, Inc.
    Inventors: Thibaud Senechal, Rana el Kaliouby, Niels Haering
  • Publication number: 20140112540
    Abstract: A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data.
    Type: Application
    Filed: December 30, 2013
    Publication date: April 24, 2014
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Abraham Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky, Thibaud Senechal, Panu James Turcot
  • Publication number: 20140058828
    Abstract: Mental state data is collected from a group of people as they view a media presentation, such as an advertisement, a television show, or a movie. The mental state data is analyzed to produce mental state information, such as inferred mental states, facial expressions, or valence. The mental state information is used to automatically optimize the previously viewed media presentation. The optimization may change various aspects of the media presentation including the length of different portions of the media presentation, the overall length of the media presentation, character selection, music selection, advertisement placement, and brand reveal time.
    Type: Application
    Filed: October 31, 2013
    Publication date: February 27, 2014
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Melissa Sue Burke, Andrew Edwin Dreisch, Panu James Turcot, Evan Kodra
  • Publication number: 20140051047
    Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.
    Type: Application
    Filed: October 26, 2013
    Publication date: February 20, 2014
    Applicant: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Publication number: 20140016860
    Abstract: A system and method for facial analysis to detect asymmetric expressions is disclosed. A series of facial images is collected, and an image from the series of images is evaluated with a classifier. The image is then flipped to create a flipped image. Then, the flipped image is evaluated with the classifier. The results of the evaluation of original image and the flipped image are compared. Asymmetric features such as a wink, a raised eyebrow, a smirk, or a wince are identified. These asymmetric features are associated with mental states such as skepticism, contempt, condescension, repugnance, disgust, disbelief, cynicism, pessimism, doubt, suspicion, and distrust.
    Type: Application
    Filed: September 19, 2013
    Publication date: January 16, 2014
    Applicant: Affectiva, Inc.
    Inventors: Thibaud Senechal, Rana el Kaliouby
  • Publication number: 20130262182
    Abstract: Analysis of mental states is provided to evaluate purchase intent. Purchase intent may be determined based on viewing and sampling various products. Data is captured for viewers of a product where the data includes facial information, physiological data, and the like. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for evaluating product or service purchase intent. In some embodiments, data captured from an individual viewer or group of viewers is used to optimize product purchase intent.
    Type: Application
    Filed: February 15, 2013
    Publication date: October 3, 2013
    Applicant: Affectiva, Inc.
    Inventors: Evan Kodra, Daniel Bender, Rana el Kaliouby, Mohamed Nada