Patents by Inventor Richard Scott Sadowsky

Richard Scott Sadowsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11430561
    Abstract: Remote computing analysis for cognitive state data metrics is performed. Cognitive state data from a plurality of people is collected as they interact with a rendering. The cognitive state data includes video facial data collected on one or more local devices from the plurality of people. Information is uploaded to a remote server. The information includes the cognitive state data. A facial expression metric based on a plurality of image classifiers is calculated for each individual within the plurality of people. Cognitive state information is generated for each individual, based on the facial expression metric for each individual. The cognitive state information for each individual within the plurality of people who interacted with the rendering is aggregated. The aggregation is based on the facial expression metric for each individual. The cognitive state information that was aggregated is displayed on at least one of the one or more local devices.
    Type: Grant
    Filed: July 21, 2020
    Date of Patent: August 30, 2022
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
  • Patent number: 10843078
    Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.
    Type: Grant
    Filed: February 1, 2016
    Date of Patent: November 24, 2020
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Publication number: 20200350057
    Abstract: Remote computing analysis for cognitive state data metrics is performed. Cognitive state data from a plurality of people is collected as they interact with a rendering. The cognitive state data includes video facial data collected on one or more local devices from the plurality of people. Information is uploaded to a remote server. The information includes the cognitive state data. A facial expression metric based on a plurality of image classifiers is calculated for each individual within the plurality of people. Cognitive state information is generated for each individual, based on the facial expression metric for each individual. The cognitive state information for each individual within the plurality of people who interacted with the rendering is aggregated. The aggregation is based on the facial expression metric for each individual. The cognitive state information that was aggregated is displayed on at least one of the one or more local devices.
    Type: Application
    Filed: July 21, 2020
    Publication date: November 5, 2020
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
  • Publication number: 20200342979
    Abstract: Distributed analysis for cognitive state metrics is performed. Data for an individual is captured into a computing device. The data provides information for evaluating a cognitive state of the individual. The data for the individual is uploaded to a web server. A cognitive state metric for the individual is calculated. The cognitive state metric is based on the data that was uploaded. Analysis from the web server is received by the computing device. The analysis is based on the data for the individual and the cognitive state metric for the individual. An output that describes a cognitive state of the individual is rendered at the computing device. The output is based on the analysis that was received. The cognitive states of other individuals are correlated to the cognitive state of the individual. Other sources of information are aggregated. The information is used to analyze the cognitive state of the individual.
    Type: Application
    Filed: July 14, 2020
    Publication date: October 29, 2020
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby, Rosalind Wright Picard, Oliver Orion Wilder-Smith, Panu James Turcot, Zhihong Zheng
  • Patent number: 10799168
    Abstract: Facial image data of an individual is collected of the individual to provide mental state data using a first web-enabled computing device. The mental state data is analyzed to produce mental state information using a second web-enabled computing device. The mental state information is shared across a social network using a third web-enabled computing device. The mental state data is also collected from the individual through capture of sensor information. The mental state data is also collected from the individual through capture of audio data. The individual elects to share the mental state information across the social network. The mental state data may be collected over a period of time and analyzed to determine a mood of the individual. The mental state information is translated into a representative icon for sharing, which may include an emoji. An image of the individual is shared along with the mental state information.
    Type: Grant
    Filed: September 29, 2017
    Date of Patent: October 13, 2020
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Patent number: 10143414
    Abstract: An individual can exhibit one or more mental states when reacting to a stimulus. A camera or other monitoring device can be used to collect, on an intermittent basis, mental state data including facial data. The mental state data can be interpolated between the intermittent collecting. The facial data can be obtained from a series of images of the individual where the images are captured intermittently. A second face can be identified, and the first face and the second face can be tracked.
    Type: Grant
    Filed: December 7, 2015
    Date of Patent: December 4, 2018
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Patent number: 9934425
    Abstract: A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data.
    Type: Grant
    Filed: December 30, 2013
    Date of Patent: April 3, 2018
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Abraham Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky, Thibaud Senechal, Panu James Turcot
  • Publication number: 20180035938
    Abstract: Facial image data of an individual is collected of the individual to provide mental state data using a first web-enabled computing device. The mental state data is analyzed to produce mental state information using a second web-enabled computing device. The mental state information is shared across a social network using a third web-enabled computing device. The mental state data is also collected from the individual through capture of sensor information. The mental state data is also collected from the individual through capture of audio data. The individual elects to share the mental state information across the social network. The mental state data may be collected over a period of time and analyzed to determine a mood of the individual. The mental state information is translated into a representative icon for sharing, which may include an emoji. An image of the individual is shared along with the mental state information.
    Type: Application
    Filed: September 29, 2017
    Publication date: February 8, 2018
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Publication number: 20170238859
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. Intermittent mental state data is interpolated. The data and additional data allow interpretation of individual mental state information. The additional data is tagged to the mental state data. At least some of the mental state data, along with the tagged data, is analyzed to produce further mental state information. A mood measurement is a result of the analysis.
    Type: Application
    Filed: May 8, 2017
    Publication date: August 24, 2017
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Patent number: 9646046
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.
    Type: Grant
    Filed: March 15, 2014
    Date of Patent: May 9, 2017
    Assignee: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Publication number: 20170105668
    Abstract: Image analysis is performed on collected data from a person who interacts with a rendering such as a website or video. The images are collected through video capture. Physiological data is captured from the images. Classifiers are used for analyzing images. Information is uploaded to a server and compared against a plurality of mental state event temporal signatures. Aggregated mental state information from other people who interact with the rendering is received, including video facial data analysis of the other people. The received information is displayed along with the rendering through a visual representation such as an avatar.
    Type: Application
    Filed: December 29, 2016
    Publication date: April 20, 2017
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
  • Publication number: 20170095192
    Abstract: Analysis of mental states is provided using web servers to enable data analysis. Data is captured for an individual where the data includes facial information and physiological information. Data that was captured for the individual is compared against a plurality of mental state event temporal signatures. Analysis is performed on a web service and the analysis is received. The mental states of other people are correlated to the mental state for the individual. Other sources of information are aggregated, where the information is used to analyze the mental state of the individual. Analysis of the mental state of the individual or group of individuals is rendered for display.
    Type: Application
    Filed: December 16, 2016
    Publication date: April 6, 2017
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby, Rosalind Wright Picard, Oliver Orion Wilder-Smith, Panu James Turcot, Zhihong Zeng
  • Publication number: 20160144278
    Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.
    Type: Application
    Filed: February 1, 2016
    Publication date: May 26, 2016
    Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
  • Publication number: 20160081607
    Abstract: An individual can exhibit one or more mental states when reacting to a stimulus. A camera or other monitoring device can be used to collect, on an intermittent basis, mental state data including facial data. The mental state data can be interpolated between the intermittent collecting. The facial data can be obtained from a series of images of the individual where the images are captured intermittently. A second face can be identified, and the first face and the second face can be tracked.
    Type: Application
    Filed: December 7, 2015
    Publication date: March 24, 2016
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Patent number: 9247903
    Abstract: Mental state data is collected as a person interacts with a game machine. Analysis is performed on this data and mental state information and affect are shared across a social network. The affect of a person can be represented to the social network or gaming community in the form of an avatar. Recommendations can be based on the affect of the person. Mental states can be analyzed by web services which may, in turn, modify the game.
    Type: Grant
    Filed: February 6, 2012
    Date of Patent: February 2, 2016
    Assignee: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky, Panu James Turcot, Oliver Orion Wilder-Smith
  • Patent number: 9204836
    Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.
    Type: Grant
    Filed: October 26, 2013
    Date of Patent: December 8, 2015
    Assignee: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
  • Patent number: 9106958
    Abstract: Analysis of mental states is provided to enable data analysis pertaining to video recommendation based on affect. Video response may be evaluated based on viewing and sampling various videos. Data is captured for viewers of a video where the data includes facial information and/or physiological data. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.
    Type: Grant
    Filed: February 27, 2012
    Date of Patent: August 11, 2015
    Assignee: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Rosalind Wright Picard, Oliver Orion Wilder-Smith, May Bahgat
  • Publication number: 20140201207
    Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.
    Type: Application
    Filed: March 15, 2014
    Publication date: July 17, 2014
    Applicant: Affectiva, Inc.
    Inventors: Richard Scott Sadowsky, Rana el Kaliouby
  • Publication number: 20140112540
    Abstract: A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data.
    Type: Application
    Filed: December 30, 2013
    Publication date: April 24, 2014
    Applicant: Affectiva, Inc.
    Inventors: Rana el Kaliouby, Daniel Abraham Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky, Thibaud Senechal, Panu James Turcot
  • Publication number: 20140051047
    Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.
    Type: Application
    Filed: October 26, 2013
    Publication date: February 20, 2014
    Applicant: Affectiva, Inc.
    Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky