Patents by Inventor Richard Scott Sadowsky
Richard Scott Sadowsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11430561Abstract: Remote computing analysis for cognitive state data metrics is performed. Cognitive state data from a plurality of people is collected as they interact with a rendering. The cognitive state data includes video facial data collected on one or more local devices from the plurality of people. Information is uploaded to a remote server. The information includes the cognitive state data. A facial expression metric based on a plurality of image classifiers is calculated for each individual within the plurality of people. Cognitive state information is generated for each individual, based on the facial expression metric for each individual. The cognitive state information for each individual within the plurality of people who interacted with the rendering is aggregated. The aggregation is based on the facial expression metric for each individual. The cognitive state information that was aggregated is displayed on at least one of the one or more local devices.Type: GrantFiled: July 21, 2020Date of Patent: August 30, 2022Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
-
Patent number: 10843078Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.Type: GrantFiled: February 1, 2016Date of Patent: November 24, 2020Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
-
Publication number: 20200350057Abstract: Remote computing analysis for cognitive state data metrics is performed. Cognitive state data from a plurality of people is collected as they interact with a rendering. The cognitive state data includes video facial data collected on one or more local devices from the plurality of people. Information is uploaded to a remote server. The information includes the cognitive state data. A facial expression metric based on a plurality of image classifiers is calculated for each individual within the plurality of people. Cognitive state information is generated for each individual, based on the facial expression metric for each individual. The cognitive state information for each individual within the plurality of people who interacted with the rendering is aggregated. The aggregation is based on the facial expression metric for each individual. The cognitive state information that was aggregated is displayed on at least one of the one or more local devices.Type: ApplicationFiled: July 21, 2020Publication date: November 5, 2020Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
-
Publication number: 20200342979Abstract: Distributed analysis for cognitive state metrics is performed. Data for an individual is captured into a computing device. The data provides information for evaluating a cognitive state of the individual. The data for the individual is uploaded to a web server. A cognitive state metric for the individual is calculated. The cognitive state metric is based on the data that was uploaded. Analysis from the web server is received by the computing device. The analysis is based on the data for the individual and the cognitive state metric for the individual. An output that describes a cognitive state of the individual is rendered at the computing device. The output is based on the analysis that was received. The cognitive states of other individuals are correlated to the cognitive state of the individual. Other sources of information are aggregated. The information is used to analyze the cognitive state of the individual.Type: ApplicationFiled: July 14, 2020Publication date: October 29, 2020Applicant: Affectiva, Inc.Inventors: Richard Scott Sadowsky, Rana el Kaliouby, Rosalind Wright Picard, Oliver Orion Wilder-Smith, Panu James Turcot, Zhihong Zheng
-
Patent number: 10799168Abstract: Facial image data of an individual is collected of the individual to provide mental state data using a first web-enabled computing device. The mental state data is analyzed to produce mental state information using a second web-enabled computing device. The mental state information is shared across a social network using a third web-enabled computing device. The mental state data is also collected from the individual through capture of sensor information. The mental state data is also collected from the individual through capture of audio data. The individual elects to share the mental state information across the social network. The mental state data may be collected over a period of time and analyzed to determine a mood of the individual. The mental state information is translated into a representative icon for sharing, which may include an emoji. An image of the individual is shared along with the mental state information.Type: GrantFiled: September 29, 2017Date of Patent: October 13, 2020Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
-
Patent number: 10143414Abstract: An individual can exhibit one or more mental states when reacting to a stimulus. A camera or other monitoring device can be used to collect, on an intermittent basis, mental state data including facial data. The mental state data can be interpolated between the intermittent collecting. The facial data can be obtained from a series of images of the individual where the images are captured intermittently. A second face can be identified, and the first face and the second face can be tracked.Type: GrantFiled: December 7, 2015Date of Patent: December 4, 2018Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Daniel Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
-
Patent number: 9934425Abstract: A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data.Type: GrantFiled: December 30, 2013Date of Patent: April 3, 2018Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Daniel Abraham Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky, Thibaud Senechal, Panu James Turcot
-
Publication number: 20180035938Abstract: Facial image data of an individual is collected of the individual to provide mental state data using a first web-enabled computing device. The mental state data is analyzed to produce mental state information using a second web-enabled computing device. The mental state information is shared across a social network using a third web-enabled computing device. The mental state data is also collected from the individual through capture of sensor information. The mental state data is also collected from the individual through capture of audio data. The individual elects to share the mental state information across the social network. The mental state data may be collected over a period of time and analyzed to determine a mood of the individual. The mental state information is translated into a representative icon for sharing, which may include an emoji. An image of the individual is shared along with the mental state information.Type: ApplicationFiled: September 29, 2017Publication date: February 8, 2018Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
-
Publication number: 20170238859Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. Intermittent mental state data is interpolated. The data and additional data allow interpretation of individual mental state information. The additional data is tagged to the mental state data. At least some of the mental state data, along with the tagged data, is analyzed to produce further mental state information. A mood measurement is a result of the analysis.Type: ApplicationFiled: May 8, 2017Publication date: August 24, 2017Applicant: Affectiva, Inc.Inventors: Richard Scott Sadowsky, Rana el Kaliouby
-
Patent number: 9646046Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.Type: GrantFiled: March 15, 2014Date of Patent: May 9, 2017Assignee: Affectiva, Inc.Inventors: Richard Scott Sadowsky, Rana el Kaliouby
-
Publication number: 20170105668Abstract: Image analysis is performed on collected data from a person who interacts with a rendering such as a website or video. The images are collected through video capture. Physiological data is captured from the images. Classifiers are used for analyzing images. Information is uploaded to a server and compared against a plurality of mental state event temporal signatures. Aggregated mental state information from other people who interact with the rendering is received, including video facial data analysis of the other people. The received information is displayed along with the rendering through a visual representation such as an avatar.Type: ApplicationFiled: December 29, 2016Publication date: April 20, 2017Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky
-
Publication number: 20170095192Abstract: Analysis of mental states is provided using web servers to enable data analysis. Data is captured for an individual where the data includes facial information and physiological information. Data that was captured for the individual is compared against a plurality of mental state event temporal signatures. Analysis is performed on a web service and the analysis is received. The mental states of other people are correlated to the mental state for the individual. Other sources of information are aggregated, where the information is used to analyze the mental state of the individual. Analysis of the mental state of the individual or group of individuals is rendered for display.Type: ApplicationFiled: December 16, 2016Publication date: April 6, 2017Applicant: Affectiva, Inc.Inventors: Richard Scott Sadowsky, Rana el Kaliouby, Rosalind Wright Picard, Oliver Orion Wilder-Smith, Panu James Turcot, Zhihong Zeng
-
Publication number: 20160144278Abstract: Mental state data is collected as a person interacts with a game played on a machine. The mental state data includes facial data, where the facial data includes facial regions or facial landmarks. The mental state data can include physiological data and actigraphy data. The mental state data is analyzed to produce mental state information. Mental state data and/or mental state information can be shared across a social network or a gaming community. The affect of the person interacting with the game can be represented to the social network or gaming community in the form of an avatar. Recommendations based on the affect resulting from the analysis can be made to the person interacting with the game. Mental states are analyzed locally or via a web services. Based on the results of the analysis, the game with which the person is interacting is modified.Type: ApplicationFiled: February 1, 2016Publication date: May 26, 2016Inventors: Rana el Kaliouby, Panu James Turcot, Forest Jay Handford, Daniel Bender, Rosalind Wright Picard, Richard Scott Sadowsky, Oliver Orion Wilder-Smith
-
Publication number: 20160081607Abstract: An individual can exhibit one or more mental states when reacting to a stimulus. A camera or other monitoring device can be used to collect, on an intermittent basis, mental state data including facial data. The mental state data can be interpolated between the intermittent collecting. The facial data can be obtained from a series of images of the individual where the images are captured intermittently. A second face can be identified, and the first face and the second face can be tracked.Type: ApplicationFiled: December 7, 2015Publication date: March 24, 2016Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Daniel Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
-
Patent number: 9247903Abstract: Mental state data is collected as a person interacts with a game machine. Analysis is performed on this data and mental state information and affect are shared across a social network. The affect of a person can be represented to the social network or gaming community in the form of an avatar. Recommendations can be based on the affect of the person. Mental states can be analyzed by web services which may, in turn, modify the game.Type: GrantFiled: February 6, 2012Date of Patent: February 2, 2016Assignee: Affectiva, Inc.Inventors: Daniel Bender, Rana el Kaliouby, Rosalind Wright Picard, Richard Scott Sadowsky, Panu James Turcot, Oliver Orion Wilder-Smith
-
Patent number: 9204836Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.Type: GrantFiled: October 26, 2013Date of Patent: December 8, 2015Assignee: Affectiva, Inc.Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky
-
Patent number: 9106958Abstract: Analysis of mental states is provided to enable data analysis pertaining to video recommendation based on affect. Video response may be evaluated based on viewing and sampling various videos. Data is captured for viewers of a video where the data includes facial information and/or physiological data. Facial and physiological information may be gathered for a group of viewers. In some embodiments, demographics information is collected and used as a criterion for visualization of affect responses to videos. In some embodiments, data captured from an individual viewer or group of viewers is used to rank videos.Type: GrantFiled: February 27, 2012Date of Patent: August 11, 2015Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Richard Scott Sadowsky, Rosalind Wright Picard, Oliver Orion Wilder-Smith, May Bahgat
-
Publication number: 20140201207Abstract: Mental state data useful for determining mental state information on an individual, such as video of an individual's face, is captured. Additional data that is helpful in determining the mental state information, such as contextual information, is also determined. The data and additional data allows interpretation of individual mental state information. The additional data is tagged to the mental state data and at least some of the mental state data, along with the tagged data, can be sent to a web service where it is used to produce further mental state information.Type: ApplicationFiled: March 15, 2014Publication date: July 17, 2014Applicant: Affectiva, Inc.Inventors: Richard Scott Sadowsky, Rana el Kaliouby
-
Publication number: 20140112540Abstract: A user interacts with various pieces of technology to perform numerous tasks and activities. Reactions can be observed and mental states inferred from these performances. Multiple devices, including mobile devices, can observe and record or transmit a user's mental state data. The mental state data collected from the multiple devices can be used to analyze the mental states of the user. The mental state data can be in the form of facial expressions, electrodermal activity, movements, or other detectable manifestations. Multiple cameras on the multiple devices can be usefully employed to collect facial data. An output can be rendered based on an analysis of the mental state data.Type: ApplicationFiled: December 30, 2013Publication date: April 24, 2014Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Daniel Abraham Bender, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky, Thibaud Senechal, Panu James Turcot
-
Publication number: 20140051047Abstract: A user may react to an interaction by exhibiting a mental state. A camera or other monitoring device can be used to capture one or more manifestations of the user's mental state, such as facial expressions, electrodermal activity, or movements. However, there may be conditions where the monitoring device is not able to detect the manifestation continually. Thus, various capabilities and implementations are described where the mental state data is collected on an intermittent basis, analyzed, and an output rendered based on the analysis of the mental state data.Type: ApplicationFiled: October 26, 2013Publication date: February 20, 2014Applicant: Affectiva, Inc.Inventors: Daniel Bender, Rana el Kaliouby, Evan Kodra, Oliver Ernst Nowak, Richard Scott Sadowsky