Patents by Inventor Jason Krupat
Jason Krupat has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230186734Abstract: Techniques are provided for an electronic skill based platform. In an embodiment, a gaming platform may receive a predetermined sale price for a product/service from a merchant. The gaming platform may determine a prices for each electronic entry to participate in a skill based game where skill is the predominant factor and chance is not the predominant factor. The gaming platform may generate the skill based game for each participant that purchases an electronic entry. The skill based game may be generated utilizing a clustering technique to determine a skill level for each participant. The gaming platform may determine a score for each participant based on participation in the skill based games. The gaming platform ay determine a winner, wherein the winning participant wins the product/service for a cost that is equal to a selected number of the plurality of electronic entries purchased by the winning participant.Type: ApplicationFiled: December 14, 2022Publication date: June 15, 2023Inventors: Jason Krupat, Bret Siarkowski, Steve Curran, Jake Curran
-
Patent number: 11484685Abstract: Techniques for robotic control using profiles are disclosed. Cognitive state data for an individual is obtained. A cognitive state profile for the individual is learned using the cognitive state data that was obtained. Further cognitive state data for the individual is collected. The further cognitive state data is compared with the cognitive state profile. Stimuli are provided by a robot to the individual based on the comparing. The robot can be a smart toy. The cognitive state data can include facial image data for the individual. The further cognitive state data can include audio data for the individual. The audio data can be voice data. The voice data augments the cognitive state data. Cognitive state data for the individual is obtained using another robot. The cognitive state profile is updated based on input from either of the robots.Type: GrantFiled: June 29, 2020Date of Patent: November 1, 2022Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Jason Krupat
-
Patent number: 10869626Abstract: Techniques are described for image analysis and representation for emotional metric threshold generation. A client device is used to collect image data of a user interacting with a media presentation, where the image data includes facial images of the user. One or more processors are used to analyze the image data to extract emotional content of the facial images. One or more emotional intensity metrics are determined based on the emotional content. The one or more emotional intensity metrics are stored into a digital storage component. The one or more emotional intensity metrics, obtained from the digital storage component, are coalesced into a summary emotional intensity metric. The summary emotional intensity metric is represented.Type: GrantFiled: June 25, 2018Date of Patent: December 22, 2020Assignee: Affectiva, Inc.Inventors: Jason Krupat, Rana el Kaliouby, Jason Radice, Chilton Lyons Cabot
-
Publication number: 20200324072Abstract: Techniques for robotic control using profiles are disclosed. Cognitive state data for an individual is obtained. A cognitive state profile for the individual is learned using the cognitive state data that was obtained. Further cognitive state data for the individual is collected. The further cognitive state data is compared with the cognitive state profile. Stimuli are provided by a robot to the individual based on the comparing. The robot can be a smart toy. The cognitive state data can include facial image data for the individual. The further cognitive state data can include audio data for the individual. The audio data can be voice data. The voice data augments the cognitive state data. Cognitive state data for the individual is obtained using another robot. The cognitive state profile is updated based on input from either of the robots.Type: ApplicationFiled: June 29, 2020Publication date: October 15, 2020Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Jason Krupat
-
Patent number: 10401860Abstract: Image analysis is performed for a two-sided data hub. Data reception on a first computing device is enabled by an individual and a content provider. Cognitive state data including facial data on the individual is collected on a second computing device. The cognitive state data is analyzed on a third computing device and the analysis is provided to the individual. The cognitive state data is evaluated and the evaluation is provided to the content provider. A mood dashboard is displayed to the individual based on the analyzing. The individual opts in to enable data reception for the individual. The content provider provides content via a website.Type: GrantFiled: March 12, 2018Date of Patent: September 3, 2019Assignee: Affectiva, Inc.Inventors: Jason Krupat, Rana el Kaliouby, Jason Radice, Gabriele Zijderveld, Chilton Lyons Cabot
-
Publication number: 20180303397Abstract: Techniques are described for image analysis and representation for emotional metric threshold generation. A client device is used to collect image data of a user interacting with a media presentation, where the image data includes facial images of the user. One or more processors are used to analyze the image data to extract emotional content of the facial images. One or more emotional intensity metrics are determined based on the emotional content. The one or more emotional intensity metrics are stored into a digital storage component. The one or more emotional intensity metrics, obtained from the digital storage component, are coalesced into a summary emotional intensity metric. The summary emotional intensity metric is represented.Type: ApplicationFiled: June 25, 2018Publication date: October 25, 2018Applicant: Affectiva, Inc.Inventors: Jason Krupat, Rana el Kaliouby, Jason Radice, Chilton Lyons Cabot
-
Publication number: 20180196432Abstract: Image analysis is performed for a two-sided data hub. Data reception on a first computing device is enabled by an individual and a content provider. Cognitive state data including facial data on the individual is collected on a second computing device. The cognitive state data is analyzed on a third computing device and the analysis is provided to the individual. The cognitive state data is evaluated and the evaluation is provided to the content provider. A mood dashboard is displayed to the individual based on the analyzing. The individual opts in to enable data reception for the individual. The content provider provides content via a website.Type: ApplicationFiled: March 12, 2018Publication date: July 12, 2018Applicant: Affectiva, Inc.Inventors: Jason Krupat, Rana el Kaliouby, Jason Radice, Gabriele Zijderveld, Chilton Lyons Cabot
-
Publication number: 20180144649Abstract: Techniques are disclosed for smart toy interaction based on using image analysis. Cognitive state data, including facial data, for an individual is obtained, using a first computing device. A cognitive state profile for the individual is learned, using a second computing device based on the cognitive state data that was obtained. Further cognitive state data is collected for the individual. The further cognitive state data is compared with the cognitive state profile. Stimuli are provided by a first smart toy to the individual based on the comparing. The further cognitive state data includes audio data for the individual. Voice data is collected. The voice data augments the cognitive state data. Cognitive state data for the individual is obtained using a second smart toy. The cognitive state profile is updated based on input from the first smart toy or the second smart toy.Type: ApplicationFiled: January 4, 2018Publication date: May 24, 2018Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Jason Krupat