Patents by Inventor Youssef Kashef
Youssef Kashef has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12204958Abstract: File system manipulation using machine learning is described. Access to a machine learning system is obtained. A connection between a file system and an application is structured. The connection is managed through an application programming interface (API). The connection provides two-way data transfer through the API between the application and the file system. The connection provides distribution of one or more data files through the API. The connection provides enablement of processing of the one or more data files. The processing uses classifiers running on the machine learning system. Data files are retrieved from the file system connected through the interface. The file system is network-connected to the application through the interface. The data files comprise image data of one or more people. Cognitive state analysis is performed by the machine learning system. The application programming interface is generated by a software development kit (SDK).Type: GrantFiled: March 24, 2020Date of Patent: January 21, 2025Assignee: Affectiva, Inc.Inventors: Boisy G. Pitre, Rana el Kaliouby, Youssef Kashef
-
Patent number: 11151610Abstract: Video of one or more vehicle occupants is obtained and analyzed. Heart rate information is determined from the video. The heart rate information is used in cognitive state analysis. The heart rate information and resulting cognitive state analysis are correlated to stimuli, such as digital media, which is consumed or with which a vehicle occupant interacts. The heart rate information is used to infer cognitive states. The inferred cognitive states are used to output a mood measurement. The cognitive states are used to modify the behavior of a vehicle. The vehicle is an autonomous or semi-autonomous vehicle. Training is employed in the analysis. Machine learning is engaged to facilitate the training. Near-infrared image processing is used to obtain the video. The analysis is augmented by audio information obtained from the vehicle occupant.Type: GrantFiled: December 30, 2019Date of Patent: October 19, 2021Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Viprali Bhatkar, Niels Haering, Youssef Kashef, Ahmed Adel Osman
-
Publication number: 20200226012Abstract: File system manipulation using machine learning is described. Access to a machine learning system is obtained. A connection between a file system and an application is structured. The connection is managed through an application programming interface (API). The connection provides two-way data transfer through the API between the application and the file system. The connection provides distribution of one or more data files through the API. The connection provides enablement of processing of the one or more data files. The processing uses classifiers running on the machine learning system. Data files are retrieved from the file system connected through the interface. The file system is network-connected to the application through the interface. The data files comprise image data of one or more people. Cognitive state analysis is performed by the machine learning system. The application programming interface is generated by a software development kit (SDK).Type: ApplicationFiled: March 24, 2020Publication date: July 16, 2020Applicant: Affectiva, Inc.Inventors: Boisy G. Pitre, Rana el Kaliouby, Youssef Kashef
-
Publication number: 20200134672Abstract: Video of one or more vehicle occupants is obtained and analyzed. Heart rate information is determined from the video. The heart rate information is used in cognitive state analysis. The heart rate information and resulting cognitive state analysis are correlated to stimuli, such as digital media, which is consumed or with which a vehicle occupant interacts. The heart rate information is used to infer cognitive states. The inferred cognitive states are used to output a mood measurement. The cognitive states are used to modify the behavior of a vehicle. The vehicle is an autonomous or semi-autonomous vehicle. Training is employed in the analysis. Machine learning is engaged to facilitate the training. Near-infrared image processing is used to obtain the video. The analysis is augmented by audio information obtained from the vehicle occupant.Type: ApplicationFiled: December 30, 2019Publication date: April 30, 2020Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Viprali Bhatkar, Niels Haering, Youssef Kashef, Ahmed Adel Osman
-
Patent number: 10517521Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video. The heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media, which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The inferred mental states are used to output a mood measurement. The mental state analysis, based on the heart rate information, is used to optimize digital media or modify a digital game. Training is employed in the analysis. Machine learning is engaged to facilitate the training.Type: GrantFiled: May 8, 2017Date of Patent: December 31, 2019Assignee: Affectiva, Inc.Inventors: Rana el Kaliouby, Viprali Bhatkar, Niels Haering, Youssef Kashef, Ahmed Adel Osman
-
Publication number: 20170238860Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video. The heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media, which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The inferred mental states are used to output a mood measurement. The mental state analysis, based on the heart rate information, is used to optimize digital media or modify a digital game. Training is employed in the analysis. Machine learning is engaged to facilitate the training.Type: ApplicationFiled: May 8, 2017Publication date: August 24, 2017Applicant: Affectiva, Inc.Inventors: Rana el Kaliouby, Viprali Bhatkar, Niels Haering, Youssef Kashef, Ahmed Adel Osman
-
Patent number: 9642536Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game.Type: GrantFiled: March 15, 2014Date of Patent: May 9, 2017Assignee: Affectiva, Inc.Inventors: Youssef Kashef, Rana el Kaliouby, Ahmed Adel Osman, Niels Haering, Viprali Bhatkar
-
Publication number: 20150099987Abstract: A system and method for evaluating heart rate variability for mental state analysis is disclosed. Video of an individual is captured while the individual consumes and interacts with media. The video is analyzed to determine heart rate information with heart rate variability (HRV) being calculated and being understood to be in response to stimuli from the media. The analysis of heart rate variability is based upon a sympathovagal balance derived from a ratio of low frequency heart rate values to high frequency heart rate values. Heart rate variability is analyzed to determine changes in an individual's mental state related to the stimuli. Heart rate variability is determined and thereby mental state analysis is performed to evaluate media.Type: ApplicationFiled: December 13, 2014Publication date: April 9, 2015Inventors: Viprali Bhatkar, Rana el Kaliouby, Youssef Kashef, Ahmed Adel Osman
-
Publication number: 20140357976Abstract: A mobile device is emotionally enabled using an application programming interface (API) in order to infer a user's emotions and make the emotions available for sharing. Images of an individual or individuals are captured and send through the API. The images are evaluated to determine the individual's mental state. Mental state analysis is output to an app running on the device on which the API resides for further sharing, analysis, or transmission. A software development kit (SDK) can be used to generate the API or to otherwise facilitate the emotional enablement of a mobile device and the apps that run on the device.Type: ApplicationFiled: August 15, 2014Publication date: December 4, 2014Inventors: Boisy G. Pitre, Rana el Kaliouby, Youssef Kashef
-
Publication number: 20140200416Abstract: Video of one or more people is obtained and analyzed. Heart rate information is determined from the video and the heart rate information is used in mental state analysis. The heart rate information and resulting mental state analysis are correlated to stimuli, such as digital media which is consumed or with which a person interacts. The heart rate information is used to infer mental states. The mental state analysis, based on the heart rate information, can be used to optimize digital media or modify a digital game.Type: ApplicationFiled: March 15, 2014Publication date: July 17, 2014Applicant: Affectiva, Inc.Inventors: Youssef Kashef, Rana el Kaliouby, Ahmed Adel Osman, Niels Haering, Viprali Bhatkar
-
Publication number: 20110263946Abstract: A digital computer and method for processing data indicative of images of facial and head movements of a subject to recognize at least one of said movements and to determine at least one mental state of said subject is provided. The outputting instructions for providing to a user information relating to at least one said mental state. A further processing data reflective of input from a user, and based at least in part on said input, confirming or modifying said determination and generating with a transducer an output of humanly perceptible stimuli indicative of said at least one mental state.Type: ApplicationFiled: April 22, 2010Publication date: October 27, 2011Inventors: Rana el Kaliouby, Rosalind W. Picard, Abdelrahman N. Mahmoud, Youssef Kashef, Miriam Anna Rimm Madsen, Mina Mikhail