Patents by Inventor Cynthia Breazeal

Cynthia Breazeal has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11148296
    Abstract: A persistent companion robot detects human interaction cues through analysis of a range of sensory inputs. Based on the detected cue, the robot expresses a skill that involves interacting with human through verbal and non-verbal means to determine a second interaction cue in response to which the robot performs a second skill such as facilitating social interactions between humans, performing utilitarian tasks, informing humans, and entertaining humans.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: October 19, 2021
    Assignee: NTT DISRUPTION US, INC.
    Inventor: Cynthia Breazeal
  • Patent number: 10391636
    Abstract: A method includes providing a telecommunications enabled robotic device adapted to persist in an environment of a user, receiving an instruction to photograph one or more persons in the environment according to a time parameter and photographing the one or more persons in accordance with the time parameter resulting in one or more photographs.
    Type: Grant
    Filed: March 13, 2014
    Date of Patent: August 27, 2019
    Assignee: SQN VENTURE INCOME FUND, L.P.
    Inventor: Cynthia Breazeal
  • Patent number: 10357881
    Abstract: A multi-segment robot for emotive expression includes a first segment having a generally planar surface, a second segment having a first axis of rotation, the second segment in rotational contact with the first segment about the first axis of rotation; and a third segment in rotational contact with the second segment about a second axis of rotation not parallel to the first axis of rotation, the third segment having a display screen adapted to facilitate social interaction with a user.
    Type: Grant
    Filed: February 3, 2016
    Date of Patent: July 23, 2019
    Assignee: SQN VENTURE INCOME FUND, L.P.
    Inventors: Fardad Faridi, Cynthia Breazeal
  • Publication number: 20180229372
    Abstract: A socio-emotive-cognitive architecture for a social robot that includes at least two of an attention system that determines at least one of the subject on which and the direction to which the robot focuses at least one of its resources in real-time; an embodied speech system that facilitates intent-based variation of utterances combined with multi-segment body movement; a motivation system that adjusts at least one of the degree of interaction and a mode of interaction for engaging a human user; and an emotion system that partially determines how the attention system, the embodied speech system, and the motivation system perform for any given interaction with a human.
    Type: Application
    Filed: February 7, 2018
    Publication date: August 16, 2018
    Inventors: Cynthia BREAZEAL, Fardad FARIDI, Sigurdur Orn ADALGEIRSSON, Samuel Lee SPAULDING, Andrew Paul Burt STOUT, Thomas James DONAHUE, Matthew R. BERLIN, Jesse V. GRAY
  • Publication number: 20180133900
    Abstract: A social robot provides more believable, spontaneous, and understandable expressive communication via embodied communication capabilities by which a robot can express one or more of: paralinguistic audio expressions, sound effects or audio/vocal filters, expressive synthetic speech or pre-recorded speech, body movements and expressive gestures, body postures, lighting effects, aromas, and on-screen content, such as graphics, animations, photos, videos. These are coordinated with produced speech to enhance the expressiveness of the communication and non-verbal communication apart from speech communication.
    Type: Application
    Filed: November 14, 2017
    Publication date: May 17, 2018
    Inventors: Cynthia BREAZEAL, Fardad FARIDI, Sigurdur Orn ADALGEIRSSON, Thomas James DONAHUE, Sridhar RAGHAVAN, Adam SHONKOFF
  • Publication number: 20170206064
    Abstract: A development platform for developing a skill for a persistent companion device (PCD) includes an asset development library having an application programming interface (API) configured to enable a developer to at least one of find, create, edit and access one or more content assets utilizable for creating a skill, an expression tool suite having one or more APIs via which are received one or more expressions associated with the skill as specified by the developer wherein the skill is executable by the PCD in response to at least one defined input, a behavior editor for specifying one or more behavioral sequences of the PCD for the skill and a skill deployment facility having an API for deploying the skill to an execution engine of the PCD.
    Type: Application
    Filed: March 30, 2017
    Publication date: July 20, 2017
    Inventors: Cynthia Breazeal, Avida Michaud, Francois Laberge, Jonathan Louis Ross, Carolyn Marothy Saund, Fardad Faridi
  • Publication number: 20160199977
    Abstract: A persistent companion robot detects human interaction cues through analysis of a range of sensory inputs. Based on the detected cue, the robot expresses a skill that involves interacting with human through verbal and non-verbal means to determine a second interaction cue in response to which the robot performs a second skill such as facilitating social interactions between humans, performing utilitarian tasks, informing humans, and entertaining humans.
    Type: Application
    Filed: March 18, 2016
    Publication date: July 14, 2016
    Inventor: Cynthia Breazeal
  • Publication number: 20160193732
    Abstract: A persistent companion robot supports both one-on-one interaction with a human and group interaction with more than one human. The interaction can be directed to a human in detectable proximity, such as a human that is near to the robot, one that is further away from the robot, or any combination of near and far humans. The interaction incorporates multi-modal human input detection (e.g., seeing, hearing, tactile) with multi-modal expression (e.g., movement, speech, non-speech sound, lighting, electronic imagery, and the like.
    Type: Application
    Filed: March 15, 2016
    Publication date: July 7, 2016
    Inventors: Cynthia Breazeal, Robert Todd Pack, Seppo Andrew Rapo, Roberto Pieraccini, Maxim Makachev
  • Publication number: 20160171979
    Abstract: Tiled grammar-based phrase spotting includes orienting at least one segment of a multi-segment robot to facilitate capturing speech of a user arriving at the robot, and configuring a plurality of processing threads of a multi-threaded processing environment into distinct tiles, wherein at least a portion of the plurality of processing threads operate simultaneously on the captured speech to recognize a phrase type using a speech recognition grammar that is associated with the corresponding tile, wherein at least two of the tiles employ different speech recognition grammars to recognize different content in the captured speech.
    Type: Application
    Filed: February 12, 2016
    Publication date: June 16, 2016
    Inventors: Cynthia Breazeal, Roberto Pieraccini, Maxim Makachev
  • Publication number: 20160151917
    Abstract: A multi-segment robot for emotive expression includes a first segment having a generally planar surface, a second segment having a first axis of rotation, the second segment in rotational contact with the first segment about the first axis of rotation; and a third segment in rotational contact with the second segment about a second axis of rotation not parallel to the first axis of rotation, the third segment having a display screen adapted to facilitate social interaction with a user.
    Type: Application
    Filed: February 3, 2016
    Publication date: June 2, 2016
    Inventors: Fardad Faridi, Cynthia Breazeal
  • Publication number: 20150314454
    Abstract: A development platform for developing a skill for a persistent companion device (PCD) includes an asset development library having an application programming interface (API) configured to enable a developer to at least one of find, create, edit and access one or more content assets utilizable for creating a skill, an expression tool suite having one or more APIs via which are received one or more expressions associated with the skill as specified by the developer wherein the skill is executable by the PCD in response to at least one defined input, a behavior editor for specifying one or more behavioral sequences of the PCD for the skill and a skill deployment facility having an API for deploying the skill to an execution engine of the PCD.
    Type: Application
    Filed: July 15, 2015
    Publication date: November 5, 2015
    Inventors: Cynthia Breazeal, Avida Michaud, Francois Laberge, Jonathan Louis Ross, Carolyn Marothy Saund, Elio Dante Querze, III, Fardad Faridi
  • Patent number: 8909370
    Abstract: An interactive system for interacting with a sentient being. The system includes a robotic companion of which the sentient being may be a user and an entity which employs the robot as a participant in an activity involving the user. The robotic companion responds to inputs from an environment that includes the user during the activity. The robotic companion is capable of social and affective behavior either under control of the entity or in response to the environment. The entity may provide an interface by which an operator may control the robotic companion. Example applications for the interactive system include as a system for communicating with patients that have difficulties communicating verbally, a system for teaching remotely-located students or students with communication difficulties, a system for facilitating social interaction between a remotely-located relative and a child, and systems in which the user and the robot interact with an entity such as a smart book.
    Type: Grant
    Filed: May 8, 2008
    Date of Patent: December 9, 2014
    Assignee: Massachusetts Institute of Technology
    Inventors: Walter Dan Stiehl, Cynthia Breazeal, Jun Ki Lee, Allan Z Maymin, Heather Knight, Robert L. Toscano, Iris M. Cheung
  • Publication number: 20140277735
    Abstract: A method includes providing a telecommunications enabled robotic device adapted to persist in an environment of a user, receiving an instruction to photograph one or more persons in the environment according to a time parameter and photographing the one or more persons in accordance with the time parameter resulting in one or more photographs.
    Type: Application
    Filed: March 13, 2014
    Publication date: September 18, 2014
    Applicant: JIBO, Inc.
    Inventor: Cynthia Breazeal
  • Patent number: 8751042
    Abstract: A method of generating a behavior of a robot includes measuring input data associated with a plurality of user responses, applying an algorithm to the input data of the plurality of user responses to generate a plurality of user character classes, storing the plurality of user character classes in a database, classifying an individual user into a selected one of the plurality of user character classes by generating user preference data, selecting a robot behavior based on the selected user character class, and controlling the actions of the robot in accordance with the selected robot behavior during a user-robot interaction session. The selected user character class and the user preference data are based at least in part on input data associated with the individual user.
    Type: Grant
    Filed: December 14, 2011
    Date of Patent: June 10, 2014
    Assignees: Toyota Motor Engineering & Manufacturing North America, Inc., Massachusetts Institute of Technology
    Inventors: Haeyeon Lee, Yasuhiro Ota, Cynthia Breazeal, Jun Ki Lee
  • Patent number: 8475172
    Abstract: There is disclosed a process for motor learning, for teaching motion, or for rehabilitation of a student, the student having a motor sensory system. Plural transducers may be coupled around at least one joint of a student, the transducers for providing kinesthetic sensations to the student through its motor sensory system. Tactile control signals may be provided to control operation of the transducers to guide motions of the student.
    Type: Grant
    Filed: July 19, 2007
    Date of Patent: July 2, 2013
    Assignee: Massachusetts Institute of Technology
    Inventors: Jeff Lieberman, Cynthia Breazeal
  • Publication number: 20130158707
    Abstract: A method of generating a behavior of a robot includes measuring input data associated with a plurality of user responses, applying an algorithm to the input data of the plurality of user responses to generate a plurality of user character classes, storing the plurality of user character classes in a database, classifying an individual user into a selected one of the plurality of user character classes by generating user preference data, selecting a robot behavior based on the selected user character class, and controlling the actions of the robot in accordance with the selected robot behavior during a user-robot interaction session. The selected user character class and the user preference data are based at least in part on input data associated with the individual user.
    Type: Application
    Filed: December 14, 2011
    Publication date: June 20, 2013
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Haeyeon Lee, Yasuhiro Ota, Cynthia Breazeal, Jun Ki Lee
  • Publication number: 20090276288
    Abstract: The present invention provides a new and unique platform for authoring and deploying interactive characters which are powered by artificial intelligence. The platform permits the creation of a virtual world populated by multiple characters and objects, interacting with one another so as to create a life-like virtual world and interacting with a user so as to provide a more interesting and powerful experience for the user. This system can be used for entertainment purposes, for commercial purposes, for educational purposes, etc.
    Type: Application
    Filed: November 14, 2008
    Publication date: November 5, 2009
    Inventors: Michal Hlavac, Senia Maymin, Cynthia Breazeal, Milos Hlavac, Juraj Hlavac, Dennis Bromley
  • Publication number: 20090106171
    Abstract: The present invention provides a new and unique platform for authoring and deploying interactive characters which are powered by artificial intelligence. The platform permits the creation of a virtual world populated by multiple characters and objects, interacting with one another so as to create a life-like virtual world and interacting with a user so as to provide a more interesting and powerful experience for the user. This system can be used for entertainment purposes, for commercial purposes, for educational purposes, etc.
    Type: Application
    Filed: April 30, 2008
    Publication date: April 23, 2009
    Inventors: Michal Hlavac, Senia Maymin, Cynthia Breazeal, Milos Hlavac, Juraj Hlavac, Dennis Bromley
  • Patent number: D746886
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: January 5, 2016
    Assignee: JIBO, INC.
    Inventors: Cynthia Breazeal, Fardad Faridi
  • Patent number: D761895
    Type: Grant
    Filed: November 24, 2015
    Date of Patent: July 19, 2016
    Assignee: JIBO, INC.
    Inventors: Cynthia Breazeal, Fardad Faridi