Patents by Inventor Shinichi Oonaka

Shinichi Oonaka has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130123658
    Abstract: A child-care robot for use in a nursery school associates child behavior patterns with corresponding robot action patterns, and acquires a child behavior pattern when a child behaves in a specified pattern. The robot selects one of the robot action patterns, which is associated with the acquired child behavior pattern, and performs the selected robot action pattern. Additionally, the robot associates child identifiers with parent identifiers, and receives an inquiry message from a remote terminal indicating a parent identifier. The robot detects one of the child identifiers, which is associated with the parent identifier of the inquiry message, acquires an image or a voice of a child identified by the detected child identifier, and transmits the acquired image or voice to the remote terminal. The robot further moves in search of a child, measures the temperature of the child, and associates the temperature with time of day at which it was measured.
    Type: Application
    Filed: January 9, 2013
    Publication date: May 16, 2013
    Inventor: Shinichi OOnaka
  • Patent number: 8376803
    Abstract: A child-care robot for use in a nursery school associates child behavior patterns with corresponding robot action patterns, and acquires a child behavior pattern when a child behaves in a specified pattern. The robot selects one of the robot action patterns, which is associated with the acquired child behavior pattern, and performs the selected robot action pattern. Additionally, the robot associates child identifiers with parent identifiers, and receives an inquiry message from a remote terminal indicating a parent identifier. The robot detects one of the child identifiers, which is associated with the parent identifier of the inquiry message, acquires an image or a voice of a child identified by the detected child identifier, and transmits the acquired image or voice to the remote terminal. The robot further moves in search of a child, measures the temperature of the child, and associates the temperature with time of day at which it was measured.
    Type: Grant
    Filed: March 25, 2005
    Date of Patent: February 19, 2013
    Assignee: NEC Corporation
    Inventor: Shinichi Oonaka
  • Publication number: 20090182453
    Abstract: For a joint performance of a dialogue between a human partner and a robot, the robot analyzes phrase and action of the partner to detect a recognized behavior of the partner and analyzes a state of audience listening to utterances from the partner and the robot to detect a recognized state of the audience. A scenario describing the dialogue is stored in entries of a memory. The memory is successively referenced entry by entry and a check is made for a match between an utterance by the partner or the robot to a reaction from the audience. Responsive to a currently detected audience state, a corresponding robot behavior is determined. Preferably, possible partner's behaviors and expected audience states are mapped in a database to specified robot behaviors. The database is searched for a specified robot behavior corresponding to a currently sensed partner behavior or a currently sensed audience state.
    Type: Application
    Filed: March 11, 2009
    Publication date: July 16, 2009
    Inventor: Shinichi Oonaka
  • Publication number: 20090177321
    Abstract: For a joint performance of a dialogue between a human partner and a robot, the robot analyzes phrase and action of the partner to detect a recognized behavior of the partner and analyzes a state of audience listening to utterances from the partner and the robot to detect a recognized state of the audience. A scenario describing the dialogue is stored in entries of a memory. The memory is successively referenced entry by entry and a check is made for a match between an utterance by the partner or the robot to a reaction from the audience. Responsive to a currently detected audience state, a corresponding robot behavior is determined. Preferably, possible partner's behaviors and expected audience states are mapped in a database to specified robot behaviors. The database is searched for a specified robot behavior corresponding to a currently sensed partner behavior or a currently sensed audience state.
    Type: Application
    Filed: March 11, 2009
    Publication date: July 9, 2009
    Inventor: Shinichi Oonaka
  • Patent number: 7526363
    Abstract: For a joint performance of a dialogue between a human partner and a robot, the robot analyzes phrase and action of the partner to detect a recognized behavior of the partner and analyze a state of audience listening to utterances from the partner and the robot to detect a recognized state of the audience. A scenario describing the dialogue is stored in entries of a memory. The memory is successively referenced entry by entry and a check is made for a match between an utterance by the partner or the robot to a reaction from the audience. Responsive to a currently detected audience state, a corresponding robot behavior is determined. Preferably, possible partner's behaviors and expected audience states are mapped in a database to specified robot behaviors. The database is searched for a specified robot behavior corresponding to a currently sensed partner behavior or a currently sensed audience state.
    Type: Grant
    Filed: April 28, 2005
    Date of Patent: April 28, 2009
    Assignee: NEC Corporation
    Inventor: Shinichi Oonaka
  • Publication number: 20050246063
    Abstract: For a joint performance of a dialogue between a human partner and a robot, the robot analyzes phrase and action of the partner to detect a recognized behavior of the partner and analyze a state of audience listening to utterances from the partner and the robot to detect a recognized state of the audience. A scenario describing the dialogue is stored in entries of a memory. The memory is successively referenced entry by entry and a check is made for a match between an utterance by the partner or the robot to a reaction from the audience. Responsive to a currently detected audience state, a corresponding robot behavior is determined. Preferably, possible partner's behaviors and expected audience states are mapped in a database to specified robot behaviors. The database is searched for a specified robot behavior corresponding to a currently sensed partner behavior or a currently sensed audience state.
    Type: Application
    Filed: April 28, 2005
    Publication date: November 3, 2005
    Inventor: Shinichi Oonaka
  • Publication number: 20050215171
    Abstract: A child-care robot for use in a nursery school associates child behavior patterns with corresponding robot action patterns, and acquires a child behavior pattern when a child behaves in a specified pattern. The robot selects one of the robot action patterns, which is associated with the acquired child behavior pattern, and performs the selected robot action pattern. Additionally, the robot associates child identifiers with parent identifiers, and receives an inquiry message from a remote terminal indicating a parent identifier. The robot detects one of the child identifiers, which is associated with the parent identifier of the inquiry message, acquires an image or a voice of a child identified by the detected child identifier, and transmits the acquired image or voice to the remote terminal. The robot further moves in search of a child, measures the temperature of the child, and associates the temperature with time of day at which it was measured.
    Type: Application
    Filed: March 25, 2005
    Publication date: September 29, 2005
    Inventor: Shinichi Oonaka