Patents by Inventor Rika Hasegawa

Rika Hasegawa has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 7117190
    Abstract: In a robot apparatus and a control method therefor, firstly, partial or whole state space of a behavioral model is expanded or reduced, secondly, transition to a predetermined node in the behavioral model is described as transition to a virtual node and a node group to be allotted to the virtual node is sequentially changed, thirdly, the number of emotions and/or desires which are used for generating actions is gradually increased, and fourthly, an environment is evaluated to update each sensitivity corresponding to each emotion and desire, on the basis of the evaluated. In the robot apparatus and the character discriminating method for the robot apparatus, a pet robot is provided with: detecting means for detecting outputs from other pet robots; character discriminating means for discriminating characters of the pet robots on the basis of the result detected by the detecting means; and character changing means for changing the character on the basis of the result judged by the character discriminating means.
    Type: Grant
    Filed: November 30, 2000
    Date of Patent: October 3, 2006
    Assignee: Sony Corporation
    Inventors: Kohtaro Sabe, Rika Hasegawa, Makoto Inoue
  • Publication number: 20060041332
    Abstract: In a robot apparatus and a control method therefor, firstly, partial or whole state space of a behavioral model is expanded or reduced, secondly, transition to a predetermined node in the behavioral model is described as transition to a virtual node and a node group to be allotted to the virtual node is sequentially changed, thirdly, the number of emotions and/or desires which are used for generating actions is gradually increased, and fourthly, an environment is evaluated to update each sensitivity corresponding to each emotion and desire, on the basis of the evaluated. In the robot apparatus and the character discriminating method for the robot apparatus, a pet robot is provided with: detecting means for detecting outputs from other pet robots; character discriminating means for discriminating characters of the pet robots on the basis of the result detected by the detecting means; and character changing means for changing the character on the basis of the result judged by the character discriminating means.
    Type: Application
    Filed: October 5, 2005
    Publication date: February 23, 2006
    Inventors: Kohtaro Sabe, Rika Hasegawa, Makoto Inoue
  • Patent number: 6754560
    Abstract: A robot is proposed which has a speech recognition unit to detect information supplied simultaneously with or just before or after detection of a touch by a touch sensor, an associative memory/recall memory to store action made correspondingly to the touch and input information (speech signal) detected by the speech recognition unit in association with each other, and an action generator to control the robot to make action recalled by the associative memory/recall memory based on a newly acquired input information (speech signal). The robot has also a sensor data processor to allow the robot to act correspondingly to the touch detection by the touch sensor. Thus, the robot can learn action in association with an input signal such as speech signal.
    Type: Grant
    Filed: March 14, 2002
    Date of Patent: June 22, 2004
    Assignee: Sony Corporation
    Inventors: Masahiro Fujita, Tsuyoshi Takagi, Rika Hasegawa, Osamu Hanagata, Jun Yokono, Gabriel Costa, Hideki Shimomura
  • Patent number: 6650965
    Abstract: A behavior decision system (70) includes a perceptual information acquisition unit (90) which acquires a cause factor being external or internal information acquired by a CCD camera (20), distance sensor (22), microphone (23) or the like and which influences a behavior and a motivational information acquisition unit (81) which acquires an occurrence tendency of a behavior influenced by the cause factor based on the cause factor from the perceptual information acquisition unit (90), a behavior selecting processor (82) which compares occurrence tendencies corresponding to two or more behaviors, acquired by the perceptual information acquisition unit (90) and motivational information acquisition unit (81) and belonging to the same group, to thereby select one of the behaviors, and an output semantics converter module (68) which controls moving parts based on the behavior selected by the behavior selecting processor (82) for expressing the selected behavior.
    Type: Grant
    Filed: March 13, 2002
    Date of Patent: November 18, 2003
    Assignee: Sony Corporation
    Inventors: Tsuyoshi Takagi, Masahiro Fujita, Rika Hasegawa, Kotaro Sabe, Craig Ronald Arkin
  • Publication number: 20030045203
    Abstract: In a robot apparatus and a control method therefor, firstly, partial or whole state space of a behavioral model is expanded or reduced, secondly, transition to a predetermined node in the behavioral model is described as transition to a virtual node and a node group to be allotted to the virtual node is sequentially changed, thirdly, the number of emotions and/or desires which are used for generating actions is gradually increased, and fourthly, an environment is evaluated to update each sensitivity corresponding to each emotion and desire, on the basis of the evaluated. In the robot apparatus and the character discriminating method for the robot apparatus, a pet robot is provided with: detecting means for detecting outputs from other pet robots; character discriminating means for discriminating characters of the pet robots on the basis of the result detected by the detecting means; and character changing means for changing the character on the basis of the result judged by the character discriminating means.
    Type: Application
    Filed: July 27, 2001
    Publication date: March 6, 2003
    Inventors: Kohtaro Sabe, Rika Hasegawa, Makoto Inoue
  • Patent number: 6490503
    Abstract: There is provided a control apparatus adapted to easily and securely plural robots in an individual manner so that they can be controlled. In more practical sense, when one robot unit is controlled, a slide button 52 is switched to the module A side. When the other robot unit is controlled, such button is switched to the module B side. When a button 51 caused to correspond to sound name is operated, sound corresponding to that sound name is outputted at sound pitch corresponding to switching position of the slide button 52, and robot is controlled by that sound. Respective robots take thereinto only audio signals of sound pitches corresponding thereto.
    Type: Grant
    Filed: January 9, 2001
    Date of Patent: December 3, 2002
    Assignee: Sony Corporation
    Inventor: Rika Hasegawa
  • Publication number: 20020158599
    Abstract: A robot (1) is proposed which includes a speech recognition unit (101) to detect information supplied simultaneously with or just before or after detection of a touch by a touch sensor, an associative memory/recall memory (104) to store action made correspondingly to the touch and input information (speech signal) detected by the speech recognition unit (101) in association with each other, and an action generator (105) to control the robot (1) to make action recalled by the associative memory/recall memory (104) based on a newly acquired input information (speech signal). The robot (1) includes also a sensor data processor (102) to allow the robot (1) to act correspondingly to the touch detection by the touch sensor. Thus, the robot (1) can learn action in association with an input signal such as speech signal.
    Type: Application
    Filed: March 14, 2002
    Publication date: October 31, 2002
    Inventors: Masahiro Fujita, Tsuyoshi Takagi, Rika Hasegawa, Osamu Hanagata, Jun Yokono, Gabriel Costa, Hideki Shimomura
  • Publication number: 20020156751
    Abstract: A behavior decision system (70) includes a perceptual information acquisition unit (90) which acquires a cause factor being external or internal information acquired by a CCD camera (20), distance sensor (22), microphone (23) or the like and which influences a behavior and a motivational information acquisition unit (81) which acquires an occurrence tendency of a behavior influenced by the cause factor based on the cause factor from the perceptual information acquisition unit (90), a behavior selecting processor (82) which compares occurrence tendencies corresponding to two or more behaviors, acquired by the perceptual information acquisition unit (90) and motivational information acquisition unit (81) and belonging to the same group, to thereby select one of the behaviors, and an output semantics converter module (68) which controls moving parts based on the behavior selected by the behavior selecting processor (82) for expressing the selected behavior.
    Type: Application
    Filed: March 13, 2002
    Publication date: October 24, 2002
    Inventors: Tsuyoshi Takagi, Masahiro Fujita, Rika Hasegawa, Kotaro Sabe, Craig Ronald Arkin
  • Patent number: 6385506
    Abstract: A motion deciding means is provided to decide a motion based on transmitted information which is transmitted from another robot apparatus, so that it is possible to realize a robot apparatus which operates in cooperation with other robot apparatuses regardless of operator's instructions. Thus, such robot apparatuses can constitute a group of robot apparatuses which autonomously move in cooperation.
    Type: Grant
    Filed: November 22, 2000
    Date of Patent: May 7, 2002
    Assignee: Sony Corporation
    Inventors: Rika Hasegawa, Makoto Inoue