Patents by Inventor Yoshiaki Sakagami
Yoshiaki Sakagami has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 7467026Abstract: An autonomous robot is controlled by the local robot information controller which is connected to a robot application network to which the transceiver to communicate with the autonomous robot is attached. The robot application network, a user LAN adaptive controller an information distribution manager and the third party information provider subsystem are linked with a public network. The information distribution manager acquires the information from the third party information provider subsystem on the schedule which is set by the user LAN adaptive controller. The local robot information controller receives the information distribution manager and convert it into the data that generates robot gestures. The robot performs actions in accordance to the gesture data received from the local robot information controller.Type: GrantFiled: August 13, 2004Date of Patent: December 16, 2008Assignee: Honda Motor Co. Ltd.Inventors: Yoshiaki Sakagami, Shinichi Matsunaga, Naoaki Sumida
-
Patent number: 7373218Abstract: An information distribution system capable of presenting an image photographed by a moving body, such as a robot or vehicle, selected by a user. The information distribution system includes moving bodies, local managers, and a global manager. When a user requests a connection with one of the moving bodies to the global manager through a portable terminal such as a portable phone, a menu generation section of the global manager generates a menu screen, and sends it to the portable terminal. The user selects one from the moving bodies through the menu screen, so that the portable terminal of the user connects with the selected moving body. An image photographed by a camera of the moving body is sent to the local manager, and information related to the image is superimposed on the image. Consequently, the superimposed image is sent to the portable terminal.Type: GrantFiled: August 30, 2004Date of Patent: May 13, 2008Assignee: Honda Motor Co., Ltd.Inventor: Yoshiaki Sakagami
-
Patent number: 7340100Abstract: A posture recognition apparatus recognizes instructions signified by postures of persons present in the surroundings, from images obtained with an image capture device. The posture recognition apparatus includes an outline extraction device that extracts an outline of a body which is a candidate for a person from the images; a distance calculation device that calculates a distance to the body being the candidate, from distance information of each pixel within the outline in the image; a search device that searches for a candidate for a hand of a person based on the outline and the distance to the body represented by the outline; and a posture determination device that determines an instruction corresponding to the relative position of the candidate for a hand and the outline, and outputs this determination result as a posture determination result.Type: GrantFiled: August 7, 2003Date of Patent: March 4, 2008Assignee: Honda Giken Kogyo Kabushiki KaishaInventors: Nobuo Higaki, Yoshiaki Sakagami, Naoaki Sumida
-
Publication number: 20070296950Abstract: An optical device ensuring projection of light over a wide range as well as reduction in size, and a mobile apparatus mounted with the optical device are provided. The optical device (100) includes a light-projecting unit (110) and a light-receiving unit (120). The light-projecting unit (110) has a projector (114) and a lenticular sheet (112) arranged in layers. First and second cylindrical lens arrays having their generatrices orthogonal to each other are formed on the respective surfaces of the sheet (112). The light-receiving unit (120) has a light-receiver (124). The light-projecting unit (110) and the light-receiving unit (120) are arranged adjacent to each other in an integrated manner so that the light-receiver (124) can sense the light emitted from the projector (114) via the sheet (112) and then reflected from an object.Type: ApplicationFiled: June 7, 2007Publication date: December 27, 2007Inventors: Yoshiaki Sakagami, Yoichi Nishimura, Kenji Kadowaki
-
Patent number: 7239339Abstract: A position detection apparatus that is able to easily detect self-position during autonomous movement by a humanoid robot that moves by the use of legs or automobile.Type: GrantFiled: May 26, 2001Date of Patent: July 3, 2007Assignee: Honda Giken Kogyo Kabushiki KaishaInventors: Takaaki Nagai, Shinichi Matsunaga, Yoshiaki Sakagami
-
Publication number: 20060224301Abstract: A communication system between vehicles of the present invention is equipped in each of the vehicles with an image taking device for taking an image around the vehicle; a moving body detection unit for detecting a moving body from the image taken by the image taking device; a display unit for displaying the image; an image data generation unit for generating image data output to the display unit of the each vehicle; and a transmitting/receiving device for any of transmitting/receiving data, wherein the image data generation unit generates image data for displaying an existence of a moving body, of which an image is taken by said image taking device of one of the vehicles, and outputs the image data to the display unit of other vehicle.Type: ApplicationFiled: March 2, 2006Publication date: October 5, 2006Inventors: Yoshiaki Sakagami, Takamichi Shimada
-
Publication number: 20050065652Abstract: An autonomous robot is controlled by the local robot information controller which is connected to a robot application network to which the transceiver to communicate with the autonomous robot is attached. The robot application network, a user LAN adaptive controller an information distribution manager and the third party information provider subsystem are linked with a public network. The information distribution manager acquires the information from the third party information provider subsystem on the schedule which is set by the user LAN adaptive controller. The local robot information controller receives the information distribution manager and covert it into the data that generates robot gestures. The robot performs actions in accordance to the gesture data received from the local robot information controller.Type: ApplicationFiled: August 13, 2004Publication date: March 24, 2005Inventors: Yoshiaki Sakagami, Shinichi Matsunaga, Naoaki Sumida
-
Publication number: 20050057689Abstract: Disclosed is an information distribution system capable of presenting an image photographed by a moving body, such as a robot or vehicle, selected by a user. The information distribution system includes moving bodies, local managers, and a global manager. When a user requests a connection with one of the moving bodies to the global manager through a portable terminal such as a portable phone, a menu generation section of the global manager generates a menu screen, and sends it to the portable terminal. The user selects one from the moving bodies through the menu screen, so that the portable terminal of the user connects with the selected moving body. An image photographed by a camera of the moving body is sent to the local manager, and information related to the image is superimposed on the image. Consequently, the superimposed image is sent to the portable terminal.Type: ApplicationFiled: August 30, 2004Publication date: March 17, 2005Inventor: Yoshiaki Sakagami
-
Publication number: 20050054332Abstract: In an event site or the like, a visitor may waste time and effort trying to find a spot where an event of interest may be taking place because the visitor is unable to look through the entire site from any particular spot. An information gathering robot roams in such an event site typically along a prescribed route, and notes spots of interest to transmit this information to a data server. The visitor can access the data server to find a spot of interest of his or her choice substantially on a real time basis.Type: ApplicationFiled: August 11, 2004Publication date: March 10, 2005Inventors: Yoshiaki Sakagami, Yoko Saito, Koji Kawabe, Takamichi Shimada, Nobuo Higaki
-
Publication number: 20050041839Abstract: A mobile robot that is fitted with a camera and can be controlled from a remote terminal such as a mobile phone moves about in an event site to detect and track a moving object such as a visitor and entertainer. Because the camera along with the robot itself can change its position freely autonomously and/or according to a command from the mobile terminal, a desired frame layout can be accomplished by moving the position of the robot. Therefore, the operator is not required to execute a complex image trimming process or other adjustment of the obtained picture image so that a desired picture can be obtained quickly and without any difficulty. If the user is allowed access the managing server, the user can download the desired picture images and have them printed out at will. Also, because the selected picture images can be transmitted to the managing server, the robot is prevented from running out of memory for storing the picture images.Type: ApplicationFiled: August 10, 2004Publication date: February 24, 2005Inventors: Youko Saitou, Koji Kawabe, Yoshiaki Sakagami, Tomonobu Gotou, Takamichi Shimada
-
Patent number: 6853880Abstract: An autonomous action robot which can turn its line of sight to face a person who calls out, can recognize the face of a person, and can perform various actions in response to commands. First, a sound emitted from a person or other sound source is detected by a sound detector, and the direction of the sound source is specified based on the detected sound. Then, a robot head section is controlled, and the imaging direction of the robot head section is moved to face the specified direction of the sound source. Next, an image is captured in the direction of the sound source, and a target image of a specific shape is extracted from the captured image. Then, the imaging direction of the robot head section is controlled and moved to face in the direction of the extracted target image.Type: GrantFiled: August 22, 2002Date of Patent: February 8, 2005Assignee: Honda Giken Kogyo Kabushiki KaishaInventors: Yoshiaki Sakagami, Nobuo Higaki, Naoaki Sumida, Yuichi Yoshida
-
Publication number: 20040199292Abstract: An apparatus 1 for controlling a movable robot recognizes the person to be followed up from an image taken by a camera C by an image processing portion 20, controls leg portions R1 of a movable robot A so as to keep a prescribed interval between the movable robot A and the person by a portion 50 for detecting an action, and notify the degree of the distance between the movable robot A and the person to the person by a voice from a voice outputting portion 62.Type: ApplicationFiled: April 1, 2004Publication date: October 7, 2004Inventors: Yoshiaki Sakagami, Shinichi Matsunaga, Nobuo Higaki, Naoaki Sumida, Takahiro Oohashi
-
Publication number: 20040190753Abstract: In an image transmission system for a mobile robot that can move about and look for persons such as children separated from their parents in places where a large number of people congregate, a human is detected from the captured image and/or sound. An image of the detected human is cut out from a captured image, and the cut out image is transmitted to a remote terminal or a large screen. By thus cutting out the image of the detected human, even when the image signal is transmitted to a remote terminal having a small screen, the image of the detected human can be shown in a clearly recognizable manner. Also when the image is shown in a large screen, the viewer can identify the person even from a great distance. The transmitted image may be attached with various pieces of information such as the current location of the robot.Type: ApplicationFiled: April 1, 2004Publication date: September 30, 2004Applicant: Honda Motor Co., Ltd.Inventors: Yoshiaki Sakagami, Koji Kawabe, Nobuo Higaki, Naoaki Sumida, Youko Saitou, Tomonobu Gotou
-
Publication number: 20040190754Abstract: In an image transmission system for a mobile robot that can move about and look for persons such as children separated from their parents in places where a large number of people congregate, a face image is cut out from a captured image of a detected human, and the cut out face image is transmitted to a remote terminal or a large screen. By thus cutting out the image of the face, even when the image signal is transmitted to a remote terminal having a small screen, the face image can be shown in a clearly recognizable manner. Also when the image is shown in a large screen, the viewer can identify the person even from a great distance. The transmitted image may be attached with various pieces of information such as the current location of the robot.Type: ApplicationFiled: April 1, 2004Publication date: September 30, 2004Applicant: Honda Motor Co., Ltd.Inventors: Yoshiaki Sakagami, Koji Kawabe, Nobuo Higaki, Naoaki Sumida, Takahiro Oohashi, Youko Saitou
-
Publication number: 20040111273Abstract: To automate the work of recognizing a guest, check the appointment, notify a host of the arrival of the guest and conduct the guest to a designated place, a robot having a function to autonomously travel is equipped with camera/microphone for recognizing a guest at least according to image information. The system comprises management database adapted to communicate with the robot and equipped with an information database for identifying the recognized guest, and identifies the guest according to the information obtained from the camera/microphone and management database. The robot recognizes the guest from the image thereof, and the recognized guest is identified and verified by comparing the image of the visitor with the information contained in the database so that the robot can automatically conduct the guest to the designated meeting room according to the information of the appointment stored in the database.Type: ApplicationFiled: September 23, 2003Publication date: June 10, 2004Inventors: Yoshiaki Sakagami, Shinichi Matsunaga, Nobuo Higaki, Kazunori Kanai, Naoaki Sumida, Takahiro Oohashi, Sachie Hashimoto
-
Publication number: 20040028260Abstract: A posture recognition apparatus recognizes instructions signified by postures of persons present in the surroundings, from images obtained with an image capture device. The posture recognition apparatus includes an outline extraction device that extracts an outline of a body which is a candidate for a person from the images; a distance calculation device that calculates a distance to the body being the candidate, from distance information of each pixel within the outline in the image; a search device that searches for a candidate for a hand of a person based on the outline and the distance to the body represented by the outline; and a posture determination device that determines an instruction corresponding to the relative position of the candidate for a hand and the outline, and outputs this determination result as a posture determination result.Type: ApplicationFiled: August 7, 2003Publication date: February 12, 2004Applicant: Honda Gilken Kogyo Kabushiki KaishaInventors: Nobuo Higaki, Yoshiaki sakagami, Naoaki Sumida
-
Patent number: 6636827Abstract: A foreign-matter detector for detecting foreign matter contained in an article P transported between a transmitter (17) and a receiver (18). The detector (1) includes a signal extractor (2) for extracting a signal (e) having a predetermined phase difference relative to a transmitted signal (a) from the transmitter (17), based on a phase of the transmitted signal (a) from the transmitter (17) and the received signal (b) from the receiver (18), a specific value setting unit (25) for setting, through an external input operation, a specific value for the level of the extracted signal (e) which is outputted from the signal extractor (2) and which is attributable to the article (P) free from foreign matter to be detected, and a level adjusting circuit (24) for adjusting the signal extractor (2) or the receiver (18) to cause the level of the extracted signal (e) attributable to the article itself to assume the specific value inputted.Type: GrantFiled: April 1, 1999Date of Patent: October 21, 2003Assignee: Ishida Co., Ltd.Inventor: Yoshiaki Sakagami
-
Publication number: 20030144820Abstract: A foreign-matter detector for detecting foreign matter contained in an article P transported between a transmitter (17) and a receiver (18). The detector (1) includes a signal extractor (2) for extracting a signal (e) having a predetermined phase difference relative to a transmitted signal (a) from the transmitter (17), based on a phase of the transmitted signal (a) from the transmitter (17) and the received signal (b) from the receiver (18), a specific value setting unit (25) for setting, through an external input operation, a specific value for the level of the extracted signal (e) which is outputted from the signal extractor (2) and which is attributable to the article (P) free from foreign matter to be detected, and a level adjusting circuit (24) for adjusting the signal extractor (2) or the receiver (18) to cause the level of the extracted signal (e) attributable to the article itself to assume the specific value inputted.Type: ApplicationFiled: April 1, 1999Publication date: July 31, 2003Inventor: YOSHIAKI SAKAGAMI
-
Patent number: 6553331Abstract: A weight checking apparatus is provided to prevent items that have not been properly checked from slipping through the weight checker. A weight checker includes a weight conveyor, a loading sensor, a weighing portion, a weighing conveyor weight change monitoring means, an item weight calculation means, and a controller. The loading sensor detects the loading of an item on the weighing conveyor. The weighing portion detects the weight of the weighing conveyor. The controller switches between the weighing conveyor weight change monitoring process and the item weight calculation process based on a detection signal from the loading sensor. The weighing conveyor weight change monitoring process monitors a change in the weight value of the weighing conveyor. The item weight calculation process calculates the weight of the item from the weight value detected by the weighing portion.Type: GrantFiled: July 23, 2001Date of Patent: April 22, 2003Assignee: Ishida Co., Ltd.Inventor: Yoshiaki Sakagami
-
Publication number: 20030055532Abstract: The present invention provides an autonomous action robot which can turn its line of sight to face a person who calls out, can recognize the face of a person, and can perform various actions in response to commands. First, a sound emitted from a person or other sound source is detected by a sound detector, and the direction of the sound source is specified based on the detected sound. Then, a robot head section is controlled, and the imaging direction of the robot head section is moved to face the specified direction of the sound source. Next, an image is captured in the direction of the sound source, and a target image of a specific shape is extracted from the captured image. Then, the imaging direction of the robot head section is controlled and moved to face in the direction of the extracted target image.Type: ApplicationFiled: August 22, 2002Publication date: March 20, 2003Inventors: Yoshiaki Sakagami, Nobuo Higaki, Naoaki Sumida, Yuichi Yoshida