Patents by Inventor Ikushi Yoda
Ikushi Yoda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11893161Abstract: A gesture recognition apparatus by which a gesture for an interface operation is performed includes an image capturer that captures distance image data, a closest point detector that detects a closest point from the imaging apparatus, a gesture measurer that calculates an input switching border for switching ON and OFF of interface input, a gesture recognizer that determines whether the input is ON or OFF, and an interface controller that performs the interface control associated with the gesture if it is determined that the input has been turned on.Type: GrantFiled: April 27, 2021Date of Patent: February 6, 2024Assignee: NATIONAL INSTITUTE OF ADVANCED INDUSTRIAL SCIENCE AND TECHNOLOGYInventor: Ikushi Yoda
-
Publication number: 20230168745Abstract: The present invention provides a gesture recognition apparatus in which a gesture intended to be used for an interface operation is performed at the closest mode of the camera, thus allowing the gesture to achieve interface control. A gesture recognition apparatus according to the present invention includes: an image capturer that sequentially captures distance image data taken by an imaging apparatus; a closest point detector that detects, from the distance image data, a closest point from the imaging apparatus; a gesture measurer that calculates an input switching border for switching ON and OFF of interface input, based on a trajectory of the closest point that is the gesture of the user detected from a plurality of distance image data items; a gesture recognizer that determines whether the input is ON or OFF, in accordance with the input switching border; and an interface controller that performs the interface control associated with the gesture if it is determined that the input has been turned on.Type: ApplicationFiled: April 27, 2021Publication date: June 1, 2023Inventor: Ikushi YODA
-
Publication number: 20150255005Abstract: To provide an action exhibiting apparatus capable of effectively exhibiting an exemplary action to a user. An action evaluating apparatus according to the present invention includes a part coordinate calculating section that calculates a part coordinate of a body of a user based on image data on the user, a user model generating section that generates a geometric model of the user based on the part coordinate and generates moving image data on an instructor action represented by the geometric model of the user based on an instructor action parameter, an action evaluating section that evaluates an action of the user based on the part coordinate, and an output controlling section that displays the instructor action represented by the geometric model of the user and the action of the user in a superimposed manner and outputs an evaluation result.Type: ApplicationFiled: September 9, 2013Publication date: September 10, 2015Inventors: Ikushi Yoda, Masaki Onishi, Tetsuo Yukioka, Shoichi Ohta, Shiro Mishima, Jun Oda
-
Patent number: 8107679Abstract: The purpose of the present invention is, in a horse race or a motorboat race, to display the progress of the race (trail) of each horse, by obtaining analysis data of a plural number of patrol images provided around the race course, by acquiring position information of each horse at the moment and by tracking the specific horse through judging similarity between consecutive pictures of said patrol image. The position information analyzing and displaying method for each horse or boat or the like of the present invention, for continuously captured race images, identifies each horse or boat or the like by similarity analysis and tracks continuously the trail of each horse or boat or the like in said racing images, and also analyzes said position information of each horse or boat or the like by using the positional relationship with the fixed position information in said images, in order to display the trail of each horse or boat or the like.Type: GrantFiled: September 29, 2006Date of Patent: January 31, 2012Assignee: Yamaguchi Cinema Co., Ltd.Inventor: Ikushi Yoda
-
Patent number: 7821531Abstract: To provide an interface apparatus capable of achieving noncontact and unrestricted arm pointing actions of multiple users in an indoor space, and facilitating the recognition of all typical arm pointing actions in standing, sitting, and lying postures and the operation of indoor units in the indoor space by the arm pointing actions. The interface apparatus includes: image processing means for picking up images of the interior of an indoor space 5 with a plurality of stereo cameras 1-1 to 1-n, and producing a distance image based on the picked-up images within the visual field on a camera-by-camera basis and the coordinate system of the indoor space 5; means for extracting the posture and arm pointing of a user 4 from the distance information from the stereo cameras 1-1 to 1-n; and means for determining, when the arm pointing has been identified, whether the arm pointing is an intended signal, from the direction pointed by the arm and the motion of the arm.Type: GrantFiled: December 17, 2003Date of Patent: October 26, 2010Assignee: National Institute of Advanced Industrial Science and TechnologyInventors: Ikushi Yoda, Katsuhiko Sakaue
-
Patent number: 7680295Abstract: An interface is provided that corresponds to an individual person without being restricted to a particular place within a room, by performing gesture recognition while identifying an individual person. A stereo camera (1) picks up an image of a user (4), and based on the image pickup output, an image processor 2 transmits a color image within a visual field and a distance image to an information integrated recognition device (3). The information integrated recognition device (3) identifies an individual by the face of the user (4), senses the position, and recognizes a significant gesture based on a hand sign of the user (4). The information integrated recognition device (3) executes a command corresponding the identified user (4) and performs operations of all devices (6) to be operated in the room (such as a TV set, an air conditioner, an electric fan, illumination, acoustic condition, and window opening/closing).Type: GrantFiled: September 17, 2002Date of Patent: March 16, 2010Assignee: National Institute of Advanced Industrial Science and TechnologyInventors: Ikushi Yoda, Katsuhiko Sakaue
-
Publication number: 20090042628Abstract: The purpose of the present invention is, in a horse race or a motorboat race, to display the progress of the race (trail) of each horse, by obtaining analysis data of a plural number of patrol images provided around the race course, by acquiring position information of each horse at the moment and by tracking the specific horse through judging similarity between consecutive pictures of said patrol image. The position information analyzing and displaying method for each horse or boat or the like of the present invention, for continuously captured race images, identifies each horse or boat or the like by similarity analysis and tracks continuously the trail of each horse or boat or the like in said racing images, and also analyzes said position information of each horse or boat or the like by using the positional relationship with the fixed position information in said images, in order to display the trail of each horse or boat or the like.Type: ApplicationFiled: September 29, 2006Publication date: February 12, 2009Applicant: PLUSMIC CORPORATIONInventor: Ikushi Yoda
-
Patent number: 7460686Abstract: The present invention provides a more reliable safety monitoring device on the station platform, where the safety monitoring device detects the fall of a person on the platform edge on the railroad-track side onto a railroad track with stability, recognizes at least two persons, and obtains the entire action log thereof. The present invention recognizes the person at the platform edge by distance information and texture information and determines the position on the platform edge. At the same time, the present invention detects the case where the person falls onto the railroad track with stability and automatically transmits a stop signal or the like. At the same time, the present invention transmits an image of the corresponding camera. Further, the present invention records the entire action of all persons moving on the platform edge.Type: GrantFiled: July 24, 2003Date of Patent: December 2, 2008Assignee: National Institute of Advanced Industrial Science and TechnologyInventors: Ikushi Yoda, Katsuhiko Sakaue
-
Publication number: 20060182346Abstract: An interface is provided that corresponds to an individual person without being restricted to a particular place within a room, by performing gesture recognition while identifying an individual person. A stereo camera (1) picks up an image of a user (4), and based on the image pickup output, an image processor 2 transmits a color image within a visual field and a distance image to an information integrated recognition device (3). The information integrated recognition device (3) identifies an individual by the face of the user (4), senses the position, and recognizes a significant gesture based on a hand sign of the user (4). The information integrated recognition device (3) executes a command corresponding the identified user (4) and performs operations of all devices (6) to be operated in the room (such as a TV set, an air conditioner, an electric fan, illumination, acoustic condition, and window opening/closing).Type: ApplicationFiled: September 17, 2002Publication date: August 17, 2006Applicant: National Inst. of Adv. Industrial Science & Tech.Inventors: Ikushi Yoda, Katsuhiko Sakaue
-
Publication number: 20060168523Abstract: To provide an interface apparatus capable of achieving noncontact and unrestricted arm pointing actions of multiple users in an indoor space, and facilitating the recognition of all typical arm pointing actions in standing, sitting, and lying postures and the operation of indoor units in the indoor space by the arm pointing actions. The interface apparatus includes: image processing means for picking up images of the interior of an indoor space 5 with a plurality of stereo cameras 1-1 to 1-n, and producing a distance image based on the picked-up images within the visual field on a camera-by-camera basis and the coordinate system of the indoor space 5; means for extracting the posture and arm pointing of a user 4 from the distance information from the stereo cameras 1-1 to 1-n; and means for determining, when the arm pointing has been identified, whether the arm pointing is an intended signal, from the direction pointed by the arm and the motion of the arm.Type: ApplicationFiled: December 17, 2003Publication date: July 27, 2006Applicant: National Institute of Adv. Industrial Sci. & Tech.Inventors: Ikushi Yoda, Katsuhiko Sakaue
-
Publication number: 20060056654Abstract: The present invention provides a more reliable safety monitoring device on the station platform, where the safety monitoring device detects the fall of a person on the platform edge on the railroad-track side onto a railroad track with stability, recognizes at least two persons, and obtains the entire action log thereof. The present invention recognizes the person at the platform edge by distance information and texture information and determines the position on the platform edge. At the same time, the present invention detects the case where the person falls onto the railroad track with stability and automatically transmits a stop signal or the like. At the same time, the present invention transmits an image of the corresponding camera. Further, the present invention records the entire action of all persons moving on the platform edge.Type: ApplicationFiled: July 24, 2003Publication date: March 16, 2006Applicant: National Institute of Advanced Indust Sci & TechInventors: Ikushi Yoda, Kasuhiko Sakaue