Patents by Inventor Yudai NAKAMURA

Yudai NAKAMURA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9904371
    Abstract: A hand region is detected in a captured image, and for each part of the background area, a light source presence degree indicating the probability that a light source is present is determined according to the luminance or color of that part; on the basis of the light source presence degree, a region in which the captured image is affected by a light source is estimated, and if the captured image includes a region estimated to be affected by a light source, whether or not a dropout has occurred in the hand region in the captured image is decided; on the basis of the result of this decision, an action is determined. Gesture determinations can be made correctly even when the hand region in the captured image is affected by a light source at the time of gesture manipulation input.
    Type: Grant
    Filed: May 15, 2014
    Date of Patent: February 27, 2018
    Assignee: Mitsubishi Electric Corporation
    Inventors: Nobuhiko Yamagishi, Yudai Nakamura, Tomonori Fukuta
  • Publication number: 20180052521
    Abstract: Provided are a gesture recognition device, a gesture recognition method and an information processing device for making it possible to quickly recognize a gesture of a user. The gesture recognition device includes a motion information generator that generates body part motion information by performing detection and tracking of the body part, a prediction processor that makes a first comparison of comparing the generated body part motion information with previously stored pre-gesture motion model information and generates a prediction result regarding a pre-gesture motion on the basis of a result of the first comparison, and a recognition processor that makes a second comparison of comparing the generated body part motion information with previously stored gesture model information and generates a result of recognition of the gesture represented by a motion of the detected body part on the basis of the prediction result and a result of the second comparison.
    Type: Application
    Filed: April 15, 2016
    Publication date: February 22, 2018
    Applicant: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Masashi KAMIYA, Yudai NAKAMURA
  • Publication number: 20180046254
    Abstract: An information display device includes a display control unit, a gesture detection unit, a gesture identification unit that makes identification of an operator's gesture based on gesture information and outputs a signal based on a result of the identification, a distance estimation unit that estimates a distance between the operator and a display unit, and an identification function setting unit that sets gestures identifiable by the gesture identification unit so that the number of gestures identifiable by the gesture identification unit when the estimated distance is over a first set distance is smaller than the number of gestures identifiable by the gesture identification unit when the estimated distance is less than or equal to the first set distance.
    Type: Application
    Filed: March 14, 2016
    Publication date: February 15, 2018
    Applicant: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Aki TAKAYANAGI, Masashi KAMIYA, Yudai NAKAMURA, Masahiro NAITO
  • Publication number: 20180039852
    Abstract: A luminance variation (dI) pertaining to each pixel is calculated (21) using a plurality of captured images obtained by image capturing under different illumination conditions, a texture variation (dF) pertaining to each pixel is calculated (22) using a plurality of captured images obtained by image capturing at different time points, and a subject region is extracted (23) based on the luminance variation (dI) and the texture variation (dF). A variation in the texture feature (F) pertaining to each pixel between the images is calculated as the texture variation (dF). The subject can be extracted with a high accuracy even when there are changes in the ambient light or the background.
    Type: Application
    Filed: March 27, 2015
    Publication date: February 8, 2018
    Applicant: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Yudai NAKAMURA, Tomonori FUKUTA, Masashi KAMIYA, Masahiro NAITO
  • Publication number: 20160357690
    Abstract: A storage section (101) configured to store data, a DMA section (106) configured to read video data from the storage section (101) by specifying an address and write them into the storage section (101), an address conversion rule storage section (104) configured to store an address conversion rule for converting the address specified by the DMA section (106), and an address conversion section (105) configured to convert the address specified by the DMA section (106) in accordance with the address conversion rule; the address conversion rule is a rule for converting addresses of a series of areas to addresses of the video data stored in a plurality of areas in the storage section (101); the address conversion section (105) includes an address conversion done-or-not determination section; and the address conversion done-or-not determination section determines, by comparing the address with a third area assigned to address conversion, whether the address conversion is done or not done.
    Type: Application
    Filed: December 4, 2014
    Publication date: December 8, 2016
    Applicant: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Masahide KOIKE, Kiyoyasu MARUYAMA, Satoshi MICHIHATA, Keiji UEMURA, Yudai NAKAMURA
  • Publication number: 20160209927
    Abstract: A hand region is detected in a captured image, and for each part of the background area, a light source presence degree indicating the probability that a light source is present is determined according to the luminance or color of that part; on the basis of the light source presence degree, a region in which the captured image is affected by a light source is estimated, and if the captured image includes a region estimated to be affected by a light source, whether or not a dropout has occurred in the hand region in the captured image is decided; on the basis of the result of this decision, an action is determined. Gesture determinations can be made correctly even when the hand region in the captured image is affected by a light source at the time of gesture manipulation input.
    Type: Application
    Filed: May 15, 2014
    Publication date: July 21, 2016
    Applicant: Mitsubishi Electric Corporation
    Inventors: Nobuhiko YAMAGISHI, Yudai NAKAMURA, Tomonori FUKUTA
  • Publication number: 20160132124
    Abstract: A hand region (Rh) of an operator is detected from a captured image, and positions of a palm center (Po) and a wrist center (Wo) are determined, and origin coordinates (Cho) and a direction of an coordinate axis (Chu) of a hand coordinate system are calculated. A shape of the hand in the hand region (Rh) is detected using the hand coordinate system, and a shape feature quantity (D14) of the hand is calculated. Further, a movement feature quantity (D15f) of a finger is calculated based on the shape feature quantity (D14) of the hand. Gesture determination is made based on the calculated feature quantities. Since the gesture determination is made taking into consideration the differences in the angle of the hand placed in the operation region, and the direction in which the hand is moved, misrecognition of the operation can be reduced.
    Type: Application
    Filed: April 10, 2014
    Publication date: May 12, 2016
    Applicant: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Yudai NAKAMURA, Nobuhiko YAMAGISHI, Tomonori FUKUTA, Yoshiaki KUSUNOKI