Patents by Inventor Makoto Yoshimoto

Makoto Yoshimoto has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230372232
    Abstract: An oral care composition is provided. The oral care composition is configured by containing a sweet potato-derived potato syrup or a supernatant of a potato syrup.
    Type: Application
    Filed: October 4, 2021
    Publication date: November 23, 2023
    Inventors: Osamu YAMAKAWA, Makoto YOSHIMOTO, Tadashi MIURA, Megumi KIDO, Yoshiaki HARA
  • Patent number: 10860826
    Abstract: An information processing apparatus (2000) includes a conversion unit (2020) and a computation unit (2040). The conversion unit (2020) detects a plurality of markers from a camera image (10). The conversion unit (2020) converts the detected markers into corresponding sub-identifier information. The computation unit (2040) computes an identifier using a plurality of sub-identifiers converted by the conversion unit (2020).
    Type: Grant
    Filed: June 7, 2017
    Date of Patent: December 8, 2020
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge, Yoshiaki Aoyagi
  • Patent number: 10713488
    Abstract: An image acquisition unit (2020) acquires a captured image containing an inspection target instrument. An inspection information acquisition unit (2040) acquires inspection information regarding the instrument contained in the captured image. The inspection information is information indicating an inspection item of the instrument. A first display control unit (2060) displays an indication representing an inspection spot corresponding to the inspection item indicated by the inspection information on the display device (10). For example, the first display control unit (2060) displays the indication representing the inspection spot so that the indication is superimposed on the inspection spot on a display device (10). For example, the first display control unit (2060) displays the indication in the inspection spot on the display device (10) or near the instrument.
    Type: Grant
    Filed: September 7, 2016
    Date of Patent: July 14, 2020
    Assignee: NEC CORPORATION
    Inventors: Yoshinori Saida, Shin Norieda, Makoto Yoshimoto, Kota Iwamoto, Takami Sato, Ruihan Bao
  • Patent number: 10698535
    Abstract: An interface control system (2000) includes a first marker detection unit (2020) and a display control unit (2040). The first marker detection unit (2020) detects a first marker (50) attached to the body of a user based on a captured image obtained from a camera (20) worn by the user. The display control unit (2040) displays an operation image on a display screen (32) of a head mounted display 30 mounted on the user based on the first marker detected by the first marker detection unit (2020). The display control unit (2040) displays the operation image so that the operation image is viewed to be superimposed on the arm portion of the user when viewed with the eyes of the user.
    Type: Grant
    Filed: April 18, 2016
    Date of Patent: June 30, 2020
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
  • Publication number: 20190347458
    Abstract: An information processing apparatus (2000) includes a conversion unit (2020) and a computation unit (2040). The conversion unit (2020) detects a plurality of markers from a camera image (10). The conversion unit (2020) converts the detected markers into corresponding sub-identifier information. The computation unit (2040) computes an identifier using a plurality of sub-identifiers converted by the conversion unit (2020).
    Type: Application
    Filed: June 7, 2017
    Publication date: November 14, 2019
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE, Yoshiaki AOYAGI
  • Patent number: 10372229
    Abstract: An information processing system (3000) includes a marker (3020). The marker (3020) is any part of the body of a user of the information processing system (3000), or is any object attached to the user of the information processing system (3000). An information processing apparatus (2000) includes an operation region extraction unit (2020) and a recognition unit (2040). The operation region extraction unit (2020) extracts an operation region from a captured image on the basis of a position of the marker (3020). The recognition unit (2040) calculates a position or motion of an operation body in the operation region on a captured image. The recognition unit (2040) recognizes an input operation on the basis of the calculated position or motion of the operation body.
    Type: Grant
    Filed: September 20, 2016
    Date of Patent: August 6, 2019
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge
  • Publication number: 20190156118
    Abstract: An image acquisition unit (2020) acquires a captured image containing an inspection target instrument. An inspection information acquisition unit (2040) acquires inspection information regarding the instrument contained in the captured image. The inspection information is information indicating an inspection item of the instrument. A first display control unit (2060) displays an indication representing an inspection spot corresponding to the inspection item indicated by the inspection information on the display device (10). For example, the first display control unit (2060) displays the indication representing the inspection spot so that the indication is superimposed on the inspection spot on a display device (10). For example, the first display control unit (2060) displays the indication in the inspection spot on the display device (10) or near the instrument.
    Type: Application
    Filed: September 7, 2016
    Publication date: May 23, 2019
    Applicant: NEC Corporation
    Inventors: Yoshinori SAIDA, Shin NORIEDA, Makoto YOSHIMOTO, Kota IWAMOTO, Takami SATO, Ruihan BAO
  • Patent number: 10296101
    Abstract: A marker (3020) is any part of a user's body or is any mark attached to the user. A sensor (3040) is attached to the user. An operation region calculation unit (2020) calculates an operation region included in a captured image on the basis of a position of the marker (3020) included in the captured image generated by a camera. A recognition unit (2040) calculates a position or motion of an operation body captured in the operation region, and recognizes an input operation on the basis of the calculated position or motion of the operation body. Note that the recognition unit (2040) calculates a position of the operation body captured in the operation region at a timing based on a result of detection by the sensor (3040). The recognition unit (2040) calculates motion of the operation body captured in the operation region in a period including a timing based on a result of detection by the sensor (3040).
    Type: Grant
    Filed: February 8, 2017
    Date of Patent: May 21, 2019
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge
  • Patent number: 10234955
    Abstract: An input apparatus (2000) includes a position calculation unit (2020) and an input recognition unit (2040). The position calculation unit (2020) calculates a position of a marker included in a captured image. The marker is attached to the body of a user or is a part of the body of the user. The captured image is generated by a camera at a timing based on a result of detection by a sensor attached to the body of the user. The input recognition unit (2040) recognizes input specifying a location separated from the marker on the captured image on the basis of the calculated position of the marker. The location specified by the input is a location separated from the marker.
    Type: Grant
    Filed: September 20, 2016
    Date of Patent: March 19, 2019
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
  • Patent number: 10168769
    Abstract: An input apparatus (2000) includes a motion detection unit (2020) and an input recognition unit (2040). The motion detection unit (2020) detects motion of an object by using a captured image including the object. Here, the detected motion of the object is motion of the object in a period defined based on a result of detection by a sensor attached to the body of a user of the input apparatus (2000). The input recognition unit (2040) recognizes input to an information processing apparatus based on the detected motion of the object.
    Type: Grant
    Filed: September 20, 2016
    Date of Patent: January 1, 2019
    Assignee: NEC CORPORATION
    Inventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
  • Publication number: 20180267619
    Abstract: A marker (3020) is any part of a user's body or is any mark attached to the user. A sensor (3040) is attached to the user. An operation region calculation unit (2020) calculates an operation region included in a captured image on the basis of a position of the marker (3020) included in the captured image generated by a camera. A recognition unit (2040) calculates a position or motion of an operation body captured in the operation region, and recognizes an input operation on the basis of the calculated position or motion of the operation body. Note that the recognition unit (2040) calculates a position of the operation body captured in the operation region at a timing based on a result of detection by the sensor (3040). The recognition unit (2040) calculates motion of the operation body captured in the operation region in a period including a timing based on a result of detection by the sensor (3040).
    Type: Application
    Filed: February 8, 2017
    Publication date: September 20, 2018
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
  • Publication number: 20180260032
    Abstract: An input apparatus (2000) includes a position calculation unit (2020) and an input recognition unit (2040). The position calculation unit (2020) calculates a position of a marker included in a captured image. The marker is attached to the body of a user or is a part of the body of the user. The captured image is generated by a camera at a timing based on a result of detection by a sensor attached to the body of the user. The input recognition unit (2040) recognizes input specifying a location separated from the marker on the captured image on the basis of the calculated position of the marker. The location specified by the input is a location separated from the marker.
    Type: Application
    Filed: September 20, 2016
    Publication date: September 13, 2018
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
  • Publication number: 20180260033
    Abstract: An input apparatus (2000) includes a motion detection unit (2020) and an input recognition unit (2040). The motion detection unit (2020) detects motion of an object by using a captured image including the object. Here, the detected motion of the object is motion of the object in a period defined based on a result of detection by a sensor attached to the body of a user of the input apparatus (2000). The input recognition unit (2040) recognizes input to an information processing apparatus based on the detected motion of the object.
    Type: Application
    Filed: September 20, 2016
    Publication date: September 13, 2018
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
  • Publication number: 20180253149
    Abstract: An information processing system (3000) includes a marker (3020). The marker (3020) is any part of the body of a user of the information processing system (3000), or is any object attached to the user of the information processing system (3000). An information processing apparatus (2000) includes an operation region extraction unit (2020) and a recognition unit (2040). The operation region extraction unit (2020) extracts an operation region from a captured image on the basis of a position of the marker (3020). The recognition unit (2040) calculates a position or motion of an operation body in the operation region on a captured image. The recognition unit (2040) recognizes an input operation on the basis of the calculated position or motion of the operation body.
    Type: Application
    Filed: September 20, 2016
    Publication date: September 6, 2018
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
  • Publication number: 20180150186
    Abstract: An interface control system (2000) includes a first marker detection unit (2020) and a display control unit (2040). The first marker detection unit (2020) detects a first marker (50) attached to the body of a user based on a captured image obtained from a camera (20) worn by the user. The display control unit (2040) displays an operation image on a display screen (32) of a head mounted display 30 mounted on the user based on the first marker detected by the first marker detection unit (2020). The display control unit (2040) displays the operation image so that the operation image is viewed to be superimposed on the arm portion of the user when viewed with the eyes of the user.
    Type: Application
    Filed: April 18, 2016
    Publication date: May 31, 2018
    Applicant: NEC CORPORATION
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
  • Publication number: 20170255838
    Abstract: According to an embodiment, an information processing system includes a display device and an information processing apparatus. The information processing apparatus includes a product recognition unit, a display control unit, and an operation recognition unit. The product recognition unit recognizes product. The display control unit acquires product information regarding the product recognized by the product recognition unit. Further, the display control unit displays an operation screen including the acquired product information on a display device. The operation recognition unit recognizes an input operation with respect to the operation screen on the basis of the position or movement of an operation body in a captured image.
    Type: Application
    Filed: September 27, 2016
    Publication date: September 7, 2017
    Applicant: NEC Corporation
    Inventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
  • Patent number: 7550649
    Abstract: An object of the present invention is to provide a model animal of Parkinson's disease comprising an ?-synuclein gene introduced therein. The present invention provides a transgenic non-human mammal or a portion thereof, wherein an ?-synuclein gene is introduced and the gene is expressed in neurons, and the number of dopamine-producing neurons in the substantial nigra is significantly decreased as compared with that of a wild-type animal.
    Type: Grant
    Filed: October 28, 2004
    Date of Patent: June 23, 2009
    Assignee: Taisho Pharmaceutical Co., Ltd.
    Inventors: Makoto Yoshimoto, Masaki Wakamatsu, Aiko Ishii
  • Publication number: 20070192879
    Abstract: An object of the present invention is to provide a model animal of Parkinson's disease comprising an ?-synuclein gene introduced therein. The present invention provides a transgenic non-human mammal or a portion thereof, wherein an ?-synuclein gene is introduced and the gene is expressed in neurons, and the number of dopamine-producing neurons in the substantial nigra is significantly decreased as compared with that of a wild-type animal.
    Type: Application
    Filed: October 28, 2004
    Publication date: August 16, 2007
    Inventors: Makoto Yoshimoto, Masaki Wakamatsu, Aiko Ishii
  • Patent number: 6723521
    Abstract: A gene hucep-8 coding for a novel protein HUCEP-8 having sugar-transporting activity can be obtained by cloning from a cDNA library derived from human cerebral cortex.
    Type: Grant
    Filed: February 28, 2001
    Date of Patent: April 20, 2004
    Inventors: Makoto Yoshimoto, Madoka Yazaki, Kayo Matsumoto, Kiyoshi Takayama, Katsuki Tsuritani
  • Patent number: 5523391
    Abstract: DNA fragments encoding a tumor cell growth inhibitors and having a nucleotide sequence shown by formula (1) below, which are produced by preparing cDNA library from mRNA of the established 3T3 cell-derived cell line, amplifying various DNA fragments considered to encode the tumor cell growth inhibitors by the PCR method, analyzing the nucleotide sequences of these DNA fragments and determining the nucleotide sequences of the DNA fragments encoding the inhibitors: ##STR1## wherein X represents TTC TTT CTA or TTC (SEQ ID NO:1 or SEQ ID NO:2).
    Type: Grant
    Filed: May 25, 1994
    Date of Patent: June 4, 1996
    Assignee: Taisho Pharmaceutical Co., Ltd.
    Inventors: Toshi Komurasaki, Hitoshi Toyoda, Makoto Yoshimoto, Kazunori Hanada