Patents by Inventor Makoto Yoshimoto
Makoto Yoshimoto has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230372232Abstract: An oral care composition is provided. The oral care composition is configured by containing a sweet potato-derived potato syrup or a supernatant of a potato syrup.Type: ApplicationFiled: October 4, 2021Publication date: November 23, 2023Inventors: Osamu YAMAKAWA, Makoto YOSHIMOTO, Tadashi MIURA, Megumi KIDO, Yoshiaki HARA
-
Patent number: 10860826Abstract: An information processing apparatus (2000) includes a conversion unit (2020) and a computation unit (2040). The conversion unit (2020) detects a plurality of markers from a camera image (10). The conversion unit (2020) converts the detected markers into corresponding sub-identifier information. The computation unit (2040) computes an identifier using a plurality of sub-identifiers converted by the conversion unit (2020).Type: GrantFiled: June 7, 2017Date of Patent: December 8, 2020Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge, Yoshiaki Aoyagi
-
Patent number: 10713488Abstract: An image acquisition unit (2020) acquires a captured image containing an inspection target instrument. An inspection information acquisition unit (2040) acquires inspection information regarding the instrument contained in the captured image. The inspection information is information indicating an inspection item of the instrument. A first display control unit (2060) displays an indication representing an inspection spot corresponding to the inspection item indicated by the inspection information on the display device (10). For example, the first display control unit (2060) displays the indication representing the inspection spot so that the indication is superimposed on the inspection spot on a display device (10). For example, the first display control unit (2060) displays the indication in the inspection spot on the display device (10) or near the instrument.Type: GrantFiled: September 7, 2016Date of Patent: July 14, 2020Assignee: NEC CORPORATIONInventors: Yoshinori Saida, Shin Norieda, Makoto Yoshimoto, Kota Iwamoto, Takami Sato, Ruihan Bao
-
Patent number: 10698535Abstract: An interface control system (2000) includes a first marker detection unit (2020) and a display control unit (2040). The first marker detection unit (2020) detects a first marker (50) attached to the body of a user based on a captured image obtained from a camera (20) worn by the user. The display control unit (2040) displays an operation image on a display screen (32) of a head mounted display 30 mounted on the user based on the first marker detected by the first marker detection unit (2020). The display control unit (2040) displays the operation image so that the operation image is viewed to be superimposed on the arm portion of the user when viewed with the eyes of the user.Type: GrantFiled: April 18, 2016Date of Patent: June 30, 2020Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
-
Publication number: 20190347458Abstract: An information processing apparatus (2000) includes a conversion unit (2020) and a computation unit (2040). The conversion unit (2020) detects a plurality of markers from a camera image (10). The conversion unit (2020) converts the detected markers into corresponding sub-identifier information. The computation unit (2040) computes an identifier using a plurality of sub-identifiers converted by the conversion unit (2020).Type: ApplicationFiled: June 7, 2017Publication date: November 14, 2019Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE, Yoshiaki AOYAGI
-
Patent number: 10372229Abstract: An information processing system (3000) includes a marker (3020). The marker (3020) is any part of the body of a user of the information processing system (3000), or is any object attached to the user of the information processing system (3000). An information processing apparatus (2000) includes an operation region extraction unit (2020) and a recognition unit (2040). The operation region extraction unit (2020) extracts an operation region from a captured image on the basis of a position of the marker (3020). The recognition unit (2040) calculates a position or motion of an operation body in the operation region on a captured image. The recognition unit (2040) recognizes an input operation on the basis of the calculated position or motion of the operation body.Type: GrantFiled: September 20, 2016Date of Patent: August 6, 2019Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge
-
Publication number: 20190156118Abstract: An image acquisition unit (2020) acquires a captured image containing an inspection target instrument. An inspection information acquisition unit (2040) acquires inspection information regarding the instrument contained in the captured image. The inspection information is information indicating an inspection item of the instrument. A first display control unit (2060) displays an indication representing an inspection spot corresponding to the inspection item indicated by the inspection information on the display device (10). For example, the first display control unit (2060) displays the indication representing the inspection spot so that the indication is superimposed on the inspection spot on a display device (10). For example, the first display control unit (2060) displays the indication in the inspection spot on the display device (10) or near the instrument.Type: ApplicationFiled: September 7, 2016Publication date: May 23, 2019Applicant: NEC CorporationInventors: Yoshinori SAIDA, Shin NORIEDA, Makoto YOSHIMOTO, Kota IWAMOTO, Takami SATO, Ruihan BAO
-
Patent number: 10296101Abstract: A marker (3020) is any part of a user's body or is any mark attached to the user. A sensor (3040) is attached to the user. An operation region calculation unit (2020) calculates an operation region included in a captured image on the basis of a position of the marker (3020) included in the captured image generated by a camera. A recognition unit (2040) calculates a position or motion of an operation body captured in the operation region, and recognizes an input operation on the basis of the calculated position or motion of the operation body. Note that the recognition unit (2040) calculates a position of the operation body captured in the operation region at a timing based on a result of detection by the sensor (3040). The recognition unit (2040) calculates motion of the operation body captured in the operation region in a period including a timing based on a result of detection by the sensor (3040).Type: GrantFiled: February 8, 2017Date of Patent: May 21, 2019Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida, Hiroki Kosuge
-
Patent number: 10234955Abstract: An input apparatus (2000) includes a position calculation unit (2020) and an input recognition unit (2040). The position calculation unit (2020) calculates a position of a marker included in a captured image. The marker is attached to the body of a user or is a part of the body of the user. The captured image is generated by a camera at a timing based on a result of detection by a sensor attached to the body of the user. The input recognition unit (2040) recognizes input specifying a location separated from the marker on the captured image on the basis of the calculated position of the marker. The location specified by the input is a location separated from the marker.Type: GrantFiled: September 20, 2016Date of Patent: March 19, 2019Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
-
Patent number: 10168769Abstract: An input apparatus (2000) includes a motion detection unit (2020) and an input recognition unit (2040). The motion detection unit (2020) detects motion of an object by using a captured image including the object. Here, the detected motion of the object is motion of the object in a period defined based on a result of detection by a sensor attached to the body of a user of the input apparatus (2000). The input recognition unit (2040) recognizes input to an information processing apparatus based on the detected motion of the object.Type: GrantFiled: September 20, 2016Date of Patent: January 1, 2019Assignee: NEC CORPORATIONInventors: Shin Norieda, Makoto Yoshimoto, Yoshinori Saida
-
Publication number: 20180267619Abstract: A marker (3020) is any part of a user's body or is any mark attached to the user. A sensor (3040) is attached to the user. An operation region calculation unit (2020) calculates an operation region included in a captured image on the basis of a position of the marker (3020) included in the captured image generated by a camera. A recognition unit (2040) calculates a position or motion of an operation body captured in the operation region, and recognizes an input operation on the basis of the calculated position or motion of the operation body. Note that the recognition unit (2040) calculates a position of the operation body captured in the operation region at a timing based on a result of detection by the sensor (3040). The recognition unit (2040) calculates motion of the operation body captured in the operation region in a period including a timing based on a result of detection by the sensor (3040).Type: ApplicationFiled: February 8, 2017Publication date: September 20, 2018Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
-
Publication number: 20180260032Abstract: An input apparatus (2000) includes a position calculation unit (2020) and an input recognition unit (2040). The position calculation unit (2020) calculates a position of a marker included in a captured image. The marker is attached to the body of a user or is a part of the body of the user. The captured image is generated by a camera at a timing based on a result of detection by a sensor attached to the body of the user. The input recognition unit (2040) recognizes input specifying a location separated from the marker on the captured image on the basis of the calculated position of the marker. The location specified by the input is a location separated from the marker.Type: ApplicationFiled: September 20, 2016Publication date: September 13, 2018Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
-
Publication number: 20180260033Abstract: An input apparatus (2000) includes a motion detection unit (2020) and an input recognition unit (2040). The motion detection unit (2020) detects motion of an object by using a captured image including the object. Here, the detected motion of the object is motion of the object in a period defined based on a result of detection by a sensor attached to the body of a user of the input apparatus (2000). The input recognition unit (2040) recognizes input to an information processing apparatus based on the detected motion of the object.Type: ApplicationFiled: September 20, 2016Publication date: September 13, 2018Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
-
Publication number: 20180253149Abstract: An information processing system (3000) includes a marker (3020). The marker (3020) is any part of the body of a user of the information processing system (3000), or is any object attached to the user of the information processing system (3000). An information processing apparatus (2000) includes an operation region extraction unit (2020) and a recognition unit (2040). The operation region extraction unit (2020) extracts an operation region from a captured image on the basis of a position of the marker (3020). The recognition unit (2040) calculates a position or motion of an operation body in the operation region on a captured image. The recognition unit (2040) recognizes an input operation on the basis of the calculated position or motion of the operation body.Type: ApplicationFiled: September 20, 2016Publication date: September 6, 2018Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
-
Publication number: 20180150186Abstract: An interface control system (2000) includes a first marker detection unit (2020) and a display control unit (2040). The first marker detection unit (2020) detects a first marker (50) attached to the body of a user based on a captured image obtained from a camera (20) worn by the user. The display control unit (2040) displays an operation image on a display screen (32) of a head mounted display 30 mounted on the user based on the first marker detected by the first marker detection unit (2020). The display control unit (2040) displays the operation image so that the operation image is viewed to be superimposed on the arm portion of the user when viewed with the eyes of the user.Type: ApplicationFiled: April 18, 2016Publication date: May 31, 2018Applicant: NEC CORPORATIONInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA
-
Publication number: 20170255838Abstract: According to an embodiment, an information processing system includes a display device and an information processing apparatus. The information processing apparatus includes a product recognition unit, a display control unit, and an operation recognition unit. The product recognition unit recognizes product. The display control unit acquires product information regarding the product recognized by the product recognition unit. Further, the display control unit displays an operation screen including the acquired product information on a display device. The operation recognition unit recognizes an input operation with respect to the operation screen on the basis of the position or movement of an operation body in a captured image.Type: ApplicationFiled: September 27, 2016Publication date: September 7, 2017Applicant: NEC CorporationInventors: Shin NORIEDA, Makoto YOSHIMOTO, Yoshinori SAIDA, Hiroki KOSUGE
-
Patent number: 7550649Abstract: An object of the present invention is to provide a model animal of Parkinson's disease comprising an ?-synuclein gene introduced therein. The present invention provides a transgenic non-human mammal or a portion thereof, wherein an ?-synuclein gene is introduced and the gene is expressed in neurons, and the number of dopamine-producing neurons in the substantial nigra is significantly decreased as compared with that of a wild-type animal.Type: GrantFiled: October 28, 2004Date of Patent: June 23, 2009Assignee: Taisho Pharmaceutical Co., Ltd.Inventors: Makoto Yoshimoto, Masaki Wakamatsu, Aiko Ishii
-
Publication number: 20070192879Abstract: An object of the present invention is to provide a model animal of Parkinson's disease comprising an ?-synuclein gene introduced therein. The present invention provides a transgenic non-human mammal or a portion thereof, wherein an ?-synuclein gene is introduced and the gene is expressed in neurons, and the number of dopamine-producing neurons in the substantial nigra is significantly decreased as compared with that of a wild-type animal.Type: ApplicationFiled: October 28, 2004Publication date: August 16, 2007Inventors: Makoto Yoshimoto, Masaki Wakamatsu, Aiko Ishii
-
Patent number: 6723521Abstract: A gene hucep-8 coding for a novel protein HUCEP-8 having sugar-transporting activity can be obtained by cloning from a cDNA library derived from human cerebral cortex.Type: GrantFiled: February 28, 2001Date of Patent: April 20, 2004Inventors: Makoto Yoshimoto, Madoka Yazaki, Kayo Matsumoto, Kiyoshi Takayama, Katsuki Tsuritani
-
Patent number: 5523391Abstract: DNA fragments encoding a tumor cell growth inhibitors and having a nucleotide sequence shown by formula (1) below, which are produced by preparing cDNA library from mRNA of the established 3T3 cell-derived cell line, amplifying various DNA fragments considered to encode the tumor cell growth inhibitors by the PCR method, analyzing the nucleotide sequences of these DNA fragments and determining the nucleotide sequences of the DNA fragments encoding the inhibitors: ##STR1## wherein X represents TTC TTT CTA or TTC (SEQ ID NO:1 or SEQ ID NO:2).Type: GrantFiled: May 25, 1994Date of Patent: June 4, 1996Assignee: Taisho Pharmaceutical Co., Ltd.Inventors: Toshi Komurasaki, Hitoshi Toyoda, Makoto Yoshimoto, Kazunori Hanada