Patents by Inventor Shinya Satoh

Shinya Satoh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210304731
    Abstract: A control device controls an electronic apparatus capable of communicating with an external server and receiving an input of voice information. The control device includes a voice recognition unit and a voice recognition control unit. The voice recognition unit is configured to perform voice information recognition on the inputted voice information. The voice recognition control unit is configured to transmit to the server the voice information and a voice recognition request that the server perform voice information recognition on the voice information and to determine whether or not there has occurred a recognition error in a voice recognition result produced by the server. When there have occurred more recognition errors than a prescribed number, the voice recognition control unit suspends the transmission of the voice recognition request to the server.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 30, 2021
    Inventors: Shinya Satoh, Kaiko Kuwamura, Hiroshi Wada
  • Patent number: 11062701
    Abstract: A response sentence that is satisfactory to a user is formed. A response device (1) includes: a related term searching section (13) configured to, in a case where an input sentence contains an unknown word, detect a related term, which is a known word that shares, with the unknown word, at least one shared word; and a response sentence forming section (16) configured to form the response sentence whose content is related to the related term that has been detected by the related term searching section (13).
    Type: Grant
    Filed: August 29, 2017
    Date of Patent: July 13, 2021
    Assignee: SHARP KABUSHIKI KAISHA
    Inventors: Shinya Satoh, Kazunori Morishita, Hiroyasu Igami, Naoki Esumi
  • Publication number: 20200090645
    Abstract: A response sentence that is satisfactory to a user is formed. A response device (1) includes: a related term searching section (13) configured to, in a case where an input sentence contains an unknown word, detect a related term, which is a known word that shares, with the unknown word, at least one shared word; and a response sentence forming section (16) configured to form the response sentence whose content is related to the related term that has been detected by the related term searching section (13).
    Type: Application
    Filed: August 29, 2017
    Publication date: March 19, 2020
    Inventors: SHINYA SATOH, KAZUNORI MORISHITA, HIROYASU IGAMI, NAOKI ESUMI
  • Publication number: 20190311716
    Abstract: A completion processing section (23) is configured to, if a user's speech inputted to an interactive device (1) omits some phrase, complete the speech of the user. A speech storing section (25) stores a user's speech having no omitted or incorrect phrases in a speech database (50) for use in generation of a speech of the interactive device (1). User's previous speech data thus stored are made effective use of for generation of a speech of the interactive device.
    Type: Application
    Filed: August 24, 2017
    Publication date: October 10, 2019
    Applicant: SHARP KABUSHIKI KAISHA
    Inventors: KAZUNORI MORISHITA, SHINYA SATOH, HIROYASU IGAMI, NAOKI ESUMI
  • Patent number: 9824086
    Abstract: A condition determining section (24) determines whether or not two consecutive lines in an image meet a joining condition that is based on a characteristic of a language of a character string, the two consecutive lines being extracted from the character string composed of a plurality of lines. In a case where the joining condition is met, an extracted line joining section (25) and a translation section (26) join and then translate the two consecutive lines.
    Type: Grant
    Filed: August 20, 2014
    Date of Patent: November 21, 2017
    Assignee: SHARP KABUSHIKI KAISHA
    Inventors: Shinya Satoh, Tatsuo Kishimoto, Tadao Nagasawa
  • Publication number: 20160381439
    Abstract: An indoor environment management apparatus includes a receiver, a display controller, an operation receiver, and a control unit. The receiver is configured to receive a measurement result from a measurement device that measures data concerning an indoor environment. The display controller is configured to display, in an operation terminal of a user, an indoor environment map in which the measurement result received by the receiver is displayed on a floor plan of an indoor space. The operation receiver is configured to receive a user operation with respect to a control device that adjusts the indoor environment from the operation terminal. The control unit is configured to generate control information by which the indoor environment is changed depending on the user operation, and output the generated control information to the control device.
    Type: Application
    Filed: June 17, 2016
    Publication date: December 29, 2016
    Inventors: Shinya SATOH, Kojiro FURUYA, Ruiko KIKKAWA, Tatsuya ISHIKAWA, Iyo KUDOH, Naoya MOTOMURA
  • Patent number: 9495090
    Abstract: A touch control section (5) includes: an operation accepting section (51) which accepts a touch operation conducted with respect to an icon or the like; and an accepting operation control section (52) which controls the operation accepting section (51) to forbid an accepting operation in a case where screen scroll is started. In a case where screen scroll is started, the accepting operation control section (52) controls the accepting operation control section (52) to lift the forbiddance of an accepting operation before the screen scroll automatically stops.
    Type: Grant
    Filed: February 15, 2013
    Date of Patent: November 15, 2016
    Assignee: SHARP KABUSHIKI KAISHA
    Inventors: Kiyofumi Ohtsuka, Hirokazu Ishikawa, Megumi Yokogawa, Shinya Satoh, Tatsuo Kishimoto, Yuhichi Yabuki, Tadao Nagasawa
  • Publication number: 20160321246
    Abstract: A condition determining section (24) determines whether or not two consecutive lines in an image meet a joining condition that is based on a characteristic of a language of a character string, the two consecutive lines being extracted from the character string composed of a plurality of lines. In a case where the joining condition is met, an extracted line joining section (25) and a translation section (26) join and then translate the two consecutive lines.
    Type: Application
    Filed: August 20, 2014
    Publication date: November 3, 2016
    Inventors: Shinya SATOH, Tatsuo KISHIMOTO, Tadao NAGASAWA
  • Publication number: 20140317634
    Abstract: In a case where a procedure in which a process generation step of allocating a resource necessary for application execution is carried out and then an application execution screen is displayed in a display panel is an application execution procedure, an application execution processing section (102) of the present invention carries out the process generation step in the background during application execution in the foreground.
    Type: Application
    Filed: February 15, 2013
    Publication date: October 23, 2014
    Inventors: Hirokazu Ishikawa, Kiyofumi Ohtsuka, Megumi Yokogawa, Shinya Satoh, Yuhichi Yabuki, Tadao Nagasawa
  • Publication number: 20140298251
    Abstract: A touch control section (5) includes: an operation accepting section (51) which accepts a touch operation conducted with respect to an icon or the like; and an accepting operation control section (52) which controls the operation accepting section (51) to forbid an accepting operation in a case where screen scroll is started. In a case where screen scroll is started, the accepting operation control section (52) controls the accepting operation control section (52) to lift the forbiddance of an accepting operation before the screen scroll automatically stops.
    Type: Application
    Filed: February 15, 2013
    Publication date: October 2, 2014
    Inventors: Kiyofumi Ohtsuka, Hirokazu Ishikawa, Megumi Yokogawa, Shinya Satoh, Tatsuo Kishimoto, Yuhichi Yabuki, Tadao Nagasawa