Patents by Inventor Kyohei TOMITA

Kyohei TOMITA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11200506
    Abstract: A method provides information to a user as a function of derived user intent. The method includes receiving input from a user, generating an intent vector by processing the received input though an artificial intelligence model that has been trained with data representative of the user's intention, wherein the intent vector comprises a probability for each intent in a known set of possible intents, executing a trigger control model to determine whether to respond to the user as a function of the input from the user and the intent vector, utilizing the trigger control model, received input, and intent vector input to generate a response via a trained chatbot, and providing the response via an output device.
    Type: Grant
    Filed: December 15, 2017
    Date of Patent: December 14, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Xianchao Wu, Kyohei Tomita, Keizo Fujiwara
  • Patent number: 10671181
    Abstract: Receiving user input. A method includes displaying a set of characters arranged sequentially in a curvilinear or linear fashion such that any one of the characters in the set of characters can be identified for selection by continuous and uniform user input from a user. The method further comprises displaying a plurality of the characters in the set of characters in a fashion where each given character in the plurality of characters is displayed at a level of prominence determined by a probability that the given character is a next character in a string of characters selected by a user. The method further includes receiving user input in a continuous and uniform fashion from a user to identify a character in the set of characters. The method further includes receiving user input selecting the identified character. The method further includes adding the identified character to the string of characters.
    Type: Grant
    Filed: April 3, 2017
    Date of Patent: June 2, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Yaming Xu, Kyohei Tomita
  • Patent number: 10573009
    Abstract: Provided is an in vivo movement tracking apparatus configured to track a portion of interest that moves in vivo, in which accuracy and robustness of tracking are improved. The apparatus is configured to determine an estimated position of ah organ in a biological image based on the past movement of the organ and search for contour points corresponding to a plurality of control points, respectively, representing a contour shape of the organ in a region corresponding to the estimated position, to thereby determine an estimated contour of the organ based on the contour points. The in vivo movement tracking apparatus is configured to determine a position of a portion of interest, which moves in association with the organ, based on the estimated contour with reference to previously acquired sample data regarding a positional relationship between a contour of the organ and the portion of interest.
    Type: Grant
    Filed: January 22, 2018
    Date of Patent: February 25, 2020
    Assignees: THE UNIVERSITY OF TOKYO, THE UNIVERSITY OF ELECTRO-COMMUNICATIONS, PUBLIC UNIVERSITY CORPORATION YOKOHAMA CITY UNIVERSITY
    Inventors: Norihiro Koizumi, Atsushi Kayasuga, Kyohei Tomita, Izumu Hosoi, Yu Nishiyama, Hiroyuki Tsukihara, Hideyo Miyazaki, Hiroyuki Fukuda, Kazushi Numata, Kiyoshi Yoshinaka, Takashi Azuma, Naohiko Sugita, Yukio Homma, Yoichiro Matsumoto, Mamoru Mitsuishi
  • Patent number: 10535159
    Abstract: An in vivo motion tracking device tracking an in vivo motion that is a tracking target included in an ultrasonic image includes an image acquiring unit that is configured to acquire an ultrasonic image, an advance learning unit that is configured to perform advance learning using the ultrasonic image as learning data, and a tracking unit that is configured to track a position of the tracking target in an ultrasonic image including the tracking target after the advance learning performed by the advance learning unit.
    Type: Grant
    Filed: January 10, 2018
    Date of Patent: January 14, 2020
    Assignees: The University of Electro-Communications, PUBLIC UNIVERSITY CORPORATION YOKOHAMA CITY UNIVERSITY
    Inventors: Norihiro Koizumi, Yu Nishiyama, Ryosuke Kondo, Kyohei Tomita, Fumio Eura, Kazushi Numata
  • Publication number: 20190188590
    Abstract: A method provides information to a user as a function of derived user intent. The method includes receiving input from a user, generating an intent vector by processing the received input though an artificial intelligence model that has been trained with data representative of the user's intention, wherein the intent vector comprises a probability for each intent in a known set of possible intents, executing a trigger control model to determine whether to respond to the user as a function of the input from the user and the intent vector, utilizing the trigger control model, received input, and intent vector input to generate a response via a trained chatbot, and providing the response via an output device.
    Type: Application
    Filed: December 15, 2017
    Publication date: June 20, 2019
    Inventors: Xianchao Wu, Kyohei Tomita, Keizo Fujiwara
  • Publication number: 20190057517
    Abstract: An in vivo motion tracking device tracking an in vivo motion that is a tracking target included in an ultrasonic image includes an image acquiring unit that is configured to acquire an ultrasonic image, an advance learning unit that is configured to perform advance learning using the ultrasonic image as learning data, and a tracking unit that is configured to track a position of the tracking target in an ultrasonic image including the tracking target after the advance learning performed by the advance learning unit.
    Type: Application
    Filed: January 10, 2018
    Publication date: February 21, 2019
    Inventors: Norihiro KOIZUMI, Yu NISHIYAMA, Ryosuke KONDO, Kyohei TOMITA, Fumio EURA, Kazushi NUMATA
  • Publication number: 20180285335
    Abstract: Receiving user input. A method includes displaying a set of characters arranged sequentially in a curvilinear or linear fashion such that any one of the characters in the set of characters can be identified for selection by continuous and uniform user input from a user. The method further comprises displaying a plurality of the characters in the set of characters in a fashion where each given character in the plurality of characters is displayed at a level of prominence determined by a probability that the given character is a next character in a string of characters selected by a user. The method further includes receiving user input in a continuous and uniform fashion from a user to identify a character in the set of characters. The method further includes receiving user input selecting the identified character. The method further includes adding the identified character to the string of characters.
    Type: Application
    Filed: April 3, 2017
    Publication date: October 4, 2018
    Inventors: Yaming Xu, Kyohei Tomita
  • Publication number: 20180253855
    Abstract: Provided is an in vivo movement tracking apparatus configured to track a portion of interest that moves in vivo, in which accuracy and robustness of tracking are improved. The apparatus is configured to determine an estimated position of ah organ in a biological image based on the past movement of the organ and search for contour points corresponding to a plurality of control points, respectively, representing a contour shape of the organ in a region corresponding to the estimated position, to thereby determine an estimated contour of the organ based on the contour points. The in vivo movement tracking apparatus is configured to determine a position of a portion of interest, which moves in association with the organ, based on the estimated contour with reference to previously acquired sample data regarding a positional relationship between a contour of the organ and the portion of interest.
    Type: Application
    Filed: January 22, 2018
    Publication date: September 6, 2018
    Inventors: Norihiro KOIZUMI, Atsushi KAYASUGA, Kyohei TOMITA, Izumu HOSOI, Yu NISHIYAMA, Hiroyuki TSUKIHARA, Hideyo MIYAZAKI, Hiroyuki FUKUDA, Kazushi NUMATA, Kiyoshi YOSHINAKA, Takashi AZUMA, Naohiko SUGITA, Yukio HOMMA, Yoichiro MATSUMOTO, Mamoru MITSUISHI