Patents by Inventor Yoshihiko ASAO

Yoshihiko ASAO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11970220
    Abstract: There is provided an electric power steering apparatus in which a motor housing that contains motor constituent members and has a waterproofing function and a control unit case that contains control apparatus constituent members and has a waterproofing function are provided, in which an output axle of a motor extends to the outside of the motor housing while maintaining the waterproofing function, and in which a respiratory apparatus that performs a respiratory action, based on an inner pressure change, is provided in the motor housing.
    Type: Grant
    Filed: May 22, 2018
    Date of Patent: April 30, 2024
    Assignee: Mitsubishi Electric Corporation
    Inventors: Yoshihiko Onishi, Yoshihito Asao
  • Patent number: 11952054
    Abstract: There is provided an electric power steering apparatus in which at least one of a motor housing and a control unit has a respiratory apparatus that performs a respiratory action, based on an inner pressure change in at least one of the motor housing and the control unit.
    Type: Grant
    Filed: May 22, 2018
    Date of Patent: April 9, 2024
    Assignee: Mitsubishi Electric Corporation
    Inventors: Yoshihito Asao, Yoshihiko Onishi, Toyoaki Udo
  • Patent number: 11861307
    Abstract: A request paraphrasing system 120 allowing a dialogue system to flexibly address to requests in various different manners of expression includes: a pre-processing unit 130 converting a user input 56 to a word vector sequence; and a neural paraphrasing model 94 trained in advance by machine learning to receive the word vector sequence as an input and paraphrasing a request represented by the word vector sequence to a request having a higher probability of obtaining an answer from a question-answering device 122 than the request before paraphrasing. As pre-processing, whether the user input 56 is a request or not may be determined and it may be paraphrased only when it is determined to be a request. Further, a classification model 98 may classify the input request to determine to which request class it belongs, and the classification may be input as one feature to neural paraphrasing model 94.
    Type: Grant
    Filed: March 5, 2019
    Date of Patent: January 2, 2024
    Assignee: National Institute of Information and Communications Technology
    Inventors: Yoshihiko Asao, Ryu Iida, Canasai Kruengkrai, Noriyuki Abe, Kanako Onishi, Kentaro Torisawa, Yutaka Kidawara
  • Publication number: 20210326675
    Abstract: A memory for a question-answering device that reduces influence of noise on answer generation and is capable of generating highly accurate answers includes: a memory configured to normalize vector expressions of answers included in a set of answers extracted from a prescribed background knowledge source for each of a plurality of mutually different questions and to store the results as normalized vectors; and a key-value memory access unit responsive to application of a question vector derived from a question for accessing the memory and for updating the question vector by using a degree of relatedness between the question vector and the plurality of questions and using the normalized vectors corresponding to respective ones of the plurality of questions.
    Type: Application
    Filed: June 18, 2019
    Publication date: October 21, 2021
    Inventors: Jonghoon OH, Kentaro TORISAWA, Canasai KRUENGKRAI, Julien KLOETZER, Ryu IIDA, Ryo ISHIDA, Yoshihiko ASAO
  • Patent number: 11106714
    Abstract: A summary generating apparatus includes a text storage device storing text with information indicating a portion to be focused on; word vector converters vectorizing each word of the text and adding an element indicating whether the word is focused on or not to the vector and thereby converting the text to a word vector sequence; an LSTM implemented by a neural network performing sequence-to-sequence type conversion, pre-trained by machine learning to output, in response to each of the word vectors of the word vector sequence input in a prescribed order, a summary of the text consisting of the words represented by the word sequence; and input units inputting each of the word vectors of the word vector sequence in the prescribed order to the neural network.
    Type: Grant
    Filed: May 7, 2018
    Date of Patent: August 31, 2021
    Assignee: National Institute of Information and Communications Technology
    Inventors: Ryu Iida, Kentaro Torisawa, Jonghoon Oh, Canasai Kruengkrai, Yoshihiko Asao, Noriyuki Abe, Junta Mizuno, Julien Kloetzer
  • Publication number: 20210034817
    Abstract: A request paraphrasing system 120 allowing a dialogue system to flexibly address to requests in various different manners of expression includes: a pre-processing unit 130 converting a user input 56 to a word vector sequence; and a neural paraphrasing model 94 trained in advance by machine learning to receive the word vector sequence as an input and paraphrasing a request represented by the word vector sequence to a request having a higher probability of obtaining an answer from a question-answering device 122 than the request before paraphrasing. As pre-processing, whether the user input 56 is a request or not may be determined and it may be paraphrased only when it is determined to be a request. Further, a classification model 98 may classify the input request to determine to which request class it belongs, and the classification may be input as one feature to neural paraphrasing model 94.
    Type: Application
    Filed: March 5, 2019
    Publication date: February 4, 2021
    Inventors: Yoshihiko ASAO, Ryu IIDA, Canasai KRUENGKRAI, Noriyuki ABE, Kanako ONISHI, Kentaro TORISAWA, Yutaka KIDAWARA
  • Publication number: 20200159755
    Abstract: A summary generating apparatus includes; a text storage device storing text with information indicating a portion to be focused on; word vector converters vectorizing each word of the text and adding an element indicating whether the word is focused on or not to the vector and thereby converting the text to a word vector sequence; an LSTM implemented by a neural network performing sequence-to-sequence type conversion, pre-trained by machine learning to output, in response to each of the word vectors of the word vector sequence input in a prescribed order, a summary of the text consisting of the words represented by the word sequence; and input units inputting each of the word vectors of the word vector sequence in the prescribed order to the neural network.
    Type: Application
    Filed: May 7, 2018
    Publication date: May 21, 2020
    Inventors: Ryu IIDA, Kentaro TORISAWA, Jonghoon OH, Canasai KRUENGKRAI, Yoshihiko ASAO, Noriyuki ABE, Junta MIZUNO, Julien KLOETZER