Patents by Inventor Hisako ASANO

Hisako ASANO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210042472
    Abstract: To make it possible to accurately generate a word vector even if vocabulary of a word vector data set is not limited. In a vector generating device 10 that generates vectors representing an input sentence P, when generating a series of the vectors representing the input sentence P based on vectors corresponding to words included in the input sentence P, a definition-sentence-considered-context encode unit 280 generates, based on a dictionary DB 230 storing sets of headwords y and definition sentences Dy, which are sentences defining the headwords y, concerning a word, which is the headword stored in the dictionary DB, among the words included in the input sentence P, the series of the vectors representing the input sentence P using the definition sentence Dy of the headwords y.
    Type: Application
    Filed: March 4, 2019
    Publication date: February 11, 2021
    Applicant: Nippon Telegraph and Telephone Corporation
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210042469
    Abstract: The present disclosure relates to concurrent learning of a relationship estimation model and a phrase generation model. The relationship estimation model estimates a relationship between phrases. The phrase generation model generates a phrase that relates to an input phrase. The phrase generation model includes an encoder and a decoder. The encoder converts a phrase into a vector using a three-piece set as learning data. The decoder generates, based on the converted vector and a connection expression or a relationship label, a phrase having a relationship expressed by the connection expression or the relationship label for the phrase. The relationship estimation model generates a relationship score from the converted vector, which indicates each phrase included in a combination of the phrases, and a vector indicating the connection expression and the relationship label.
    Type: Application
    Filed: March 1, 2019
    Publication date: February 11, 2021
    Applicant: Nippon Telegraph and Telephone Corporation
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210034822
    Abstract: To arrange all words so that the distance of a given word pair will be appropriate. Using as input a concept base 22 which is a set of pairs of a word and a vector representing a concept of the word, and a dictionary 24 which is a set of semantically distant or close word pairs, when a word pair C being a pair of given words A, B in the concept base 22 is present in the dictionary 24, conversion means 30 associates with the word pair C a magnitude D of a difference vector between a difference vector V? between a converted vector of the word A and a converted vector of the word B, and a vector kV determined by multiplying a difference vector V between the vector of the word A in the concept base 22 and the vector of the word B in the concept base 22 by a scalar value k. When the word pair C is not present in the dictionary 24, the conversion means 30 associates the magnitude D of the difference vector between the difference vector V? and the difference vector V with the word pair C.
    Type: Application
    Filed: April 4, 2019
    Publication date: February 4, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Katsuji BESSHO, Hisako ASANO, Junji TOMITA
  • Publication number: 20210004541
    Abstract: A learning device of a phrase generation model includes a memory; and a processor configured to execute learning the phrase generation model including an encoder and a decoder, by using, as training data, a 3-tuple. The 3-tuple includes a combination of phrases and at least one of a conjunctive expression representing a relationship between the phrases, and a relational label indicating the relationship represented by the conjunctive expression. The encoder is configured to convert a phrase into a vector from a 2-tuple. The 2-tuple includes a phrase and at least one of the conjunctive expression and the relational label. The decoder is configured to generate, from the converted vector and the conjunctive expression or the relational label, a phrase having the relationship represented by the conjunctive expression or the relational label with respect to the phrase.
    Type: Application
    Filed: February 22, 2019
    Publication date: January 7, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20190278812
    Abstract: Taking as input a group of text pairs for learning in which each pair is constituted with a first text for learning and a second text for learning that serves as an answer when a question is made with the first text for learning, a query expansion model is learned so as to generate a text serving as an expanded query for a text serving as a query.
    Type: Application
    Filed: November 20, 2017
    Publication date: September 12, 2019
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Katsuji BESSHO, Kyosuke NISHIDA, Hisako ASANO, Yoshihiro MATSUO