Patents by Inventor Hisako ASANO

Hisako ASANO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220269856
    Abstract: A structured text processing learning apparatus includes a memory, and a processor configured to: analyze a structure of structured text, extract, from the structured text, information related to a predetermined structure to be extracted based on the structure, generate a converted document including text data in which each string indicating a structure of the extracted information has been converted, in accordance with the structure related to the extracted information, and train a neural network for executing predetermined processing for the converted document, using, as input, the generated converted document and correct answer information for performing the predetermined processing. Thus, application of a neural network to structured text is facilitated.
    Type: Application
    Filed: August 1, 2019
    Publication date: August 25, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Narichika NOMOTO, Hisako ASANO, Junji TOMITA
  • Publication number: 20220253591
    Abstract: A structured text processing apparatus includes one or more computers each including a memory and a processor configured to analyze a tree structure of a structured text; specify, for each leaf node in the tree structure, a path from the leaf node to a root node; and generate a converted text including text data in which strings associated with respective nodes from the root node to the leaf node of each path are connected to each other.
    Type: Application
    Filed: August 1, 2019
    Publication date: August 11, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Narichika NOMOTO, Hisako ASANO, Junji TOMITA
  • Patent number: 11403469
    Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.
    Type: Grant
    Filed: July 23, 2019
    Date of Patent: August 2, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20220229997
    Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.
    Type: Application
    Filed: May 28, 2019
    Publication date: July 21, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138239
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.
    Type: Application
    Filed: February 25, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138267
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to generate a question representation for a range of an answer in the document, wherein when generating a word of the question representation by performing a copy from the document, the generation unit adjusts a probability that a word included in the range is copied.
    Type: Application
    Filed: February 12, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138438
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.
    Type: Application
    Filed: February 21, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138601
    Abstract: A question-answering apparatus includes answer generating means of accepting as input a document set made up of one or more documents, a question sentence, and a style of an answer sentence for the question sentence and running a process of generating an answer sentence for the question sentence using a learned model based on the document set, wherein the learned model determines probability of generation of words contained in the answer sentence, according to the style, when generating the answer sentence.
    Type: Application
    Filed: February 10, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220043972
    Abstract: An encoding unit transforms a piece of text divided into a plurality of spans that are subdivided units of the piece of text and a question which have been input into a vector representation sequence representing a meaning of a span and the question based on the piece of text and the question which have been input using a pre-trained encoding model for transforming input text into a vector representation sequence representing a meaning of the input text. For each of the spans, an evidence extraction unit estimates an evidence score indicating the degree to which the span is suitable as the evidence for extracting the answer using a pre-trained extraction model for calculating the evidence score based on the vector representation sequence.
    Type: Application
    Filed: December 17, 2019
    Publication date: February 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Atsushi OTSUKA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA, Itsumi SAITO
  • Publication number: 20210383257
    Abstract: There is provided a learning device for learning a neural network used for search of external knowledge in order to increase search accuracy of external knowledge required for arithmetic processing. With an input sentence Q as an input, an external knowledge search unit 22 selects pieces of external knowledge based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and the input sentence Q, using a neural network, and causes the selected pieces of external knowledge to be search results R2. A processing unit 14 acquires a response sentence A to the input sentence Q by arithmetic processing with the input sentence Q and the selected pieces of external knowledge as an input. A consideration calculation unit 23 calculates a consideration v determined from an index indicating correctness of the response sentence A based on a true output T given to the input sentence Q in advance and an index indicating quality of the selected pieces of external knowledge.
    Type: Application
    Filed: November 8, 2019
    Publication date: December 9, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210374350
    Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.
    Type: Application
    Filed: October 5, 2018
    Publication date: December 2, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Hisako ASANO, Junji TOMITA
  • Patent number: 11182435
    Abstract: Taking as input a group of text pairs for learning in which each pair is constituted with a first text for learning and a second text for learning that serves as an answer when a question is made with the first text for learning, a query expansion model is learned so as to generate a text serving as an expanded query for a text serving as a query.
    Type: Grant
    Filed: November 20, 2017
    Date of Patent: November 23, 2021
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi Otsuka, Katsuji Bessho, Kyosuke Nishida, Hisako Asano, Yoshihiro Matsuo
  • Publication number: 20210319330
    Abstract: There is provided a processing device for natural language processing with high search accuracy while time complexity and space complexity are suppressed. A first external knowledge search unit 11 acquires pieces of external knowledge searched for based on first scores obtained based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and an input sentence as first search results R1, with the input sentence Q as an input; and a second external knowledge search unit 12 determines second scores obtained from similarity degrees between pieces of external knowledge included in the first search results R1 and the input sentence using a neural network, and searches the first search results R1 to acquire second search results. A processing unit 14 acquires an output to the input sentence by arithmetic processing with the input sentence and pieces of external knowledge included in the second search results R2 as an input.
    Type: Application
    Filed: November 8, 2019
    Publication date: October 14, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210294985
    Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.
    Type: Application
    Filed: July 23, 2019
    Publication date: September 23, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210256018
    Abstract: This disclosure is provided, in which an answer generation unit configured to receive a document and a question as inputs, and execute processing of generating an answer sentence for the question by a learned model by using a word included in a union of a predetermined first vocabulary and a second vocabulary composed of words included in the document and the question, in which the learned model includes a learned neural network that has been learned in advance whether word included in the answer sentence is included in the second vocabulary, and increases or decreases a probability at which a word included in the second vocabulary is selected as the word included in the answer sentence at the time of generating the answer sentence by the learned neural network.
    Type: Application
    Filed: March 27, 2019
    Publication date: August 19, 2021
    Applicant: Nippon Telegraph and Telephone Corporation
    Inventors: Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20210232948
    Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.
    Type: Application
    Filed: April 25, 2019
    Publication date: July 29, 2021
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210125516
    Abstract: A question that can be answered with polarity can be accurately answered with polarity. A machine comprehension unit 210 estimates the start and the end of a range serving as a basis for an answer to the question in text by using a reading comprehension model trained in advance to estimate the range based on the inputted text and question. A determination unit 220 determines the polarity of the answer to the question by using a determination model trained in advance to determine whether the polarity of the answer to the question is positive or not based on information obtained by the processing of the machine comprehension unit 210.
    Type: Application
    Filed: June 14, 2019
    Publication date: April 29, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20210081612
    Abstract: A relationship between phrases can be accurately estimated without incurring the cost of generating learning data. A learning data generation unit 62 extracts a pair of phrases having a dependency relationship with a segment containing a predetermined connection expression representing a relationship between phrases based on a dependency analysis result for input text, and generates a triple consisting of the extracted pair of phrases, and the connection expression or a relation label indicating a relationship represented by the connection expression. A learning unit 63 learns the relationship estimation model for estimating a relationship between phrases based on the triple generated by the learning data generation unit.
    Type: Application
    Filed: February 15, 2019
    Publication date: March 18, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Junji TOMITA, Hisako ASANO
  • Publication number: 20210064966
    Abstract: An appropriate vector of any phrase can be generated. A lattice construction unit 212 constructs a lattice structure formed by links binding adjacent word or phrase candidates based on a morphological analysis result and a dependency analysis result of input text. A first learning unit 213 performs learning of a neural network A for estimating nearby word or phrase candidates from word or phrase candidates based on the lattice structure. A vector generation unit 214 acquires a vector of each of the word or phrase candidates from the neural network A and sets the vector as learning data. A second learning unit performs learning of a neural network B for vectorizing the word or phrase candidates based on the learning data.
    Type: Application
    Filed: February 15, 2019
    Publication date: March 4, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210049210
    Abstract: To enable provision of appropriate information for a user query even in a case there are multiple information provision modules which are different in answer generation processing. A query sending unit 212 sends a user query to each one of a plurality of information provision module units 220 that are different in the answer generation processing and that each generate an answer candidate for the user query. An output control unit 214 performs control such that the answer candidate acquired from each one of the plurality of information provision module units 220 is displayed on a display unit 300 on a per-agent basis with information on an agent associated with that information provision module unit 220.
    Type: Application
    Filed: February 13, 2019
    Publication date: February 18, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Narichika NOMOTO, Hisako ASANO