Patents by Inventor Kosuke NISHIDA

Kosuke NISHIDA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972365
    Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.
    Type: Grant
    Filed: April 25, 2019
    Date of Patent: April 30, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi Otsuka, Kyosuke Nishida, Itsumi Saito, Kosuke Nishida, Hisako Asano, Junji Tomita
  • Patent number: 11954435
    Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: April 9, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Kosuke Nishida, Hisako Asano, Junji Tomita, Atsushi Otsuka
  • Publication number: 20240054295
    Abstract: A learning apparatus includes a memory and at least one processor connected to the memory, wherein the processor configured to: convert input text data into a feature amount sequence based on a language model; and update parameters of the language model based on the text data, the feature amount sequence, and a word vector learned in advance.
    Type: Application
    Filed: March 8, 2021
    Publication date: February 15, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Sen YOSHIDA
  • Patent number: 11893353
    Abstract: To make it possible to accurately generate a word vector even if vocabulary of a word vector data set is not limited. In a vector generating device 10 that generates vectors representing an input sentence P, when generating a series of the vectors representing the input sentence P based on vectors corresponding to words included in the input sentence P, a definition-sentence-considered-context encode unit 280 generates, based on a dictionary DB 230 storing sets of headwords y and definition sentences Dy, which are sentences defining the headwords y, concerning a word, which is the headword stored in the dictionary DB, among the words included in the input sentence P, the series of the vectors representing the input sentence P using the definition sentence Dy of the headwords y.
    Type: Grant
    Filed: March 4, 2019
    Date of Patent: February 6, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke Nishida, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20230130902
    Abstract: A text generation apparatus includes a processor and a memory storing program instructions that cause the processor to receive an input sentence including one or more words and generate a sentence: estimate an importance of each word included in the input sentence and encode the input sentence; and take the importance and a result of encoding the input sentence as inputs to generate the sentence based on the input sentence. The processor uses a neural network based on learned parameters. This improves the accuracy of sentence generation.
    Type: Application
    Filed: March 3, 2020
    Publication date: April 27, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20230028376
    Abstract: The efficiency of summary learning that requires an additional input parameter is improved by causing a computer to execute: a first learning step of learning a first model for calculating an importance value of each component in source text, with use of a first training data group and a second training data group, the first training data group including source text, a query related to a summary of the source text, and summary data related to the query in the source text, and the second training group including source text and summary data generated based on the source text; and a second learning step of learning a second model for generating summary data from source text of training data, with use of each piece of training data in the second training data group and a plurality of components extracted for each piece of training data in the second training data group based on importance values calculated by the first model for components of the source text of the piece of training data.
    Type: Application
    Filed: December 18, 2019
    Publication date: January 26, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220405639
    Abstract: An information processing apparatus includes a training unit configured to share encoding layers from a first layer to a (N-n)-th layer having parameters trained in advance by a first model and a second model, and train parameters of a third model through multi-task training including training of the first model and retraining of the second model for a predetermined task, wherein N and n are integers equal to or greater than 1, and satisfies N>n, and in the third model, encoding layers from an ((N-n)+1)-th layer to an N-th layer having parameters trained in advance are divided into the first model and the second model.
    Type: Application
    Filed: November 21, 2019
    Publication date: December 22, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20220366140
    Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.
    Type: Application
    Filed: March 3, 2020
    Publication date: November 17, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
  • Publication number: 20220358361
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to extract one or more ranges that are likely to be answers in the document and generate a question representation whose answer is each of the ranges that are extracted.
    Type: Application
    Filed: February 12, 2020
    Publication date: November 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220343076
    Abstract: A text generation apparatus includes a memory and a processor configured to, based on learned parameters of neural networks, acquire, as a reference text, a predetermined number of two or more sentences having a relatively high relevance to an input sentence from a set of sentences different from the input sentence and generate text based on the input sentence and the reference text, such that information to be considered when generating text can be added as text.
    Type: Application
    Filed: October 2, 2019
    Publication date: October 27, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138267
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to generate a question representation for a range of an answer in the document, wherein when generating a word of the question representation by performing a copy from the document, the generation unit adjusts a probability that a word included in the range is copied.
    Type: Application
    Filed: February 12, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138239
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.
    Type: Application
    Filed: February 25, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138438
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.
    Type: Application
    Filed: February 21, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138601
    Abstract: A question-answering apparatus includes answer generating means of accepting as input a document set made up of one or more documents, a question sentence, and a style of an answer sentence for the question sentence and running a process of generating an answer sentence for the question sentence using a learned model based on the document set, wherein the learned model determines probability of generation of words contained in the answer sentence, according to the style, when generating the answer sentence.
    Type: Application
    Filed: February 10, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220043972
    Abstract: An encoding unit transforms a piece of text divided into a plurality of spans that are subdivided units of the piece of text and a question which have been input into a vector representation sequence representing a meaning of a span and the question based on the piece of text and the question which have been input using a pre-trained encoding model for transforming input text into a vector representation sequence representing a meaning of the input text. For each of the spans, an evidence extraction unit estimates an evidence score indicating the degree to which the span is suitable as the evidence for extracting the answer using a pre-trained extraction model for calculating the evidence score based on the vector representation sequence.
    Type: Application
    Filed: December 17, 2019
    Publication date: February 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Atsushi OTSUKA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA, Itsumi SAITO
  • Publication number: 20210383257
    Abstract: There is provided a learning device for learning a neural network used for search of external knowledge in order to increase search accuracy of external knowledge required for arithmetic processing. With an input sentence Q as an input, an external knowledge search unit 22 selects pieces of external knowledge based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and the input sentence Q, using a neural network, and causes the selected pieces of external knowledge to be search results R2. A processing unit 14 acquires a response sentence A to the input sentence Q by arithmetic processing with the input sentence Q and the selected pieces of external knowledge as an input. A consideration calculation unit 23 calculates a consideration v determined from an index indicating correctness of the response sentence A based on a true output T given to the input sentence Q in advance and an index indicating quality of the selected pieces of external knowledge.
    Type: Application
    Filed: November 8, 2019
    Publication date: December 9, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210319330
    Abstract: There is provided a processing device for natural language processing with high search accuracy while time complexity and space complexity are suppressed. A first external knowledge search unit 11 acquires pieces of external knowledge searched for based on first scores obtained based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and an input sentence as first search results R1, with the input sentence Q as an input; and a second external knowledge search unit 12 determines second scores obtained from similarity degrees between pieces of external knowledge included in the first search results R1 and the input sentence using a neural network, and searches the first search results R1 to acquire second search results. A processing unit 14 acquires an output to the input sentence by arithmetic processing with the input sentence and pieces of external knowledge included in the second search results R2 as an input.
    Type: Application
    Filed: November 8, 2019
    Publication date: October 14, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210232948
    Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.
    Type: Application
    Filed: April 25, 2019
    Publication date: July 29, 2021
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20210125516
    Abstract: A question that can be answered with polarity can be accurately answered with polarity. A machine comprehension unit 210 estimates the start and the end of a range serving as a basis for an answer to the question in text by using a reading comprehension model trained in advance to estimate the range based on the inputted text and question. A determination unit 220 determines the polarity of the answer to the question by using a determination model trained in advance to determine whether the polarity of the answer to the question is positive or not based on information obtained by the processing of the machine comprehension unit 210.
    Type: Application
    Filed: June 14, 2019
    Publication date: April 29, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20210042472
    Abstract: To make it possible to accurately generate a word vector even if vocabulary of a word vector data set is not limited. In a vector generating device 10 that generates vectors representing an input sentence P, when generating a series of the vectors representing the input sentence P based on vectors corresponding to words included in the input sentence P, a definition-sentence-considered-context encode unit 280 generates, based on a dictionary DB 230 storing sets of headwords y and definition sentences Dy, which are sentences defining the headwords y, concerning a word, which is the headword stored in the dictionary DB, among the words included in the input sentence P, the series of the vectors representing the input sentence P using the definition sentence Dy of the headwords y.
    Type: Application
    Filed: March 4, 2019
    Publication date: February 11, 2021
    Applicant: Nippon Telegraph and Telephone Corporation
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA