Patents by Inventor Kyosuke NISHIDA

Kyosuke NISHIDA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230072537
    Abstract: A learning apparatus according to an embodiment has a feature generation means configured to take a search query, a first document related to the search query, and a second document that is not related to the search query as input, and generate a feature of the search query, a feature of the first document, and a feature of the second document, by using model parameters of a neural network, and an update means configured to take the feature of the search query, the feature of the first document, and the feature of the second document as input, and update the model parameters by using an error function including a cost function that is a differentiable approximation function of an L0 norm.
    Type: Application
    Filed: January 29, 2020
    Publication date: March 9, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Taku HASEGAWA, Kyosuke NISHIDA, Junji TOMITA, Hisako ASANO
  • Patent number: 11593436
    Abstract: To enable provision of appropriate information for a user query even in a case there are multiple information provision modules which are different in answer generation processing. A query sending unit 212 sends a user query to each one of a plurality of information provision module units 220 that are different in the answer generation processing and that each generate an answer candidate for the user query. An output control unit 214 performs control such that the answer candidate acquired from each one of the plurality of information provision module units 220 is displayed on a display unit 300 on a per-agent basis with information on an agent associated with that information provision module unit 220.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: February 28, 2023
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi Otsuka, Kyosuke Nishida, Narichika Nomoto, Hisako Asano
  • Publication number: 20230034414
    Abstract: A dialogue processing apparatus includes one or more computers each including a memory and a processor configured to receive a question Qi as a word string representing a current question in an interactive machine reading comprehension task, a question history {Qi, . . . , Qi?1} as a set of word strings representing previous questions, and an answer history {A1, . . . , Ai?1} as a set of word strings representing previous answers to the previous questions, and use a pre-learned first model parameter, to generate an encoded context vector reflecting an attribute or an importance degree of each of the previous questions and the previous answers; and receive a document P to be used to generate an answer Ai to the question Qi and the encoded context vector, and use a pre-learned second model parameter, to perform matching between the document and the previous questions and previous answers, to generate the answer.
    Type: Application
    Filed: December 12, 2019
    Publication date: February 2, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Patent number: 11568132
    Abstract: The present disclosure relates to concurrent learning of a relationship estimation model and a phrase generation model. The relationship estimation model estimates a relationship between phrases. The phrase generation model generates a phrase that relates to an input phrase. The phrase generation model includes an encoder and a decoder. The encoder converts a phrase into a vector using a three-piece set as learning data. The decoder generates, based on the converted vector and a connection expression or a relationship label, a phrase having a relationship expressed by the connection expression or the relationship label for the phrase. The relationship estimation model generates a relationship score from the converted vector, which indicates each phrase included in a combination of the phrases, and a vector indicating the connection expression and the relationship label.
    Type: Grant
    Filed: March 1, 2019
    Date of Patent: January 31, 2023
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20230028376
    Abstract: The efficiency of summary learning that requires an additional input parameter is improved by causing a computer to execute: a first learning step of learning a first model for calculating an importance value of each component in source text, with use of a first training data group and a second training data group, the first training data group including source text, a query related to a summary of the source text, and summary data related to the query in the source text, and the second training group including source text and summary data generated based on the source text; and a second learning step of learning a second model for generating summary data from source text of training data, with use of each piece of training data in the second training data group and a plurality of components extracted for each piece of training data in the second training data group based on importance values calculated by the first model for components of the source text of the piece of training data.
    Type: Application
    Filed: December 18, 2019
    Publication date: January 26, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20230026110
    Abstract: In a training data generation method, a computer executes: a generation step for generating partial data of a summary sentence created for text data; an extraction step for extracting, from the text data, a sentence set that is a portion of the text data, based on a similarity with the partial data; and a determination step for determining whether or not the partial data is to be used as training data for a neural network that generates a summary sentence, based on the similarity between the partial data and the sentence set. Thus, it is possible to streamline the collection of training data for a neural summarization model.
    Type: Application
    Filed: December 18, 2019
    Publication date: January 26, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220405639
    Abstract: An information processing apparatus includes a training unit configured to share encoding layers from a first layer to a (N-n)-th layer having parameters trained in advance by a first model and a second model, and train parameters of a third model through multi-task training including training of the first model and retraining of the second model for a predetermined task, wherein N and n are integers equal to or greater than 1, and satisfies N>n, and in the third model, encoding layers from an ((N-n)+1)-th layer to an N-th layer having parameters trained in advance are divided into the first model and the second model.
    Type: Application
    Filed: November 21, 2019
    Publication date: December 22, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20220366140
    Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.
    Type: Application
    Filed: March 3, 2020
    Publication date: November 17, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
  • Publication number: 20220358361
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to extract one or more ranges that are likely to be answers in the document and generate a question representation whose answer is each of the ranges that are extracted.
    Type: Application
    Filed: February 12, 2020
    Publication date: November 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220343076
    Abstract: A text generation apparatus includes a memory and a processor configured to, based on learned parameters of neural networks, acquire, as a reference text, a predetermined number of two or more sentences having a relatively high relevance to an input sentence from a set of sentences different from the input sentence and generate text based on the input sentence and the reference text, such that information to be considered when generating text can be added as text.
    Type: Application
    Filed: October 2, 2019
    Publication date: October 27, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Patent number: 11481560
    Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.
    Type: Grant
    Filed: October 5, 2018
    Date of Patent: October 25, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke Nishida, Itsumi Saito, Atsushi Otsuka, Hisako Asano, Junji Tomita
  • Patent number: 11404063
    Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities and an expression unit that expresses the nonverbal information on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling the expression unit so as to express behavior corresponding to the text.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: August 2, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo Ishii, Ryuichiro Higashinaka, Taichi Katayama, Junji Tomita, Nozomi Kobayashi, Kyosuke Nishida
  • Patent number: 11403469
    Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.
    Type: Grant
    Filed: July 23, 2019
    Date of Patent: August 2, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20220229997
    Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.
    Type: Application
    Filed: May 28, 2019
    Publication date: July 21, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138239
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.
    Type: Application
    Filed: February 25, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138438
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.
    Type: Application
    Filed: February 21, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138267
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to generate a question representation for a range of an answer in the document, wherein when generating a word of the question representation by performing a copy from the document, the generation unit adjusts a probability that a word included in the range is copied.
    Type: Application
    Filed: February 12, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138601
    Abstract: A question-answering apparatus includes answer generating means of accepting as input a document set made up of one or more documents, a question sentence, and a style of an answer sentence for the question sentence and running a process of generating an answer sentence for the question sentence using a learned model based on the document set, wherein the learned model determines probability of generation of words contained in the answer sentence, according to the style, when generating the answer sentence.
    Type: Application
    Filed: February 10, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220043972
    Abstract: An encoding unit transforms a piece of text divided into a plurality of spans that are subdivided units of the piece of text and a question which have been input into a vector representation sequence representing a meaning of a span and the question based on the piece of text and the question which have been input using a pre-trained encoding model for transforming input text into a vector representation sequence representing a meaning of the input text. For each of the spans, an evidence extraction unit estimates an evidence score indicating the degree to which the span is suitable as the evidence for extracting the answer using a pre-trained extraction model for calculating the evidence score based on the vector representation sequence.
    Type: Application
    Filed: December 17, 2019
    Publication date: February 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Atsushi OTSUKA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA, Itsumi SAITO
  • Publication number: 20210383257
    Abstract: There is provided a learning device for learning a neural network used for search of external knowledge in order to increase search accuracy of external knowledge required for arithmetic processing. With an input sentence Q as an input, an external knowledge search unit 22 selects pieces of external knowledge based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and the input sentence Q, using a neural network, and causes the selected pieces of external knowledge to be search results R2. A processing unit 14 acquires a response sentence A to the input sentence Q by arithmetic processing with the input sentence Q and the selected pieces of external knowledge as an input. A consideration calculation unit 23 calculates a consideration v determined from an index indicating correctness of the response sentence A based on a true output T given to the input sentence Q in advance and an index indicating quality of the selected pieces of external knowledge.
    Type: Application
    Filed: November 8, 2019
    Publication date: December 9, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA