Patents by Inventor Itsumi SAITO
Itsumi SAITO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12210847Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.Type: GrantFiled: February 21, 2020Date of Patent: January 28, 2025Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Atsushi Otsuka, Kosuke Nishida, Hisako Asano, Junji Tomita
-
Publication number: 20250021762Abstract: A language processing device includes circuitry configured to generate an error sentence corresponding to an original sentence based on pronunciation corresponding to text data indicating the original sentence; use a language model based on a neural network model to generate a prediction sentence from the error sentence based on a language model parameter of the language model; and update the language model parameter based on a difference between the original sentence and the prediction sentence.Type: ApplicationFiled: December 1, 2021Publication date: January 16, 2025Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Sen YOSHIDA
-
Publication number: 20240320440Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . , Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.Type: ApplicationFiled: May 22, 2024Publication date: September 26, 2024Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Patent number: 12026472Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . , Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.Type: GrantFiled: May 28, 2019Date of Patent: July 2, 2024Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yasuhito Osugi, Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
-
Publication number: 20240202495Abstract: A learning apparatus executes receiving a text and a question associated with the text, and calculating an evidence score expressing a likelihood of a character string included in the text as evidence for an answer to the question by using a model parameter of a first neural network; extracting, by sampling from a predetermined distribution having the evidence score as a parameter, a first set indicating a set of the character strings as the evidence for the answer from the text; receiving the question and the first set and extracting the answer from the first set by using a model parameter of a second neural network; and learning the model parameters of the first and second neural networks by calculating a gradient through error back propagation by using a continuous relaxation and a first loss between the answer and a true answer to the question.Type: ApplicationFiled: March 6, 2020Publication date: June 20, 2024Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20240202442Abstract: A text generation apparatus includes a content selection unit that acquires a reference text based on an input text and information different from the input text and a generation unit that generates a text based on the input text and the reference text, wherein the content selection unit and the generation unit are neural networks based on learned parameters, so that information to be considered when generating a text can be added as text.Type: ApplicationFiled: March 4, 2024Publication date: June 20, 2024Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
-
Patent number: 11972365Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.Type: GrantFiled: April 25, 2019Date of Patent: April 30, 2024Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi Otsuka, Kyosuke Nishida, Itsumi Saito, Kosuke Nishida, Hisako Asano, Junji Tomita
-
Patent number: 11954435Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.Type: GrantFiled: March 3, 2020Date of Patent: April 9, 2024Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Kosuke Nishida, Hisako Asano, Junji Tomita, Atsushi Otsuka
-
Publication number: 20230306202Abstract: A language processing apparatus includes: a preprocessing unit that splits an input text into a plurality of short texts; a language processing unit that calculates a first feature and a second feature using a trained model for each of the plurality of short texts; and an external storage unit configured to store a third feature for one or more short texts, and the language processing unit uses the trained model to calculate the second feature for a certain short text using the first feature of the short text and the third feature stored in the external storage unit.Type: ApplicationFiled: August 20, 2020Publication date: September 28, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Patent number: 11704485Abstract: An appropriate vector of any phrase can be generated. A lattice construction unit 212 constructs a lattice structure formed by links binding adjacent word or phrase candidates based on a morphological analysis result and a dependency analysis result of input text. A first learning unit 213 performs learning of a neural network A for estimating nearby word or phrase candidates from word or phrase candidates based on the lattice structure. A vector generation unit 214 acquires a vector of each of the word or phrase candidates from the neural network A and sets the vector as learning data. A second learning unit performs learning of a neural network B for vectorizing the word or phrase candidates based on the learning data.Type: GrantFiled: February 15, 2019Date of Patent: July 18, 2023Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
-
Patent number: 11693854Abstract: This disclosure is provided, in which an answer generation unit configured to receive a document and a question as inputs, and execute processing of generating an answer sentence for the question by a learned model by using a word included in a union of a predetermined first vocabulary and a second vocabulary composed of words included in the document and the question, in which the learned model includes a learned neural network that has been learned in advance whether word included in the answer sentence is included in the second vocabulary, and increases or decreases a probability at which a word included in the second vocabulary is selected as the word included in the answer sentence at the time of generating the answer sentence by the learned neural network.Type: GrantFiled: March 27, 2019Date of Patent: July 4, 2023Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kyosuke Nishida, Atsushi Otsuka, Itsumi Saito, Hisako Asano, Junji Tomita
-
Patent number: 11651166Abstract: A learning device of a phrase generation model includes a memory; and a processor configured to execute learning the phrase generation model including an encoder and a decoder, by using, as training data, a 3-tuple. The 3-tuple includes a combination of phrases and at least one of a conjunctive expression representing a relationship between the phrases, and a relational label indicating the relationship represented by the conjunctive expression. The encoder is configured to convert a phrase into a vector from a 2-tuple. The 2-tuple includes a phrase and at least one of the conjunctive expression and the relational label. The decoder is configured to generate, from the converted vector and the conjunctive expression or the relational label, a phrase having the relationship represented by the conjunctive expression or the relational label with respect to the phrase.Type: GrantFiled: February 22, 2019Date of Patent: May 16, 2023Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
-
Publication number: 20230130902Abstract: A text generation apparatus includes a processor and a memory storing program instructions that cause the processor to receive an input sentence including one or more words and generate a sentence: estimate an importance of each word included in the input sentence and encode the input sentence; and take the importance and a result of encoding the input sentence as inputs to generate the sentence based on the input sentence. The processor uses a neural network based on learned parameters. This improves the accuracy of sentence generation.Type: ApplicationFiled: March 3, 2020Publication date: April 27, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20230034414Abstract: A dialogue processing apparatus includes one or more computers each including a memory and a processor configured to receive a question Qi as a word string representing a current question in an interactive machine reading comprehension task, a question history {Qi, . . . , Qi?1} as a set of word strings representing previous questions, and an answer history {A1, . . . , Ai?1} as a set of word strings representing previous answers to the previous questions, and use a pre-learned first model parameter, to generate an encoded context vector reflecting an attribute or an importance degree of each of the previous questions and the previous answers; and receive a document P to be used to generate an answer Ai to the question Qi and the encoded context vector, and use a pre-learned second model parameter, to perform matching between the document and the previous questions and previous answers, to generate the answer.Type: ApplicationFiled: December 12, 2019Publication date: February 2, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Patent number: 11568132Abstract: The present disclosure relates to concurrent learning of a relationship estimation model and a phrase generation model. The relationship estimation model estimates a relationship between phrases. The phrase generation model generates a phrase that relates to an input phrase. The phrase generation model includes an encoder and a decoder. The encoder converts a phrase into a vector using a three-piece set as learning data. The decoder generates, based on the converted vector and a connection expression or a relationship label, a phrase having a relationship expressed by the connection expression or the relationship label for the phrase. The relationship estimation model generates a relationship score from the converted vector, which indicates each phrase included in a combination of the phrases, and a vector indicating the connection expression and the relationship label.Type: GrantFiled: March 1, 2019Date of Patent: January 31, 2023Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
-
Publication number: 20230028376Abstract: The efficiency of summary learning that requires an additional input parameter is improved by causing a computer to execute: a first learning step of learning a first model for calculating an importance value of each component in source text, with use of a first training data group and a second training data group, the first training data group including source text, a query related to a summary of the source text, and summary data related to the query in the source text, and the second training group including source text and summary data generated based on the source text; and a second learning step of learning a second model for generating summary data from source text of training data, with use of each piece of training data in the second training data group and a plurality of components extracted for each piece of training data in the second training data group based on importance values calculated by the first model for components of the source text of the piece of training data.Type: ApplicationFiled: December 18, 2019Publication date: January 26, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20230026110Abstract: In a training data generation method, a computer executes: a generation step for generating partial data of a summary sentence created for text data; an extraction step for extracting, from the text data, a sentence set that is a portion of the text data, based on a similarity with the partial data; and a determination step for determining whether or not the partial data is to be used as training data for a neural network that generates a summary sentence, based on the similarity between the partial data and the sentence set. Thus, it is possible to streamline the collection of training data for a neural summarization model.Type: ApplicationFiled: December 18, 2019Publication date: January 26, 2023Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220405639Abstract: An information processing apparatus includes a training unit configured to share encoding layers from a first layer to a (N-n)-th layer having parameters trained in advance by a first model and a second model, and train parameters of a third model through multi-task training including training of the first model and retraining of the second model for a predetermined task, wherein N and n are integers equal to or greater than 1, and satisfies N>n, and in the third model, encoding layers from an ((N-n)+1)-th layer to an N-th layer having parameters trained in advance are divided into the first model and the second model.Type: ApplicationFiled: November 21, 2019Publication date: December 22, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20220366140Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.Type: ApplicationFiled: March 3, 2020Publication date: November 17, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
-
Publication number: 20220358361Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to extract one or more ranges that are likely to be answers in the document and generate a question representation whose answer is each of the ranges that are extracted.Type: ApplicationFiled: February 12, 2020Publication date: November 10, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA