Patents by Inventor Itsumi SAITO
Itsumi SAITO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220343076Abstract: A text generation apparatus includes a memory and a processor configured to, based on learned parameters of neural networks, acquire, as a reference text, a predetermined number of two or more sentences having a relatively high relevance to an input sentence from a set of sentences different from the input sentence and generate text based on the input sentence and the reference text, such that information to be considered when generating text can be added as text.Type: ApplicationFiled: October 2, 2019Publication date: October 27, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Patent number: 11481560Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.Type: GrantFiled: October 5, 2018Date of Patent: October 25, 2022Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kyosuke Nishida, Itsumi Saito, Atsushi Otsuka, Hisako Asano, Junji Tomita
-
Patent number: 11403469Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.Type: GrantFiled: July 23, 2019Date of Patent: August 2, 2022Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
-
Publication number: 20220229997Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.Type: ApplicationFiled: May 28, 2019Publication date: July 21, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220138267Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to generate a question representation for a range of an answer in the document, wherein when generating a word of the question representation by performing a copy from the document, the generation unit adjusts a probability that a word included in the range is copied.Type: ApplicationFiled: February 12, 2020Publication date: May 5, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220138239Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.Type: ApplicationFiled: February 25, 2020Publication date: May 5, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220138601Abstract: A question-answering apparatus includes answer generating means of accepting as input a document set made up of one or more documents, a question sentence, and a style of an answer sentence for the question sentence and running a process of generating an answer sentence for the question sentence using a learned model based on the document set, wherein the learned model determines probability of generation of words contained in the answer sentence, according to the style, when generating the answer sentence.Type: ApplicationFiled: February 10, 2020Publication date: May 5, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220138438Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.Type: ApplicationFiled: February 21, 2020Publication date: May 5, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20220043972Abstract: An encoding unit transforms a piece of text divided into a plurality of spans that are subdivided units of the piece of text and a question which have been input into a vector representation sequence representing a meaning of a span and the question based on the piece of text and the question which have been input using a pre-trained encoding model for transforming input text into a vector representation sequence representing a meaning of the input text. For each of the spans, an evidence extraction unit estimates an evidence score indicating the degree to which the span is suitable as the evidence for extracting the answer using a pre-trained extraction model for calculating the evidence score based on the vector representation sequence.Type: ApplicationFiled: December 17, 2019Publication date: February 10, 2022Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Atsushi OTSUKA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA, Itsumi SAITO
-
Publication number: 20210374350Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.Type: ApplicationFiled: October 5, 2018Publication date: December 2, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210294985Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.Type: ApplicationFiled: July 23, 2019Publication date: September 23, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210256018Abstract: This disclosure is provided, in which an answer generation unit configured to receive a document and a question as inputs, and execute processing of generating an answer sentence for the question by a learned model by using a word included in a union of a predetermined first vocabulary and a second vocabulary composed of words included in the document and the question, in which the learned model includes a learned neural network that has been learned in advance whether word included in the answer sentence is included in the second vocabulary, and increases or decreases a probability at which a word included in the second vocabulary is selected as the word included in the answer sentence at the time of generating the answer sentence by the learned neural network.Type: ApplicationFiled: March 27, 2019Publication date: August 19, 2021Applicant: Nippon Telegraph and Telephone CorporationInventors: Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20210232948Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.Type: ApplicationFiled: April 25, 2019Publication date: July 29, 2021Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210125516Abstract: A question that can be answered with polarity can be accurately answered with polarity. A machine comprehension unit 210 estimates the start and the end of a range serving as a basis for an answer to the question in text by using a reading comprehension model trained in advance to estimate the range based on the inputted text and question. A determination unit 220 determines the polarity of the answer to the question by using a determination model trained in advance to determine whether the polarity of the answer to the question is positive or not based on information obtained by the processing of the machine comprehension unit 210.Type: ApplicationFiled: June 14, 2019Publication date: April 29, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20210081612Abstract: A relationship between phrases can be accurately estimated without incurring the cost of generating learning data. A learning data generation unit 62 extracts a pair of phrases having a dependency relationship with a segment containing a predetermined connection expression representing a relationship between phrases based on a dependency analysis result for input text, and generates a triple consisting of the extracted pair of phrases, and the connection expression or a relation label indicating a relationship represented by the connection expression. A learning unit 63 learns the relationship estimation model for estimating a relationship between phrases based on the triple generated by the learning data generation unit.Type: ApplicationFiled: February 15, 2019Publication date: March 18, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Junji TOMITA, Hisako ASANO
-
Publication number: 20210064966Abstract: An appropriate vector of any phrase can be generated. A lattice construction unit 212 constructs a lattice structure formed by links binding adjacent word or phrase candidates based on a morphological analysis result and a dependency analysis result of input text. A first learning unit 213 performs learning of a neural network A for estimating nearby word or phrase candidates from word or phrase candidates based on the lattice structure. A vector generation unit 214 acquires a vector of each of the word or phrase candidates from the neural network A and sets the vector as learning data. A second learning unit performs learning of a neural network B for vectorizing the word or phrase candidates based on the learning data.Type: ApplicationFiled: February 15, 2019Publication date: March 4, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210042469Abstract: The present disclosure relates to concurrent learning of a relationship estimation model and a phrase generation model. The relationship estimation model estimates a relationship between phrases. The phrase generation model generates a phrase that relates to an input phrase. The phrase generation model includes an encoder and a decoder. The encoder converts a phrase into a vector using a three-piece set as learning data. The decoder generates, based on the converted vector and a connection expression or a relationship label, a phrase having a relationship expressed by the connection expression or the relationship label for the phrase. The relationship estimation model generates a relationship score from the converted vector, which indicates each phrase included in a combination of the phrases, and a vector indicating the connection expression and the relationship label.Type: ApplicationFiled: March 1, 2019Publication date: February 11, 2021Applicant: Nippon Telegraph and Telephone CorporationInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210004541Abstract: A learning device of a phrase generation model includes a memory; and a processor configured to execute learning the phrase generation model including an encoder and a decoder, by using, as training data, a 3-tuple. The 3-tuple includes a combination of phrases and at least one of a conjunctive expression representing a relationship between the phrases, and a relational label indicating the relationship represented by the conjunctive expression. The encoder is configured to convert a phrase into a vector from a 2-tuple. The 2-tuple includes a phrase and at least one of the conjunctive expression and the relational label. The decoder is configured to generate, from the converted vector and the conjunctive expression or the relational label, a phrase having the relationship represented by the conjunctive expression or the relational label with respect to the phrase.Type: ApplicationFiled: February 22, 2019Publication date: January 7, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA