Patents by Inventor Kyosuke NISHIDA
Kyosuke NISHIDA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210374350Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.Type: ApplicationFiled: October 5, 2018Publication date: December 2, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kyosuke NISHIDA, Itsumi SAITO, Atsushi OTSUKA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210370519Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates nonverbal information that corresponds to feature quantities of voice or text on the basis of the feature quantities and a learned nonverbal information general model. The nonverbal information is information for controlling an expression unit that expresses behavior so that at least one of the number of times that the behavior is performed and the magnitude of the behavior correspond to the feature quantities.Type: ApplicationFiled: February 15, 2019Publication date: December 2, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Ryo ISHII, Ryuichiro HIGASHINAKA, Taichi KATAYAMA, Junji TOMITA, Nozomi KOBAYASHI, Kyosuke NISHIDA
-
Patent number: 11182435Abstract: Taking as input a group of text pairs for learning in which each pair is constituted with a first text for learning and a second text for learning that serves as an answer when a question is made with the first text for learning, a query expansion model is learned so as to generate a text serving as an expanded query for a text serving as a query.Type: GrantFiled: November 20, 2017Date of Patent: November 23, 2021Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi Otsuka, Katsuji Bessho, Kyosuke Nishida, Hisako Asano, Yoshihiro Matsuo
-
Publication number: 20210319330Abstract: There is provided a processing device for natural language processing with high search accuracy while time complexity and space complexity are suppressed. A first external knowledge search unit 11 acquires pieces of external knowledge searched for based on first scores obtained based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and an input sentence as first search results R1, with the input sentence Q as an input; and a second external knowledge search unit 12 determines second scores obtained from similarity degrees between pieces of external knowledge included in the first search results R1 and the input sentence using a neural network, and searches the first search results R1 to acquire second search results. A processing unit 14 acquires an output to the input sentence by arithmetic processing with the input sentence and pieces of external knowledge included in the second search results R2 as an input.Type: ApplicationFiled: November 8, 2019Publication date: October 14, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210294985Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.Type: ApplicationFiled: July 23, 2019Publication date: September 23, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210256018Abstract: This disclosure is provided, in which an answer generation unit configured to receive a document and a question as inputs, and execute processing of generating an answer sentence for the question by a learned model by using a word included in a union of a predetermined first vocabulary and a second vocabulary composed of words included in the document and the question, in which the learned model includes a learned neural network that has been learned in advance whether word included in the answer sentence is included in the second vocabulary, and increases or decreases a probability at which a word included in the second vocabulary is selected as the word included in the answer sentence at the time of generating the answer sentence by the learned neural network.Type: ApplicationFiled: March 27, 2019Publication date: August 19, 2021Applicant: Nippon Telegraph and Telephone CorporationInventors: Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20210232948Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.Type: ApplicationFiled: April 25, 2019Publication date: July 29, 2021Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210125516Abstract: A question that can be answered with polarity can be accurately answered with polarity. A machine comprehension unit 210 estimates the start and the end of a range serving as a basis for an answer to the question in text by using a reading comprehension model trained in advance to estimate the range based on the inputted text and question. A determination unit 220 determines the polarity of the answer to the question by using a determination model trained in advance to determine whether the polarity of the answer to the question is positive or not based on information obtained by the processing of the machine comprehension unit 210.Type: ApplicationFiled: June 14, 2019Publication date: April 29, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Atsushi OTSUKA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
-
Publication number: 20210081612Abstract: A relationship between phrases can be accurately estimated without incurring the cost of generating learning data. A learning data generation unit 62 extracts a pair of phrases having a dependency relationship with a segment containing a predetermined connection expression representing a relationship between phrases based on a dependency analysis result for input text, and generates a triple consisting of the extracted pair of phrases, and the connection expression or a relation label indicating a relationship represented by the connection expression. A learning unit 63 learns the relationship estimation model for estimating a relationship between phrases based on the triple generated by the learning data generation unit.Type: ApplicationFiled: February 15, 2019Publication date: March 18, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Junji TOMITA, Hisako ASANO
-
Publication number: 20210064966Abstract: An appropriate vector of any phrase can be generated. A lattice construction unit 212 constructs a lattice structure formed by links binding adjacent word or phrase candidates based on a morphological analysis result and a dependency analysis result of input text. A first learning unit 213 performs learning of a neural network A for estimating nearby word or phrase candidates from word or phrase candidates based on the lattice structure. A vector generation unit 214 acquires a vector of each of the word or phrase candidates from the neural network A and sets the vector as learning data. A second learning unit performs learning of a neural network B for vectorizing the word or phrase candidates based on the learning data.Type: ApplicationFiled: February 15, 2019Publication date: March 4, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210049210Abstract: To enable provision of appropriate information for a user query even in a case there are multiple information provision modules which are different in answer generation processing. A query sending unit 212 sends a user query to each one of a plurality of information provision module units 220 that are different in the answer generation processing and that each generate an answer candidate for the user query. An output control unit 214 performs control such that the answer candidate acquired from each one of the plurality of information provision module units 220 is displayed on a display unit 300 on a per-agent basis with information on an agent associated with that information provision module unit 220.Type: ApplicationFiled: February 13, 2019Publication date: February 18, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi OTSUKA, Kyosuke NISHIDA, Narichika NOMOTO, Hisako ASANO
-
Publication number: 20210042469Abstract: The present disclosure relates to concurrent learning of a relationship estimation model and a phrase generation model. The relationship estimation model estimates a relationship between phrases. The phrase generation model generates a phrase that relates to an input phrase. The phrase generation model includes an encoder and a decoder. The encoder converts a phrase into a vector using a three-piece set as learning data. The decoder generates, based on the converted vector and a connection expression or a relationship label, a phrase having a relationship expressed by the connection expression or the relationship label for the phrase. The relationship estimation model generates a relationship score from the converted vector, which indicates each phrase included in a combination of the phrases, and a vector indicating the connection expression and the relationship label.Type: ApplicationFiled: March 1, 2019Publication date: February 11, 2021Applicant: Nippon Telegraph and Telephone CorporationInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210042472Abstract: To make it possible to accurately generate a word vector even if vocabulary of a word vector data set is not limited. In a vector generating device 10 that generates vectors representing an input sentence P, when generating a series of the vectors representing the input sentence P based on vectors corresponding to words included in the input sentence P, a definition-sentence-considered-context encode unit 280 generates, based on a dictionary DB 230 storing sets of headwords y and definition sentences Dy, which are sentences defining the headwords y, concerning a word, which is the headword stored in the dictionary DB, among the words included in the input sentence P, the series of the vectors representing the input sentence P using the definition sentence Dy of the headwords y.Type: ApplicationFiled: March 4, 2019Publication date: February 11, 2021Applicant: Nippon Telegraph and Telephone CorporationInventors: Kosuke NISHIDA, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20210005218Abstract: A nonverbal information generation apparatus includes a display unit that partitions text into predetermined units, displays the text partitioned into the predetermined units, and makes nonverbal information that represents information about behavior of a verbal output agent or nonverbal information that represents information about behavior of a receiver of verbal information of the verbal output agent that corresponds to the text when the verbal output agent outputs the verbal information visible in association with the predetermined units of the text.Type: ApplicationFiled: February 15, 2019Publication date: January 7, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Ryo ISHII, Ryuichiro HIGASHINAKA, Taichi KATAYAMA, Junji TOMITA, Nozomi KOBAYASHI, Kyosuke NISHIDA
-
Publication number: 20210004541Abstract: A learning device of a phrase generation model includes a memory; and a processor configured to execute learning the phrase generation model including an encoder and a decoder, by using, as training data, a 3-tuple. The 3-tuple includes a combination of phrases and at least one of a conjunctive expression representing a relationship between the phrases, and a relational label indicating the relationship represented by the conjunctive expression. The encoder is configured to convert a phrase into a vector from a 2-tuple. The 2-tuple includes a phrase and at least one of the conjunctive expression and the relational label. The decoder is configured to generate, from the converted vector and the conjunctive expression or the relational label, a phrase having the relationship represented by the conjunctive expression or the relational label with respect to the phrase.Type: ApplicationFiled: February 22, 2019Publication date: January 7, 2021Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
-
Publication number: 20200401794Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling an expression unit that expresses behavior that corresponds to the text.Type: ApplicationFiled: February 15, 2019Publication date: December 24, 2020Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Ryo ISHII, Ryuichiro HIGASHINAKA, Taichi KATAYAMA, Junji TOMITA, Nozomi KOBAYASHI, Kyosuke NISHIDA
-
Publication number: 20200372915Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities and an expression unit that expresses the nonverbal information on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling the expression unit so as to express behavior corresponding to the text.Type: ApplicationFiled: February 15, 2019Publication date: November 26, 2020Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Ryo ISHII, Ryuichiro HIGASHINAKA, Taichi KATAYAMA, Junji TOMITA, Nozomi KOBAYASHI, Kyosuke NISHIDA
-
Publication number: 20190278812Abstract: Taking as input a group of text pairs for learning in which each pair is constituted with a first text for learning and a second text for learning that serves as an answer when a question is made with the first text for learning, a query expansion model is learned so as to generate a text serving as an expanded query for a text serving as a query.Type: ApplicationFiled: November 20, 2017Publication date: September 12, 2019Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATIONInventors: Atsushi OTSUKA, Katsuji BESSHO, Kyosuke NISHIDA, Hisako ASANO, Yoshihiro MATSUO