Patents by Inventor Ryu IIDA
Ryu IIDA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11861307Abstract: A request paraphrasing system 120 allowing a dialogue system to flexibly address to requests in various different manners of expression includes: a pre-processing unit 130 converting a user input 56 to a word vector sequence; and a neural paraphrasing model 94 trained in advance by machine learning to receive the word vector sequence as an input and paraphrasing a request represented by the word vector sequence to a request having a higher probability of obtaining an answer from a question-answering device 122 than the request before paraphrasing. As pre-processing, whether the user input 56 is a request or not may be determined and it may be paraphrased only when it is determined to be a request. Further, a classification model 98 may classify the input request to determine to which request class it belongs, and the classification may be input as one feature to neural paraphrasing model 94.Type: GrantFiled: March 5, 2019Date of Patent: January 2, 2024Assignee: National Institute of Information and Communications TechnologyInventors: Yoshihiko Asao, Ryu Iida, Canasai Kruengkrai, Noriyuki Abe, Kanako Onishi, Kentaro Torisawa, Yutaka Kidawara
-
Publication number: 20230385558Abstract: A text classifier 90 for answer identification is capable of highly accurate identification of an answer candidate to a question, by effectively using background knowledge related to the question, in order to extract an answer candidate to the question, the text classifier including: a BERT (Bidirectional Encoder Representation from Transformers) receiving a question and an answer candidate as inputs; a knowledge integration transformer receiving the output of BERT as an input; a background knowledge representation generator receiving a question and an answer as inputs and generating a group of background knowledge representation vectors for the question; and a vector converter respectively converting the question and the answer candidate to embedded vectors and inputting the same to the background knowledge representation generator.Type: ApplicationFiled: October 13, 2021Publication date: November 30, 2023Inventors: Jonghoon OH, Kentaro TORISAWA, Julien KLOETZER, Ryu IIDA
-
Publication number: 20220253599Abstract: A program for training a representation generator generating a representation representing an answer part included in a passage to classify whether the passage is related to an answer or not. The program causes a computer to operate as: a fake representation generator responsive to a question and a passage for outputting a fake representation representing an answer part of the passage; a real representation generator for outputting, for the question and a core answer, a real representation representing the core answer, in the same format as fake representation; a discriminator for discriminating whether fake representation and real representation are a real or fake representation; and a generative adversarial network unit training the discriminator and fake representation generator through generative adversarial network such that error determination of fake representation is maximized and error determination of real representation is minimized.Type: ApplicationFiled: July 6, 2020Publication date: August 11, 2022Inventors: Jonghoon OH, Kazuma KADOWAKI, Julien KLOETZER, Ryu IIDA, Kentaro TORISAWA
-
Patent number: 11176328Abstract: A question answering device includes: a general word vector converter converting a question and an answer to semantic vectors in accordance with general context; a general sentence level CNN 214, in response to similarities of semantic vectors between words in question and answer and to strength of causality between the words, for weighting each semantic vector to calculate sentence level representations of the question and the answer; a general passage level CNN 218, in response to similarity between sentence level representations of question and answer, and to strength of relation of vectors in the sentence level representations viewed from causality, for weighting the sentence level representation to calculate a passage level representation for the question and answer passage; and a classifier determining whether or not an answer is a correct answer, based on the similarities between outputs from CNNs 214 and 218.Type: GrantFiled: June 14, 2018Date of Patent: November 16, 2021Assignee: National Institute of Information and Communications TechnologyInventors: Jonghoon Oh, Kentaro Torisawa, Canasai Kruengkrai, Ryu Iida, Julien Kloetzer
-
Publication number: 20210326675Abstract: A memory for a question-answering device that reduces influence of noise on answer generation and is capable of generating highly accurate answers includes: a memory configured to normalize vector expressions of answers included in a set of answers extracted from a prescribed background knowledge source for each of a plurality of mutually different questions and to store the results as normalized vectors; and a key-value memory access unit responsive to application of a question vector derived from a question for accessing the memory and for updating the question vector by using a degree of relatedness between the question vector and the plurality of questions and using the normalized vectors corresponding to respective ones of the plurality of questions.Type: ApplicationFiled: June 18, 2019Publication date: October 21, 2021Inventors: Jonghoon OH, Kentaro TORISAWA, Canasai KRUENGKRAI, Julien KLOETZER, Ryu IIDA, Ryo ISHIDA, Yoshihiko ASAO
-
Patent number: 11106714Abstract: A summary generating apparatus includes a text storage device storing text with information indicating a portion to be focused on; word vector converters vectorizing each word of the text and adding an element indicating whether the word is focused on or not to the vector and thereby converting the text to a word vector sequence; an LSTM implemented by a neural network performing sequence-to-sequence type conversion, pre-trained by machine learning to output, in response to each of the word vectors of the word vector sequence input in a prescribed order, a summary of the text consisting of the words represented by the word sequence; and input units inputting each of the word vectors of the word vector sequence in the prescribed order to the neural network.Type: GrantFiled: May 7, 2018Date of Patent: August 31, 2021Assignee: National Institute of Information and Communications TechnologyInventors: Ryu Iida, Kentaro Torisawa, Jonghoon Oh, Canasai Kruengkrai, Yoshihiko Asao, Noriyuki Abe, Junta Mizuno, Julien Kloetzer
-
Patent number: 10936664Abstract: A dialogue system includes: a question generating unit receiving an input sentence from a user and generating a question using an expression included in the input sentence, by using a dependency relation; an answer obtaining unit inputting the question generated by the question generating unit to a question-answering system and obtaining an answer to the question from question-answering system; and an utterance generating unit for generating an output sentence to the input sentence, based on the answer obtained by the answer obtaining unit.Type: GrantFiled: July 26, 2017Date of Patent: March 2, 2021Assignee: National Institute of Information and Communications TechnologyInventors: Noriyuki Abe, Kanako Onishi, Kentaro Torisawa, Canasai Kruengkrai, Jonghoon Oh, Ryu Iida, Yutaka Kidawara
-
Publication number: 20210034817Abstract: A request paraphrasing system 120 allowing a dialogue system to flexibly address to requests in various different manners of expression includes: a pre-processing unit 130 converting a user input 56 to a word vector sequence; and a neural paraphrasing model 94 trained in advance by machine learning to receive the word vector sequence as an input and paraphrasing a request represented by the word vector sequence to a request having a higher probability of obtaining an answer from a question-answering device 122 than the request before paraphrasing. As pre-processing, whether the user input 56 is a request or not may be determined and it may be paraphrased only when it is determined to be a request. Further, a classification model 98 may classify the input request to determine to which request class it belongs, and the classification may be input as one feature to neural paraphrasing model 94.Type: ApplicationFiled: March 5, 2019Publication date: February 4, 2021Inventors: Yoshihiko ASAO, Ryu IIDA, Canasai KRUENGKRAI, Noriyuki ABE, Kanako ONISHI, Kentaro TORISAWA, Yutaka KIDAWARA
-
Publication number: 20200183983Abstract: A dialogue system includes: a question generating unit receiving an input sentence from a user and generating a question using an expression included in the input sentence, by using a dependency relation; an answer obtaining unit inputting the question generated by the question generating unit to a question-answering system and obtaining an answer to the question from question-answering system; and an utterance generating unit for generating an output sentence to the input sentence, based on the answer obtained by the answer obtaining unit.Type: ApplicationFiled: July 26, 2017Publication date: June 11, 2020Inventors: Noriyuki ABE, Kanako ONISHI, Kentaro TORISAWA, Canasai KRUENGKRAI, Jonghoon OH, Ryu IIDA, Yutaka KIDAWARA
-
Publication number: 20200159755Abstract: A summary generating apparatus includes; a text storage device storing text with information indicating a portion to be focused on; word vector converters vectorizing each word of the text and adding an element indicating whether the word is focused on or not to the vector and thereby converting the text to a word vector sequence; an LSTM implemented by a neural network performing sequence-to-sequence type conversion, pre-trained by machine learning to output, in response to each of the word vectors of the word vector sequence input in a prescribed order, a summary of the text consisting of the words represented by the word sequence; and input units inputting each of the word vectors of the word vector sequence in the prescribed order to the neural network.Type: ApplicationFiled: May 7, 2018Publication date: May 21, 2020Inventors: Ryu IIDA, Kentaro TORISAWA, Jonghoon OH, Canasai KRUENGKRAI, Yoshihiko ASAO, Noriyuki ABE, Junta MIZUNO, Julien KLOETZER
-
Publication number: 20200134263Abstract: A question answering device includes: a general word vector converter converting a question and an answer to semantic vectors in accordance with general context; a general sentence level CNN 214, in response to similarities of semantic vectors between words in question and answer and to strength of causality between the words, for weighting each semantic vector to calculate sentence level representations of the question and the answer; a general passage level CNN 218, in response to similarity between sentence level representations of question and answer, and to strength of relation of vectors in the sentence level representations viewed from causality, for weighting the sentence level representation to calculate a passage level representation for the question and answer passage; and a classifier determining whether or not an answer is a correct answer, based on the similarities between outputs from CNNs 214 and 218.Type: ApplicationFiled: June 14, 2018Publication date: April 30, 2020Inventors: Jonghoon OH, Kentaro TORISAWA, Canasai KRUENGKRAI, Ryu IIDA, Julien KLOETZER
-
Publication number: 20200034722Abstract: A question-answering system includes a storage unit storing expressions representing causality; an answer receiving unit receiving a question and answer passages each including an answer candidate to the question; a causality expression extracting unit extracting a causality expression from each of the answer passages; a relevant causality expression extracting unit selecting, for a combination of the question and an answer passage, an expression most relevant to the combination, from the storage unit; and a neural network receiving the question, the answer passages, semantic relation expressions related to the answer passages, and one of the relevant expressions for the combination of the question and the answer passages, and selecting an answer to the question from the answer passages.Type: ApplicationFiled: October 2, 2017Publication date: January 30, 2020Inventors: Jonghoon OH, Kentaro TORISAWA, Canasai KRUENGKRAI, Ryu IIDA, Julien KLOETZER
-
Publication number: 20190188257Abstract: A context analysis apparatus includes an analysis control unit for detecting a predicate of which subject is omitted and antecedent candidates thereof, and an anaphora/ellipsis analysis unit determining a word to be identified. The anaphora/ellipsis analysis unit includes: word vector generating units generating a plurality of different types of word vectors from sentences for the antecedent candidates; a convolutional neural network receiving as an input a word vector and trained to output a score indicating the probability of each antecedent candidate being the omitted word; and a list storage unit and a identification unit determining a antecedent candidate having the highest score. The word vectors include a plurality of word vectors each extracted at least by using the object of analysis and character sequences of the entire sentences other than the candidates. Similar processing is also possible on other words such as a referring expression.Type: ApplicationFiled: August 30, 2017Publication date: June 20, 2019Inventors: Ryu IIDA, Kentaro TORISAWA, Canasai KRUENGKRAI, Jonghoon OH, Julien KLOETZER
-
Patent number: 10157171Abstract: An annotation data generation assisting system includes: an input/output device receiving an input through an interactive process; morphological analysis system 380 and dependency parsing system performing morphological and dependency parsing on text data in text archive; first to fourth candidate generating units detecting a zero anaphor or a referring expression in the dependency relation of a predicate in a sequence of morphemes, identifying a position as an object of annotation and estimating candidates of expressions to be inserted by using language knowledge; a candidate DB storing estimated candidates; and an interactive annotation device reading candidates of annotation from candidate DB and annotate a candidate selected by an interactive process by input/output device.Type: GrantFiled: January 20, 2016Date of Patent: December 18, 2018Assignee: National Institute of Information and Communications TechnologyInventors: Ryu Iida, Kentaro Torisawa, Chikara Hashimoto, Jonghoon Oh, Kiyonori Ootake, Yutaka Kidawara
-
Publication number: 20180246953Abstract: A training device includes: a question issuing unit issuing a question stored in a question and expected answer storage unit to a question answering system; an answer candidate filtering unit, an answer candidate determining unit, training data generating/labeling unit, and a training data selecting unit, generating and adding to a training data storage unit training data for a ranking unit of question answering system, from pairs of a question and each of a plurality of answer candidates output with scores from why-question answering system; and an iteration control unit controlling question issuing unit, answer candidate filtering unit, answer candidate determining unit, training data generating/labeling unit and training data selecting unit such that training of the training device, issuance of question and addition of training data are repeated until an end condition is satisfied.Type: ApplicationFiled: August 26, 2016Publication date: August 30, 2018Inventors: Jonghoon OH, Kentaro TORISAWA, Chikara HASHIMOTO, Ryu IIDA, Masahiro TANAKA, Julien KLOETZER
-
Publication number: 20180011830Abstract: annotation data generation assisting system includes: an input/output device receiving an input through an interactive process; morphological analysis system and dependency parsing system performing morphological and dependency parsing on text data in text archive; first to fourth candidate generating units detecting a zero anaphor or a referring expression in the dependency relation of a predicate in a sequence of morphemes, identifying a position as an object of annotation and estimating candidates of expressions to be inserted by using language knowledge; a candidate DB storing estimated candidates; and an interactive annotation device reading candidates of annotation from candidate DB and annotate a candidate selected by an interactive process by input/output device.Type: ApplicationFiled: January 20, 2016Publication date: January 11, 2018Inventors: Ryu IIDA, Kentaro TORISAWA, Chikara HASHIMOTO, Jonghoon OH, Kiyonori OOTAKE, Yutaka KIDAWARA
-
Patent number: 8868407Abstract: A referring expression processor which uses a probabilistic model and in which referring expressions including descriptive, anaphoric and deictic expressions are understood and generated in the course of dialogue is provided. The referring expression processor according to the present invention includes: a referring expression processing section which performs at least one of understanding and generation of referring expressions using a probabilistic model constructed with a referring expression Bayesian network, each referring expression Bayesian network representing relationships between a reference domain (D) which is a set of possible referents, a referent (X) in the reference domain, a concept (C) concerning the referent and a word (W) which represents the concept; and a memory which stores data necessary for constructing the referring expression Bayesian network.Type: GrantFiled: June 25, 2012Date of Patent: October 21, 2014Assignee: Honda Motor Co., Ltd.Inventors: Kotaro Funakoshi, Mikio Nakano, Takenobu Tokunaga, Ryu Iida
-
Publication number: 20130013290Abstract: A referring expression processor which uses a probabilistic model and in which referring expressions including descriptive, anaphoric and deictic expressions are understood and generated in the course of dialogue is provided. The referring expression processor according to the present invention includes: a referring expression processing section which performs at least one of understanding and generation of referring expressions using a probabilistic model constructed with a referring expression Bayesian network, each referring expression Bayesian network representing relationships between a reference domain (D) which is a set of possible referents, a referent (X) in the reference domain, a concept (C) concerning the referent and a word (W) which represents the concept; and a memory which stores data necessary for constructing the referring expression Bayesian network.Type: ApplicationFiled: June 25, 2012Publication date: January 10, 2013Applicant: HONDA MOTOR CO., LTD.Inventors: Kotaro FUNAKOSHI, Mikio NAKANO, Takenobu TOKUNAGA, Ryu IIDA