Patents by Inventor Junji Tomita

Junji Tomita has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11537790
    Abstract: To arrange all words so that the distance of a given word pair will be appropriate. Using as input a concept base 22 which is a set of pairs of a word and a vector representing a concept of the word, and a dictionary 24 which is a set of semantically distant or close word pairs, when a word pair C being a pair of given words A, B in the concept base 22 is present in the dictionary 24, conversion means 30 associates with the word pair C a magnitude D of a difference vector between a difference vector V? between a converted vector of the word A and a converted vector of the word B, and a vector kV determined by multiplying a difference vector V between the vector of the word A in the concept base 22 and the vector of the word B in the concept base 22 by a scalar value k. When the word pair C is not present in the dictionary 24, the conversion means 30 associates the magnitude D of the difference vector between the difference vector V? and the difference vector V with the word pair C.
    Type: Grant
    Filed: April 4, 2019
    Date of Patent: December 27, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Katsuji Bessho, Hisako Asano, Junji Tomita
  • Publication number: 20220405639
    Abstract: An information processing apparatus includes a training unit configured to share encoding layers from a first layer to a (N-n)-th layer having parameters trained in advance by a first model and a second model, and train parameters of a third model through multi-task training including training of the first model and retraining of the second model for a predetermined task, wherein N and n are integers equal to or greater than 1, and satisfies N>n, and in the third model, encoding layers from an ((N-n)+1)-th layer to an N-th layer having parameters trained in advance are divided into the first model and the second model.
    Type: Application
    Filed: November 21, 2019
    Publication date: December 22, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Publication number: 20220366140
    Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.
    Type: Application
    Filed: March 3, 2020
    Publication date: November 17, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
  • Publication number: 20220358361
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to extract one or more ranges that are likely to be answers in the document and generate a question representation whose answer is each of the ranges that are extracted.
    Type: Application
    Filed: February 12, 2020
    Publication date: November 10, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220343076
    Abstract: A text generation apparatus includes a memory and a processor configured to, based on learned parameters of neural networks, acquire, as a reference text, a predetermined number of two or more sentences having a relatively high relevance to an input sentence from a set of sentences different from the input sentence and generate text based on the input sentence and the reference text, such that information to be considered when generating text can be added as text.
    Type: Application
    Filed: October 2, 2019
    Publication date: October 27, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Patent number: 11481560
    Abstract: An information processing device includes a processing unit configured to receive as input a document and a question, and to execute processing to output an answer range as a range of a string that can be an answer to the question in the document, or an answer suitability of the document with respect to the question, by using neural networks, wherein the processing unit includes a first neural network configured to calculate the answer range, and a second neural network configured to calculate the answer suitability, and between the first neural network and the second neural network, part of layers constituting both neural networks is shared.
    Type: Grant
    Filed: October 5, 2018
    Date of Patent: October 25, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kyosuke Nishida, Itsumi Saito, Atsushi Otsuka, Hisako Asano, Junji Tomita
  • Patent number: 11475911
    Abstract: In communication performed among multiple participants, at least one of a participant who will start speaking next and a timing thereof is estimated. An estimation apparatus includes a head motion information generation unit that acquires head motion information representing head motions of communication participants in a time segment corresponding to an end time of an utterance segment and synchronization information for head motions between the communication participants, and an estimation unit that estimates at least one of the speaker of the next utterance segment following the utterance segment and the next utterance start timing following the utterance segment based on the head motion information and the synchronization information for the head motions between the communication participants.
    Type: Grant
    Filed: February 5, 2019
    Date of Patent: October 18, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo Ishii, Ryuichiro Higashinaka, Junji Tomita, Shiro Kumano, Kazuhiro Otsuka
  • Patent number: 11429784
    Abstract: To make it possible to generate a response sentence with respect to an input speech sentence without preparing a large amount of data. A response-type determining unit 117 determines, based on an analysis result of a speech sentence analyzed by a speech-content analyzing unit 112, a speech type indicating a type of the speech sentence and determines a response type with respect to the determined speech type based on the speech type and a type conversion rule prescribing, for each speech type, a rule for a response type with respect to a speech of the speech type. A response-sentence generating unit 119 generates the response sentence based on the response type and a response sentence database.
    Type: Grant
    Filed: March 27, 2019
    Date of Patent: August 30, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Nozomi Kobayashi, Kuniko Saito, Junji Tomita
  • Publication number: 20220269856
    Abstract: A structured text processing learning apparatus includes a memory, and a processor configured to: analyze a structure of structured text, extract, from the structured text, information related to a predetermined structure to be extracted based on the structure, generate a converted document including text data in which each string indicating a structure of the extracted information has been converted, in accordance with the structure related to the extracted information, and train a neural network for executing predetermined processing for the converted document, using, as input, the generated converted document and correct answer information for performing the predetermined processing. Thus, application of a neural network to structured text is facilitated.
    Type: Application
    Filed: August 1, 2019
    Publication date: August 25, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Narichika NOMOTO, Hisako ASANO, Junji TOMITA
  • Publication number: 20220261536
    Abstract: An expanded utterance that is used to output a more appropriate output utterance for an utterance can be generated. An utterance sentence expansion device includes an expansion unit that inserts, for an utterance that is an utterance to be expanded that includes a noun and is morphologically analyzed in advance, by using information of an expansion dictionary, which includes higher-level categories of the noun, one or more higher-level categories of the expansion dictionary corresponding to the noun included in the utterance into a position before the noun of the utterance to generate an expanded utterance.
    Type: Application
    Filed: April 10, 2020
    Publication date: August 18, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ko MITSUDA, Ryuichiro HIGASHINAKA, Junji TOMITA
  • Publication number: 20220261556
    Abstract: It is possible to ask an appropriate question for digging an utterance of the other party in depth. An interrogative search unit estimates an estimated used interrogative with a text, which is an utterance sentence, as an input, by using a predetermined rule or an estimator that has already learned, the estimated used interrogative being an interrogative related to the text. A candidate utterance sentence generation unit generates each of candidate utterance sentences for the utterance sentence with the text as an input through automatic utterance generation. Based on each of the candidate utterance sentences and an estimation result of the estimated used interrogative, a ranking unit ranks the candidate utterance sentenced, based on the scored.
    Type: Application
    Filed: October 9, 2019
    Publication date: August 18, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Taichi KATAYAMA, Atsushi OTSUKA, Ko MITSUDA, Ryuichiro HIGASHINAKA, Junji TOMITA
  • Publication number: 20220253591
    Abstract: A structured text processing apparatus includes one or more computers each including a memory and a processor configured to analyze a tree structure of a structured text; specify, for each leaf node in the tree structure, a path from the leaf node to a root node; and generate a converted text including text data in which strings associated with respective nodes from the root node to the leaf node of each path are connected to each other.
    Type: Application
    Filed: August 1, 2019
    Publication date: August 11, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Narichika NOMOTO, Hisako ASANO, Junji TOMITA
  • Patent number: 11403469
    Abstract: The present invention makes it possible to generate a paraphrastic sentence that has a similar meaning to the original sentence despite a local word/phrase difference, or a non-paraphrastic sentence that is not a paraphrase despite having a similar meaning to the original sentence in terms of the entire sentence. An estimation unit 22 estimates a word deletion probability for each of words constituting an input sentence, by using a positive example model that has been trained based on a positive example constituted by a sentence and a paraphrastic sentence of the sentence, and is used to generate a paraphrastic sentence by deleting a word, or by using a negative example model that has been trained based on a negative example constituted by the sentence and a non-paraphrastic sentence of the sentence, and is used to generate a non-paraphrastic sentence by deleting a word.
    Type: Grant
    Filed: July 23, 2019
    Date of Patent: August 2, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Patent number: 11404063
    Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities and an expression unit that expresses the nonverbal information on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling the expression unit so as to express behavior corresponding to the text.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: August 2, 2022
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo Ishii, Ryuichiro Higashinaka, Taichi Katayama, Junji Tomita, Nozomi Kobayashi, Kyosuke Nishida
  • Publication number: 20220229997
    Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.
    Type: Application
    Filed: May 28, 2019
    Publication date: July 21, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220207239
    Abstract: An utterance pair for expansion necessary for outputting an appropriate output utterance for an input utterance can be acquired.
    Type: Application
    Filed: April 10, 2020
    Publication date: June 30, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ko MITSUDA, Ryuichiro HIGASHINAKA, Taichi KATAYAMA, Junji TOMITA
  • Publication number: 20220164545
    Abstract: To enable accurate estimation of a dialogue act type taking utterance subject into account. A feature value extraction unit 130 extracts feature values including an utterance subject feature value which is a feature value related to an utterance subject of an utterance sentence for each of a first utterance sentence and a second utterance sentence, the second utterance sentence being an utterance sentence preceding the first utterance sentence, including at least the utterance sentence immediately preceding the first utterance sentence. A dialogue act estimation unit 260 estimates a dialogue act type of the first utterance sentence using the aggregate feature value generated by aggregating the extracted feature values for each of the first utterance sentence and the second utterance sentence and a previously learned dialogue act estimation model for estimating the dialogue act type indicating a kind of dialogue act taking into account the utterance subject of an utterance sentence.
    Type: Application
    Filed: March 25, 2020
    Publication date: May 26, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Nozomi KOBAYASHI, Kuniko SAITO, Junji TOMITA
  • Publication number: 20220138239
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.
    Type: Application
    Filed: February 25, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138267
    Abstract: A generation apparatus includes a generation unit configured to use a machine learning model learned in advance, with a document as an input, to generate a question representation for a range of an answer in the document, wherein when generating a word of the question representation by performing a copy from the document, the generation unit adjusts a probability that a word included in the range is copied.
    Type: Application
    Filed: February 12, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi OTSUKA, Kyosuke NISHIDA, Itsumi SAITO, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20220138438
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.
    Type: Application
    Filed: February 21, 2020
    Publication date: May 5, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Atsushi OTSUKA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA