Patents by Inventor Kyosuke NISHIDA

Kyosuke NISHIDA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12292904
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and an output length, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating a second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence, in correspondence with a designated output length.
    Type: Grant
    Filed: February 25, 2020
    Date of Patent: May 6, 2025
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Atsushi Otsuka, Kosuke Nishida, Hisako Asano, Junji Tomita
  • Patent number: 12210847
    Abstract: A sentence generation device has: an estimation unit for receiving input of a first sentence and a focus point related to generation of a second sentence to be generated based on the first sentence, and estimating importance of each word constituting the first sentence using a pre-trained model; and a generation unit for generating the second sentence based on the importance, and thus makes it possible to evaluate importance of a constituent element of an input sentence in correspondence with a designated focus point.
    Type: Grant
    Filed: February 21, 2020
    Date of Patent: January 28, 2025
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Atsushi Otsuka, Kosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20250028906
    Abstract: A language processing device includes a hardware processor configured to extract a feature amount from text data based on model parameters of a neural network; output an answer start point score, an answer end point score, and an answer possibility score using the feature amount as an input, based on model parameters of the neural network; extract predetermined n answer suitability scores based on the answer start point score and the answer end point score; obtain n adjusted answer suitability scores from the n answer suitability scores and obtain an adjusted answer possibility score from the answer possibility score, based on model parameters of the neural network; and lean the model parameters based on the n adjusted answer suitability scores, the adjusted answer possibility score, a correct answer section, and a correct answer possibility.
    Type: Application
    Filed: December 6, 2021
    Publication date: January 23, 2025
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Sen YOSHIDA
  • Publication number: 20250021762
    Abstract: A language processing device includes circuitry configured to generate an error sentence corresponding to an original sentence based on pronunciation corresponding to text data indicating the original sentence; use a language model based on a neural network model to generate a prediction sentence from the error sentence based on a language model parameter of the language model; and update the language model parameter based on a difference between the original sentence and the prediction sentence.
    Type: Application
    Filed: December 1, 2021
    Publication date: January 16, 2025
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Sen YOSHIDA
  • Publication number: 20250005913
    Abstract: Therefore, an image processing apparatus according to the present disclosure is an image processing apparatus for extracting a feature amount of image data, the image processing apparatus including: an image understanding unit 41 that vectorizes an image pattern of the image data to extract an image feature amount; a text understanding unit 43 that vectorizes a text pattern of attached text data attached to the image data to extract a text feature amount; and a feature amount mixing unit 44 that generates a mixed feature amount as the feature amount by projecting the image feature amount extracted by the image understanding unit 41 and the text feature amount extracted by the text understanding unit 43 onto the same vector space and mixing the image feature amount and the text feature amount.
    Type: Application
    Filed: June 24, 2022
    Publication date: January 2, 2025
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Shuichi NISHIOKA
  • Patent number: 12165672
    Abstract: A nonverbal information generation apparatus includes a display unit that partitions text into predetermined units, displays the text partitioned into the predetermined units, and makes nonverbal information that represents information about behavior of a verbal output agent or nonverbal information that represents information about behavior of a receiver of verbal information of the verbal output agent that corresponds to the text when the verbal output agent outputs the verbal information visible in association with the predetermined units of the text.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: December 10, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo Ishii, Ryuichiro Higashinaka, Taichi Katayama, Junji Tomita, Nozomi Kobayashi, Kyosuke Nishida
  • Patent number: 12112275
    Abstract: There is provided a learning device for learning a neural network used for search of external knowledge in order to increase search accuracy of external knowledge required for arithmetic processing. With an input sentence Q as an input, an external knowledge search unit 22 selects pieces of external knowledge based on similarity degrees between pieces of external knowledge included in an external knowledge database 2 and the input sentence Q, using a neural network, and causes the selected pieces of external knowledge to be search results R2. A processing unit 14 acquires a response sentence A to the input sentence Q by arithmetic processing with the input sentence Q and the selected pieces of external knowledge as an input. A consideration calculation unit 23 calculates a consideration v determined from an index indicating correctness of the response sentence A based on a true output T given to the input sentence Q in advance and an index indicating quality of the selected pieces of external knowledge.
    Type: Grant
    Filed: November 8, 2019
    Date of Patent: October 8, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke Nishida, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20240320440
    Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . , Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.
    Type: Application
    Filed: May 22, 2024
    Publication date: September 26, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Publication number: 20240321011
    Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling an expression unit that expresses behavior that corresponds to the text.
    Type: Application
    Filed: April 10, 2024
    Publication date: September 26, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo ISHII, Ryuichiro Higashinaka, Taichi Katayama, Junji Tomita, Nozomi Kobayashi, Kyosuke Nishida
  • Patent number: 12056168
    Abstract: A learning apparatus according to an embodiment has a feature generation means configured to take a search query, a first document related to the search query, and a second document that is not related to the search query as input, and generate a feature of the search query, a feature of the first document, and a feature of the second document, by using model parameters of a neural network, and an update means configured to take the feature of the search query, the feature of the first document, and the feature of the second document as input, and update the model parameters by using an error function including a cost function that is a differentiable approximation function of an L0 norm.
    Type: Grant
    Filed: January 29, 2020
    Date of Patent: August 6, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Taku Hasegawa, Kyosuke Nishida, Junji Tomita, Hisako Asano
  • Patent number: 12026472
    Abstract: A generation unit that takes a question Qi that is a word sequence representing a current question in a dialogue, a document P used to generate an answer Ai to the question Qi, a question history {Qi-1, . . . , Qi-k} that is a set of word sequences representing k past questions, and an answer history {Ai-1, . . . , Ai-k} that is a set of word sequences representing answers to the k questions as inputs, and generates the answer Ai by machine reading comprehension in an extractive mode or a generative mode using pre-trained model parameters is provided.
    Type: Grant
    Filed: May 28, 2019
    Date of Patent: July 2, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito Osugi, Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20240202442
    Abstract: A text generation apparatus includes a content selection unit that acquires a reference text based on an input text and information different from the input text and a generation unit that generates a text based on the input text and the reference text, wherein the content selection unit and the generation unit are neural networks based on learned parameters, so that information to be considered when generating a text can be added as text.
    Type: Application
    Filed: March 4, 2024
    Publication date: June 20, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi SAITO, Kyosuke NISHIDA, Kosuke NISHIDA, Hisako ASANO, Junji TOMITA, Atsushi OTSUKA
  • Publication number: 20240202495
    Abstract: A learning apparatus executes receiving a text and a question associated with the text, and calculating an evidence score expressing a likelihood of a character string included in the text as evidence for an answer to the question by using a model parameter of a first neural network; extracting, by sampling from a predetermined distribution having the evidence score as a parameter, a first set indicating a set of the character strings as the evidence for the answer from the text; receiving the question and the first set and extracting the answer from the first set by using a model parameter of a second neural network; and learning the model parameters of the first and second neural networks by calculating a gradient through error back propagation by using a continuous relaxation and a first loss between the answer and a true answer to the question.
    Type: Application
    Filed: March 6, 2020
    Publication date: June 20, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Itsumi SAITO, Hisako ASANO, Junji TOMITA
  • Patent number: 11989976
    Abstract: A nonverbal information generation apparatus includes a nonverbal information generation unit that generates time-information-stamped nonverbal information that corresponds to time-information-stamped text feature quantities on the basis of the time-information-stamped text feature quantities and a learned nonverbal information generation model. The time-information-stamped text feature quantities are configured to include feature quantities that have been extracted from text and time information representing times assigned to predetermined units of the text. The nonverbal information is information for controlling an expression unit that expresses behavior that corresponds to the text.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: May 21, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Ryo Ishii, Ryuichiro Higashinaka, Taichi Katayama, Junji Tomita, Nozomi Kobayashi, Kyosuke Nishida
  • Patent number: 11972365
    Abstract: A question generation device includes: generating means which uses a query and a relevant document including an answer to the query as input and, using a machine learning model having been learned in advance, generates a revised query in which a potentially defective portion of the query is supplemented with a word included in a prescribed lexical set.
    Type: Grant
    Filed: April 25, 2019
    Date of Patent: April 30, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Atsushi Otsuka, Kyosuke Nishida, Itsumi Saito, Kosuke Nishida, Hisako Asano, Junji Tomita
  • Patent number: 11954435
    Abstract: A text generation apparatus includes a memory and a processor configured to execute acquiring a reference text based on an input text and information different from the input text; and generating a text based on the input text and the reference text, wherein the acquiring and the generating are implemented as neural networks based on learned parameters.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: April 9, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Kosuke Nishida, Hisako Asano, Junji Tomita, Atsushi Otsuka
  • Publication number: 20240054295
    Abstract: A learning apparatus includes a memory and at least one processor connected to the memory, wherein the processor configured to: convert input text data into a feature amount sequence based on a language model; and update parameters of the language model based on the text data, the feature amount sequence, and a word vector learned in advance.
    Type: Application
    Filed: March 8, 2021
    Publication date: February 15, 2024
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke NISHIDA, Kyosuke NISHIDA, Sen YOSHIDA
  • Patent number: 11893353
    Abstract: To make it possible to accurately generate a word vector even if vocabulary of a word vector data set is not limited. In a vector generating device 10 that generates vectors representing an input sentence P, when generating a series of the vectors representing the input sentence P based on vectors corresponding to words included in the input sentence P, a definition-sentence-considered-context encode unit 280 generates, based on a dictionary DB 230 storing sets of headwords y and definition sentences Dy, which are sentences defining the headwords y, concerning a word, which is the headword stored in the dictionary DB, among the words included in the input sentence P, the series of the vectors representing the input sentence P using the definition sentence Dy of the headwords y.
    Type: Grant
    Filed: March 4, 2019
    Date of Patent: February 6, 2024
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Kosuke Nishida, Kyosuke Nishida, Hisako Asano, Junji Tomita
  • Publication number: 20230306202
    Abstract: A language processing apparatus includes: a preprocessing unit that splits an input text into a plurality of short texts; a language processing unit that calculates a first feature and a second feature using a trained model for each of the plurality of short texts; and an external storage unit configured to store a third feature for one or more short texts, and the language processing unit uses the trained model to calculate the second feature for a certain short text using the first feature of the short text and the third feature stored in the external storage unit.
    Type: Application
    Filed: August 20, 2020
    Publication date: September 28, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Yasuhito OSUGI, Itsumi SAITO, Kyosuke NISHIDA, Hisako ASANO, Junji TOMITA
  • Patent number: 11704485
    Abstract: An appropriate vector of any phrase can be generated. A lattice construction unit 212 constructs a lattice structure formed by links binding adjacent word or phrase candidates based on a morphological analysis result and a dependency analysis result of input text. A first learning unit 213 performs learning of a neural network A for estimating nearby word or phrase candidates from word or phrase candidates based on the lattice structure. A vector generation unit 214 acquires a vector of each of the word or phrase candidates from the neural network A and sets the vector as learning data. A second learning unit performs learning of a neural network B for vectorizing the word or phrase candidates based on the learning data.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: July 18, 2023
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Itsumi Saito, Kyosuke Nishida, Hisako Asano, Junji Tomita