Patents by Inventor Tomoya Iwakura

Tomoya Iwakura has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240037329
    Abstract: A non-transitory computer-readable recording medium stores a generation program for causing a computer to execute processing including: generating a feature vector of each of a plurality of words based on document data that includes the plurality of words; and generating a feature vector of a compound word obtained by combining two or more words based on the generated feature vector of each of the plurality of words. The feature vector of each of the plurality of words and the feature vector of the compound word are used to predict a word that follows one word in the document data.
    Type: Application
    Filed: May 25, 2023
    Publication date: February 1, 2024
    Applicant: Fujitsu Limited
    Inventors: Taiki WATANABE, Tomoya IWAKURA
  • Publication number: 20230281331
    Abstract: A non-transitory computer-readable recording medium stores a program for causing a computer to execute a process, the process includes extracting a plurality of candidates for an entity in a knowledge graph based on a word in text, collecting related images related to the extracted candidates, generating image clusters of the collected related images for the respective candidates, calculating degrees of similarity between the generated image clusters, and determining, as the entity, a candidate of which the image cluster indicates a higher degree of similarity among the candidates.
    Type: Application
    Filed: December 15, 2022
    Publication date: September 7, 2023
    Applicant: Fujitsu Limited
    Inventors: Chunpeng Ma, Tomoya Iwakura, Yuzi Kanazawa, Tetsuro Takahashi
  • Patent number: 11669695
    Abstract: A translation method, implemented by a computer, includes: converting a text written in a first language into a replacement text in which a named entity in the text is replaced with a predetermined character string; translating the replacement text into a second language by using a text translation model which is a neural network; and translating a named entity corresponding to the predetermined character string in the replacement text into the second language by using a named entity translation model which is a neural network.
    Type: Grant
    Filed: March 17, 2020
    Date of Patent: June 6, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Akiba Miura, Tomoya Iwakura
  • Publication number: 20220180198
    Abstract: A training method for a computer to execute a process includes acquiring a model that includes an input layer and an intermediate layer, in which the intermediate layer is coupled to a first output layer and a second output layer; training the first output layer, the intermediate layer, and the input layer based on an output result from the first output layer when first training data is input into the input layer; and training the second output layer, the intermediate layer, and the input layer based on an output result from the second output layer when second training data is input into the input layer.
    Type: Application
    Filed: February 24, 2022
    Publication date: June 9, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Akiba Miura, Tomoya Iwakura
  • Publication number: 20220180197
    Abstract: A training method for a computer to execute a process includes acquiring a trained model that is trained by using training data that belongs to a first field, and that includes an input layer and an intermediate layer that is coupled to each of a plurality of output layers; generating an objective model in which a new output layer is coupled to the intermediate layer; and training the objective model by using training data that belongs to a second field.
    Type: Application
    Filed: February 23, 2022
    Publication date: June 9, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Akiba Miura, Tomoya Iwakura
  • Publication number: 20220171926
    Abstract: An information processing method for a computer to execute a process includes extracting, from a first document, a word not included in a second document; registering the word in a first dictionary; acquiring an intermediate representation vector by inputting a word included in the second document to a recursion-type encoder; acquiring a first probability distribution based on a result of inputting the intermediate representation vector to a recursion-type decoder that calculates a probability distribution of each word registered in the first dictionary; acquiring a second probability distribution of a second dictionary of a word included in the second document based on a hidden state vector calculated by inputting each word included in the second document to the recursion-type encoder and a hidden state vector output from the recursion-type decoder; and generating word included in the first document based on the first probability distribution and the second probability distribution.
    Type: Application
    Filed: February 14, 2022
    Publication date: June 2, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Tomoya Iwakura, Takuya Makino
  • Publication number: 20220171928
    Abstract: An information processing method in which a computer executes processing includes: acquiring a sentence; specifying a word that appears immediately before or immediately after a first word in the acquired sentence by using a prediction model that predicts a word that appears immediately before or immediately after an input word; determining whether or not an estimated relationship between the first word and the second word in the sentence is appropriate on the basis of the specified word and a rule regarding a unit that corresponds to a relationship between words stored in a storage; and outputting information regarding the estimated relationship between the first word and the second word in a case where it is determined that the relationship is appropriate.
    Type: Application
    Filed: February 15, 2022
    Publication date: June 2, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Tomoya Iwakura, Taiki Watanabe
  • Patent number: 11144729
    Abstract: A computer-implemented summary generation method includes obtaining input text; generating an initial lattice including serially coupled nodes corresponding to words within the input text; generating a node corresponding to an expression within the initial lattice; adding the generated node to the initial lattice to provide an extended lattice corresponding to the input text; calculating a generation probability of each word within the input text using a dictionary and a machine learning model (model); calculating a generation probability for each node included in the extended lattice based on a hidden state output by a cell corresponding to the node among cells in an encoder of the model and a hidden state updated by a cell in a decoder of the model; and generating an element of a summary of the input text based on the generation probability of each word and the generation probability of each node of the extended lattice.
    Type: Grant
    Filed: November 25, 2019
    Date of Patent: October 12, 2021
    Assignee: FUJITSU LIMITED
    Inventors: Tomoya Iwakura, Takuya Makino
  • Patent number: 10936948
    Abstract: An apparatus acquires learning-data, including feature-elements, to which a label is assigned. The apparatus generates a first-set of expanded feature-elements by expanding the feature-elements. With reference to a model where a confidence value is stored in association with each of a second-set of expanded feature-elements, the apparatus updates confidence values associated with expanded feature-elements common between the first- and second-sets of expanded feature-elements, based on the label.
    Type: Grant
    Filed: September 11, 2017
    Date of Patent: March 2, 2021
    Assignee: FUJITSU LIMITED
    Inventor: Tomoya Iwakura
  • Publication number: 20200311352
    Abstract: A translation method, implemented by a computer, includes: converting a text written in a first language into a replacement text in which a named entity in the text is replaced with a predetermined character string; translating the replacement text into a second language by using a text translation model which is a neural network; and translating a named entity corresponding to the predetermined character string in the replacement text into the second language by using a named entity translation model which is a neural network.
    Type: Application
    Filed: March 17, 2020
    Publication date: October 1, 2020
    Applicant: FUJITSU LIMITED
    Inventors: Akiba Miura, Tomoya Iwakura
  • Publication number: 20200279159
    Abstract: A learning method to be executed by a computer, the learning method includes when a first input sentence in which a predetermined target is represented by a first named entity is input to a first machine learning model, learning a first parameter of the first machine learning model such that a value output from the first machine learning model approaches correct answer information corresponding to the first input sentence; and when an intermediate representation generated when the first input sentence is input to the first machine learning model and a second input sentence in which the predetermined target is represented by a second named entity are input to a second machine learning model, learning the first parameter and a second parameter of the second machine learning model such that a value output from the second machine learning model approaches correct answer information corresponding to the second input sentence.
    Type: Application
    Filed: February 26, 2020
    Publication date: September 3, 2020
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya Iwakura
  • Publication number: 20200175229
    Abstract: A computer-implemented summary generation method includes obtaining input text; generating an initial lattice including serially coupled nodes corresponding to words within the input text; generating a node corresponding to an expression within the initial lattice; adding the generated node to the initial lattice to provide an extended lattice corresponding to the input text; calculating a generation probability of each word within the input text using a dictionary and a machine learning model (model); calculating a generation probability for each node included in the extended lattice based on a hidden state output by a cell corresponding to the node among cells in an encoder of the model and a hidden state updated by a cell in a decoder of the model; and generating an element of a summary of the input text based on the generation probability of each word and the generation probability of each node of the extended lattice.
    Type: Application
    Filed: November 25, 2019
    Publication date: June 4, 2020
    Applicant: FUJITSU LIMITED
    Inventors: Tomoya Iwakura, Takuya Makino
  • Publication number: 20180330279
    Abstract: A non-transitory computer-readable recording medium stores a learning program that causes a computer to execute a process including: acquiring learning data that is a learning object for a model in which data and confidence of the data are associated with each other; determining whether learning of the learning data is needed by comparing a predetermined condition with a decision result related to updating of the model accumulated for the learning data acquired at the acquiring; and excluding, from a learning object, the learning data of which learning is determined to be unneeded at the determining.
    Type: Application
    Filed: May 7, 2018
    Publication date: November 15, 2018
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya IWAKURA
  • Publication number: 20180075351
    Abstract: An apparatus acquires learning-data, including feature-elements, to which a label is assigned. The apparatus generates a first-set of expanded feature-elements by expanding the feature-elements. With reference to a model where a confidence value is stored in association with each of a second-set of expanded feature-elements, the apparatus updates confidence values associated with expanded feature-elements common between the first- and second-sets of expanded feature-elements, based on the label.
    Type: Application
    Filed: September 11, 2017
    Publication date: March 15, 2018
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya Iwakura
  • Publication number: 20170212896
    Abstract: A method for extracting character string candidate includes: receiving, by a computer, an input character or an input character string, and input identity information of an input source of the input character or the input character string; referencing a memory that stores character string candidates in association with pronunciation and identification information or the identification information; extracting, from among the character string candidates, a character string candidate that is associated with the input identity information and the pronunciation including the input character or the input character string, or a character string candidate that is associated with the input identity information and a character or a character string including the input character or the input character string; and outputting an extracted character string candidate as a selection candidate.
    Type: Application
    Filed: January 9, 2017
    Publication date: July 27, 2017
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya Iwakura
  • Publication number: 20160246775
    Abstract: A learning apparatus includes a memory and a processor to generate, based on a first example sentence containing a target word having a plurality of meanings belonging to different types, a first rule containing a first meaning of the target word in the first example sentence, and another word providing a clue for determining the first meaning, acquire a second example sentence, determine a second meaning of the target word in the second example sentence based on a word contained in the second example sentence and the first rule, generate a second rule pertaining to a correlation between the second meaning and the type, acquire a third example sentence, determine the third meaning of the target word in the third example sentence, and learn a third rule for determining a type of the target word based on the second rule, the third meaning, and the third example sentence.
    Type: Application
    Filed: January 20, 2016
    Publication date: August 25, 2016
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya IWAKURA
  • Patent number: 9348810
    Abstract: A present method includes first updating, based on a weight of each training sample (TS), a first score for each of features, which is a cue when extracting a correct structure from each TS, to calculate a model defined by first scores; performing, for each TS, a processing including identifying a maximum score among second scores, each of which is assigned, by the model, to either of candidate structures other than the correct structure among candidate structures derived from the TS; and first calculating a difference between the identified maximum score and a second score assigned by the model to the correct structure; and second calculating a confidence degree based on an upper limit value of errors, which is defined by the differences; second updating the weight of each TS based on the confidence degree and the differences; and repeating the first updating, performing, second calculating and second updating.
    Type: Grant
    Filed: June 24, 2014
    Date of Patent: May 24, 2016
    Assignee: FUJITSU LIMITED
    Inventor: Tomoya Iwakura
  • Publication number: 20150006151
    Abstract: A present method includes first updating, based on a weight of each training sample (TS), a first score for each of features, which is a cue when extracting a correct structure from each TS, to calculate a model defined by first scores; performing, for each TS, a processing including identifying a maximum score among second scores, each of which is assigned, by the model, to either of candidate structures other than the correct structure among candidate structures derived from the TS; and first calculating a difference between the identified maximum score and a second score assigned by the model to the correct structure; and second calculating a confidence degree based on an upper limit value of errors, which is defined by the differences; second updating the weight of each TS based on the confidence degree and the differences; and repeating the first updating, performing, second calculating and second updating.
    Type: Application
    Filed: June 24, 2014
    Publication date: January 1, 2015
    Applicant: FUJITSU LIMITED
    Inventor: Tomoya IWAKURA
  • Patent number: 8370276
    Abstract: A rule learning method in machine learning includes distributing features to a given number of buckets based on a weight of the features which are correlated with a training example; specifying a feature with a maximum gain value as a rule based on a weight of the training example from each of the buckets; calculating a confidence value of the specified rule based on the weight of the training example; storing the specified rule and the confidence value in a rule data storage unit; updating the weights of the training examples based on the specified rule, the confidence value of the specified rule, data of the training example, and the weight of the training example; and repeating the distributing, the specifying, the calculating, the storing, and the updating, when the rule and the confidence value are to be further generated.
    Type: Grant
    Filed: July 22, 2009
    Date of Patent: February 5, 2013
    Assignee: Fujitsu Limited
    Inventors: Tomoya Iwakura, Seishi Okamoto
  • Patent number: 8296249
    Abstract: A rule learning method for making a computer perform rule learning processing in machine learning includes firstly calculating an evaluation value of respective features in a training example by using data and weights of the training examples; selecting a given number of features in descending order of the evaluation values; secondly calculating a confidence value for one of the given number of selected features; updating the weights of training example, by using the data and weights of the training examples, and the confidence value corresponding to the one feature; firstly repeating the updating for the remaining features of the given number of features; and secondly repeating, for a given number of times, the firstly calculating, the selecting, the secondly calculating, the updating, and the firstly repeating.
    Type: Grant
    Filed: July 20, 2009
    Date of Patent: October 23, 2012
    Assignee: Fujitsu Limited
    Inventors: Tomoya Iwakura, Seishi Okamoto