Patents by Inventor Tomoya Iwakura
Tomoya Iwakura has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240037329Abstract: A non-transitory computer-readable recording medium stores a generation program for causing a computer to execute processing including: generating a feature vector of each of a plurality of words based on document data that includes the plurality of words; and generating a feature vector of a compound word obtained by combining two or more words based on the generated feature vector of each of the plurality of words. The feature vector of each of the plurality of words and the feature vector of the compound word are used to predict a word that follows one word in the document data.Type: ApplicationFiled: May 25, 2023Publication date: February 1, 2024Applicant: Fujitsu LimitedInventors: Taiki WATANABE, Tomoya IWAKURA
-
Publication number: 20230281331Abstract: A non-transitory computer-readable recording medium stores a program for causing a computer to execute a process, the process includes extracting a plurality of candidates for an entity in a knowledge graph based on a word in text, collecting related images related to the extracted candidates, generating image clusters of the collected related images for the respective candidates, calculating degrees of similarity between the generated image clusters, and determining, as the entity, a candidate of which the image cluster indicates a higher degree of similarity among the candidates.Type: ApplicationFiled: December 15, 2022Publication date: September 7, 2023Applicant: Fujitsu LimitedInventors: Chunpeng Ma, Tomoya Iwakura, Yuzi Kanazawa, Tetsuro Takahashi
-
Patent number: 11669695Abstract: A translation method, implemented by a computer, includes: converting a text written in a first language into a replacement text in which a named entity in the text is replaced with a predetermined character string; translating the replacement text into a second language by using a text translation model which is a neural network; and translating a named entity corresponding to the predetermined character string in the replacement text into the second language by using a named entity translation model which is a neural network.Type: GrantFiled: March 17, 2020Date of Patent: June 6, 2023Assignee: FUJITSU LIMITEDInventors: Akiba Miura, Tomoya Iwakura
-
Publication number: 20220180198Abstract: A training method for a computer to execute a process includes acquiring a model that includes an input layer and an intermediate layer, in which the intermediate layer is coupled to a first output layer and a second output layer; training the first output layer, the intermediate layer, and the input layer based on an output result from the first output layer when first training data is input into the input layer; and training the second output layer, the intermediate layer, and the input layer based on an output result from the second output layer when second training data is input into the input layer.Type: ApplicationFiled: February 24, 2022Publication date: June 9, 2022Applicant: FUJITSU LIMITEDInventors: Akiba Miura, Tomoya Iwakura
-
Publication number: 20220180197Abstract: A training method for a computer to execute a process includes acquiring a trained model that is trained by using training data that belongs to a first field, and that includes an input layer and an intermediate layer that is coupled to each of a plurality of output layers; generating an objective model in which a new output layer is coupled to the intermediate layer; and training the objective model by using training data that belongs to a second field.Type: ApplicationFiled: February 23, 2022Publication date: June 9, 2022Applicant: FUJITSU LIMITEDInventors: Akiba Miura, Tomoya Iwakura
-
Publication number: 20220171926Abstract: An information processing method for a computer to execute a process includes extracting, from a first document, a word not included in a second document; registering the word in a first dictionary; acquiring an intermediate representation vector by inputting a word included in the second document to a recursion-type encoder; acquiring a first probability distribution based on a result of inputting the intermediate representation vector to a recursion-type decoder that calculates a probability distribution of each word registered in the first dictionary; acquiring a second probability distribution of a second dictionary of a word included in the second document based on a hidden state vector calculated by inputting each word included in the second document to the recursion-type encoder and a hidden state vector output from the recursion-type decoder; and generating word included in the first document based on the first probability distribution and the second probability distribution.Type: ApplicationFiled: February 14, 2022Publication date: June 2, 2022Applicant: FUJITSU LIMITEDInventors: Tomoya Iwakura, Takuya Makino
-
Publication number: 20220171928Abstract: An information processing method in which a computer executes processing includes: acquiring a sentence; specifying a word that appears immediately before or immediately after a first word in the acquired sentence by using a prediction model that predicts a word that appears immediately before or immediately after an input word; determining whether or not an estimated relationship between the first word and the second word in the sentence is appropriate on the basis of the specified word and a rule regarding a unit that corresponds to a relationship between words stored in a storage; and outputting information regarding the estimated relationship between the first word and the second word in a case where it is determined that the relationship is appropriate.Type: ApplicationFiled: February 15, 2022Publication date: June 2, 2022Applicant: FUJITSU LIMITEDInventors: Tomoya Iwakura, Taiki Watanabe
-
Patent number: 11144729Abstract: A computer-implemented summary generation method includes obtaining input text; generating an initial lattice including serially coupled nodes corresponding to words within the input text; generating a node corresponding to an expression within the initial lattice; adding the generated node to the initial lattice to provide an extended lattice corresponding to the input text; calculating a generation probability of each word within the input text using a dictionary and a machine learning model (model); calculating a generation probability for each node included in the extended lattice based on a hidden state output by a cell corresponding to the node among cells in an encoder of the model and a hidden state updated by a cell in a decoder of the model; and generating an element of a summary of the input text based on the generation probability of each word and the generation probability of each node of the extended lattice.Type: GrantFiled: November 25, 2019Date of Patent: October 12, 2021Assignee: FUJITSU LIMITEDInventors: Tomoya Iwakura, Takuya Makino
-
Patent number: 10936948Abstract: An apparatus acquires learning-data, including feature-elements, to which a label is assigned. The apparatus generates a first-set of expanded feature-elements by expanding the feature-elements. With reference to a model where a confidence value is stored in association with each of a second-set of expanded feature-elements, the apparatus updates confidence values associated with expanded feature-elements common between the first- and second-sets of expanded feature-elements, based on the label.Type: GrantFiled: September 11, 2017Date of Patent: March 2, 2021Assignee: FUJITSU LIMITEDInventor: Tomoya Iwakura
-
Publication number: 20200311352Abstract: A translation method, implemented by a computer, includes: converting a text written in a first language into a replacement text in which a named entity in the text is replaced with a predetermined character string; translating the replacement text into a second language by using a text translation model which is a neural network; and translating a named entity corresponding to the predetermined character string in the replacement text into the second language by using a named entity translation model which is a neural network.Type: ApplicationFiled: March 17, 2020Publication date: October 1, 2020Applicant: FUJITSU LIMITEDInventors: Akiba Miura, Tomoya Iwakura
-
Publication number: 20200279159Abstract: A learning method to be executed by a computer, the learning method includes when a first input sentence in which a predetermined target is represented by a first named entity is input to a first machine learning model, learning a first parameter of the first machine learning model such that a value output from the first machine learning model approaches correct answer information corresponding to the first input sentence; and when an intermediate representation generated when the first input sentence is input to the first machine learning model and a second input sentence in which the predetermined target is represented by a second named entity are input to a second machine learning model, learning the first parameter and a second parameter of the second machine learning model such that a value output from the second machine learning model approaches correct answer information corresponding to the second input sentence.Type: ApplicationFiled: February 26, 2020Publication date: September 3, 2020Applicant: FUJITSU LIMITEDInventor: Tomoya Iwakura
-
Publication number: 20200175229Abstract: A computer-implemented summary generation method includes obtaining input text; generating an initial lattice including serially coupled nodes corresponding to words within the input text; generating a node corresponding to an expression within the initial lattice; adding the generated node to the initial lattice to provide an extended lattice corresponding to the input text; calculating a generation probability of each word within the input text using a dictionary and a machine learning model (model); calculating a generation probability for each node included in the extended lattice based on a hidden state output by a cell corresponding to the node among cells in an encoder of the model and a hidden state updated by a cell in a decoder of the model; and generating an element of a summary of the input text based on the generation probability of each word and the generation probability of each node of the extended lattice.Type: ApplicationFiled: November 25, 2019Publication date: June 4, 2020Applicant: FUJITSU LIMITEDInventors: Tomoya Iwakura, Takuya Makino
-
Publication number: 20180330279Abstract: A non-transitory computer-readable recording medium stores a learning program that causes a computer to execute a process including: acquiring learning data that is a learning object for a model in which data and confidence of the data are associated with each other; determining whether learning of the learning data is needed by comparing a predetermined condition with a decision result related to updating of the model accumulated for the learning data acquired at the acquiring; and excluding, from a learning object, the learning data of which learning is determined to be unneeded at the determining.Type: ApplicationFiled: May 7, 2018Publication date: November 15, 2018Applicant: FUJITSU LIMITEDInventor: Tomoya IWAKURA
-
Publication number: 20180075351Abstract: An apparatus acquires learning-data, including feature-elements, to which a label is assigned. The apparatus generates a first-set of expanded feature-elements by expanding the feature-elements. With reference to a model where a confidence value is stored in association with each of a second-set of expanded feature-elements, the apparatus updates confidence values associated with expanded feature-elements common between the first- and second-sets of expanded feature-elements, based on the label.Type: ApplicationFiled: September 11, 2017Publication date: March 15, 2018Applicant: FUJITSU LIMITEDInventor: Tomoya Iwakura
-
Publication number: 20170212896Abstract: A method for extracting character string candidate includes: receiving, by a computer, an input character or an input character string, and input identity information of an input source of the input character or the input character string; referencing a memory that stores character string candidates in association with pronunciation and identification information or the identification information; extracting, from among the character string candidates, a character string candidate that is associated with the input identity information and the pronunciation including the input character or the input character string, or a character string candidate that is associated with the input identity information and a character or a character string including the input character or the input character string; and outputting an extracted character string candidate as a selection candidate.Type: ApplicationFiled: January 9, 2017Publication date: July 27, 2017Applicant: FUJITSU LIMITEDInventor: Tomoya Iwakura
-
Publication number: 20160246775Abstract: A learning apparatus includes a memory and a processor to generate, based on a first example sentence containing a target word having a plurality of meanings belonging to different types, a first rule containing a first meaning of the target word in the first example sentence, and another word providing a clue for determining the first meaning, acquire a second example sentence, determine a second meaning of the target word in the second example sentence based on a word contained in the second example sentence and the first rule, generate a second rule pertaining to a correlation between the second meaning and the type, acquire a third example sentence, determine the third meaning of the target word in the third example sentence, and learn a third rule for determining a type of the target word based on the second rule, the third meaning, and the third example sentence.Type: ApplicationFiled: January 20, 2016Publication date: August 25, 2016Applicant: FUJITSU LIMITEDInventor: Tomoya IWAKURA
-
Patent number: 9348810Abstract: A present method includes first updating, based on a weight of each training sample (TS), a first score for each of features, which is a cue when extracting a correct structure from each TS, to calculate a model defined by first scores; performing, for each TS, a processing including identifying a maximum score among second scores, each of which is assigned, by the model, to either of candidate structures other than the correct structure among candidate structures derived from the TS; and first calculating a difference between the identified maximum score and a second score assigned by the model to the correct structure; and second calculating a confidence degree based on an upper limit value of errors, which is defined by the differences; second updating the weight of each TS based on the confidence degree and the differences; and repeating the first updating, performing, second calculating and second updating.Type: GrantFiled: June 24, 2014Date of Patent: May 24, 2016Assignee: FUJITSU LIMITEDInventor: Tomoya Iwakura
-
Publication number: 20150006151Abstract: A present method includes first updating, based on a weight of each training sample (TS), a first score for each of features, which is a cue when extracting a correct structure from each TS, to calculate a model defined by first scores; performing, for each TS, a processing including identifying a maximum score among second scores, each of which is assigned, by the model, to either of candidate structures other than the correct structure among candidate structures derived from the TS; and first calculating a difference between the identified maximum score and a second score assigned by the model to the correct structure; and second calculating a confidence degree based on an upper limit value of errors, which is defined by the differences; second updating the weight of each TS based on the confidence degree and the differences; and repeating the first updating, performing, second calculating and second updating.Type: ApplicationFiled: June 24, 2014Publication date: January 1, 2015Applicant: FUJITSU LIMITEDInventor: Tomoya IWAKURA
-
Patent number: 8370276Abstract: A rule learning method in machine learning includes distributing features to a given number of buckets based on a weight of the features which are correlated with a training example; specifying a feature with a maximum gain value as a rule based on a weight of the training example from each of the buckets; calculating a confidence value of the specified rule based on the weight of the training example; storing the specified rule and the confidence value in a rule data storage unit; updating the weights of the training examples based on the specified rule, the confidence value of the specified rule, data of the training example, and the weight of the training example; and repeating the distributing, the specifying, the calculating, the storing, and the updating, when the rule and the confidence value are to be further generated.Type: GrantFiled: July 22, 2009Date of Patent: February 5, 2013Assignee: Fujitsu LimitedInventors: Tomoya Iwakura, Seishi Okamoto
-
Patent number: 8296249Abstract: A rule learning method for making a computer perform rule learning processing in machine learning includes firstly calculating an evaluation value of respective features in a training example by using data and weights of the training examples; selecting a given number of features in descending order of the evaluation values; secondly calculating a confidence value for one of the given number of selected features; updating the weights of training example, by using the data and weights of the training examples, and the confidence value corresponding to the one feature; firstly repeating the updating for the remaining features of the given number of features; and secondly repeating, for a given number of times, the firstly calculating, the selecting, the secondly calculating, the updating, and the firstly repeating.Type: GrantFiled: July 20, 2009Date of Patent: October 23, 2012Assignee: Fujitsu LimitedInventors: Tomoya Iwakura, Seishi Okamoto