Patents by Inventor Suguru YASUTOMI

Suguru YASUTOMI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240119739
    Abstract: A non-transitory computer-readable recording medium storing a machine learning program for causing a computer to execute a process, the process includes inputting moving image data that includes at least a first frame image and a second frame image to a first machine learning model trained by using training data, and training an encoder by detecting a first object and a second object from the first frame image and the second frame image, respectively, based on an inference result by the first machine learning model, determining identity between the first object and the second object that have been detected, and inputting, to the encoder, first data in a first image area that includes the first object and second data in a second image area that includes the second object, the first object and the second object having been determined to have the identity.
    Type: Application
    Filed: July 13, 2023
    Publication date: April 11, 2024
    Applicant: Fujitsu Limited
    Inventors: Suguru YASUTOMI, Masayuki HIROMOTO
  • Publication number: 20230306306
    Abstract: A non-transitory computer-readable storage medium storing a machine learning program that causes at least one computer to execute a process, the process includes estimating a first label distribution of unlabeled training data based on a classification model and an initial value of a label distribution of a transfer target domain, the classification model being trained by using labeled training data which corresponds to a transfer source domain and unlabeled training data which corresponds to the transfer target domain; acquiring a second label distribution based on the labeled training data; acquiring a weight of each label included in the labeled training data and the unlabeled training data based on a difference between the first label distribution and the second label distribution; and re-training the classification model by the labeled training data and the unlabeled training data reflected the weight of each label.
    Type: Application
    Filed: January 25, 2023
    Publication date: September 28, 2023
    Applicant: Fujitsu Limited
    Inventors: TAKASHI KATOH, Kento UEMURA, Suguru YASUTOMI
  • Publication number: 20230289624
    Abstract: A non-transitory computer-readable storage medium storing an information processing program that causes at least one computer to execute a process, the process includes acquiring an update amount of a classification criterion of a classification model in retraining, the classification model being trained by using a first dataset, the classification model classifying input data into one of a plurality of classes, the retraining being performed by using a second dataset; and detecting data with a largest change amount among the second dataset when changing each piece of data included in the second dataset so as to decrease the update amount.
    Type: Application
    Filed: December 27, 2022
    Publication date: September 14, 2023
    Applicant: Fujitsu Limited
    Inventors: Takashi KATOH, Kento UEMURA, Suguru YASUTOMI
  • Publication number: 20230289659
    Abstract: An information processing method comprising: for a classification model for classifying input data into one or another of plural classes that was trained using a first data set, identifying, in a second data set that is different from the first data set one or more items of data having a specific datum of which a degree of contribution to a change in a classification criterion is greater than a predetermined threshold, the classification criterion being a classification criterion of the classification model during re-training based on the second data set; and, from among the one or more items of data, detecting an item of data, for which a loss reduces for the classification model by change to the classification criterion by re-training based on the second data set, as an item of data of an unknown class not contained in the plural classes.
    Type: Application
    Filed: February 26, 2023
    Publication date: September 14, 2023
    Applicant: Fujitsu Limited
    Inventors: Takashi KATOH, Kento UEMURA, Suguru YASUTOMI
  • Publication number: 20230289406
    Abstract: A non-transitory computer-readable recording medium stores a determination program for causing a computer to execute processing including: re-training a classification model that has been trained by using a first data set and that classifies input data into any one of a plurality of classes by using a loss calculatable based on a second data set that is different from the first data set; and determining, in a case where a change in a classification standard of the classification model based on the loss is a predetermined standard or more before and after re-training, that unknown data that is not classified into any one of the plurality of classes is included in the second data set.
    Type: Application
    Filed: March 2, 2023
    Publication date: September 14, 2023
    Applicant: Fujitsu Limited
    Inventors: Takashi KATOH, Kento UEMURA, Suguru YASUTOMI
  • Publication number: 20230281845
    Abstract: An information processing device acquires output image data that is acquired by inputting image data indicating a pseudo-shadow area to an auto-encoder that is generated by machine learning using label image data contained in training data, the label image data indicating a shadow area in ultrasound image data of a captured target. The information processing device generates augmented data corresponding to the training data by combining the acquired output image data with the ultrasound image data.
    Type: Application
    Filed: November 30, 2022
    Publication date: September 7, 2023
    Applicant: Fujitsu Limited
    Inventors: Suguru YASUTOMI, Akira SAKAI, Takashi KATOH, Kento UEMURA
  • Patent number: 11741363
    Abstract: A learning device executes learning of a discriminator that discriminates object data to a known class included in training data or an unknown class not included in the training data, using the training data. The learning device then generates a feature value of the unknown class, from a feature value of a plurality of layers of the discriminator, by at least a part of the training data in the layers. The learning device then executes the learning of the discriminator so that a feature value of the known class and the generated feature value of the unknown class are separated.
    Type: Grant
    Filed: February 27, 2019
    Date of Patent: August 29, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi
  • Publication number: 20230259827
    Abstract: A non-transitory computer-readable recording medium stores a generation program for causing a computer to execute a process including: with data included in each of a plurality of data sets, training a feature space in which a distance between pieces of the data included in a same domain is shorter and the distance of the data between different domains is longer; and generating labeled data sets by integrating labeled data included within a predetermined range in the trained feature space, among a plurality of pieces of the labeled data.
    Type: Application
    Filed: April 17, 2023
    Publication date: August 17, 2023
    Applicant: FUJITSU LIMITED
    Inventors: Takashi KATOH, Kento UEMURA, Suguru YASUTOMI, Tomohiro HAYASE
  • Publication number: 20230186118
    Abstract: A program for causing a computer to execute processing including: acquiring a plurality of datasets, each of which includes data values associated with a label, the data values having properties different for each dataset; calculating an index indicating a degree of a difference between first and second datasets by using a data value in the second dataset; calculating accuracy of a prediction result for the second dataset, predicted by a prediction model trained using the first dataset; specifying a relationship between the index and the accuracy of the prediction result from the prediction model, based on the index and the accuracy calculated for each of a plurality of combinations of the first and second datasets; and estimating accuracy of the prediction result from the prediction model for a third dataset including data values without labels based on the specified relationship and the index between the first and third datasets.
    Type: Application
    Filed: January 20, 2023
    Publication date: June 15, 2023
    Applicant: FUJITSU LIMITED
    Inventors: Tomohiro HAYASE, TAKASHI KATOH, Suguru YASUTOMI, Kento UEMURA
  • Patent number: 11676030
    Abstract: A learning method executed by a computer, the learning method including augmenting original training data based on non-stored target information included in the original training data to generate a plurality of augmented training data, generating a plurality of intermediate feature values by inputting the plurality of augmented training data to a learning model, and learning a parameter of the learning model such that, with regard to the plurality of intermediate feature values, each of the plurality of intermediate feature values generated from a plurality of augmented training data, augmented from reference training data, becomes similar to a reference feature value.
    Type: Grant
    Filed: January 14, 2020
    Date of Patent: June 13, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi
  • Patent number: 11620530
    Abstract: A learning method executed by a computer, the learning method includes: learning parameters of a machine learning model having intermediate feature values by inputting a plurality of augmented training data, which is generated by augmenting original training data, to the machine learning model so that specific intermediate feature values, which are calculated from specific augmented training data augmented from a same original training data, become similar to each other.
    Type: Grant
    Filed: January 14, 2020
    Date of Patent: April 4, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi, Takeshi Osoekawa
  • Patent number: 11562233
    Abstract: A learning device generates a first feature value and a second feature value by inputting original training data to a first neural network included in a learning model. The learning device learns at least one parameter of the learning model and a parameter of a decoder, reconstructing data inputted to the first neural network, such that reconstruction data outputted from the decoder by inputting the first feature value and the second feature value to the decoder becomes close to the original training data, and that outputted data that is outputted from a second neural network, included in the learning model by inputting the second feature value to the second neural network becomes close to correct data of the original training data.
    Type: Grant
    Filed: January 14, 2020
    Date of Patent: January 24, 2023
    Assignee: FUJITSU LIMITED
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi
  • Patent number: 11494696
    Abstract: A non-transitory computer-readable recording medium stores therein a learning program that causes a computer to execute a process including: generating a shadow image including a shadow according to a state of ultrasound reflection in an ultrasound image; generating a combined image by combining the ultrasound image and the shadow image; inputting, into a first decoder and a second decoder, an output acquired from an encoder in response to inputting the combined image into the encoder; and executing training of the encoder, the first decoder, and the second decoder, based on: reconfigured error between an output image of a coupling function and the combined image, the coupling function being configured to combine a first image output from the first decoder with a second image output from the second decoder, and an error function between an area in the first image and the shadow in the shadow image.
    Type: Grant
    Filed: December 17, 2019
    Date of Patent: November 8, 2022
    Assignee: FUJITSU LIMITED
    Inventors: Suguru Yasutomi, Kento Uemura, Takashi Katoh
  • Patent number: 11449715
    Abstract: An apparatus receives, at a discriminator within a generative adversarial network, first generation data from a first generator within the generative adversarial network, where the first generator has performed learning using a first data group. The apparatus receives, at the discriminator, a second data group, and performs learning of a second generator based on the first generation data and the second data group where the first generation data is handled as false data by the discriminator.
    Type: Grant
    Filed: November 12, 2019
    Date of Patent: September 20, 2022
    Assignee: FUJITSU LIMITED
    Inventors: Hiroya Inakoshi, Takashi Katoh, Kento Uemura, Suguru Yasutomi
  • Publication number: 20220261690
    Abstract: A computer-implemented method of a determination processing, the method including: calculating, in response that deterioration of a classification model has occurred, a similarity between a first determination result and each of a plurality of second determination results, the first determination result being a determination result output from the classification model by inputting first input data after the deterioration has occurred to the classification model, and the plurality of second determination results being determination results output from the classification model by inputting, to the classification model, a plurality of pieces of post-conversion data converted by inputting second input data before the deterioration occurs to a plurality of data converters; selecting a data converter from the plurality of data converters on the basis of the similarity; and preprocessing in data input of the classification model by using the selected data converter.
    Type: Application
    Filed: December 5, 2021
    Publication date: August 18, 2022
    Applicant: FUJITSU LIMITED
    Inventors: TAKASHI KATOH, Kento UEMURA, Suguru YASUTOMI, Tomohiro Hayase
  • Patent number: 11409988
    Abstract: A learning device learns at last one parameter of a learning model such that each intermediate feature quantity becomes similar to a reference feature quantity, the each intermediate feature quantity being calculated as a result of inputting a plurality of sets of augmentation training data to a first neural network in the learning model, the plurality of augmentation training data being generated by performing data augmentation based on same first original training data. The learning device learns at last one parameter of a second network, in the learning model, using second original training data, which is different than the first original training data, and using the reference feature quantity.
    Type: Grant
    Filed: January 8, 2020
    Date of Patent: August 9, 2022
    Assignee: FUJITSU LIMITED
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi
  • Publication number: 20220245405
    Abstract: A deterioration suppression device generates a plurality of trained machine learning models having different characteristics on the basis of each training data included in a first training data set and assigned with a label indicating correct answer information. In a case where estimation accuracy of label estimation with respect to input data to be estimated by any trained machine learning model among the plurality of trained machine learning models becomes lower than a predetermined standard, the deterioration suppression device generates a second training data set including a plurality of pieces of training data using an estimation result by a trained machine learning model with the estimation accuracy equal to or higher than the predetermined standard. The deterioration suppression device executes re-learning of the trained machine learning model with the estimation accuracy lower than the predetermined standard using the second training data set.
    Type: Application
    Filed: April 25, 2022
    Publication date: August 4, 2022
    Applicant: FUJITSU LIMITED
    Inventors: TAKASHI KATOH, Kento UEMURA, Suguru YASUTOMI, Tomohiro Hayase, YUHEI UMEDA
  • Patent number: 11367003
    Abstract: A non-transitory computer-readable storage medium storing a program that causes a computer to execute a process, the process including obtaining a feature quantity of input data by using a feature generator, generating a first output based on the feature quantity by using a supervised learner for labeled data, generating a second output based on the feature quantity by using an unsupervised learning processing for unlabeled data, and changing a contribution ratio between a first error and a second error in a learning by the feature generator, the first error being generated from the labeled data and the first output, the second error being generated from the unlabeled data and the second output.
    Type: Grant
    Filed: April 6, 2018
    Date of Patent: June 21, 2022
    Assignee: Fujitsu Limited
    Inventors: Takashi Katoh, Kento Uemura, Suguru Yasutomi, Toshio Endoh
  • Publication number: 20220147764
    Abstract: A non-transitory computer-readable storage medium storing a data generation program that causes at least one computer to execute a process, the process includes, acquiring a data generation model that is trained by using a first dataset corresponding to a first domain and a second dataset corresponding to a second domain, and that includes an identification loss by an identification model in a parameter; inputting first data corresponding to the first domain to the identification model to acquire a first identification loss, and inputting second data corresponding to the second domain to the identification model to acquire a second identification loss; generating data in which the second identification loss approximates the first identification loss, by using the data generation model; and outputting the data that is generated.
    Type: Application
    Filed: September 13, 2021
    Publication date: May 12, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Takashi KATOH, Kento UEMURA, Suguru YASUTOMI, Tomohiro HAYASE
  • Publication number: 20220101124
    Abstract: A non-transitory computer-readable storage medium storing an information processing program that causes at least one computer to execute a process, the process includes acquiring a first machine learning model trained by using a training data set including first data and a second machine learning model not trained with the specific data; and retraining the first machine learning model so that an output of the first machine learning model and an output of the second machine learning model when second data corresponding to the first data is input get close to each other.
    Type: Application
    Filed: July 7, 2021
    Publication date: March 31, 2022
    Applicant: FUJITSU LIMITED
    Inventors: Suguru YASUTOMI, Tomohiro HAYASE, Takashi KATOH