Patents by Inventor Sekitoshi KANAI

Sekitoshi KANAI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230359904
    Abstract: The conversion unit (132) converts first data into a first frequency component, and converts second data generated by a generator that configures an adversarial learning model into a second frequency component. The calculation unit (133) calculates a loss function that simultaneously optimizes the generator, a first discriminator that configures the adversarial learning model and discriminates between the first data and the second data, and a second discriminator that configures the adversarial learning model and discriminates between the first frequency component and the second frequency component. The update unit (134) updates parameters of the generator, the first discriminator, and the second discriminator so that the loss function calculated by the calculation unit (133) is optimized.
    Type: Application
    Filed: September 30, 2020
    Publication date: November 9, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Shinya YAMAGUCHI, Sekitoshi KANAI
  • Publication number: 20230267316
    Abstract: An inference device executes a first conversion step of converting an output from an intermediate layer using a bounded nonlinear function in a final layer of a deep neural network having the intermediate layer and the final layer. Moreover, the inference device executes a second conversion step of converting a value obtained by conversion in the first conversion step using an activation function.
    Type: Application
    Filed: August 5, 2020
    Publication date: August 24, 2023
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Sekitoshi KANAI, Masanori YAMADA
  • Publication number: 20220405624
    Abstract: An acquisition unit 15a acquires data in a task. The learning unit 15b learns a generation model representing a distribution of a probability of the data in the task so that a mutual information amount between a latent variable and an observed variable is minimized in the model.
    Type: Application
    Filed: November 21, 2019
    Publication date: December 22, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Hiroshi TAKAHASHI, Tomoharu IWATA, Sekitoshi KANAI, Atsutoshi KUMAGAI, Yuki YAMANAKA, Masanori YAMADA, Satoshi YAGI
  • Publication number: 20220164604
    Abstract: A classification device (10) includes: a classification unit (12) that performs classification by using a model (121) that is a model performing classification and is a deep learning model; and a preprocessing unit (11) that is provided prior to the classification unit (12), and selects an input to the model (121) by using a mask model (111) that minimizes a sum of a loss function and a magnitude of the input to the classification unit (12), the loss function evaluating a relationship between a label on an input from teaching data and an output of the model (121).
    Type: Application
    Filed: March 26, 2020
    Publication date: May 26, 2022
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Sekitoshi KANAI, Hiroshi TAKAHASHI
  • Publication number: 20210326705
    Abstract: A learning apparatus (10) includes: a generation unit (11) configured having a mathematical model for generating data through an input of a random number used for deep learning to a nonlinear function; and a prior learning unit (13) configured to cause the generation unit (11) to execute prior learning of a variance and an average using unscented transform (UT). The prior learning unit (13) estimates, using UT, the variance and the average of data generated by the generation unit (11) and updates a parameter of the generation unit (11) to minimize an evaluation function for evaluating a similarity between the estimated variance and average and a variance and an average of true data calculated in advance.
    Type: Application
    Filed: August 13, 2019
    Publication date: October 21, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventor: Sekitoshi KANAI
  • Publication number: 20210056418
    Abstract: A calculation unit (121) calculates, for an output signal of an output layer in a neural network, an output function obtained by replacing an exponential function included in softmax with a product of the exponential function and a predetermined function having no parameter, the output function having a non-linear log likelihood function. An update unit (122) updates a parameter of the neural network on the basis of the output signal such that the log likelihood function of the output function is optimized.
    Type: Application
    Filed: April 22, 2019
    Publication date: February 25, 2021
    Applicant: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Sekitoshi KANAI, Yasuhiro FUJIWARA, Yuki YAMANAKA