Patents by Inventor Shingo Kida

Shingo Kida has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12267570
    Abstract: A visible light image generation model learning unit generates a trained visible light image generation model that generates a visible light image in a second time zone from a far-infrared image in a first time zone. The visible light image generation model learning unit includes a first learning unit that machine-learns the far-infrared image in the first time zone and a far-infrared image in the second time zone as teacher data and generates a trained first generation model that generates the far-infrared image in the second time zone from the far-infrared image in the first time zone, and a second learning unit that machine-learns the far-infrared image in the second time zone and the visible light image in the second time zone as teacher data and generates a trained second generation model that generates the visible light image in the second time zone from the far-infrared image in the second time zone.
    Type: Grant
    Filed: February 24, 2023
    Date of Patent: April 1, 2025
    Assignee: JVCKENWOOD Corporation
    Inventors: Yincheng Yang, Shingo Kida, Hideki Takehara
  • Publication number: 20240338605
    Abstract: A machine learning apparatus that continually learns a novel class with fewer samples than a base class is provided. A base class feature extraction unit extracts a feature vector of the base class. A novel class feature extraction unit extracts a feature vector of the novel class. A merged feature calculation unit merges the feature vector of the base class and the feature vector of the novel class to calculate a merged feature vector that merges the base class and the novel class. A learning unit classifies, on a projected space, a query sample of a query set based on a distance between a position of the merged feature vector of the query sample of the query set and a position of a classification weight vector of each class, and learns a classification weight vector of the novel class to minimize a loss incurred in classification.
    Type: Application
    Filed: June 18, 2024
    Publication date: October 10, 2024
    Inventor: Shingo KIDA
  • Publication number: 20240330703
    Abstract: A pre-trained feature extraction unit extracts feature vectors of samples in a base class using a pre-trained model. A base class classification weight is for classifying the samples in the base class using the classification weight of the base class while using the feature vectors of the samples in the base class as input. A feature optimization unit performs meta-learning of an optimization module that is based on the pre-trained model and optimizes feature vectors of samples in a novel class. A novel class feature averaging unit averages the feature vectors of the samples in the novel class for each class and calculates the classification weight of the novel class. A graph neural network uses the classification weights of the base class and novel class as input, performs meta-learning of the dependence relationship between the base and novel classes, and outputs a reconstruction classification weight.
    Type: Application
    Filed: June 11, 2024
    Publication date: October 3, 2024
    Inventor: Shingo KIDA
  • Publication number: 20240312186
    Abstract: A feature extraction unit extracts a feature vector from input data. A semantic prediction unit is a module has been trained in advance in a meta-learning process and that generates a semantic vector from the feature vector of the input data. A mapping unit is a module that has learned a base class and that generates a semantic vector from the feature vector of the input data. An optimization unit optimizes parameters of the mapping unit using the semantic vector generated by the semantic prediction unit as a correct answer semantic vector such that a distance between the semantic vector generated by the mapping unit and the correct answer semantic vector is minimized when semantic information is not added to input data of a novel class at the time of learning the novel class.
    Type: Application
    Filed: May 21, 2024
    Publication date: September 19, 2024
    Inventor: Shingo KIDA
  • Publication number: 20240265257
    Abstract: A machine learning device is provided that performs continual learning of a fewer number of novel classes than the number of base classes. A base class feature extraction unit extracts feature vectors of the base classes. A novel class feature extraction unit extracts feature vectors of the novel classes. A mixture feature calculation unit mixes the feature vectors of the base classes and the feature vectors of the novel classes and calculates a mixture feature vector of the base classes and the novel classes. A learning unit classifies a query sample of a query set based on the distance between the position of a mixture feature vector of the query sample of the query set and the position of a classification weight vector of each class in a projection space and learns classification weight vectors of the novel classes so as to minimize classification loss.
    Type: Application
    Filed: March 27, 2024
    Publication date: August 8, 2024
    Inventors: Shingo KIDA, Hideki TAKEHARA, Yincheng YANG, Maki TAKAMI
  • Publication number: 20240212323
    Abstract: A basic class selection unit selects, in response to input data, a base class based on an embedding vector output by a basic neural network that has learned the base class and a centroid vector of the base class. A continual learning unit continually learns an additional class by using an additional neural network that has learned the base class. An additional class selection unit selects, in response to the input data, an additional class based on an embedding vector output by the additional neural network subjected to continual learning and centroid vectors of the base class and the additional class. A classification determination unit classifies the input data based on the base class selected by the base class selection unit and the additional class selected by the additional class selection unit.
    Type: Application
    Filed: February 27, 2024
    Publication date: June 27, 2024
    Inventors: Hideki TAKEHARA, Shingo KIDA, Yincheng YANG, Maki TAKAMI
  • Publication number: 20230409912
    Abstract: An initialization rate determination unit determines, in accordance with a depth of a layer in a neural network model, a first initialization rate for initializing weights in the neural network model on a first task. A machine learning execution unit generates a neural network model trained on a first task by training on the first task by machine learning. An initialization unit initializes weights in the neural network model trained on the first task, based on the first initialization rate, to generate an initialized neural network model trained on the first task, the initialized neural network trained on the first task being used in a second task.
    Type: Application
    Filed: September 1, 2023
    Publication date: December 21, 2023
    Inventors: Hideki TAKEHARA, Shingo KIDA, Yincheng YANG
  • Publication number: 20230385705
    Abstract: A domain adaptation data richness determination unit determines, when a first model trained by using training data of a first domain is trained by transfer learning by using training data of a second domain, a domain adaptation data richness based on the number of items of training data of the second domain, the first model being a neural network. A learning layer determining unit determines a layer in the second model, which is a duplicate of the first model, targeted for training, based on the domain adaptation data richness. A transfer learning unit applies transfer learning to the layer in the second model targeted for training, by using the training data of the second domain.
    Type: Application
    Filed: August 10, 2023
    Publication date: November 30, 2023
    Inventors: Hideki TAKEHARA, Shingo KIDA, Yincheng YANG
  • Publication number: 20230376763
    Abstract: A weight storage unit stores weights of a plurality of filters used to detect a feature of a task. A continual learning unit trains the weights of the plurality of filters in response to an input task in continual learning. A filter control unit compares, after a predetermined epoch number has been learned in continual learning, the weight of a filter that has learned the task with the weight of a filter that is learning the task, extracts overlap filters having a similarity in weight equal to or greater than a predetermined threshold value as shared filters shared by tasks, and leaves one of the overlap filters as the shared filter and initializes the weights of filters other than the shared filter.
    Type: Application
    Filed: July 10, 2023
    Publication date: November 23, 2023
    Inventors: Shingo KIDA, Hideki TAKEHARA, Yincheng YANG
  • Publication number: 20230351266
    Abstract: A weight storage unit stores weights of a plurality of filters used to detect a feature of a task. A continual learning unit trains the weights of the filters in response to an input task in continual learning. A filter processing unit locks, of a plurality of filters that have learned one task, the weights of a proportion of the filters to prevent the proportion of the filters from being used to learn a further task and initializes the weights of other filters to use the other filters to learn a further task. A comparison unit compares the weights of a plurality of filters that have learned two or more tasks, extracts overlap filters having a similarity in weight over a threshold value as shared filters shared by tasks, leaves one of the overlap filters as the shared filter, and initializes the weights of filters other than the shared filter.
    Type: Application
    Filed: July 10, 2023
    Publication date: November 2, 2023
    Inventors: Yincheng YANG, Hideki TAKEHARA, Shingo KIDA
  • Publication number: 20230298366
    Abstract: An object recognition unit recognizes an object in an input image by using an object recognition model. A recognition precision determination unit determines a precision of recognition of the object in the input image. A supervised image conversion unit converts the input image for which the precision of recognition of the object is lower than a predetermined threshold value into a supervised image by labeling the input image based on a feature amount of the input image. A transfer learning unit applies transfer learning to the object recognition model by using the supervised image as training data to update the object recognition model.
    Type: Application
    Filed: May 26, 2023
    Publication date: September 21, 2023
    Inventors: Shingo KIDA, Hideki TAKEHARA, Yincheng YANG
  • Publication number: 20230289614
    Abstract: A domain adaptability determination unit determines a domain adaptability based on a precision of inference from images of a second domain using a first model trained by using images of a first domain as training data, the first model being a neural network. A learning layer determining unit determines a layer in the second model, which is a duplicate of the first model, targeted for training, based on the domain adaptability. A transfer learning execution unit applied transfer learning to the layer in the second model targeted for training, by using images of the second domain as training data.
    Type: Application
    Filed: May 19, 2023
    Publication date: September 14, 2023
    Inventors: Hideki TAKEHARA, Shingo KIDA, Yincheng YANG
  • Publication number: 20230199280
    Abstract: A far-infrared image training data acquisition unit acquires a far-infrared image in a first predetermined time zone. A visible light image training data acquisition unit acquires a visible light image in a second predetermined time zone. A visible light image generation model training unit machine-learns the far-infrared image in the first predetermined time zone and the visible light image in the second predetermined time zone as training data by a generative adversarial network, and generates a trained generation model, which generates the visible light image in the second predetermined time zone from the far-infrared image in the first predetermined time zone. Through machine learning by a generative adversarial network, the visible light image generation model training unit further generates a trained identification model, which identifies whether or not the far-infrared image is a far-infrared image captured in the first predetermined time zone.
    Type: Application
    Filed: February 24, 2023
    Publication date: June 22, 2023
    Inventors: Hideki TAKEHARA, Shingo KIDA, Yincheng YANG
  • Publication number: 20230196739
    Abstract: A far-infrared image acquisition unit acquires a far-infrared image. An image conversion unit converts the acquired far-infrared image into a visible light image. A visible light image trained model storage unit stores a first visible light image trained model having performed learning using the visible light image as training data. A transfer learning unit performs transfer learning on a first visible light image trained model by using the visible light image obtained by conversion as training data to generate a second visible light image trained model.
    Type: Application
    Filed: February 24, 2023
    Publication date: June 22, 2023
    Inventors: Shingo KIDA, Hideki TAKEHARA, Yincheng YANG
  • Publication number: 20230199281
    Abstract: A visible light image generation model learning unit generates a trained visible light image generation model that generates a visible light image in a second time zone from a far-infrared image in a first time zone. The visible light image generation model learning unit includes a first learning unit that machine-learns the far-infrared image in the first time zone and a far-infrared image in the second time zone as teacher data and generates a trained first generation model that generates the far-infrared image in the second time zone from the far-infrared image in the first time zone, and a second learning unit that machine-learns the far-infrared image in the second time zone and the visible light image in the second time zone as teacher data and generates a trained second generation model that generates the visible light image in the second time zone from the far-infrared image in the second time zone.
    Type: Application
    Filed: February 24, 2023
    Publication date: June 22, 2023
    Inventors: Yincheng YANG, Shingo KIDA, Hideki TAKEHARA
  • Patent number: 11511195
    Abstract: The game device includes a reception unit that is configured to receive instruction information created when a user performs an operation on an input device triggered by a sound output while a game progresses and a derivation unit that is configured to start measuring the degree of fatigue on the basis of the instruction information received by the reception unit and derive the degree of fatigue of the user on the basis of an operation performed by the user during the started measurement of the degree of fatigue.
    Type: Grant
    Filed: March 24, 2021
    Date of Patent: November 29, 2022
    Assignee: JVCKENWOOD Corporation
    Inventors: Shingo Kida, Hideki Aiba, Ryouji Hoshi, Hisashi Oka, Yuya Takehara, Yincheng Yang, Hideya Tsujii, Daisuke Hachiri, Ryotaro Futamura
  • Patent number: 11471779
    Abstract: A map data analysis unit refers to map data for a game in which a plurality of players compete in a three-dimensional space to extract positional information on each player. A feature parameter extraction unit extracts a feature parameter related to the game. A spectating area analysis unit analyzes one or more areas in a map that should be viewed by spectators, based on the positional information on each player and the feature parameter related to the game. A map data generation unit generates spectating map data by associating information indicating the area that should be viewed by spectators with the map data.
    Type: Grant
    Filed: March 9, 2021
    Date of Patent: October 18, 2022
    Assignee: JVCKENWOOD CORPORATION
    Inventors: Hisashi Oka, Hideki Aiba, Yuya Takehara, Ryouji Hoshi, Shingo Kida, Yincheng Yang, Hideya Tsujii, Daisuke Hachiri, Ryotaro Futamura
  • Publication number: 20210299570
    Abstract: The game device includes a reception unit that is configured to receive instruction information created when a user performs an operation on an input device triggered by a sound output while a game progresses and a derivation unit that is configured to start measuring the degree of fatigue on the basis of the instruction information received by the reception unit and derive the degree of fatigue of the user on the basis of an operation performed by the user during the started measurement of the degree of fatigue.
    Type: Application
    Filed: March 24, 2021
    Publication date: September 30, 2021
    Inventors: Shingo KIDA, Hideki AIBA, Ryouji HOSHI, Hisashi OKA, Yuya TAKEHARA, Yincheng YANG, Hideya TSUJII, Daisuke HACHIRI, Ryotaro FUTAMURA
  • Publication number: 20210299558
    Abstract: A game device includes a reception unit that is configured to receive operation information created when a user performs an operation on an input device while a game progresses, a determination unit that is configured to determine whether an operation of starting measurement of the degree of concentration has been performed on the basis of the operation information received by the reception unit, and a derivation unit that is configured to start measuring the degree of concentration if the determination unit determines that the operation of starting measurement of the degree of concentration has been performed and is configured to derive the degree of concentration of the user on the basis of an operation performed by the user during the started measurement of the degree of concentration.
    Type: Application
    Filed: March 24, 2021
    Publication date: September 30, 2021
    Inventors: Yuya TAKEHARA, Hideki AIBA, Hisashi OKA, Ryouji HOSHI, Shingo KIDA, Yincheng YANG, Hideya TSUJII, Daisuke HACHIRI, Ryotaro FUTAMURA
  • Publication number: 20210275930
    Abstract: A map data analysis unit refers to map data for a game in which a plurality of players compete in a three-dimensional space to extract positional information on each player. A feature parameter extraction unit extracts a feature parameter related to the game. A spectating area analysis unit analyzes one or more areas in a map that should be viewed by spectators, based on the positional information on each player and the feature parameter related to the game. A map data generation unit generates spectating map data by associating information indicating the area that should be viewed by spectators with the map data.
    Type: Application
    Filed: March 9, 2021
    Publication date: September 9, 2021
    Inventors: Hisashi OKA, Hideki AIBA, Yuya TAKEHARA, Ryouji HOSHI, Shingo KIDA, Yincheng YANG, Hideya TSUJII, Daisuke HACHIRI, Ryotaro FUTAMURA