Patents by Inventor Masataro Asai

Masataro Asai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11526729
    Abstract: A method is provided for detecting a higher-level action from one or more trajectories of real states. The trajectories are based on an experts' action demonstration. The method trains predictors to predict future states. Each predictor has a different duration of the higher-level action to be detected. The method predicts, using the predictors, the future states using past ones of the real states in the one or more trajectories as inputs for the predictors. The method determines if a match exists between any of the future states relative to a real future state with a corresponding same duration from the one or more trajectories. The method outputs a pair that includes the matching one of the future states as a prediction input and the real future state with the corresponding same duration from the one or more trajectories as the higher-level action corresponding thereto, responsive to the match existing.
    Type: Grant
    Filed: May 22, 2019
    Date of Patent: December 13, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Michiaki Tatsubori, Roland Everett Fall, III, Don Joven R. Agravante, Masataro Asai, Asim Munawar
  • Publication number: 20220198324
    Abstract: Techniques regarding generating and/or training one or more symbolic models are provided. For example, one or more embodiments described herein can comprise a system, which can comprise a memory that can store computer executable components. The system can also comprise a processor, operably coupled to the memory, and that can execute the computer executable components stored in the memory. The computer executable components can comprise a training component that can train a symbolic model via active machine learning. The symbolic model can characterize a formal planning language for a planning domain as a plurality of digital image sequences.
    Type: Application
    Filed: December 23, 2020
    Publication date: June 23, 2022
    Inventors: Akihiro Kishimoto, Masataro Asai, Yufang Hou, Hiroshi Kajino, Radu Marinescu
  • Patent number: 11244227
    Abstract: A discrete neural network is trained by training a neural network having an output layer so as to output discrete values. The output layer includes a plurality of nodes. Each node corresponding to one of a plurality of classes. The training includes activating the nodes by priority according to the corresponding class.
    Type: Grant
    Filed: March 1, 2019
    Date of Patent: February 8, 2022
    Assignee: International Business Machines Corporation
    Inventor: Masataro Asai
  • Publication number: 20200372323
    Abstract: A method is provided for detecting a higher-level action from one or more trajectories of real states. The trajectories are based on an experts' action demonstration. The method trains predictors to predict future states. Each predictor has a different duration of the higher-level action to be detected. The method predicts, using the predictors, the future states using past ones of the real states in the one or more trajectories as inputs for the predictors. The method determines if a match exists between any of the future states relative to a real future state with a corresponding same duration from the one or more trajectories. The method outputs a pair that includes the matching one of the future states as a prediction input and the real future state with the corresponding same duration from the one or more trajectories as the higher-level action corresponding thereto, responsive to the match existing.
    Type: Application
    Filed: May 22, 2019
    Publication date: November 26, 2020
    Inventors: Michiaki Tatsubori, Roland Everett Fall, III, Don Joven R. Agravante, Masataro Asai, Asim Munawar
  • Publication number: 20200311554
    Abstract: Permutation-invariant neural networks are trained by calculating a pairwise distance between each of a plurality of elements of a first data and each of a plurality of elements of a second data, normalizing each pairwise distance with a normalizing function to obtain a normalized value corresponding to each pairwise distance, de-normalizing a summation of the normalized values of all pairwise distances between a single element of the second data and each element of the first data with a de-normalizing function to obtain a first value, for each element of the second data, estimating a summation of the first values for all elements of the second data, and training a neural network by using at least the summation of the first values for an optimization metric.
    Type: Application
    Filed: March 27, 2019
    Publication date: October 1, 2020
    Inventor: Masataro Asai
  • Publication number: 20200279164
    Abstract: A discrete neural network is trained by training a neural network having an output layer so as to output discrete values. The output layer includes a plurality of nodes. Each node corresponding to one of a plurality of classes. The training includes activating the nodes by priority according to the corresponding class.
    Type: Application
    Filed: March 1, 2019
    Publication date: September 3, 2020
    Inventor: Masataro Asai