Patents Assigned to Preferred Networks, Inc.
  • Publication number: 20240136028
    Abstract: An information processing system includes a first information processing device and a second information processing device. The first information processing device is configured to receive the atomic information from the second information processing device, calculate a processing result corresponding to the atomic information by inputting the atomic information into a neural network, and transmit the processing result to the second information processing device. The second information processing device is configured to transmit atomic information to the first information processing device.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 25, 2024
    Applicants: Preferred Networks, Inc., ENEOS Corporation
    Inventors: Kosuke NAKAGO, Daisuke TANIWAKI, Motoki ABE, Marc Alan ONG, So TAKAMOTO, Takao KUDO, Yusuke ASANO
  • Publication number: 20240127533
    Abstract: An inferring device includes one or more memories and one or more processors. The one or more processors are configured to obtain three-dimensional structures of a plurality of molecules; and input the three-dimensional structures of the plurality of molecules into a neural network model and infer one or more physical properties of the plurality of molecules.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 18, 2024
    Applicants: Preferred Networks, Inc., ENEOS Corporation
    Inventors: So TAKAMOTO, Wenwen LI, Yusuke ASANO, Takafumi ISHII
  • Publication number: 20240127028
    Abstract: An information processing device includes one or more memories and one or more processors. The one or more processors are configured to receive information on a plurality of graphs from one or more second information processing devices; select a plurality of graphs which are simultaneously processable using a graph neural network model among the plurality of graphs; input information on the plurality of graphs which are simultaneously processable into the graph neural network model and simultaneously process the information on the plurality of graphs which are simultaneously processable to acquire a processing result for each of the plurality of graphs which are simultaneously processable; and transmit the processing result to the second information processing device which has transmitted the corresponding information on the graph.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 18, 2024
    Applicant: Preferred Networks, Inc.
    Inventor: Daisuke TANIWAKI
  • Publication number: 20240129368
    Abstract: A server device configured to communicate, via a communication network, with at least one device including a learner configured to perform processing by using a learned model, includes processor, a transmitter, and a storage configured to store a plurality of shared models pre-learned in accordance with environments and conditions of various devices. The processor is configured to acquire device data including information on an environment and conditions from the at least one device, and select an optimum shared model for the at least one device based on the acquired device data. The transmitter is configured to transmit a selected shared model to the at least one device.
    Type: Application
    Filed: December 7, 2023
    Publication date: April 18, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: Keigo Kawaai, Shohei Hido, Nobuyuki Kubota, Daisuke Tanaka
  • Publication number: 20240127121
    Abstract: A training device includes processor. The processor inputs a first atomic structure including a surface and an adsorbed molecule close to the surface into a model to obtain an energy outputted from the model in response to the input, and obtains a first error based on the outputted energy of the first atomic structure and a ground truth value of the energy of the first atomic structure, input a fourth atomic structure including a cluster and an adsorbed molecule close to the cluster into the model to obtain an energy outputted from the model in response to the input, and obtains a fourth error based on the outputted energy of the fourth atomic structure and a ground truth value of the energy of the fourth atomic structure, and update a parameter of the model by the first and the fourth error. The surface and the cluster include the same atom.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 18, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: Chikashi SHINAGAWA, So TAKAMOTO, Iori KURATA
  • Publication number: 20240112764
    Abstract: An information processing device includes one or more processors. The one or more processors are configured to optimize, for a specific elementary reaction in a reaction using a catalyst including a plurality of elementary reactions, an arrangement of a promoter element in the catalyst based on activation energy acquired using a trained model, and search for the promoter element based on the activation energy acquired using the trained model for each type of the promoter element.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 4, 2024
    Applicants: ENEOS Corporation, Preferred Networks, Inc.
    Inventors: Yoshihiro YAYAMA, Yusuke ASANO, Takafumi ISHII, Takao KUDO, Taku WATANABE, Ryohto SAWADA
  • Publication number: 20240111998
    Abstract: An inferring device includes one or more memories; and one or more processors. The one or more processors are configured to input information on each atom in an atomic system into a second model to infer a difference between energy based on a first-principles calculation corresponding to the atomic system and energy of an interatomic potential function corresponding to the atomic system.
    Type: Application
    Filed: December 8, 2023
    Publication date: April 4, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: Daisuke MOTOKI, Chikashi SHINAGAWA, So TAKAMOTO, Hiroki IRIGUCHI
  • Publication number: 20240105288
    Abstract: An inferring device includes one or more processors. The one or more processors are configured to acquire an output from a neural network model based on information related to an atomic structure and label information related to an atomic simulation, wherein the neural network model is trained to infer a simulation result with respect to the atomic structure generated by the atomic simulation corresponding to the label information.
    Type: Application
    Filed: December 8, 2023
    Publication date: March 28, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: So TAKAMOTO, Chikashi SHINAGAWA
  • Publication number: 20240079099
    Abstract: An inferring device comprises one or more memories and one or more processors. The one or more processors execute decision of an action based on a tree representation including a node and an edge of a molecular graph, and a trained model trained through reinforcement learning, and execute generation of a state including information on the molecular graph based on the action, wherein the edge has connection information on the nodes.
    Type: Application
    Filed: November 10, 2023
    Publication date: March 7, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: Ryuichiro ISHITANI, Toshiki KATAOKA
  • Patent number: 11921566
    Abstract: A method and system that efficiently selects sensors without requiring advanced expertise or extensive experience even in a case of new machines and unknown failures. An abnormality detection system includes a storage unit for storing a latent variable model and a joint probability model, an acquisition unit for acquiring sensor data that is output by a sensor, a measurement unit for measuring the probability of the sensor data acquired by the acquisition unit based on the latent variable model and the joint probability model stored by the storage unit, a determination unit for determining whether the sensor data is normal or abnormal based on the probability of the sensor data measured by the measurement unit, and a learning unit for learning the latent variable model and the joint probability model based on the sensor data output by the sensor.
    Type: Grant
    Filed: April 8, 2022
    Date of Patent: March 5, 2024
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Daisuke Okanohara, Kenta Oono
  • Patent number: 11922307
    Abstract: With respect to an inference method performed by at least one processor, the method includes inputting, by the at least one processor, into a learned model, non-processed object image data of a second object and data related to a second process for the second object, and inferring, by the at least one processor using the learned model, processed object image data of the second object on which the second process has been performed. The learned model has been trained so that an output obtained in response to non-processed object image data of a first object and data related to a first process for the first object being input approaches processed object image data of the first object on which the first process has been performed.
    Type: Grant
    Filed: March 2, 2021
    Date of Patent: March 5, 2024
    Assignees: Preferred Networks, Inc., Tokyo Electron Limited
    Inventors: Kosuke Nakago, Daisuke Motoki, Masaki Watanabe, Tomoki Komatsu, Hironori Moki, Masanobu Honda, Takahiko Kato, Tomohiko Niizeki
  • Patent number: 11915146
    Abstract: There is provided an information processing device which efficiently executes machine learning. The information processing device according to one embodiment includes: an obtaining unit which obtains a source code including a code which defines Forward processing of each layer constituting a neural network; a storage unit which stores an association relationship between each Forward processing and Backward processing associated with each Forward processing; and an executing unit which successively executes each code included in the source code, and which calculates an output value of the Forward processing defined by the code based on an input value at a time of execution of each code, and generates a reference structure for Backward processing in a layer associated with the code based on the association relationship stored in the storage unit.
    Type: Grant
    Filed: November 11, 2022
    Date of Patent: February 27, 2024
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Seiya Tokui, Yuya Unno, Kenta Oono, Ryosuke Okuta
  • Patent number: 11915344
    Abstract: An apparatus and a method for coloring line drawing is disclosed for: acquiring line drawing data; performing reduction processing on the line drawing data to be a predetermined reduced size to obtain reduced line drawing data; coloring the reduced line drawing data based on a first learned model which is learned in advance using sample data; and coloring original line drawing data with the colored reduced data and the original line drawing data as inputs based on a second learned model which is learned in advance.
    Type: Grant
    Filed: April 23, 2021
    Date of Patent: February 27, 2024
    Assignee: Preferred Networks, Inc.
    Inventor: Taizan Yonetsuji
  • Patent number: 11904469
    Abstract: A machine learning device for a robot that allows a human and the robot to work cooperatively, the machine learning device including a state observation unit that observes a state variable representing a state of the robot during a period in that the human and the robot work cooperatively; a determination data obtaining unit that obtains determination data for at least one of a level of burden on the human and a working efficiency; and a learning unit that learns a training data set for setting an action of the robot, based on the state variable and the determination data.
    Type: Grant
    Filed: September 17, 2020
    Date of Patent: February 20, 2024
    Assignees: FANUC CORPORATION, PREFERRED NETWORKS, INC.
    Inventors: Taketsugu Tsuda, Daisuke Okanohara, Ryosuke Okuta, Eiichi Matsumoto, Keigo Kawaai
  • Patent number: 11900225
    Abstract: A computer system for generating information regarding chemical compound includes one or more memories and one or more processors configured to generate information regarding chemical compound based on a latent variable, and to evaluate the generated information regarding chemical compound based on desired characteristics, wherein generating the information regarding chemical compound is restricted by the desired characteristics.
    Type: Grant
    Filed: August 24, 2020
    Date of Patent: February 13, 2024
    Assignee: Preferred Networks, Inc.
    Inventors: Kenta Oono, Justin Clayton, Nobuyuki Ota
  • Patent number: 11902369
    Abstract: An autoencoder includes memory configured to store data including an encode network and a decode network, and processing circuitry coupled to the memory. The processing circuitry is configured to cause the encode network to convert inputted data to a plurality of values and output the plurality of values, batch-normalize values indicated by at least two or more layers of the encode network, out of the output plurality of values, the batch-normalized values having a predetermined average value and a predetermined variance value, quantize each of the batch-normalized values, and cause the decode network to decode each of the quantized values.
    Type: Grant
    Filed: February 8, 2019
    Date of Patent: February 13, 2024
    Assignee: Preferred Networks, Inc.
    Inventors: Ken Nakanishi, Shinichi Maeda
  • Publication number: 20240046104
    Abstract: There is provided an information processing device which efficiently executes machine learning. The information processing device according to one embodiment includes: an obtaining unit which obtains a source code including a code which defines Forward processing of each layer constituting a neural network; a storage unit which stores an association relationship between each Forward processing and Backward processing associated with each Forward processing; and an executing unit which successively executes each code included in the source code, and which calculates an output value of the Forward processing defined by the code based on an input value at a time of execution of each code, and generates a reference structure for Backward processing in a layer associated with the code based on the association relationship stored in the storage unit.
    Type: Application
    Filed: October 16, 2023
    Publication date: February 8, 2024
    Applicant: Preferred Networks, Inc.
    Inventors: Seiya TOKUI, Yuya UNNO, Kenta OONO, Ryosuke OKUTA
  • Patent number: 11874723
    Abstract: A method and system that efficiently selects sensors without requiring advanced expertise or extensive experience even in a case of new machines and unknown failures. An abnormality detection system includes a storage unit for storing a latent variable model and a joint probability model, an acquisition unit for acquiring sensor data that is output by a sensor, a measurement unit for measuring the probability of the sensor data acquired by the acquisition unit based on the latent variable model and the joint probability model stored by the storage unit, a determination unit for determining whether the sensor data is normal or abnormal based on the probability of the sensor data measured by the measurement unit, and a learning unit for learning the latent variable model and the joint probability model based on the sensor data output by the sensor.
    Type: Grant
    Filed: April 8, 2022
    Date of Patent: January 16, 2024
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Daisuke Okanohara, Kenta Oono
  • Publication number: 20240005070
    Abstract: According to one embodiment, an inference device includes at least one memory and at least one processor. The at least one processor performs a computation for geometry optimization of a substance by a first algorithm. After a predetermined condition is satisfied, the at least one processor performs, based on a result of the computation by the first algorithm, a geometry optimization of the substance by a second algorithm different from the first algorithm.
    Type: Application
    Filed: June 28, 2023
    Publication date: January 4, 2024
    Applicant: Preferred Networks, Inc.
    Inventor: Akihide HAYASHI
  • Patent number: 11845194
    Abstract: To select a picking position of a workpiece in a simpler method. A robot system includes a three-dimensional measuring device for generating a range image of a plurality of workpieces, a robot having a hand for picking up at least one of the plurality of workpieces, a display part for displaying the range image generated by the three-dimensional measuring device, and a reception part for receiving a teaching of a picking position for picking-up by the hand on the displayed range image. The robot picks up at least one of the plurality of workpieces by the hand on the basis of the taught picking position.
    Type: Grant
    Filed: August 30, 2018
    Date of Patent: December 19, 2023
    Assignees: FANUC CORPORATION, PREFERRED NETWORKS, INC.
    Inventors: Takashi Yamazaki, Daisuke Okanohara, Eiichi Matsumoto