Patents Assigned to Preferred Networks, Inc.
  • Publication number: 20210165374
    Abstract: An inference apparatus includes one or more memories; and one or more processors. The one or more processors configured to acquire a latent state from input data regarding a control target; acquire a future latent state from the latent state and control data; infer, from the future latent state, a time series of a task to be executed by the control target to be controlled based on the control data; calculate a loss between the time series of the task and data indicating a target state; and update the control data based on the loss.
    Type: Application
    Filed: December 2, 2020
    Publication date: June 3, 2021
    Applicant: Preferred Networks, Inc.
    Inventors: Kento KAWAHARAZUKA, Toru OGAWA
  • Publication number: 20210149726
    Abstract: A scheduling device includes at least one storage and at least one processor. The at least one storage stores information regarding jobs in execution. The at least one processor is configured to accept a job, select, when an execution resource for the accepted job is not secured, at least one job with a lower priority than the accepted job from the jobs in execution as a stop candidate based on the information regarding the jobs in execution, and issue a stop instruction to the stop candidate.
    Type: Application
    Filed: January 27, 2021
    Publication date: May 20, 2021
    Applicant: Preferred Networks, Inc.
    Inventors: Ryota ARAI, Shingo OMURA, Taisuke TANIWAKI
  • Publication number: 20210150353
    Abstract: There is proposed a device capable of performing processing regarding decision of optimum loading positions even if it does not know all packages to be loaded beforehand. One embodiment of the present invention includes: at least one memory; and at least one processing circuitry. The at least one processing circuitry is configured to execute: generating loading state information regarding an object loading state of a predetermined space where a plurality of objects will be loaded, under an assumption that a first object to be loaded is loaded at a loading position candidate in the predetermined space; and inputting the loading state information into a loading state evaluation model that outputs, when the loading state information is input therein, an evaluation value with respect to the object loading state of the predetermined space, and acquiring the evaluation value.
    Type: Application
    Filed: December 29, 2020
    Publication date: May 20, 2021
    Applicant: Preferred Networks, Inc.
    Inventors: Masaki WATANABE, Tomoki KOMATSU
  • Publication number: 20210151128
    Abstract: A learning method of a mixing ratio prediction of element comprising causing a machine learning model to learn to output, in response to input of group expression level data indicating an expression level of each element in a group to be predicted, a mixing ratio of an element contained in the group, wherein in the causing a machine learning model to learn, a virtual mixing ratio that differs among a plurality of pieces of learning data is set as desired, and a learning dataset is used, the learning dataset including data generated, for each piece of the learning data, by obtaining a virtual expression level that is a virtual expression level corresponding to the virtual mixing ratio based on original data indicating an expression level in each element.
    Type: Application
    Filed: December 28, 2020
    Publication date: May 20, 2021
    Applicant: Preferred Networks, Inc.
    Inventors: Motoki Abe, Daisuke Okanohara, Kenta Oono, Mizuki Takemoto
  • Patent number: 11010932
    Abstract: An apparatus and a method for coloring line drawing is disclosed for: acquiring line drawing data; performing reduction processing on the line drawing data to be a predetermined reduced size to obtain reduced line drawing data; coloring the reduced line drawing data based on a first learned model which is learned in advance using sample data; and coloring original line drawing data with the colored reduced data and the original line drawing data as inputs based on a second learned model which is learned in advance.
    Type: Grant
    Filed: May 22, 2018
    Date of Patent: May 18, 2021
    Assignee: PREFERRED NETWORKS, INC.
    Inventor: Taizan Yonetsuji
  • Publication number: 20210074059
    Abstract: A rendering device includes at least one memory and at least one processor. The at least one processor acquires information about projection data in a 2D space from information about a region of a 3D model; stores the information about the projection data in association with the information about the region in the at least one memory; and generates the projection data based on the information about the projection data. The associated information includes information associating an identifier given to a part of regions obtained by dividing the 3D model with information about a position where the part of the regions is projected in the 2D space.
    Type: Application
    Filed: November 20, 2020
    Publication date: March 11, 2021
    Applicant: Preferred Networks, Inc.
    Inventor: Takahiro ANDO
  • Publication number: 20210011791
    Abstract: A method and system that efficiently selects sensors without requiring advanced expertise or extensive experience even in a case of new machines and unknown failures. An abnormality detection system includes a storage unit for storing a latent variable model and a joint probability model, an acquisition unit for acquiring sensor data that is output by a sensor, a measurement unit for measuring the probability of the sensor data acquired by the acquisition unit based on the latent variable model and the joint probability model stored by the storage unit, a determination unit for determining whether the sensor data is normal or abnormal based on the probability of the sensor data measured by the measurement unit, and a learning unit for learning the latent variable model and the joint probability model based on the sensor data output by the sensor.
    Type: Application
    Filed: September 30, 2020
    Publication date: January 14, 2021
    Applicant: Preferred Networks, Inc.
    Inventors: Daisuke OKANOHARA, Kenta OONO
  • Publication number: 20200364031
    Abstract: A computation device includes: a data multiplexer configured to output first high-order data as first output data and fifth output data, output first low-order data as third output data and seventh output data, output second high-order data as second output data, output second low-order data as fourth output data, output third high-order data, which is high-order data having a second bit number out of third input data, as sixth output data, and output third low-order data, which is low-order data having the second bit number out of the third input data, as eighth output data when a mode signal indicates a second computation mode; and first to fourth multipliers each of which multiplies two output data.
    Type: Application
    Filed: May 11, 2018
    Publication date: November 19, 2020
    Applicants: Preferred Networks, Inc., RIKEN
    Inventors: Junichiro MAKINO, Takayuki MURANUSHI, Miyuki TSUBOUCHI, Takoshi NAMURA
  • Patent number: 10831577
    Abstract: A method and system that efficiently selects sensors without requiring advanced expertise or extensive experience even in a case of new machines and unknown failures. An abnormality detection system includes a storage unit for storing a latent variable model and a joint probability model, an acquisition unit for acquiring sensor data that is output by a sensor, a measurement unit for measuring the probability of the sensor data acquired by the acquisition unit based on the latent variable model and the joint probability model stored by the storage unit, a determination unit for determining whether the sensor data is normal or abnormal based on the probability of the sensor data measured by the measurement unit, and a learning unit for learning the latent variable model and the joint probability model based on the sensor data output by the sensor.
    Type: Grant
    Filed: December 1, 2016
    Date of Patent: November 10, 2020
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Daisuke Okanohara, Kenta Oono
  • Patent number: 10807235
    Abstract: A machine learning device for a robot that allows a human and the robot to work cooperatively, the machine learning device including a state observation unit that observes a state variable representing a state of the robot during a period in that the human and the robot work cooperatively; a determination data obtaining unit that obtains determination data for at least one of a level of burden on the human and a working efficiency; and a learning unit that learns a training data set for setting an action of the robot, based on the state variable and the determination data.
    Type: Grant
    Filed: April 1, 2019
    Date of Patent: October 20, 2020
    Assignees: FANUC CORPORATION, PREFERRED NETWORKS, INC.
    Inventors: Taketsugu Tsuda, Daisuke Okanohara, Ryosuke Okuta, Eiichi Matsumoto, Keigo Kawaai
  • Publication number: 20200310364
    Abstract: In a combination of diverse devices, system control with high real-time property is realized. A control device includes: a control indication acceptor receiving a control target indication to be a target of control with respect to plural controlled devices; and a control processing generator generating a control signal with respect to each of the plural controlled devices based on the control target indication, in which operations of the plural controlled devices are controlled so that the plural controlled devices cooperate to achieve the target, based on at least one of a communication delay time, a phase shift time, and an operation cycle time of the plural controlled devices.
    Type: Application
    Filed: June 11, 2020
    Publication date: October 1, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Hirochika Asai, Yusuke Doi, Yuzo Tamada
  • Patent number: 10776712
    Abstract: In various embodiments, the systems and methods described herein relate to generative models. The generative models may be trained using machine learning approaches, with training sets comprising chemical compounds and biological or chemical information that relate to the chemical compounds. Deep learning architectures may be used. In various embodiments, the generative models are used to generate chemical compounds that have desired characteristics, e.g. activity against a selected target. The generative models may be used to generate chemical compounds that satisfy multiple requirements.
    Type: Grant
    Filed: February 3, 2016
    Date of Patent: September 15, 2020
    Assignee: Preferred Networks, Inc.
    Inventors: Kenta Oono, Justin Clayton, Nobuyuki Ota
  • Publication number: 20200272901
    Abstract: An optimization apparatus includes one or more memories and one or more processors. For an operation node constituting a representation of an operation of a neural network, the one or more processors are configured to calculate a time consumption for recomputing an operation result of a focused operation node, from another operation node whose operation result has been stored, and acquire data on the operation node whose operation result is to be stored, based on the time consumption.
    Type: Application
    Filed: February 24, 2020
    Publication date: August 27, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Takuya INOUE, Mitsuru KUSUMOTO, Gentaro WATANABE
  • Publication number: 20200269420
    Abstract: A method of selecting a kinematics computation library implementation, includes a first process of compiling, by a first processor, each of a plurality of types of implementations of kinematics computation libraries using different rigid transformation representation formats by a plurality of types of compilers, a second process of performing, by the first processor, each of predetermined kinematics computations by using the plurality of types of implementations of the kinematics computation libraries, which are compiled in the first process, and a third process of comparing, by the first processor, results of the kinematics computations performed in the second process to select an optimum implementation for the predetermined kinematics computation from the plurality of types of implementations of the kinematics computation libraries.
    Type: Application
    Filed: February 22, 2019
    Publication date: August 27, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Ryo Miyajima, Wilson Ko
  • Patent number: 10717196
    Abstract: A machine learning device that learns an operation of a robot for picking up, by a hand unit, any of a plurality of workpieces placed in a random fashion, including a bulk-loaded state, includes a state variable observation unit that observes a state variable representing a state of the robot, including data output from a three-dimensional measuring device that obtains a three-dimensional map for each workpiece, an operation result obtaining unit that obtains a result of a picking operation of the robot for picking up the workpiece by the hand unit, and a learning unit that learns a manipulated variable including command data for commanding the robot to perform the picking operation of the workpiece, in association with the state variable of the robot and the result of the picking operation, upon receiving output from the state variable observation unit and output from the operation result obtaining unit.
    Type: Grant
    Filed: July 29, 2016
    Date of Patent: July 21, 2020
    Assignees: FANUC CORPORATION, PREFERRED NETWORKS, INC.
    Inventors: Takashi Yamazaki, Takumi Oyama, Shun Suyama, Kazutaka Nakayama, Hidetoshi Kumiya, Hiroshi Nakagawa, Daisuke Okanohara, Ryosuke Okuta, Eiichi Matsumoto, Keigo Kawaai
  • Publication number: 20200167657
    Abstract: A training apparatus includes one or more memories and one or more processors. The one or more processors are configured to generate a graph based on a path of an error backward propagation, assign an identifier to each node based on the path of the error backward propagation in the graph, and execute the error backward propagation based on the graph and on the identifier.
    Type: Application
    Filed: November 25, 2019
    Publication date: May 28, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Seiya TOKUI, Daisuke NISHINO, Hiroyuki Vincent YAMAZAKI, Naotoshi SEO, Akifumi IMANISHI
  • Patent number: 10652167
    Abstract: A packet switch device for a message exchange among a plurality of computing devices, including a message transceiver transmitting and receiving a message, a transmission table storage unit storing a transmission table for determining a computing device to which the message is transmitted, a transmission processor determining the computing device to which the message is transmitted based on a topic address included in the message, a reference processing information receiver receiving a usage status that indicates information used in a calculation process in the computing device, a transmission table compatibility calculation unit calculating compatibility of the transmission table based on the received reference processing information, and a transmission table update unit that updates the transmission table based on the calculated compatibility.
    Type: Grant
    Filed: January 24, 2018
    Date of Patent: May 12, 2020
    Assignee: PREFERRED NETWORKS, INC.
    Inventor: Yusuke Doi
  • Publication number: 20200134473
    Abstract: A model generation method includes updating, by at least one processor, a weight matrix of a first neural network model at least based on a first inference result obtained by inputting, to the first neural network model which discriminates between first data and second data generated by using a second neural network model, the first data, a second inference result obtained by inputting the second data to the first neural network model, and a singular value based on the weight matrix of the first neural network model. The model generation method also includes at least based on the second inference result, updating a parameter of the second neural network model.
    Type: Application
    Filed: December 23, 2019
    Publication date: April 30, 2020
    Applicant: PREFERRED NETWORKS, INC.
    Inventor: Takeru MIYATO
  • Publication number: 20200134453
    Abstract: A device for shortening time for learning curve prediction includes a sampler, a learning curve predictor, a learning executor, and a learning curve calculator. The sampler samples a weight parameter of a parameter model which outputs a parameter of a learning curve model of a neural network (NNW) on the basis of a set value of a hyperparameter of the NNW. The learning curve predictor calculates a prediction learning curve of the NNW on the basis of the sampled weight parameter and an actual learning curve of the NNW. The learning executor advances learning in the NNW. The learning curve calculator calculates an actual learning curve resulting from the advance of the learning in the NNW. The learning curve predictor updates the prediction learning curve of the NNW on the basis of the weight parameter sampled before the learning advances and the actual learning curve calculated after the learning advances.
    Type: Application
    Filed: October 24, 2019
    Publication date: April 30, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Yutaka KITAMURA, Shin-ichi MAEDA
  • Publication number: 20200130193
    Abstract: According to some embodiments, a tactile information estimation apparatus may include one or more memories and one or more processors. The one or more processors are configured to input at least first visual information of an object acquired by a visual sensor to a model. The model is generated based on visual information and tactile information linked to the visual information. The one or more processors are configured to extract, based on the model, a feature amount relating to tactile information of the object.
    Type: Application
    Filed: December 6, 2019
    Publication date: April 30, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Kuniyuki TAKAHASHI, Jethro Eliezer Tanuwijaya TAN