Patents Assigned to Preferred Networks, Inc.
  • Publication number: 20200125958
    Abstract: A training apparatus, for training a network, including a first network and a second network, configured to infer a feature of an input graph, includes memory and a processor. The processor is configured to: merge, by the first network, first hidden vectors of first nodes of the input graph and a second hidden vector of a second node coupled to each of the first nodes, based on the first hidden vectors, the second hidden vector, and information on coupling between the first nodes. The processor is further configured to update the first hidden vectors and the second hidden vector, based on a result of the merging; extract, from the second network, the feature of the input graph, based on the updated first hidden vectors and the updated second hidden vector; calculate a loss of the feature of the input graph; and update the first network or the second network.
    Type: Application
    Filed: October 18, 2019
    Publication date: April 23, 2020
    Applicant: PREFERRED NETWORKS, INC.
    Inventors: Katsuhiko ISHIGURO, Shinichi MAEDA
  • Publication number: 20200118305
    Abstract: A computer is caused to realize: a line drawing data acquisition function to acquire line drawing data to be colored; a size-reducing process function to perform a size-reducing process on the line drawing data acquired to a predetermined reduced size so as to obtain size-reduced line drawing data; a first coloring process function to perform a coloring process on the size-reduced line drawing data based on a first learned model that has previously learned the coloring process on the size-reduced line drawing data by using sample data; and a second coloring process function to perform a coloring process on original line drawing data by receiving an input of the original line drawing data and colored, size-reduced line drawing data as the size-reduced line drawing data on which the first coloring process function has performed the coloring, based on a second learned model that has previously learned the coloring process on the sample data by receiving an input of the sample data and colored, size-reduced sampl
    Type: Application
    Filed: May 1, 2017
    Publication date: April 16, 2020
    Applicant: Preferred Networks, Inc.
    Inventor: Taizan YONETSUJI
  • Publication number: 20200094406
    Abstract: An estimation device includes a memory and at least one processor. The at least one processor is configured to acquire information regarding a target object. The at least one processor is configured to estimate information regarding a location and a posture of a gripper relating to where the gripper is able to grasp the target object. The estimation is based on an output of a neural model having as an input the information regarding the target object. The estimated information regarding the posture includes information capable of expressing a rotation angle around a plurality of axes.
    Type: Application
    Filed: November 27, 2019
    Publication date: March 26, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Hitoshi KUSANO, Ayaka KUME, Eiichi MATSUMOTO
  • Publication number: 20200019877
    Abstract: Apparatus, methods, and systems for cross-domain time series data conversion are disclosed. In an example embodiment, a first time series of a first type of data is received and stored. The first time series of the first type of data is encoded as a first distributed representation for the first type of data. The first distributed representation is converted to a second distributed representation for a second type of data which is different from the first type of data. The second distributed representation for the second type of data is decoded as a second time series of the second type of data.
    Type: Application
    Filed: September 23, 2019
    Publication date: January 16, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Daisuke OKANOHARA, Justin B. CLAYTON
  • Publication number: 20200014761
    Abstract: A server device configured to communicate, via a communication network, with at least one device including a learner configured to perform processing by using a learned model, includes processor, a transmitter, and a storage configured to store a plurality of shared models pre-learned in accordance with environments and conditions of various devices. The processor is configured to acquire device data including information on an environment and conditions from the at least one device, and select an optimum shared model for the at least one device based on the acquired device data. The transmitter is configured to transmit a selected shared model to the at least one device.
    Type: Application
    Filed: September 20, 2019
    Publication date: January 9, 2020
    Applicant: Preferred Networks, Inc.
    Inventors: Keigo Kawaai, Shohei Hido, Nobuyuki Kubota, Daisuke Tanaka
  • Publication number: 20190378018
    Abstract: There is provided an information processing device which efficiently executes machine learning. The information processing device according to one embodiment includes: an obtaining unit which obtains a source code including a code which defines Forward processing of each layer constituting a neural network; a storage unit which stores an association relationship between each Forward processing and Backward processing associated with each Forward processing; and an executing unit which successively executes each code included in the source code, and which calculates an output value of the Forward processing defined by the code based on an input value at a time of execution of each code, and generates a reference structure for Backward processing in a layer associated with the code based on the association relationship stored in the storage unit.
    Type: Application
    Filed: August 26, 2019
    Publication date: December 12, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Seiya TOKUI, Yuya UNNO, Kenta OONO, Ryosuke OKUTA
  • Patent number: 10460251
    Abstract: Apparatus, methods, and systems for cross-domain time series data conversion are disclosed. In an example embodiment, a first time series of a first type of data is received and stored. The first time series of the first type of data is encoded as a first distributed representation for the first type of data. The first distributed representation is converted to a second distributed representation for a second type of data which is different from the first type of data. The second distributed representation for the second type of data is decoded as a second time series of the second type of data.
    Type: Grant
    Filed: June 19, 2015
    Date of Patent: October 29, 2019
    Assignee: PREFERRED NETWORKS INC.
    Inventors: Daisuke Okanohara, Justin B. Clayton
  • Publication number: 20190325346
    Abstract: Machine learning with model filtering and model mixing for edge devices in a heterogeneous environment is disclosed. In an example embodiment, an edge device includes a communication module, a data collection device, a memory, a machine learning module, and a model mixing module. The edge device analyzes collected data with a model for a first task, outputs a result, and updates the model to create a local model. The edge device communicates with other edge devices in a heterogeneous group, transmits a request for local models to the heterogeneous group, and receives local models from the heterogeneous group. The edge device filters the local models by structure metadata, including second local models, which relate to a second task. The edge device performs a mix operation of the second local models to generate a mixed model which relates to the second task, and transmits the mixed model to the heterogeneous group.
    Type: Application
    Filed: June 28, 2019
    Publication date: October 24, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Daisuke OKANOHARA, Justin B. CLAYTON, Toru NISHIKAWA, Shohei HIDO, Nobuyuki KUBOTA, Nobuyuki OTA, Seiya TOKUI
  • Publication number: 20190317986
    Abstract: The present disclosure provides an annotated text data expanding method capable of obtaining a large amount of annotated text data, which is not inconsistent with an annotation label and is not unnatural as a text, by mechanically expanding a small amount of annotated text data through a natural language processing. The annotated text data expanding method includes inputting, by an input device, the annotated text data including a first text appended with a first annotation label to a prediction complementary model. New annotated text data is created by one or more processors by the prediction complementary model, with reference to the first annotation label and context of the first text.
    Type: Application
    Filed: April 12, 2019
    Publication date: October 17, 2019
    Applicant: PREFERRED NETWORKS, INC.
    Inventor: Sosuke KOBAYASHI
  • Publication number: 20190304136
    Abstract: A gaze point estimation processing apparatus in an embodiment includes a storage configured to store a neural network as a gaze point estimation model and one or more processors. The storage stores a gaze point estimation model generated through learning based on an image for learning and information relating to a first gaze point for the image for learning. The one or more processors estimate information relating to a second gaze point with respect to an image for estimation from the image for estimation using the gaze point estimation model.
    Type: Application
    Filed: March 29, 2019
    Publication date: October 3, 2019
    Applicant: Preferred Networks, Inc.
    Inventor: Masaaki FUKUDA
  • Publication number: 20190303658
    Abstract: A motion generating apparatus includes memory and processing circuitry coupled to the memory. The memory is configured to store a learned model. The learned model outputs, when path information is input, motion information of an object which moves according to the path information. The processing circuitry accepts input of parameters regarding a plurality of objects, and generates path information of the plurality of objects based on the parameters according to predetermined rules. The processing circuitry inputs the generated path information of the plurality of objects into the learned model, and causes the learned model to generate motion information with respect to the path information of the plurality of objects.
    Type: Application
    Filed: March 28, 2019
    Publication date: October 3, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Takahiro ANDO, Shimpei SAWADA, Toru MATSUOKA
  • Patent number: 10410113
    Abstract: Systems, methods, and apparatus for time series data adaptation, including sensor fusion, are disclosed. For example, a system includes a variational inference machine, a sequential data forecast machine including a hidden state, and a machine learning model. The sequential data forecast machine exports a version of the hidden state. The variational inference machine receives as inputs time series data and the version of the hidden state, and outputs a time dependency infused latent distribution. The sequential data forecast machine obtains the version of the hidden state, receives as inputs the time series data and the time dependency infused latent distribution, and updates the hidden state based on the time series data, the time dependency infused latent distribution, and the version of the hidden state to generate a second version of the hidden state. The time dependency infused latent distribution is input into the machine learning model, which outputs a result.
    Type: Grant
    Filed: January 14, 2016
    Date of Patent: September 10, 2019
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Justin B. Clayton, Daisuke Okanohara, Shohei Hido
  • Patent number: 10397260
    Abstract: A control apparatus performs analysis by using partial information and determines whether or not communication is abnormal. If the communication is determined to be abnormal, the control apparatus controls a communication route for a communication control device such that the communication is transmitted from a communication apparatus to the control apparatus. Further, the control apparatus determines whether or not the communication transmitted by the control of the communication route is malicious communication. As a result, if the communication is determined to be malicious communication, the control apparatus controls the communication control device to restrict the malicious communication.
    Type: Grant
    Filed: April 26, 2017
    Date of Patent: August 27, 2019
    Assignees: NIPPON TELEGRAPH AND TELEPHONE CORPORATION, Preferred Networks, Inc.
    Inventors: Takahiro Hamada, Yuminobu Igarashi, Shohei Hido
  • Patent number: 10387794
    Abstract: Machine learning with model filtering and model mixing for edge devices in a heterogeneous environment is disclosed. In an example embodiment, an edge device includes a communication module, a data collection device, a memory, a machine learning module, and a model mixing module. The edge device analyzes collected data with a model for a first task, outputs a result, and updates the model to create a local model. The edge device communicates with other edge devices in a heterogeneous group, transmits a request for local models to the heterogeneous group, and receives local models from the heterogeneous group. The edge device filters the local models by structure metadata, including second local models, which relate to a second task. The edge device performs a mix operation of the second local models to generate a mixed model which relates to the second task, and transmits the mixed model to the heterogeneous group.
    Type: Grant
    Filed: January 22, 2015
    Date of Patent: August 20, 2019
    Assignee: PREFERRED NETWORKS, INC.
    Inventors: Daisuke Okanohara, Justin B. Clayton, Toru Nishikawa, Shohei Hido, Nobuyuki Kubota, Nobuyuki Ota, Seiya Tokui
  • Publication number: 20190251418
    Abstract: An autoencoder includes memory configured to store data including an encode network and a decode network, and processing circuitry coupled to the memory. The processing circuitry is configured to cause the encode network to convert inputted data to a plurality of values and output the plurality of values, batch-normalize values indicated by at least two or more layers of the encode network, out of the output plurality of values, the batch-normalized values having a predetermined average value and a predetermined variance value, quantize each of the batch-normalized values, and cause the decode network to decode each of the quantized values.
    Type: Application
    Filed: February 8, 2019
    Publication date: August 15, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Ken NAKANISHI, Shinichi MAEDA
  • Publication number: 20190236268
    Abstract: A behavior determining method includes causing a program to operate on a virtual environment including a virtual memory, while the program is operating on the virtual environment, generating access information of the virtual memory for determining a behavior of the program, based on information of at least one of a first flag or a second flag, the first flag indicating whether or not the program has read from a location in a virtual address space, and the second flag indicating whether or not the program has written to the location in the virtual address space, and inferring whether the behavior of the program is normal or abnormal, based on the access information.
    Type: Application
    Filed: January 30, 2019
    Publication date: August 1, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Ren KIMURA, Hirochika ASAI, Yusuke DOI
  • Publication number: 20190236813
    Abstract: An information processing apparatus includes a memory and processing circuitry coupled to the memory. The processing circuitry is configured to acquire target image data to be subjected to coloring, designate an area to be subjected to coloring by using reference information in the target image data, determine reference information to be used for the designated area, and perform a coloring process on the designated area by using the determined reference information, based on a learned model for coloring which has been previously learned in the coloring process using the reference information.
    Type: Application
    Filed: January 29, 2019
    Publication date: August 1, 2019
    Applicant: Preferred Networks, Inc.
    Inventor: Taizan YONETSUJI
  • Publication number: 20190197353
    Abstract: An information processing device includes a memory, and processing circuitry coupled to the memory. The processing circuitry is configured to acquire gradation processing target image data, and perform gradation processing on the gradation processing target image data based on a learned model learned in advance.
    Type: Application
    Filed: December 26, 2018
    Publication date: June 27, 2019
    Applicant: Preferred Networks, Inc.
    Inventor: Taizan YONETSUJI
  • Patent number: 10317853
    Abstract: A fault prediction system includes a machine learning device that learns conditions associated with a fault of an industrial machine. The machine learning device includes a state observation unit that, while the industrial machine is in operation or at rest, observes a state variable including, e.g., data output from a sensor, internal data of control software, or computational data obtained based on these data, a determination data obtaining unit that obtains determination data used to determine whether a fault has occurred in the industrial machine or the degree of fault, and a learning unit that learns the conditions associated with the fault of the industrial machine in accordance with a training data set generated based on a combination of the state variable and the determination data.
    Type: Grant
    Filed: July 27, 2016
    Date of Patent: June 11, 2019
    Assignees: FANUC CORPORATION, PREFERRED NETWORKS, INC.
    Inventors: Shougo Inagaki, Hiroshi Nakagawa, Daisuke Okanohara, Ryosuke Okuta, Eiichi Matsumoto, Keigo Kawaai
  • Publication number: 20190156544
    Abstract: A data augmentation apparatus includes a memory and processing circuitry coupled to the memory. The processing circuitry is configured to input a first data set including first image data and first text data related to the first image data, perform first image processing on the first image data to obtain second image data, edit the first text data based on contents of the first image processing to obtain the edited first text data as second text data, and output an augmented data set including the second image data and the second text data.
    Type: Application
    Filed: November 21, 2018
    Publication date: May 23, 2019
    Applicant: Preferred Networks, Inc.
    Inventors: Yuta TSUBOI, Yuya UNNO, Jun HATORI, Sosuke KOBAYASHI, Yuta KIKUCHI