Patents by Inventor James K. Baker

James K. Baker has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200293897
    Abstract: A machine learning system includes a coach machine learning system that uses machine learning to help a student machine learning system learn its system. By monitoring the student learning system, the coach machine learning system can learn (through machine learning techniques) “hyperparameters” for the student learning system that control the machine learning process for the student learning system. The machine learning coach could also determine structural modifications for the student learning system architecture. The learning coach can also control data flow to the student learning system.
    Type: Application
    Filed: June 3, 2020
    Publication date: September 17, 2020
    Inventor: James K. Baker
  • Publication number: 20200293890
    Abstract: Systems and methods to improve the robustness of a network that has been trained to convergence, particularly with respect to small or imperceptible changes to the input data. Various techniques, which can be utilized either individually or in various combinations, can include adding biases to the input nodes of the network, increasing the minibatch size of the training data, adding special nodes to the network that have activations that do not necessarily change with each data example of the training data, splitting the training data based upon the gradient direction, and making other intentionally adversarial changes to the input of the neural network. In more robust networks, a correct classification is less likely to be disturbed by random or even intentionally adversarial changes in the input values.
    Type: Application
    Filed: May 28, 2020
    Publication date: September 17, 2020
    Inventor: James K. Baker
  • Publication number: 20200285948
    Abstract: Computer systems and computer-implemented methods recursively train a content-addressable auto-associative memory such that: (i) the content addressable auto-associative memory system is trained to produce an output pattern for each of the input examples; and (ii) a quantity of the learned parameters for the content-addressable auto-associative memory is equal to the number of input variables times a quantity that is independent of the number of input variables. The quantity of learned parameters for the content-addressable auto-associative memory system can be varied based on the number of input examples to be learned.
    Type: Application
    Filed: September 19, 2018
    Publication date: September 10, 2020
    Inventor: James K. BAKER
  • Publication number: 20200285939
    Abstract: Various systems and methods are described herein for improving the aggressive development of machine learning systems. In machine learning, there is always a trade-off between allowing a machine learning system to learn as much as it can from training data and overfitting on the training data. This trade-off is important because overfitting usually causes performance on new data to be worse. However, various systems and methods can be utilized to separate the process of detailed learning and knowledge acquisition and the process of imposing restrictions and smoothing estimates, thereby allowing machine learning systems to aggressively learn from training data, while mitigating the effects of overfitting on the training data.
    Type: Application
    Filed: September 28, 2018
    Publication date: September 10, 2020
    Inventor: James K. Baker
  • Publication number: 20200279188
    Abstract: Computer systems and computer-implemented methods train a machine-learning regression system. The method comprises the step of generating, with a machine-learning generator, output patterns; distorting the output patterns of the generator by a scale factor to generate distorted output patterns; and training the machine-learning regression system to predict the scaling factor, where the regression system receives the distorted output patterns as input and learns and the scaling factor is a target value for the regression system. The method may further comprise, after training the machine-learning regression system, training a second machine-learning generator by back propagating partial derivatives of an error cost function from the regression system to the second machine-learning generator and training the second machine-learning generator using stochastic gradient descent.
    Type: Application
    Filed: September 17, 2018
    Publication date: September 3, 2020
    Inventors: James K. Baker, Bradley J. Baker
  • Publication number: 20200279165
    Abstract: Computer systems and computer-implemented methods train and/or operate, once trained, a machine-learning system that comprises a plurality of generator-detector pairs. The machine-learning computer system comprises a set of processor cores and computer memory that stores software. When executed by the set of processor cores, the software causes the set of processor cores to implement a plurality of generator-detector pairs, in which: (i) each generator-detector pair comprises a machine-learning data generator and a machine-learning data detector; and (ii) each generator-detector pair is for a corresponding cluster of data examples respectively, such that, for each generator-detector pair, the generator is for generating data examples in the corresponding cluster and the detector is for detecting whether data examples are within the corresponding cluster.
    Type: Application
    Filed: September 14, 2018
    Publication date: September 3, 2020
    Inventor: James K. BAKER
  • Publication number: 20200265320
    Abstract: Computer systems and methods generate a stochastic categorical autoencoder learning network (SCAN). The SCAN is trained to have an encoder network that outputs, subject to one or more constraints, parameters for parametric probability distributions of sample random variables from input data. The parameters comprise measures of central tendency and measures of dispersion. The one or more constraints comprise a first constraint that constrains a measure of a magnitude of a vector of the measures of central tendency as compared to a measure of a magnitude of a vector of the measures of dispersion. Thereafter, the sample random variables are generated from the parameters and a decoder is trained to output the input data from the sample random variables.
    Type: Application
    Filed: May 6, 2020
    Publication date: August 20, 2020
    Inventor: James K. Baker
  • Publication number: 20200210812
    Abstract: Computer-implemented, machine-learning systems and methods relate to a combination of neural networks. The systems and methods train the respective member networks both (i) to be diverse and yet (ii) according to a common, overall objective. Each member network is trained or retrained jointly with all the other member networks, including member networks that may not have been present in the ensemble when a member is first trained.
    Type: Application
    Filed: September 26, 2018
    Publication date: July 2, 2020
    Inventor: James K. Baker
  • Publication number: 20200210842
    Abstract: Machine-learning data generators use an additional objective to avoid generating data that is too similar to any previously known data example. This prevents plagiarism or simple copying of existing data examples, enhancing the ability of a generator to usefully generate novel data. A formulation of generative adversarial network (GAN) learning as the mixed strategy minimax solution of a zero-sum game solves the convergence and stability problem of GANs learning, without suffering mode collapse.
    Type: Application
    Filed: September 28, 2018
    Publication date: July 2, 2020
    Inventor: James K. BAKER
  • Publication number: 20200184337
    Abstract: A machine learning system includes a coach machine learning system that uses machine learning to help a student machine learning system learn its system. By monitoring the student learning system, the coach machine learning system can learn (through machine learning techniques) “hyperparameters” for the student learning system that control the machine learning process for the student learning system. The machine learning coach could also determine structural modifications for the student learning system architecture. The learning coach can also control data flow to the student learning system.
    Type: Application
    Filed: September 18, 2017
    Publication date: June 11, 2020
    Inventor: James K. BAKER
  • Patent number: 10679129
    Abstract: Computer systems and methods generate a stochastic categorical autoencoder learning network (SCAN). The SCAN is trained to have an encoder network that outputs, subject to one or more constraints, parameters for parametric probability distributions of sample random variables from input data. The parameters comprise measures of central tendency and measures of dispersion. The one or more constraints comprise a first constraint that constrains a measure of a magnitude of a vector of the measures of central tendency as compared to a measure of a magnitude of a vector of the measures of dispersion. Thereafter, the sample random variables are generated from the parameters and a decoder is trained to output the input data from the sample random variables.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: June 9, 2020
    Assignee: D5AI LLC
    Inventor: James K. Baker
  • Publication number: 20200143240
    Abstract: Systems and methods to improve the robustness of a network that has been trained to convergence, particularly with respect to small or imperceptible changes to the input data. Various techniques, which can be utilized either individually or in various combinations, can include adding biases to the input nodes of the network, increasing the minibatch size of the training data, adding special nodes to the network that have activations that do not necessarily change with each data example of the training data, splitting the training data based upon the gradient direction, and making other intentionally adversarial changes to the input of the neural network. In more robust networks, a correct classification is less likely to be disturbed by random or even intentionally adversarial changes in the input values.
    Type: Application
    Filed: June 11, 2018
    Publication date: May 7, 2020
    Inventor: James K. BAKER
  • Publication number: 20200134451
    Abstract: Systems and methods improve the performance of a network that has converged such that the gradient of the network and all the partial derivatives are zero (or close to zero) by splitting the training data such that, on each subset of the split training data, some nodes or arcs (i.e., connections between a node and previous or subsequent layers of the network) have individual partial derivative values that are different from zero on the split subsets of the data, although their partial derivatives averaged over the whole set of training data is close to zero. The present system and method can create a new network by splitting the candidate nodes or arcs that diverge from zero and then trains the resulting network with each selected node trained on the corresponding cluster of the data. Because the direction of the gradient i s different for each of the nodes or arcs that are split, the nodes and their arcs in the new network will train to be different. Therefore, the new network is not at a stationary point.
    Type: Application
    Filed: June 1, 2018
    Publication date: April 30, 2020
    Inventor: James K. BAKER
  • Publication number: 20200090045
    Abstract: Methods and computer systems improve a trained base deep neural network by structurally changing the base deep neural network to create an updated deep neural network, such that the updated deep neural network has no degradation in performance relative to the base deep neural network on the training data. The updated deep neural network is subsequently training. Also, an asynchronous agent for use in a machine learning system comprises a second machine learning system ML2 that is to be trained to perform some machine learning task. The asynchronous agent further comprises a learning coach LC and an optional data selector machine learning system DS. The purpose of the data selection machine learning system DS is to make the second stage machine learning system ML2 more efficient in its learning (by selecting a set of training data that is smaller but sufficient) and/or more effective (by selecting a set of training data that is focused on an important task).
    Type: Application
    Filed: May 31, 2018
    Publication date: March 19, 2020
    Inventor: James K. BAKER
  • Publication number: 20200051550
    Abstract: A multi-stage machine learning and recognition system comprises multiple individual machine learning systems arranged in multiple stages, where data is passed from a machine learning system in one stage to one or more machine learning systems in a subsequent, higher-level stage of the structure according to the logic of the machine learning system. The multi-stage machine learning system can be arranged in a final stage and one or more non-final stages, where the one or more non-final stages direct data generally towards a selected one or more machine learning systems within the final stage, but less than all of the machine learning systems in the final stage. The multi-stage machine learning system can additionally include a learning coach and data management system, which is configured to control the distribution of data throughout the multi-stage structure of machine learning systems by observing the internal state of the structure.
    Type: Application
    Filed: April 16, 2018
    Publication date: February 13, 2020
    Inventor: James K. BAKER
  • Patent number: 10529892
    Abstract: A method for growth and fabrication of semipolar (Ga,Al,In,B)N thin films, heterostructures, and devices, comprising identifying desired material properties for a particular device application, selecting a semipolar growth orientation based on the desired material properties, selecting a suitable substrate for growth of the selected semipolar growth orientation, growing a planar semipolar (Ga,Al,In,B)N template or nucleation layer on the substrate, and growing the semipolar (Ga,Al,In,B)N thin films, heterostructures or devices on the planar semipolar (Ga,Al,In,B)N template or nucleation layer. The method results in a large area of the semipolar (Ga,Al,In,B)N thin films, heterostructures, and devices being parallel to the substrate surface.
    Type: Grant
    Filed: September 7, 2017
    Date of Patent: January 7, 2020
    Assignees: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA, Japan Science and Technology Agency
    Inventors: Robert M. Farrell, Jr., Troy J. Baker, Arpan Chakraborty, Benjamin A. Haskell, P. Morgan Pattison, Rajat Sharma, Umesh K. Mishra, Steven P. DenBaars, James S. Speck, Shuji Nakamura
  • Patent number: 10515512
    Abstract: An electronic gaming machine with a primary bill validator and a separate, auxiliary bill validator in case the primary bill validator either fails or performs less than optimally. The primary bill validator operates as the default bill validator to accept bills (e.g., paper currency and/or cashless ticket vouchers) to fund wagering activities of the EGM and the separate, auxiliary bill validator selectively operates to accept bills (e.g., paper currency and/or cashless ticket vouchers) to fund wagering activities of the EGM upon an occurrence of a bill validator switch event.
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: December 24, 2019
    Assignee: IGT
    Inventors: Stewart Thoeni, James Vasquez, Steven P. McGahn, Brian K. Baker
  • Publication number: 20190291067
    Abstract: Fabrication of functional polymer-based particles by crosslinking UV-curable polymer drops in mid-air and collecting crosslinked particles in a solid container, a liquid suspension, or an air flow. The particles can contain different phases in the form or layered structures that contain one to multiple cores, or structures that are blended with dissolved or emulsified smaller domains. A curing system produces ultraviolet rays that are directed onto the particles in the jet stream from one side. A reflector positioned on other side of the jet stream reflects the ultraviolet rays back onto the particles in the jet stream.
    Type: Application
    Filed: March 22, 2018
    Publication date: September 26, 2019
    Applicants: LAWRENCE LIVERMORE NATIONAL SECURITY, LLC, PURDUE RESEARCH FOUNDATION
    Inventors: Congwang Ye, Roger D. Aines, Sarah E. Baker, Caitlyn Christian Cook, Eric B. Duoss, Joshua D. Kuntz, Elaine Lee, James S. Oakdale, Andrew J. Pascall, Joshuah K. Stolaroff, Marcus A. Worsley, Carlos J. Martinez
  • Patent number: D878473
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: March 17, 2020
    Assignee: IGT
    Inventors: Stewart Thoeni, James Vasquez, Steven P. McGahn, Brian K. Baker
  • Patent number: D893628
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: August 18, 2020
    Assignee: IGT
    Inventors: Stewart Thoeni, James Vasquez, Steven P. McGahn, Brian K. Baker