Patents by Inventor Jonathan James Hunt

Jonathan James Hunt has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11803750
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.
    Type: Grant
    Filed: September 14, 2020
    Date of Patent: October 31, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
  • Patent number: 11151443
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.
    Type: Grant
    Filed: February 3, 2017
    Date of Patent: October 19, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
  • Publication number: 20200410351
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.
    Type: Application
    Filed: September 14, 2020
    Publication date: December 31, 2020
    Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
  • Patent number: 10776692
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.
    Type: Grant
    Filed: July 22, 2016
    Date of Patent: September 15, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
  • Publication number: 20170228638
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.
    Type: Application
    Filed: February 3, 2017
    Publication date: August 10, 2017
    Inventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
  • Patent number: 9652713
    Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.
    Type: Grant
    Filed: April 4, 2016
    Date of Patent: May 16, 2017
    Assignee: QUALCOMM Technologies, Inc.
    Inventors: Jonathan James Hunt, Oleg Sinyavskiy, Robert Howard Kimball, Eric Martin Hall, Jeffrey Alexander Levin, Paul Bender, Michael-David Nakayoshi Canoy
  • Publication number: 20170024643
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.
    Type: Application
    Filed: July 22, 2016
    Publication date: January 26, 2017
    Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
  • Publication number: 20160217370
    Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.
    Type: Application
    Filed: April 4, 2016
    Publication date: July 28, 2016
    Inventors: Jonathan James HUNT, Oleg SINYAVSKIY, Robert Howard KIMBALL, Eric Martin HALL, Jeffrey Alexander LEVIN, Paul BENDER, Michael-David Nakayoshi CANOY
  • Patent number: 9390369
    Abstract: Apparatus and methods for developing parallel networks. In some implementations, a network may be partitioned into multiple partitions, wherein individual portions are being executed by respective threads executed in parallel. Individual portions may comprise multiple neurons and synapses. In order to reduce cross-thread traffic and/or reduce number of synchronization locks, network may be partitioned such that for given network portion, the neurons and the input synapses into neurons within the portion are executed within the same thread. Synapse update rules may be configured to allow memory access for postsynaptic neurons and forbid memory access to presynaptic neurons. Individual threads may be afforded pairs of memory buffers configured to effectuate asynchronous data input/output to/from thread. During an even iteration of network operation, even buffer may be utilized to store data generated by the thread during even iteration.
    Type: Grant
    Filed: May 15, 2013
    Date of Patent: July 12, 2016
    Assignee: Brain Corporation
    Inventors: Oleg Sinyavskiy, Jonathan James Hunt
  • Patent number: 9330356
    Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.
    Type: Grant
    Filed: May 1, 2013
    Date of Patent: May 3, 2016
    Assignee: QUALCOMM TECHNOLOGIES INC.
    Inventors: Jonathan James Hunt, Oleg Sinyavskiy
  • Patent number: 9195934
    Abstract: Spiking neuron network conditionally independent subset classifier apparatus and methods. In some implementations, the network may comprise one or more subset neuron layers configured to determine presence of one or more features in the subset of plurality of conditionally independent features. The output of the subset layer may be coupled to an aggregation layer. State of the subset layer may be configured during training based on training input and a reference signal. During operation, spiking output of the subset layer may be combined by the aggregation layer to produce classifier output. Subset layer output and/or classifier output may be encoded using spike rate, latency, and/or base-n encoding.
    Type: Grant
    Filed: January 31, 2013
    Date of Patent: November 24, 2015
    Assignee: Brain Corporation
    Inventors: Jonathan James Hunt, Oleg Sinyavskiy
  • Publication number: 20140330763
    Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.
    Type: Application
    Filed: May 1, 2013
    Publication date: November 6, 2014
    Inventors: Jonathan James Hunt, Oleg Sinyavskiy