Patents by Inventor Jonathan James Hunt
Jonathan James Hunt has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11803750Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: GrantFiled: September 14, 2020Date of Patent: October 31, 2023Assignee: DeepMind Technologies LimitedInventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Patent number: 11151443Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.Type: GrantFiled: February 3, 2017Date of Patent: October 19, 2021Assignee: DeepMind Technologies LimitedInventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
-
Publication number: 20200410351Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: ApplicationFiled: September 14, 2020Publication date: December 31, 2020Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Patent number: 10776692Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: GrantFiled: July 22, 2016Date of Patent: September 15, 2020Assignee: DeepMind Technologies LimitedInventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Publication number: 20170228638Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.Type: ApplicationFiled: February 3, 2017Publication date: August 10, 2017Inventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
-
Patent number: 9652713Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.Type: GrantFiled: April 4, 2016Date of Patent: May 16, 2017Assignee: QUALCOMM Technologies, Inc.Inventors: Jonathan James Hunt, Oleg Sinyavskiy, Robert Howard Kimball, Eric Martin Hall, Jeffrey Alexander Levin, Paul Bender, Michael-David Nakayoshi Canoy
-
Publication number: 20170024643Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: ApplicationFiled: July 22, 2016Publication date: January 26, 2017Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Publication number: 20160217370Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.Type: ApplicationFiled: April 4, 2016Publication date: July 28, 2016Inventors: Jonathan James HUNT, Oleg SINYAVSKIY, Robert Howard KIMBALL, Eric Martin HALL, Jeffrey Alexander LEVIN, Paul BENDER, Michael-David Nakayoshi CANOY
-
Patent number: 9390369Abstract: Apparatus and methods for developing parallel networks. In some implementations, a network may be partitioned into multiple partitions, wherein individual portions are being executed by respective threads executed in parallel. Individual portions may comprise multiple neurons and synapses. In order to reduce cross-thread traffic and/or reduce number of synchronization locks, network may be partitioned such that for given network portion, the neurons and the input synapses into neurons within the portion are executed within the same thread. Synapse update rules may be configured to allow memory access for postsynaptic neurons and forbid memory access to presynaptic neurons. Individual threads may be afforded pairs of memory buffers configured to effectuate asynchronous data input/output to/from thread. During an even iteration of network operation, even buffer may be utilized to store data generated by the thread during even iteration.Type: GrantFiled: May 15, 2013Date of Patent: July 12, 2016Assignee: Brain CorporationInventors: Oleg Sinyavskiy, Jonathan James Hunt
-
Patent number: 9330356Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.Type: GrantFiled: May 1, 2013Date of Patent: May 3, 2016Assignee: QUALCOMM TECHNOLOGIES INC.Inventors: Jonathan James Hunt, Oleg Sinyavskiy
-
Patent number: 9195934Abstract: Spiking neuron network conditionally independent subset classifier apparatus and methods. In some implementations, the network may comprise one or more subset neuron layers configured to determine presence of one or more features in the subset of plurality of conditionally independent features. The output of the subset layer may be coupled to an aggregation layer. State of the subset layer may be configured during training based on training input and a reference signal. During operation, spiking output of the subset layer may be combined by the aggregation layer to produce classifier output. Subset layer output and/or classifier output may be encoded using spike rate, latency, and/or base-n encoding.Type: GrantFiled: January 31, 2013Date of Patent: November 24, 2015Assignee: Brain CorporationInventors: Jonathan James Hunt, Oleg Sinyavskiy
-
Publication number: 20140330763Abstract: Apparatus and methods for developing parallel networks. Parallel network design may comprise a general purpose language (GPC) code portion and a network description (ND) portion. GPL tools may be utilized in designing the network. The GPL tools may be configured to produce network specification language (NSL) engine adapted to generate hardware optimized machine executable code corresponding to the network description. The developer may be enabled to describe a parameter of the network. The GPC portion may be automatically updated consistent with the network parameter value. The GPC byte code may be introspected by the NSL engine to provide the underlying source code that may be automatically reinterpreted to produce the hardware optimized machine code. The optimized machine code may be executed in parallel.Type: ApplicationFiled: May 1, 2013Publication date: November 6, 2014Inventors: Jonathan James Hunt, Oleg Sinyavskiy