Patents by Inventor Gregory Duncan
Gregory Duncan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11977967Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating sequences of predicted observations, for example images. In one aspect, a system comprises a controller recurrent neural network, and a decoder neural network to process a set of latent variables to generate an observation. An external memory and a memory interface subsystem is configured to, for each of a plurality of time steps, receive an updated hidden state from the controller, generate a memory context vector by reading data from the external memory using the updated hidden state, determine a set of latent variables from the memory context vector, generate a predicted observation by providing the set of latent variables to the decoder neural network, write data to the external memory using the latent variables, the updated hidden state, or both, and generate a controller input for a subsequent time step from the latent variables.Type: GrantFiled: December 7, 2020Date of Patent: May 7, 2024Assignee: DeepMind Technologies LimitedInventors: Gregory Duncan Wayne, Chia-Chun Hung, Mevlana Celaleddin Gemici, Adam Anthony Santoro
-
Patent number: 11950966Abstract: Surgical tools and assemblies are employed for use in minimally invasive surgical (MIS) procedures. A surgical tool assembly includes a handle assembly and a frame assembly. The handle assembly and frame assembly are designed and constructed to have an articulation input joint and a grounding joint between them. A pitch rotational degree of freedom and a yaw rotational degree of freedom are provided by way of the articulation input joint. The handle assembly is translationally constrained relative to the frame assembly by way of the grounding joint. Intermediate bodies and joints can be provided in certain surgical tool assemblies and architectures.Type: GrantFiled: June 2, 2021Date of Patent: April 9, 2024Assignee: FlexDex, Inc.Inventors: Zachary R. Zimmerman, Gregory Brian Bowles, Deepak Sharma, Matthew P. Weber, Christopher Paul Huang Shu, James Duncan Geiger
-
Patent number: 11875258Abstract: Methods, systems, and apparatus for selecting actions to be performed by an agent interacting with an environment. One system includes a high-level controller neural network, low-level controller network, and subsystem. The high-level controller neural network receives an input observation and processes the input observation to generate a high-level output defining a control signal for the low-level controller. The low-level controller neural network receives a designated component of an input observation and processes the designated component and an input control signal to generate a low-level output that defines an action to be performed by the agent in response to the input observation.Type: GrantFiled: December 2, 2021Date of Patent: January 16, 2024Assignee: DeepMind Technologies LimitedInventors: Nicolas Manfred Otto Heess, Timothy Paul Lillicrap, Gregory Duncan Wayne, Yuval Tassa
-
Patent number: 11769049Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network system used to control an agent interacting with an environment to perform a specified task. One of the methods includes causing the agent to perform a task episode in which the agent attempts to perform the specified task; for each of one or more particular time steps in the sequence: generating a modified reward for the particular time step from (i) the actual reward at the time step and (ii) value predictions at one or more time steps that are more than a threshold number of time steps after the particular time step in the sequence; and training, through reinforcement learning, the neural network system using at least the modified rewards for the particular time steps.Type: GrantFiled: September 28, 2020Date of Patent: September 26, 2023Assignee: DeepMind Technologies LimitedInventors: Gregory Duncan Wayne, Timothy Paul Lillicrap, Chia-Chun Hung, Joshua Simon Abramson
-
Publication number: 20230178076Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for controlling agents. In particular, an interactive agent can be controlled based on multi-modal inputs that include both an observation image and a natural language text sequence.Type: ApplicationFiled: December 7, 2022Publication date: June 8, 2023Inventors: Joshua Simon Abramson, Arun Ahuja, Federico Javier Carnevale, Petko Ivanov Georgiev, Chia-Chun Hung, Timothy Paul Lillicrap, Alistair Michael Muldal, Adam Anthony Santoro, Tamara Louise von Glehn, Jessica Paige Landon, Gregory Duncan Wayne, Chen Yan, Rui Zhu
-
Patent number: 11210585Abstract: Methods, systems, and apparatus for selecting actions to be performed by an agent interacting with an environment. One system includes a high-level controller neural network, low-level controller network, and subsystem. The high-level controller neural network receives an input observation and processes the input observation to generate a high-level output defining a control signal for the low-level controller. The low-level controller neural network receives a designated component of an input observation and processes the designated component and an input control signal to generate a low-level output that defines an action to be performed by the agent in response to the input observation.Type: GrantFiled: May 12, 2017Date of Patent: December 28, 2021Assignee: DeepMind Technologies LimitedInventors: Nicolas Manfred Otto Heess, Timothy Paul Lillicrap, Gregory Duncan Wayne, Yuval Tassa
-
Patent number: 11210579Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.Type: GrantFiled: March 26, 2020Date of Patent: December 28, 2021Assignee: DeepMind Technologies LimitedInventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
-
Patent number: 11151443Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.Type: GrantFiled: February 3, 2017Date of Patent: October 19, 2021Assignee: DeepMind Technologies LimitedInventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
-
Patent number: 11080594Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory using reinforcement learning. One of the methods includes providing an output derived from the system output portion of the neural network output as a system output in the sequence of system outputs; selecting a memory access process from a predetermined set of memory access processes for accessing the external memory from the reinforcement learning portion of the neural network output; writing and reading data from locations in the external memory in accordance with the selected memory access process using the differentiable portion of the neural network output; and combining the data read from the external memory with a next system input in the sequence of system inputs to generate a next neural network input in the sequence of neural network inputs.Type: GrantFiled: December 30, 2016Date of Patent: August 3, 2021Assignee: DeepMind Technologies LimitedInventors: Ilya Sutskever, Ivo Danihelka, Alexander Benjamin Graves, Gregory Duncan Wayne, Wojciech Zaremba
-
Patent number: 11010663Abstract: Systems, methods, and apparatus, including computer programs encoded on a computer storage medium, related to associative long short-term memory (LSTM) neural network layers configured to maintain N copies of an internal state for the associative LSTM layer, N being an integer greater than one. In one aspect, a system includes a recurrent neural network including an associative LSTM layer, wherein the associative LSTM layer is configured to, for each time step, receive a layer input, update each of the N copies of the internal state using the layer input for the time step and a layer output generated by the associative LSTM layer for a preceding time step, and generate a layer output for the time step using the N updated copies of the internal state.Type: GrantFiled: December 30, 2016Date of Patent: May 18, 2021Assignee: DeepMind Technologies LimitedInventors: Ivo Danihelka, Nal Emmerich Kalchbrenner, Gregory Duncan Wayne, Benigno Uría-Martínez, Alexander Benjamin Graves
-
Publication number: 20210117801Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a memory interface subsystem that is configured to perform operations comprising determining a respective content-based weight for each of a plurality of locations in an external memory; determining a respective allocation weight for each of the plurality of locations in the external memory; determining a respective final writing weight for each of the plurality of locations in the external memory from the respective content-based weight for the location and the respective allocation weight for the location; and writing data defined by the write vector to the external memory in accordance with the final writing weights.Type: ApplicationFiled: November 9, 2020Publication date: April 22, 2021Inventors: Alexander Benjamin Graves, Ivo Danihelka, Timothy James Alexander Harley, Malcolm Kevin Campbell Reynolds, Gregory Duncan Wayne
-
Publication number: 20210089968Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating sequences of predicted observations, for example images. In one aspect, a system comprises a controller recurrent neural network, and a decoder neural network to process a set of latent variables to generate an observation. An external memory and a memory interface subsystem is configured to, for each of a plurality of time steps, receive an updated hidden state from the controller, generate a memory context vector by reading data from the external memory using the updated hidden state, determine a set of latent variables from the memory context vector, generate a predicted observation by providing the set of latent variables to the decoder neural network, write data to the external memory using the latent variables, the updated hidden state, or both, and generate a controller input for a subsequent time step from the latent variables.Type: ApplicationFiled: December 7, 2020Publication date: March 25, 2021Inventors: Gregory Duncan Wayne, Chia-Chun Hung, Mevlana Celaleddin Gemici, Adam Anthony Santoro
-
Publication number: 20210081723Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network system used to control an agent interacting with an environment to perform a specified task. One of the methods includes causing the agent to perform a task episode in which the agent attempts to perform the specified task; for each of one or more particular time steps in the sequence: generating a modified reward for the particular time step from (i) the actual reward at the time step and (ii) value predictions at one or more time steps that are more than a threshold number of time steps after the particular time step in the sequence; and training, through reinforcement learning, the neural network system using at least the modified rewards for the particular time steps.Type: ApplicationFiled: September 28, 2020Publication date: March 18, 2021Inventors: Gregory Duncan Wayne, Timothy Paul Lillicrap, Chia-Chun Hung, Joshua Simon Abramson
-
Publication number: 20210034969Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a memory-based prediction system configured to receive an input observation characterizing a state of an environment interacted with by an agent and to process the input observation and data read from a memory to update data stored in the memory and to generate a latent representation of the state of the environment. The method comprises: for each of a plurality of time steps: processing an observation for the time step and data read from the memory to: (i) update the data stored in the memory, and (ii) generate a latent representation of the current state of the environment as of the time step; and generating a predicted return that will be received by the agent as a result of interactions with the environment after the observation for the time step is received.Type: ApplicationFiled: March 11, 2019Publication date: February 4, 2021Inventors: Gregory Duncan Wayne, Chia-Chun Hung, David Antony Amos, Mehdi Mirza Mohammadi, Arun Ahuja, Timothy Paul Lillicrap
-
Patent number: 10872299Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating sequences of predicted observations, for example images. In one aspect, a system comprises a controller recurrent neural network, and a decoder neural network to process a set of latent variables to generate an observation. An external memory and a memory interface subsystem is configured to, for each of a plurality of time steps, receive an updated hidden state from the controller, generate a memory context vector by reading data from the external memory using the updated hidden state, determine a set of latent variables from the memory context vector, generate a predicted observation by providing the set of latent variables to the decoder neural network, write data to the external memory using the latent variables, the updated hidden state, or both, and generate a controller input for a subsequent time step from the latent variables.Type: GrantFiled: July 1, 2019Date of Patent: December 22, 2020Assignee: DeepMind Technologies LimitedInventors: Gregory Duncan Wayne, Chia-Chun Hung, Mevlana Celaleddin Gemici, Adam Anthony Santoro
-
Patent number: 10832134Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a memory interface subsystem that is configured to perform operations comprising determining a respective content-based weight for each of a plurality of locations in an external memory; determining a respective allocation weight for each of the plurality of locations in the external memory; determining a respective final writing weight for each of the plurality of locations in the external memory from the respective content-based weight for the location and the respective allocation weight for the location; and writing data defined by the write vector to the external memory in accordance with the final writing weights.Type: GrantFiled: December 9, 2016Date of Patent: November 10, 2020Assignee: DeepMind Technologies LimitedInventors: Alexander Benjamin Graves, Ivo Danihelka, Timothy James Alexander Harley, Malcolm Kevin Campbell Reynolds, Gregory Duncan Wayne
-
Patent number: 10789511Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network system used to control an agent interacting with an environment to perform a specified task. One of the methods includes causing the agent to perform a task episode in which the agent attempts to perform the specified task; for each of one or more particular time steps in the sequence: generating a modified reward for the particular time step from (i) the actual reward at the time step and (ii) value predictions at one or more time steps that are more than a threshold number of time steps after the particular time step in the sequence; and training, through reinforcement learning, the neural network system using at least the modified rewards for the particular time steps.Type: GrantFiled: October 14, 2019Date of Patent: September 29, 2020Assignee: DeepMind Technologies LimitedInventors: Gregory Duncan Wayne, Timothy Paul Lillicrap, Chia-Chun Hung, Joshua Simon Abramson
-
Publication number: 20200226446Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.Type: ApplicationFiled: March 26, 2020Publication date: July 16, 2020Applicant: DeepMind Technologies LimitedInventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
-
Patent number: 10691997Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks to generate additional outputs. One of the systems includes a neural network and a sequence processing subsystem, wherein the sequence processing subsystem is configured to perform operations comprising, for each of the system inputs in a sequence of system inputs: receiving the system input; generating an initial neural network input from the system input; causing the neural network to process the initial neural network input to generate an initial neural network output for the system input; and determining, from a first portion of the initial neural network output for the system input, whether or not to cause the neural network to generate one or more additional neural network outputs for the system input.Type: GrantFiled: December 21, 2015Date of Patent: June 23, 2020Assignee: DeepMind Technologies LimitedInventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
-
Patent number: 10650302Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.Type: GrantFiled: October 16, 2015Date of Patent: May 12, 2020Assignee: DeepMind Technologies LimitedInventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne