Patents by Inventor David Nunes Raposo

David Nunes Raposo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240086703
    Abstract: A computer-implemented reinforcement learning neural network system that learns a model of rewards in order to relate actions by an agent in an environment to their long-term consequences. The model learns to decompose the rewards into components explainable by different past states. That is, the model learns to associate when being in a particular state of the environment is predictive of a reward in a later state, even when the later state, and reward, is only achieved after a very long time delay.
    Type: Application
    Filed: February 4, 2022
    Publication date: March 14, 2024
    Inventors: Samuel Ritter, David Nunes Raposo
  • Patent number: 11836596
    Abstract: A system including one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to implement a memory and memory-based neural network is described. The memory is configured to store a respective memory vector at each of a plurality of memory locations in the memory. The memory-based neural network is configured to: at each of a plurality of time steps: receive an input; determine an update to the memory, wherein determining the update comprising applying an attention mechanism over the memory vectors in the memory and the received input; update the memory using the determined update to the memory; and generate an output for the current time step using the updated memory.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: December 5, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Mike Chrzanowski, Jack William Rae, Ryan Faulkner, Theophane Guillaume Weber, David Nunes Raposo, Adam Anthony Santoro
  • Publication number: 20230196146
    Abstract: A neural network system is proposed, including an input network for extracting, from state data, respective entity data for each a plurality of entities which are present, or at least potentially present, in the environment. The entity data describes the entity. The neural network contains a relational network for parsing this data, which includes one or more attention blocks which may be stacked to perform successive actions on the entity data. The attention blocks each include a respective transform network for each of the entities. The transform network for each entity is able to transform data which the transform network receives for the entity into modified entity data for the entity, based on data for a plurality of the other entities. An output network is arranged to receive data output by the relational network, and use the received data to select a respective action.
    Type: Application
    Filed: February 13, 2023
    Publication date: June 22, 2023
    Inventors: Yujia Li, Victor Constant Bapst, Vinicius Zambaldi, David Nunes Raposo, Adam Anthony Santoro
  • Publication number: 20230101930
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting actions to be performed by an agent interacting with an environment to accomplish a goal. In one aspect, a method comprises: generating a respective planning embedding corresponding to each of multiple experience tuples in an external memory, wherein each experience tuple characterizes interaction of the agent with the environment at a respective previous time step; processing the planning embeddings using a planning neural network to generate an implicit plan for accomplishing the goal; and selecting the action to be performed by the agent at the time step using the implicit plan.
    Type: Application
    Filed: February 8, 2021
    Publication date: March 30, 2023
    Inventors: Samuel Ritter, Ryan Faulkner, David Nunes Raposo
  • Patent number: 11580429
    Abstract: A neural network system is proposed, including an input network for extracting, from state data, respective entity data for each a plurality of entities which are present, or at least potentially present, in the environment. The entity data describes the entity. The neural network contains a relational network for parsing this data, which includes one or more attention blocks which may be stacked to perform successive actions on the entity data. The attention blocks each include a respective transform network for each of the entities. The transform network for each entity is able to transform data which the transform network receives for the entity into modified entity data for the entity, based on data for a plurality of the other entities. An output network is arranged to receive data output by the relational network, and use the received data to select a respective action.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: February 14, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Yujia Li, Victor Constant Bapst, Vinicius Zambaldi, David Nunes Raposo, Adam Anthony Santoro
  • Publication number: 20210081795
    Abstract: A system including one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to implement a memory and memory-based neural network is described. The memory is configured to store a respective memory vector at each of a plurality of memory locations in the memory. The memory-based neural network is configured to: at each of a plurality of time steps: receive an input; determine an update to the memory, wherein determining the update comprising applying an attention mechanism over the memory vectors in the memory and the received input; update the memory using the determined update to the memory; and generate an output for the current time step using the updated memory.
    Type: Application
    Filed: November 30, 2020
    Publication date: March 18, 2021
    Inventors: Mike Chrzanowski, Jack William Rae, Ryan Faulkner, Theophane Guillaume Weber, David Nunes Raposo, Adam Anthony Santoro
  • Patent number: 10853725
    Abstract: A system including one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to implement a memory and memory-based neural network is described. The memory is configured to store a respective memory vector at each of a plurality of memory locations in the memory. The memory-based neural network is configured to: at each of a plurality of time steps: receive an input; determine an update to the memory, wherein determining the update comprising applying an attention mechanism over the memory vectors in the memory and the received input; update the memory using the determined update to the memory; and generate an output for the current time step using the updated memory.
    Type: Grant
    Filed: May 17, 2019
    Date of Patent: December 1, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Mike Chrzanowski, Jack William Rae, Ryan Faulkner, Theophane Guillaume Weber, David Nunes Raposo, Adam Anthony Santoro
  • Publication number: 20190354885
    Abstract: A neural network system is proposed, including an input network for extracting, from state data, respective entity data for each a plurality of entities which are present, or at least potentially present, in the environment. The entity data describes the entity. The neural network contains a relational network for parsing this data, which includes one or more attention blocks which may be stacked to perform successive actions on the entity data. The attention blocks each include a respective transform network for each of the entities. The transform network for each entity is able to transform data which the transform network receives for the entity into modified entity data for the entity, based on data for a plurality of the other entities. An output network is arranged to receive data output by the relational network, and use the received data to select a respective action.
    Type: Application
    Filed: May 20, 2019
    Publication date: November 21, 2019
    Inventors: Yujia Li, Victor Constant Bapst, Vinicius Zambaldi, David Nunes Raposo, Adam Anthony Santoro
  • Publication number: 20190354858
    Abstract: A system including one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to implement a memory and memory-based neural network is described. The memory is configured to store a respective memory vector at each of a plurality of memory locations in the memory. The memory-based neural network is configured to: at each of a plurality of time steps: receive an input; determine an update to the memory, wherein determining the update comprising applying an attention mechanism over the memory vectors in the memory and the received input; update the memory using the determined update to the memory; and generate an output for the current time step using the updated memory.
    Type: Application
    Filed: May 17, 2019
    Publication date: November 21, 2019
    Inventors: Mike Chrzanowski, Jack William Rae, Ryan Faulkner, Theophane Guillaume Weber, David Nunes Raposo, Adam Anthony Santoro