Patents by Inventor Sergey Bartunov

Sergey Bartunov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11983617
    Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.
    Type: Grant
    Filed: November 23, 2020
    Date of Patent: May 14, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
  • Publication number: 20240062060
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for solving mixed integer programs (MIPs) using neural networks. One of the methods includes obtaining data specifying parameters of a MIP; generating, from the parameters of the MIP, an input representation; processing the input representation using an encoder neural network to generate a respective embedding for each of the integer variables; generating a plurality of partial assignments by selecting a respective second, proper subset of the integer variables; and for each of the variables in the respective second subset, generating, using at least the respective embedding for the variable, a respective additional constraint on the value of the variable; generating, for each of the partial assignments, a corresponding candidate final assignment that assigns a respective value to each of the plurality of variables; and selecting, as a final assignment for the MIP, one of the candidate final assignments.
    Type: Application
    Filed: December 20, 2021
    Publication date: February 22, 2024
    Inventors: Sergey Bartunov, Felix Axel Gimeno Gil, Ingrid Karin von Glehn, Pawel Lichocki, Ivan Lobov, Vinod Nair, Brendan Timothy O'Donoghue, Nicolas Sonnerat, Christian Tjandraatmadja, Pengming Wang
  • Publication number: 20220180147
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for implementing associative memory. In one aspect a system comprises an associative memory neural network to process an input to generate an output that defines an energy corresponding to the input. A reading subsystem retrieves stored information from the associative memory neural network. The reading subsystem performs operations including receiving a given, i.e. query, input and retrieving a data element from the associative memory neural network that is associated with the given input. The retrieving is performed by iteratively adjusting the given input using the associative memory neural network.
    Type: Application
    Filed: May 19, 2020
    Publication date: June 9, 2022
    Inventors: Sergey Bartunov, Jack William Rae, Timothy Paul Lillicrap, Simon Osindero
  • Publication number: 20210150314
    Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.
    Type: Application
    Filed: November 23, 2020
    Publication date: May 20, 2021
    Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
  • Patent number: 10885426
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a controller neural network that includes a Least Recently Used Access (LRUA) subsystem configured to: maintain a respective usage weight for each of a plurality of locations in the external memory, and for each of the plurality of time steps: generate a respective reading weight for each location using a read key, read data from the locations in accordance with the reading weights, generate a respective writing weight for each of the locations from a respective reading weight from a preceding time step and the respective usage weight for the location, write a write vector to the locations in accordance with the writing weights, and update the respective usage weight from the respective reading weight and the respective writing weight.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: January 5, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Adam Anthony Santoro, Daniel Pieter Wiestra, Timothy Paul Lillicrap, Sergey Bartunov, Ivo Danihelka
  • Patent number: 10846588
    Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: November 24, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
  • Publication number: 20200104677
    Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.
    Type: Application
    Filed: September 27, 2019
    Publication date: April 2, 2020
    Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
  • Publication number: 20170228637
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a controller neural network that includes a Least Recently Used Access (LRUA) subsystem configured to: maintain a respective usage weight for each of a plurality of locations in the external memory, and for each of the plurality of time steps: generate a respective reading weight for each location using a read key, read data from the locations in accordance with the reading weights, generate a respective writing weight for each of the locations from a respective reading weight from a preceding time step and the respective usage weight for the location, write a write vector to the locations in accordance with the writing weights, and update the respective usage weight from the respective reading weight and the respective writing weight.
    Type: Application
    Filed: December 30, 2016
    Publication date: August 10, 2017
    Inventors: Adam Anthony Santoro, Daniel Pieter Wierstra, Timothy Paul Lillicrap, Sergey Bartunov, Ivo Danihelka