Patents by Inventor Sergey Bartunov
Sergey Bartunov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11983617Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.Type: GrantFiled: November 23, 2020Date of Patent: May 14, 2024Assignee: DeepMind Technologies LimitedInventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
-
Publication number: 20240062060Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for solving mixed integer programs (MIPs) using neural networks. One of the methods includes obtaining data specifying parameters of a MIP; generating, from the parameters of the MIP, an input representation; processing the input representation using an encoder neural network to generate a respective embedding for each of the integer variables; generating a plurality of partial assignments by selecting a respective second, proper subset of the integer variables; and for each of the variables in the respective second subset, generating, using at least the respective embedding for the variable, a respective additional constraint on the value of the variable; generating, for each of the partial assignments, a corresponding candidate final assignment that assigns a respective value to each of the plurality of variables; and selecting, as a final assignment for the MIP, one of the candidate final assignments.Type: ApplicationFiled: December 20, 2021Publication date: February 22, 2024Inventors: Sergey Bartunov, Felix Axel Gimeno Gil, Ingrid Karin von Glehn, Pawel Lichocki, Ivan Lobov, Vinod Nair, Brendan Timothy O'Donoghue, Nicolas Sonnerat, Christian Tjandraatmadja, Pengming Wang
-
Publication number: 20220180147Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for implementing associative memory. In one aspect a system comprises an associative memory neural network to process an input to generate an output that defines an energy corresponding to the input. A reading subsystem retrieves stored information from the associative memory neural network. The reading subsystem performs operations including receiving a given, i.e. query, input and retrieving a data element from the associative memory neural network that is associated with the given input. The retrieving is performed by iteratively adjusting the given input using the associative memory neural network.Type: ApplicationFiled: May 19, 2020Publication date: June 9, 2022Inventors: Sergey Bartunov, Jack William Rae, Timothy Paul Lillicrap, Simon Osindero
-
Publication number: 20210150314Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.Type: ApplicationFiled: November 23, 2020Publication date: May 20, 2021Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
-
Patent number: 10885426Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a controller neural network that includes a Least Recently Used Access (LRUA) subsystem configured to: maintain a respective usage weight for each of a plurality of locations in the external memory, and for each of the plurality of time steps: generate a respective reading weight for each location using a read key, read data from the locations in accordance with the reading weights, generate a respective writing weight for each of the locations from a respective reading weight from a preceding time step and the respective usage weight for the location, write a write vector to the locations in accordance with the writing weights, and update the respective usage weight from the respective reading weight and the respective writing weight.Type: GrantFiled: December 30, 2016Date of Patent: January 5, 2021Assignee: DeepMind Technologies LimitedInventors: Adam Anthony Santoro, Daniel Pieter Wiestra, Timothy Paul Lillicrap, Sergey Bartunov, Ivo Danihelka
-
Patent number: 10846588Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.Type: GrantFiled: September 27, 2019Date of Patent: November 24, 2020Assignee: DeepMind Technologies LimitedInventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
-
Publication number: 20200104677Abstract: A system for compressed data storage using a neural network. The system comprises a memory comprising a plurality of memory locations configured to store data; a query neural network configured to process a representation of an input data item to generate a query; an immutable key data store comprising key data for indexing the plurality of memory locations; an addressing system configured to process the key data and the query to generate a weighting associated with the plurality of memory locations; a memory read system configured to generate output memory data from the memory based upon the generated weighting associated with the plurality of memory locations and the data stored at the plurality of memory locations; and a memory write system configured to write received write data to the memory based upon the generated weighting associated with the plurality of memory locations.Type: ApplicationFiled: September 27, 2019Publication date: April 2, 2020Inventors: Jack William Rae, Timothy Paul Lillicrap, Sergey Bartunov
-
Publication number: 20170228637Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a controller neural network that includes a Least Recently Used Access (LRUA) subsystem configured to: maintain a respective usage weight for each of a plurality of locations in the external memory, and for each of the plurality of time steps: generate a respective reading weight for each location using a read key, read data from the locations in accordance with the reading weights, generate a respective writing weight for each of the locations from a respective reading weight from a preceding time step and the respective usage weight for the location, write a write vector to the locations in accordance with the writing weights, and update the respective usage weight from the respective reading weight and the respective writing weight.Type: ApplicationFiled: December 30, 2016Publication date: August 10, 2017Inventors: Adam Anthony Santoro, Daniel Pieter Wierstra, Timothy Paul Lillicrap, Sergey Bartunov, Ivo Danihelka