Patents by Inventor Ivo Danihelka

Ivo Danihelka has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230419076
    Abstract: Methods, and systems, including computer programs encoded on computer storage media for generating data items. A method includes reading a glimpse from a data item using a decoder hidden state vector of a decoder for a preceding time step, providing, as input to a encoder, the glimpse and decoder hidden state vector for the preceding time step for processing, receiving, as output from the encoder, a generated encoder hidden state vector for the time step, generating a decoder input from the generated encoder hidden state vector, providing the decoder input to the decoder for processing, receiving, as output from the decoder, a generated a decoder hidden state vector for the time step, generating a neural network output update from the decoder hidden state vector for the time step, and combining the neural network output update with a current neural network output to generate an updated neural network output.
    Type: Application
    Filed: September 12, 2023
    Publication date: December 28, 2023
    Inventors: Karol Gregor, Ivo Danihelka
  • Patent number: 11790209
    Abstract: Methods, and systems, including computer programs encoded on computer storage media for generating data items. A method includes reading a glimpse from a data item using a decoder hidden state vector of a decoder for a preceding time step, providing, as input to a encoder, the glimpse and decoder hidden state vector for the preceding time step for processing, receiving, as output from the encoder, a generated encoder hidden state vector for the time step, generating a decoder input from the generated encoder hidden state vector, providing the decoder input to the decoder for processing, receiving, as output from the decoder, a generated a decoder hidden state vector for the time step, generating a neural network output update from the decoder hidden state vector for the time step, and combining the neural network output update with a current neural network output to generate an updated neural network output.
    Type: Grant
    Filed: July 23, 2021
    Date of Patent: October 17, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Karol Gregor, Ivo Danihelka
  • Publication number: 20220366246
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using an environment model to simulate state transitions of an environment being interacted with by an agent that is controlled using a policy neural network. One of the methods includes initializing an internal representation of a state of the environment at a current time point; repeatedly performing the following operations: receiving an action to be performed by the agent; generating, based on the internal representation, a predicted latent representation that is a prediction of a latent representation that would have been generated by the policy neural network by processing an observation characterizing the state of the environment corresponding to the internal representation; and updating the internal representation to simulate a state transition caused by the agent performing the received action by processing the predicted latent representation and the received action using the environment model.
    Type: Application
    Filed: September 24, 2020
    Publication date: November 17, 2022
    Inventors: Ivo Danihelka, Danilo Jimenez Rezende, Karol Gregor, Georgios Papamakarios, Theophane Guillaume Weber
  • Patent number: 11256990
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a recurrent neural network on training sequences using backpropagation through time. In one aspect, a method includes receiving a training sequence including a respective input at each of a number of time steps; obtaining data defining an amount of memory allocated to storing forward propagation information for use during backpropagation; determining, from the number of time steps in the training sequence and from the amount of memory allocated to storing the forward propagation information, a training policy for processing the training sequence, wherein the training policy defines when to store forward propagation information during forward propagation of the training sequence; and training the recurrent neural network on the training sequence in accordance with the training policy.
    Type: Grant
    Filed: May 19, 2017
    Date of Patent: February 22, 2022
    Assignee: DeepMind Technologies Limited
    Inventors: Marc Lanctot, Audrunas Gruslys, Ivo Danihelka, Remi Munos
  • Patent number: 11210579
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.
    Type: Grant
    Filed: March 26, 2020
    Date of Patent: December 28, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
  • Publication number: 20210350207
    Abstract: Methods, and systems, including computer programs encoded on computer storage media for generating data items. A method includes reading a glimpse from a data item using a decoder hidden state vector of a decoder for a preceding time step, providing, as input to a encoder, the glimpse and decoder hidden state vector for the preceding time step for processing, receiving, as output from the encoder, a generated encoder hidden state vector for the time step, generating a decoder input from the generated encoder hidden state vector, providing the decoder input to the decoder for processing, receiving, as output from the decoder, a generated a decoder hidden state vector for the time step, generating a neural network output update from the decoder hidden state vector for the time step, and combining the neural network output update with a current neural network output to generate an updated neural network output.
    Type: Application
    Filed: July 23, 2021
    Publication date: November 11, 2021
    Inventors: Karol Gregor, Ivo Danihelka
  • Patent number: 11151443
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a sparse memory access subsystem that is configured to perform operations comprising generating a sparse set of reading weights that includes a respective reading weight for each of the plurality of locations in the external memory using the read key, reading data from the plurality of locations in the external memory in accordance with the sparse set of reading weights, generating a set of writing weights that includes a respective writing weight for each of the plurality of locations in the external memory, and writing the write vector to the plurality of locations in the external memory in accordance with the writing weights.
    Type: Grant
    Filed: February 3, 2017
    Date of Patent: October 19, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Ivo Danihelka, Gregory Duncan Wayne, Fu-min Wang, Edward Thomas Grefenstette, Jack William Rae, Alexander Benjamin Graves, Timothy Paul Lillicrap, Timothy James Alexander Harley, Jonathan James Hunt
  • Patent number: 11080594
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory using reinforcement learning. One of the methods includes providing an output derived from the system output portion of the neural network output as a system output in the sequence of system outputs; selecting a memory access process from a predetermined set of memory access processes for accessing the external memory from the reinforcement learning portion of the neural network output; writing and reading data from locations in the external memory in accordance with the selected memory access process using the differentiable portion of the neural network output; and combining the data read from the external memory with a next system input in the sequence of system inputs to generate a next neural network input in the sequence of neural network inputs.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: August 3, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Ilya Sutskever, Ivo Danihelka, Alexander Benjamin Graves, Gregory Duncan Wayne, Wojciech Zaremba
  • Patent number: 11080587
    Abstract: Methods, and systems, including computer programs encoded on computer storage media for generating data items. A method includes reading a glimpse from a data item using a decoder hidden state vector of a decoder for a preceding time step, providing, as input to a encoder, the glimpse and decoder hidden state vector for the preceding time step for processing, receiving, as output from the encoder, a generated encoder hidden state vector for the time step, generating a decoder input from the generated encoder hidden state vector, providing the decoder input to the decoder for processing, receiving, as output from the decoder, a generated a decoder hidden state vector for the time step, generating a neural network output update from the decoder hidden state vector for the time step, and combining the neural network output update with a current neural network output to generate an updated neural network output.
    Type: Grant
    Filed: February 4, 2016
    Date of Patent: August 3, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Karol Gregor, Ivo Danihelka
  • Patent number: 11010663
    Abstract: Systems, methods, and apparatus, including computer programs encoded on a computer storage medium, related to associative long short-term memory (LSTM) neural network layers configured to maintain N copies of an internal state for the associative LSTM layer, N being an integer greater than one. In one aspect, a system includes a recurrent neural network including an associative LSTM layer, wherein the associative LSTM layer is configured to, for each time step, receive a layer input, update each of the N copies of the internal state using the layer input for the time step and a layer output generated by the associative LSTM layer for a preceding time step, and generate a layer output for the time step using the N updated copies of the internal state.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: May 18, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Ivo Danihelka, Nal Emmerich Kalchbrenner, Gregory Duncan Wayne, Benigno Uría-Martínez, Alexander Benjamin Graves
  • Publication number: 20210117801
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a memory interface subsystem that is configured to perform operations comprising determining a respective content-based weight for each of a plurality of locations in an external memory; determining a respective allocation weight for each of the plurality of locations in the external memory; determining a respective final writing weight for each of the plurality of locations in the external memory from the respective content-based weight for the location and the respective allocation weight for the location; and writing data defined by the write vector to the external memory in accordance with the final writing weights.
    Type: Application
    Filed: November 9, 2020
    Publication date: April 22, 2021
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Timothy James Alexander Harley, Malcolm Kevin Campbell Reynolds, Gregory Duncan Wayne
  • Patent number: 10885426
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a controller neural network that includes a Least Recently Used Access (LRUA) subsystem configured to: maintain a respective usage weight for each of a plurality of locations in the external memory, and for each of the plurality of time steps: generate a respective reading weight for each location using a read key, read data from the locations in accordance with the reading weights, generate a respective writing weight for each of the locations from a respective reading weight from a preceding time step and the respective usage weight for the location, write a write vector to the locations in accordance with the writing weights, and update the respective usage weight from the respective reading weight and the respective writing weight.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: January 5, 2021
    Assignee: DeepMind Technologies Limited
    Inventors: Adam Anthony Santoro, Daniel Pieter Wiestra, Timothy Paul Lillicrap, Sergey Bartunov, Ivo Danihelka
  • Patent number: 10832134
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the systems includes a memory interface subsystem that is configured to perform operations comprising determining a respective content-based weight for each of a plurality of locations in an external memory; determining a respective allocation weight for each of the plurality of locations in the external memory; determining a respective final writing weight for each of the plurality of locations in the external memory from the respective content-based weight for the location and the respective allocation weight for the location; and writing data defined by the write vector to the external memory in accordance with the final writing weights.
    Type: Grant
    Filed: December 9, 2016
    Date of Patent: November 10, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Timothy James Alexander Harley, Malcolm Kevin Campbell Reynolds, Gregory Duncan Wayne
  • Publication number: 20200226446
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.
    Type: Application
    Filed: March 26, 2020
    Publication date: July 16, 2020
    Applicant: DeepMind Technologies Limited
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
  • Patent number: 10691997
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks to generate additional outputs. One of the systems includes a neural network and a sequence processing subsystem, wherein the sequence processing subsystem is configured to perform operations comprising, for each of the system inputs in a sequence of system inputs: receiving the system input; generating an initial neural network input from the system input; causing the neural network to process the initial neural network input to generate an initial neural network output for the system input; and determining, from a first portion of the initial neural network output for the system input, whether or not to cause the neural network to generate one or more additional neural network outputs for the system input.
    Type: Grant
    Filed: December 21, 2015
    Date of Patent: June 23, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
  • Patent number: 10657436
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for a neural network system. In one aspect, a neural network system includes a recurrent neural network that is configured to, for each time step of a predetermined number of time steps, receive a set of latent variables for the time step and process the latent variables to update a hidden state of the recurrent neural network; and a generative subsystem that is configured to, for each time step, generate the set of latent variables for the time step and provide the set of latent variables as input to the recurrent neural network; update a hidden canvas using the updated hidden state of the recurrent neural network; and, for a last time step, generate an output image using the updated hidden canvas for the last time step.
    Type: Grant
    Filed: January 7, 2019
    Date of Patent: May 19, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Ivo Danihelka, Danilo Jimenez Rezende, Shakir Mohamed
  • Patent number: 10650302
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from a first portion of a neural network output as a system output; determining one or more sets of writing weights for each of a plurality of locations in an external memory; writing data defined by a third portion of the neural network output to the external memory in accordance with the sets of writing weights; determining one or more sets of reading weights for each of the plurality of locations in the external memory from a fourth portion of the neural network output; reading data from the external memory in accordance with the sets of reading weights; and combining the data read from the external memory with a next system input to generate the next neural network input.
    Type: Grant
    Filed: October 16, 2015
    Date of Patent: May 12, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Alexander Benjamin Graves, Ivo Danihelka, Gregory Duncan Wayne
  • Patent number: 10482373
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for implementing grid Long Short-Term Memory (LSTM) neural networks that includes a plurality of N-LSTM blocks arranged in an N-dimensional grid. Each N-LSTM block is configured to: receive N input hidden vectors, the N input hidden vectors each corresponding to a respective one of the N dimensions; receive N input memory vectors, the N input memory vectors each corresponding to a respective one of the N dimensions; and, for each of the dimensions, apply a respective transform for the dimension to the memory hidden vector corresponding to the dimension and the input hidden vector corresponding to the dimension to generate a new hidden vector corresponding to the dimension and a new memory vector corresponding to the dimension.
    Type: Grant
    Filed: June 6, 2016
    Date of Patent: November 19, 2019
    Assignee: DeepMind Technologies Limited
    Inventors: Nal Emmerich Kalchbrenner, Ivo Danihelka, Alexander Benjamin Graves
  • Publication number: 20190213469
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for a neural network system. In one aspect, a neural network system includes a recurrent neural network that is configured to, for each time step of a predetermined number of time steps, receive a set of latent variables for the time step and process the latent variables to update a hidden state of the recurrent neural network; and a generative subsystem that is configured to, for each time step, generate the set of latent variables for the time step and provide the set of latent variables as input to the recurrent neural network; update a hidden canvas using the updated hidden state of the recurrent neural network; and, for a last time step, generate an output image using the updated hidden canvas for the last time step.
    Type: Application
    Filed: January 7, 2019
    Publication date: July 11, 2019
    Inventors: Ivo Danihelka, Danilo Jimenez Rezende, Shakir Mohamed
  • Publication number: 20190188572
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a recurrent neural network on training sequences using backpropagation through time. In one aspect, a method includes receiving a training sequence including a respective input at each of a number of time steps; obtaining data defining an amount of memory allocated to storing forward propagation information for use during backpropagation; determining, from the number of time steps in the training sequence and from the amount of memory allocated to storing the forward propagation information, a training policy for processing the training sequence, wherein the training policy defines when to store forward propagation information during forward propagation of the training sequence; and training the recurrent neural network on the training sequence in accordance with the training policy.
    Type: Application
    Filed: May 19, 2017
    Publication date: June 20, 2019
    Applicant: DeepMind Technologies Limited
    Inventors: Marc LANCTOT, Audrunas GRUSLYS, Ivo DANIHELKA, Remi MUNOS