Patents by Inventor Philip Blunsom

Philip Blunsom has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230289598
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from the neural network output for the time step as a system output for the time step; maintaining a current state of the external memory; determining, from the neural network output for the time step, memory state parameters for the time step; updating the current state of the external memory using the memory state parameters for the time step; reading data from the external memory in accordance with the updated state of the external memory; and combining the data read from the external memory with a system input for the next time step to generate the neural network input for the next time step.
    Type: Application
    Filed: February 24, 2023
    Publication date: September 14, 2023
    Inventors: EDWARD THOMAS GREFENSTETTE, Karl Moritz Hermann, Mustafa Suleyman, Philip Blunsom
  • Patent number: 11593640
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from the neural network output for the time step as a system output for the time step; maintaining a current state of the external memory; determining, from the neural network output for the time step, memory state parameters for the time step; updating the current state of the external memory using the memory state parameters for the time step; reading data from the external memory in accordance with the updated state of the external memory; and combining the data read from the external memory with a system input for the next time step to generate the neural network input for the next time step.
    Type: Grant
    Filed: September 9, 2019
    Date of Patent: February 28, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Edward Thomas Grefenstette, Karl Moritz Hermann, Mustafa Suleyman, Philip Blunsom
  • Publication number: 20220318516
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting actions to be performed by an agent interacting with an environment. In one aspect, a system includes a language encoder model that is configured to receive a text string in a particular natural language, and process the text string to generate a text embedding of the text string. The system includes an observation encoder neural network that is configured to receive an observation characterizing a state of the environment, and process the observation to generate an observation embedding of the observation. The system includes a subsystem that is configured to obtain a current text embedding of a current text string and a current observation embedding of a current observation. The subsystem is configured to select an action to be performed by the agent in response to the current observation.
    Type: Application
    Filed: May 16, 2022
    Publication date: October 6, 2022
    Inventors: Karl Moritz Hermann, Philip Blunsom, Felix George Hill
  • Patent number: 11423237
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence from an input sequence. In one aspect, a method comprises maintaining a set of current hypotheses, wherein each current hypothesis comprises an input prefix and an output prefix. For each possible combination of input and output prefix length, the method extends any current hypothesis that could reach the possible combination to generate respective extended hypotheses for each such current hypothesis; determines a respective direct score for each extended hypothesis using a direct model; determines a first number of highest-scoring hypotheses according to the direct scores; rescores the first number of highest-scoring hypotheses using a noisy channel model to generate a reduced number of hypotheses; and adds the reduced number of hypotheses to the set of current hypotheses.
    Type: Grant
    Filed: January 17, 2020
    Date of Patent: August 23, 2022
    Assignee: DeepMind Technologies Limited
    Inventors: Lei Yu, Christopher James Dyer, Tomas Kocisky, Philip Blunsom
  • Patent number: 11354509
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting actions to be performed by an agent interacting with an environment. In one aspect, a system includes a language encoder model that is configured to receive a text string in a particular natural language, and process the text string to generate a text embedding of the text string. The system includes an observation encoder neural network that is configured to receive an observation characterizing a state of the environment, and process the observation to generate an observation embedding of the observation. The system includes a subsystem that is configured to obtain a current text embedding of a current text string and a current observation embedding of a current observation. The subsystem is configured to select an action to be performed by the agent in response to the current observation.
    Type: Grant
    Filed: June 5, 2018
    Date of Patent: June 7, 2022
    Assignee: DeepMind Technologies Limited
    Inventors: Karl Moritz Hermann, Philip Blunsom, Felix George Hill
  • Publication number: 20210110115
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for selecting actions to be performed by an agent interacting with an environment. In one aspect, a system includes a language encoder model that is configured to receive a text string in a particular natural language, and process the text string to generate a text embedding of the text string. The system includes an observation encoder neural network that is configured to receive an observation characterizing a state of the environment, and process the observation to generate an observation embedding of the observation. The system includes a subsystem that is configured to obtain a current text embedding of a current text string and a current observation embedding of a current observation. The subsystem is configured to select an action to be performed by the agent in response to the current observation.
    Type: Application
    Filed: June 5, 2018
    Publication date: April 15, 2021
    Inventors: Karl Moritz Hermann, Philip Blunsom, Felix George Hill
  • Publication number: 20200151398
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence from an input sequence. In one aspect, a method comprises maintaining a set of current hypotheses, wherein each current hypothesis comprises an input prefix and an output prefix. For each possible combination of input and output prefix length, the method extends any current hypothesis that could reach the possible combination to generate respective extended hypotheses for each such current hypothesis; determines a respective direct score for each extended hypothesis using a direct model; determines a first number of highest-scoring hypotheses according to the direct scores; rescores the first number of highest-scoring hypotheses using a noisy channel model to generate a reduced number of hypotheses; and adds the reduced number of hypotheses to the set of current hypotheses.
    Type: Application
    Filed: January 17, 2020
    Publication date: May 14, 2020
    Inventors: Lei Yu, Christopher James Dyer, Tomas Kocisky, Philip Blunsom
  • Patent number: 10628735
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for selecting answers to questions about documents. One of the methods includes receiving a document comprising a plurality of document tokens; receiving a question associated with the document, the question comprising a plurality of question tokens; processing the document tokens and the question tokens using a reader neural network to generate a joint numeric representation of the document and the question; and selecting, from the plurality of document tokens, an answer to the question using the joint numeric representation of the document and the question.
    Type: Grant
    Filed: June 2, 2016
    Date of Patent: April 21, 2020
    Assignee: Deepmind Technologies Limited
    Inventors: Karl Moritz Hermann, Tomas Kocisky, Edward Thomas Grefenstette, Lasse Espeholt, William Thomas Kay, Mustafa Suleyman, Philip Blunsom
  • Patent number: 10572603
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence from an input sequence. In one aspect, a method comprises maintaining a set of current hypotheses, wherein each current hypothesis comprises an input prefix and an output prefix. For each possible combination of input and output prefix length, the method extends any current hypothesis that could reach the possible combination to generate respective extended hypotheses for each such current hypothesis; determines a respective direct score for each extended hypothesis using a direct model; determines a first number of highest-scoring hypotheses according to the direct scores; rescores the first number of highest-scoring hypotheses using a noisy channel model to generate a reduced number of hypotheses; and adds the reduced number of hypotheses to the set of current hypotheses.
    Type: Grant
    Filed: May 3, 2019
    Date of Patent: February 25, 2020
    Assignee: DeepMind Technologies Limited
    Inventors: Lei Yu, Christopher James Dyer, Tomas Kocisky, Philip Blunsom
  • Publication number: 20200005147
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from the neural network output for the time step as a system output for the time step; maintaining a current state of the external memory; determining, from the neural network output for the time step, memory state parameters for the time step; updating the current state of the external memory using the memory state parameters for the time step; reading data from the external memory in accordance with the updated state of the external memory; and combining the data read from the external memory with a system input for the next time step to generate the neural network input for the next time step.
    Type: Application
    Filed: September 9, 2019
    Publication date: January 2, 2020
    Inventors: Edward Thomas Grefenstette, Karl Moritz Hermann, Mustafa Suleyman, Philip Blunsom
  • Patent number: 10410119
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from the neural network output for the time step as a system output for the time step; maintaining a current state of the external memory; determining, from the neural network output for the time step, memory state parameters for the time step; updating the current state of the external memory using the memory state parameters for the time step; reading data from the external memory in accordance with the updated state of the external memory; and combining the data read from the external memory with a system input for the next time step to generate the neural network input for the next time step.
    Type: Grant
    Filed: June 2, 2016
    Date of Patent: September 10, 2019
    Assignee: DeepMind Technologies Limited
    Inventors: Edward Thomas Grefenstette, Karl Moritz Hermann, Mustafa Suleyman, Philip Blunsom
  • Publication number: 20190258718
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating a target sequence from an input sequence. In one aspect, a method comprises maintaining a set of current hypotheses, wherein each current hypothesis comprises an input prefix and an output prefix. For each possible combination of input and output prefix length, the method extends any current hypothesis that could reach the possible combination to generate respective extended hypotheses for each such current hypothesis; determines a respective direct score for each extended hypothesis using a direct model; determines a first number of highest-scoring hypotheses according to the direct scores; rescores the first number of highest-scoring hypotheses using a noisy channel model to generate a reduced number of hypotheses; and adds the reduced number of hypotheses to the set of current hypotheses.
    Type: Application
    Filed: May 3, 2019
    Publication date: August 22, 2019
    Inventors: Lei Yu, Christopher James Dyer, Tomas Kocisky, Philip Blunsom
  • Publication number: 20160358072
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for selecting answers to questions about documents. One of the methods includes receiving a document comprising a plurality of document tokens; receiving a question associated with the document, the question comprising a plurality of question tokens; processing the document tokens and the question tokens using a reader neural network to generate a joint numeric representation of the document and the question; and selecting, from the plurality of document tokens, an answer to the question using the joint numeric representation of the document and the question.
    Type: Application
    Filed: June 2, 2016
    Publication date: December 8, 2016
    Inventors: Karl Moritz Hermann, Tomas Kocisky, Edward Thomas Grefenstette, Lasse Espeholt, William Thomas Kay, Mustafa Suleyman, Philip Blunsom
  • Publication number: 20160358071
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for augmenting neural networks with an external memory. One of the methods includes providing an output derived from the neural network output for the time step as a system output for the time step; maintaining a current state of the external memory; determining, from the neural network output for the time step, memory state parameters for the time step; updating the current state of the external memory using the memory state parameters for the time step; reading data from the external memory in accordance with the updated state of the external memory; and combining the data read from the external memory with a system input for the next time step to generate the neural network input for the next time step.
    Type: Application
    Filed: June 2, 2016
    Publication date: December 8, 2016
    Inventors: Edward Thomas Grefenstette, Karl Moritz Hermann, Mustafa Suleyman, Philip Blunsom