Patents by Inventor Hasuk Song

Hasuk Song has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240320469
    Abstract: A system including an attention neural network that is configured to receive an input sequence and to process the input sequence to generate an output is described. The attention neural network includes: an attention block configured to receive a query input, a key input, and a value input that are derived from an attention block input. The attention block includes an attention neural network layer configured to: receive an attention layer input derived from the query input, the key input, and the value input, and apply an attention mechanism to the query input, the key input, and the value input to generate an attention layer output for the attention neural network layer; and a gating neural network layer configured to apply a gating mechanism to the attention block input and the attention layer output of the attention neural network layer to generate a gated attention output.
    Type: Application
    Filed: May 30, 2024
    Publication date: September 26, 2024
    Inventors: Emilio Parisotto, Hasuk Song, Jack William Rae, Siddhant Madhu Jayakumar, Maxwell Elliot Jaderberg, Razvan Pascanu, Caglar Gulcehre
  • Patent number: 12033055
    Abstract: A system including an attention neural network that is configured to receive an input sequence and to process the input sequence to generate an output is described. The attention neural network includes: an attention block configured to receive a query input, a key input, and a value input that are derived from an attention block input. The attention block includes an attention neural network layer configured to: receive an attention layer input derived from the query input, the key input, and the value input, and apply an attention mechanism to the query input, the key input, and the value input to generate an attention layer output for the attention neural network layer; and a gating neural network layer configured to apply a gating mechanism to the attention block input and the attention layer output of the attention neural network layer to generate a gated attention output.
    Type: Grant
    Filed: September 7, 2020
    Date of Patent: July 9, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Emilio Parisotto, Hasuk Song, Jack William Rae, Siddhant Madhu Jayakumar, Maxwell Elliot Jaderberg, Razvan Pascanu, Caglar Gulcehre
  • Publication number: 20220366218
    Abstract: A system including an attention neural network that is configured to receive an input sequence and to process the input sequence to generate an output is described. The attention neural network includes: an attention block configured to receive a query input, a key input, and a value input that are derived from an attention block input. The attention block includes an attention neural network layer configured to: receive an attention layer input derived from the query input, the key input, and the value input, and apply an attention mechanism to the query input, the key input, and the value input to generate an attention layer output for the attention neural network layer; and a gating neural network layer configured to apply a gating mechanism to the attention block input and the attention layer output of the attention neural network layer to generate a gated attention output.
    Type: Application
    Filed: September 7, 2020
    Publication date: November 17, 2022
    Inventors: Emilio Parisotto, Hasuk Song, Jack William Rae, Siddhant Madhu Jayakumar, Maxwell Elliot Jaderberg, Razvan Pascanu, Caglar Gulcehre
  • Publication number: 20210192358
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for predicting the actions of, or influences on, agents in environments with multiple agents, in particular for reinforcement learning. In one aspect, a relational forward model (RFM) system receives agent data representing agent actions for each of multiple agents and implements: an encoder graph neural network subsystem to process the agent data as graph data to provide encoded graph data, a recurrent graph neural network subsystem to process the encoded graph data to provide processed graph data, a decoder graph neural network subsystem to decode the processed graph data to provide decoded graph data and an output to provide representation data for node and/or edge attributes of the decoded graph data relating to a predicted action of one or more of the agents. A reinforcement learning system includes the RFM system.
    Type: Application
    Filed: May 20, 2019
    Publication date: June 24, 2021
    Inventors: Hasuk Song, Andrea Tacchetti, Peter William Battaglia, Vinicius Zambaldi