Patents by Inventor Alexander Pritzel
Alexander Pritzel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240153577Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for predicting a structure of a protein that comprises a plurality of amino acid chains.Type: ApplicationFiled: November 23, 2021Publication date: May 9, 2024Inventors: Richard Andrew Evans, Alexander Pritzel, Russell James Bates, John Jumper
-
Publication number: 20240120022Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing protein design. In one aspect, a method comprises: processing an input characterizing a target protein structure of a target protein using an embedding neural network having a plurality of embedding neural network parameters to generate an embedding of the target protein structure of the target protein; determining a predicted amino acid sequence of the target protein based on the embedding of the target protein structure, comprising: conditioning a generative neural network having a plurality of generative neural network parameters on the embedding of the target protein structure; and generating, by the generative neural network conditioned on the embedding of the target protein structure, a representation of the predicted amino acid sequence of the target protein.Type: ApplicationFiled: January 27, 2022Publication date: April 11, 2024Inventors: Andrew W. Senior, Simon Kohl, Jason Yim, Russell James Bates, Catalin-Dumitru Ionescu, Charlie Thomas Curtis Nash, Ali Razavi-Nematollahi, Alexander Pritzel, John Jumper
-
Publication number: 20240087686Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for unmasking a masked representation of a protein using a protein reconstruction neural network. In one aspect, a method comprises: receiving the masked representation of the protein; and processing the masked representation of the protein using the protein reconstruction neural network to generate a respective predicted embedding corresponding to one or more masked embeddings that are included in the masked representation of the protein, wherein a predicted embedding corresponding to a masked embedding in a representation of the amino acid sequence of the protein defines a prediction for an identity of an amino acid at a corresponding position in the amino acid sequence, wherein a predicted embedding corresponding to a masked embedding in a representation of the structure of the protein defines a prediction for a corresponding structural feature of the protein.Type: ApplicationFiled: January 27, 2022Publication date: March 14, 2024Inventors: Alexander Pritzel, Catalin-Dumitru Ionescu, Simon Kohl
-
Publication number: 20240046106Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.Type: ApplicationFiled: October 16, 2023Publication date: February 8, 2024Inventors: Daniel Pieter Wierstra, Chrisantha Thomas Fernando, Alexander Pritzel, Dylan Sunil Banarse, Charles Blundell, Andrei-Alexandru Rusu, Yori Zwols, David Ha
-
Publication number: 20230410938Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining a predicted structure of a protein. According to one aspect, there is provided a method comprising maintaining graph data representing a graph of the protein; obtaining a respective pair embedding for each edge in the graph; processing the pair embeddings using a sequence of update blocks, wherein each update block performs operations comprising, for each edge in the graph: generating a respective representation of each of a plurality of cycles in the graph that include the edge by, for each cycle, processing embeddings for edges in the cycle in accordance with the values of the update block parameters of the update block to generate the representation of the cycle; and updating the pair embedding for the edge using the representations of the cycles in the graph that include the edge.Type: ApplicationFiled: November 23, 2021Publication date: December 21, 2023Inventors: Alexander Pritzel, Mikhail Figurnov, John Jumper
-
Publication number: 20230395186Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a structure prediction neural network that comprises an embedding neural network and a main folding neural network. According to one aspect, a method comprises: obtaining a training network input characterizing a training protein; processing the training network input using the embedding neural network and the main folding neural network to generate a main structure prediction; for each auxiliary folding neural network in a set of one or more auxiliary folding neural networks, processing at least a corresponding intermediate output of the embedding neural network to generate an auxiliary structure prediction; determining a gradient of an objective function that includes a respective auxiliary structure loss term for each of the auxiliary folding neural networks; and updating the current values of the embedding network parameters and the main folding parameters based on the gradient.Type: ApplicationFiled: November 23, 2021Publication date: December 7, 2023Inventors: Simon Kohl, Olaf Ronneberger, Mikhail Figurnov, Alexander Pritzel
-
Patent number: 11803750Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: GrantFiled: September 14, 2020Date of Patent: October 31, 2023Assignee: DeepMind Technologies LimitedInventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Patent number: 11790238Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.Type: GrantFiled: August 17, 2020Date of Patent: October 17, 2023Assignee: DeepMind Technologies LimitedInventors: Daniel Pieter Wierstra, Chrisantha Thomas Fernando, Alexander Pritzel, Dylan Sunil Banarse, Charles Blundell, Andrei-Alexandru Rusu, Yori Zwols, David Ha
-
Publication number: 20230298687Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for predicting a structure of a protein comprising one or more chains. In one aspect, a method comprises: obtaining an initial multiple sequence alignment (MSA) representation; obtaining a respective initial pair embedding for each pair of amino acids in the protein; processing an input comprising the initial MSA representation and the initial pair embeddings using an embedding neural network to generate an output that comprises a final MSA representation and a respective final pair embedding for each pair of amino acids in the protein; and determining a predicted structure of the protein using the final MSA representation, the final pair embeddings, or both.Type: ApplicationFiled: November 23, 2021Publication date: September 21, 2023Inventors: Mikhail Figurnov, Alexander Pritzel, Richard Andrew Evans, Russell James Bates, Olaf Ronneberger, Simon Kohl, John Jumper
-
Patent number: 11720796Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.Type: GrantFiled: April 23, 2020Date of Patent: August 8, 2023Assignee: DeepMind Technologies LimitedInventors: Benigno Uria-Martínez, Alexander Pritzel, Charles Blundell, Adrià Puigdomènech Badia
-
Publication number: 20210166779Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining a predicted structure of a protein that is specified by an amino acid sequence. In one aspect, a method comprises: obtaining a multiple sequence alignment for the protein; determining, from the multiple sequence alignment and for each pair of amino acids in the amino acid sequence of the protein, a respective initial embedding of the pair of amino acids; processing the initial embeddings of the pairs of amino acids using a pair embedding neural network comprising a plurality of self-attention neural network layers to generate a final embedding of each pair of amino acids; and determining the predicted structure of the protein based on the final embedding of each pair of amino acids.Type: ApplicationFiled: December 1, 2020Publication date: June 3, 2021Inventors: John Jumper, Andrew W. Senior, Richard Andrew Evans, Russell James Bates, Mikhail Figurnov, Alexander Pritzel, Timothy Frederick Goldie Green
-
Publication number: 20200410351Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: ApplicationFiled: September 14, 2020Publication date: December 31, 2020Inventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Publication number: 20200380372Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.Type: ApplicationFiled: August 17, 2020Publication date: December 3, 2020Inventors: Daniel Pieter Wierstra, Chrisantha Thomas Fernando, Alexander Pritzel, Dylan Sunil Banarse, Charles Blundell, Andrei-Alexandru Rusu, Yori Zwols, David Ha
-
Patent number: 10776692Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training an actor neural network used to select actions to be performed by an agent interacting with an environment. One of the methods includes obtaining a minibatch of experience tuples; and updating current values of the parameters of the actor neural network, comprising: for each experience tuple in the minibatch: processing the training observation and the training action in the experience tuple using a critic neural network to determine a neural network output for the experience tuple, and determining a target neural network output for the experience tuple; updating current values of the parameters of the critic neural network using errors between the target neural network outputs and the neural network outputs; and updating the current values of the parameters of the actor neural network using the critic neural network.Type: GrantFiled: July 22, 2016Date of Patent: September 15, 2020Assignee: DeepMind Technologies LimitedInventors: Timothy Paul Lillicrap, Jonathan James Hunt, Alexander Pritzel, Nicolas Manfred Otto Heess, Tom Erez, Yuval Tassa, David Silver, Daniel Pieter Wierstra
-
Publication number: 20200285940Abstract: There is described herein a computer-implemented method of processing an input data item. The method comprises processing the input data item using a parametric model to generate output data, wherein the parametric model comprises a first sub-model and a second sub-model. The processing comprises processing, by the first sub-model, the input data to generate a query data item, retrieving, from a memory storing data point-value pairs, at least one data point-value pair based upon the query data item and modifying weights of the second sub-model based upon the retrieved at least one data point-value pair. The output data is then generated based upon the modified second sub-model.Type: ApplicationFiled: October 29, 2018Publication date: September 10, 2020Inventors: Pablo Sprechmann, Siddhant Jayakumar, Jack William Rae, Alexander Pritzel, Adrià Puigdomènech Badia, Oriol Vinyals, Razvan Pascanu, Charles Blundell
-
Publication number: 20200265317Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.Type: ApplicationFiled: April 23, 2020Publication date: August 20, 2020Inventors: Benigno Uria-Martínez, Alexander Pritzel, Charles Blundell, Adrià Puigdomènech Badia
-
Patent number: 10748065Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.Type: GrantFiled: July 30, 2019Date of Patent: August 18, 2020Assignee: DeepMind Technologies LimitedInventors: Daniel Pieter Wierstra, Chrisantha Thomas Fernando, Alexander Pritzel, Dylan Sunil Banarse, Charles Blundell, Andrei-Alexandru Rusu, Yori Zwols, David Ha
-
Patent number: 10664753Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.Type: GrantFiled: June 19, 2019Date of Patent: May 26, 2020Assignee: DeepMind Technologies LimitedInventors: Benigno Uria-Martínez, Alexander Pritzel, Charles Blundell, Adria Puigdomenech Badia
-
Publication number: 20190354868Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.Type: ApplicationFiled: July 30, 2019Publication date: November 21, 2019Inventors: Daniel Pieter Wierstra, Chrisantha Thomas Fernando, Alexander Pritzel, Dylan Sunil Banarse, Charles Blundell, Andrei-Alexandru Rusu, Yori Zwols, David Ha
-
Publication number: 20190303764Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.Type: ApplicationFiled: June 19, 2019Publication date: October 3, 2019Inventors: Benigno Uria-Martínez, Alexander Pritzel, Charles Blundell, Adria Puigdomenech Badia