Patents by Inventor Timothy Frederick Goldie Green

Timothy Frederick Goldie Green has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11941527
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Grant
    Filed: March 13, 2023
    Date of Patent: March 26, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Publication number: 20230360734
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training neural networks to predict the structure of a protein.
    Type: Application
    Filed: August 12, 2021
    Publication date: November 9, 2023
    Inventors: Richard Andrew Evans, John Jumper, Timothy Frederick Goldie Green, David Reiman
  • Publication number: 20230281445
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Application
    Filed: March 13, 2023
    Publication date: September 7, 2023
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Patent number: 11604985
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having multiple network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having multiple hyperparameters, the method includes: maintaining multiple candidate neural networks and, for each of the multiple candidate neural networks, data specifying: (i) respective values of network parameters for the candidate neural network, (ii) respective values of hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the multiple candidate neural networks, repeatedly performing additional training operations.
    Type: Grant
    Filed: November 22, 2018
    Date of Patent: March 14, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Publication number: 20210166779
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining a predicted structure of a protein that is specified by an amino acid sequence. In one aspect, a method comprises: obtaining a multiple sequence alignment for the protein; determining, from the multiple sequence alignment and for each pair of amino acids in the amino acid sequence of the protein, a respective initial embedding of the pair of amino acids; processing the initial embeddings of the pairs of amino acids using a pair embedding neural network comprising a plurality of self-attention neural network layers to generate a final embedding of each pair of amino acids; and determining the predicted structure of the protein based on the final embedding of each pair of amino acids.
    Type: Application
    Filed: December 1, 2020
    Publication date: June 3, 2021
    Inventors: John Jumper, Andrew W. Senior, Richard Andrew Evans, Russell James Bates, Mikhail Figurnov, Alexander Pritzel, Timothy Frederick Goldie Green
  • Publication number: 20210004676
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Application
    Filed: November 22, 2018
    Publication date: January 7, 2021
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard