Patents by Inventor Valentin Clement Dalibard

Valentin Clement Dalibard has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240346310
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Application
    Filed: March 21, 2024
    Publication date: October 17, 2024
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Publication number: 20240242091
    Abstract: Methods, computer systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network for performing a task. The system maintains data specifying (i) a plurality of candidate neural networks and (ii) a partitioning of the plurality of candidate neural networks into a plurality of partitions. The system repeatedly performs operations, including: training each of the candidate neural networks; evaluating each candidate neural network using a respective fitness function for the partition; and for each partition, updating the respective values of the one or more hyperparameters for at least one of the candidate neural networks in the partition based on the respective fitness metrics of the candidate neural networks in the partition. After repeatedly performing the operations, the system selects, from the maintained data, the respective values of the network parameters of one of the candidate neural networks.
    Type: Application
    Filed: May 30, 2022
    Publication date: July 18, 2024
    Inventors: Valentin Clement Dalibard, Maxwell Elliot Jaderberg
  • Publication number: 20240127071
    Abstract: There is provided a computer-implemented method for updating a search distribution of an evolutionary strategies optimizer using an optimizer neural network comprising one or more attention blocks. The method comprises receiving a plurality of candidate solutions, one or more parameters defining the search distribution that the plurality of candidate solutions are sampled from, and fitness score data indicating a fitness of each respective candidate solution of the plurality of candidate solutions. The method further comprises processing, by the one or more attention neural network blocks, the fitness score data using an attention mechanism to generate respective recombination weights corresponding to each respective candidate solution. The method further comprises updating the one or more parameters defining the search distribution based upon the recombination weights applied to the plurality of candidate solutions.
    Type: Application
    Filed: September 27, 2023
    Publication date: April 18, 2024
    Inventors: Robert Tjarko Lange, Tom Schaul, Yutian Chen, Tom Ben Zion Zahavy, Valentin Clement Dalibard, Christopher Yenchuan Lu, Satinder Singh Baveja, Johan Sebastian Flennerhag
  • Patent number: 11941527
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Grant
    Filed: March 13, 2023
    Date of Patent: March 26, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Patent number: 11907821
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a machine learning model. A method includes: maintaining a plurality of training sessions; assigning, to each worker of one or more workers, a respective training session of the plurality of training sessions; repeatedly performing operations until meeting one or more termination criteria, the operations comprising: receiving an updated training session from a respective worker of the one or more workers, selecting a second training session, selecting, based on comparing the updated training session and the second training session using a fitness evaluation function, either the updated training session or the second training session as a parent training session, generating a child training session from the selected parent training session, and assigning the child training session to an available worker, and selecting a candidate model to be a trained model for the machine learning model.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: February 20, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Ang Li, Valentin Clement Dalibard, David Budden, Ola Spyra, Maxwell Elliot Jaderberg, Timothy James Alexander Harley, Sagi Perel, Chenjie Gu, Pramod Gupta
  • Publication number: 20230281445
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Application
    Filed: March 13, 2023
    Publication date: September 7, 2023
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Patent number: 11604985
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having multiple network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having multiple hyperparameters, the method includes: maintaining multiple candidate neural networks and, for each of the multiple candidate neural networks, data specifying: (i) respective values of network parameters for the candidate neural network, (ii) respective values of hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the multiple candidate neural networks, repeatedly performing additional training operations.
    Type: Grant
    Filed: November 22, 2018
    Date of Patent: March 14, 2023
    Assignee: DeepMind Technologies Limited
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard
  • Publication number: 20210097443
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a machine learning model. A method includes: maintaining a plurality of training sessions; assigning, to each worker of one or more workers, a respective training session of the plurality of training sessions; repeatedly performing operations until meeting one or more termination criteria, the operations comprising: receiving an updated training session from a respective worker of the one or more workers, selecting a second training session, selecting, based on comparing the updated training session and the second training session using a fitness evaluation function, either the updated training session or the second training session as a parent training session, generating a child training session from the selected parent training session, and assigning the child training session to an available worker, and selecting a candidate model to be a trained model for the machine learning model.
    Type: Application
    Filed: September 27, 2019
    Publication date: April 1, 2021
    Inventors: Ang Li, Valentin Clement Dalibard, David Budden, Ola Spyra, Maxwell Elliot Jaderberg, Timothy James Alexander Harley, Sagi Perel, Chenjie Gu, Pramod Gupta
  • Publication number: 20210004676
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. A method includes: training a neural network having a plurality of network parameters to perform a particular neural network task and to determine trained values of the network parameters using an iterative training process having a plurality of hyperparameters, the method comprising: maintaining a plurality of candidate neural networks and, for each of the candidate neural networks, data specifying: (i) respective values of the network parameters for the candidate neural network, (ii) respective values of the hyperparameters for the candidate neural network, and (iii) a quality measure that measures a performance of the candidate neural network on the particular neural network task; and for each of the plurality of candidate neural networks, repeatedly performing additional training operations.
    Type: Application
    Filed: November 22, 2018
    Publication date: January 7, 2021
    Inventors: Maxwell Elliot Jaderberg, Wojciech Czarnecki, Timothy Frederick Goldie Green, Valentin Clement Dalibard