Patents by Inventor Corentin Tallec

Corentin Tallec has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240119261
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating an output sequence of discrete tokens using a diffusion model. In one aspect, a method includes generating, by using the diffusion model, a final latent representation of the sequence of discrete tokens that includes a determined value for each of a plurality of latent variables; applying a de-embedding matrix to the final latent representation of the output sequence of discrete tokens to generate a de-embedded final latent representation that includes, for each of the plurality of latent variables, a respective numeric score for each discrete token in a vocabulary of multiple discrete tokens; selecting, for each of the plurality of latent variables, a discrete token from among the multiple discrete tokens in the vocabulary that has a highest numeric score; and generating the output sequence of discrete tokens that includes the selected discrete tokens.
    Type: Application
    Filed: September 28, 2023
    Publication date: April 11, 2024
    Inventors: Robin Strudel, Rémi Leblond, Laurent Sifre, Sander Etienne Lea Dieleman, Nikolay Savinov, Will S. Grathwohl, Corentin Tallec, Florent Altché, Iaroslav Ganin, Arthur Mensch, Yilin Du
  • Publication number: 20210383225
    Abstract: A computer-implemented method of training a neural network. The method comprises processing a first transformed view of a training data item, e.g. an image, with a target neural network to generate a target output, processing a second transformed view of the training data item, e.g. image, with an online neural network to generate a prediction of the target output, updating parameters of the online neural network to minimize an error between the prediction of the target output and the target output, and updating parameters of the target neural network based on the parameters of the online neural network. The method can effectively train an encoder neural network without using labelled training data items, and without using a contrastive loss, i.e. without needing “negative examples” which comprise transformed views of different data items.
    Type: Application
    Filed: June 4, 2021
    Publication date: December 9, 2021
    Inventors: Jean-Bastien François Laurent Grill, Florian Strub, Florent Altché, Corentin Tallec, Pierre Richemond, Bernardo Avila Pires, Zhaohan Guo, Mohammad Gheshlaghi Azar, Bilal Piot, Remi Munos, Michal Valko