Patents by Inventor Nikita Kitaev

Nikita Kitaev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240028893
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing sequence modeling tasks using insertions. One of the methods includes receiving a system input that includes one or more source elements from a source sequence and zero or more target elements from a target sequence, wherein each source element is selected from a vocabulary of source elements and wherein each target element is selected from a vocabulary of target elements; generating a partial concatenated sequence that includes the one or more source elements from the source sequence and the zero or more target elements from the target sequence, wherein the source and target elements arranged in the partial concatenated sequence according to a combined order; and generating a final concatenated sequence that includes a finalized source sequence and a finalized target sequence, wherein the finalized target sequence includes one or more target elements.
    Type: Application
    Filed: May 22, 2023
    Publication date: January 25, 2024
    Inventors: William Chan, Mitchell Thomas Stern, Nikita Kitaev, Kelvin Gu, Jakob D. Uszkoreit
  • Patent number: 11657277
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing sequence modeling tasks using insertions. One of the methods includes receiving a system input that includes one or more source elements from a source sequence and zero or more target elements from a target sequence, wherein each source element is selected from a vocabulary of source elements and wherein each target element is selected from a vocabulary of target elements; generating a partial concatenated sequence that includes the one or more source elements from the source sequence and the zero or more target elements from the target sequence, wherein the source and target elements arranged in the partial concatenated sequence according to a combined order; and generating a final concatenated sequence that includes a finalized source sequence and a finalized target sequence, wherein the finalized target sequence includes one or more target elements.
    Type: Grant
    Filed: May 26, 2020
    Date of Patent: May 23, 2023
    Assignee: Google LLC
    Inventors: William Chan, Mitchell Thomas Stern, Nikita Kitaev, Kelvin Gu, Jakob D. Uszkoreit
  • Publication number: 20210350244
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes an attention neural network configured to perform the machine learning task, the attention neural network including one or more LSH attention layers, each LSH attention layer comprising one or more LSH attention sub-layers, each LSH sub-layer configured to: receive a sequence of queries derived from an input sequence to the LSH attention layer, the sequence of queries having a respective query at each of a plurality of input positions; determine one or more respective hash values for each of the respective queries at each of the plurality of input positions; generate a plurality of LSH groupings; and generate an attended input sequence.
    Type: Application
    Filed: February 1, 2021
    Publication date: November 11, 2021
    Inventors: Nikita Kitaev, Lukasz Mieczyslaw Kaiser, Anselm Caelifer Levskaya
  • Patent number: 10909461
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes an attention neural network configured to perform the machine learning task, the attention neural network including one or more LSH attention layers, each LSH attention layer comprising one or more LSH attention sub-layers, each LSH sub-layer configured to: receive a sequence of queries derived from an input sequence to the LSH attention layer, the sequence of queries having a respective query at each of a plurality of input positions; determine one or more respective hash values for each of the respective queries at each of the plurality of input positions; generate a plurality of LSH groupings; and generate an attended input sequence.
    Type: Grant
    Filed: May 8, 2020
    Date of Patent: February 2, 2021
    Assignee: Google LLC
    Inventors: Nikita Kitaev, Lukasz Mieczyslaw Kaiser, Anselm Caelifer Levskaya
  • Publication number: 20200372356
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing sequence modeling tasks using insertions. One of the methods includes receiving a system input that includes one or more source elements from a source sequence and zero or more target elements from a target sequence, wherein each source element is selected from a vocabulary of source elements and wherein each target element is selected from a vocabulary of target elements; generating a partial concatenated sequence that includes the one or more source elements from the source sequence and the zero or more target elements from the target sequence, wherein the source and target elements arranged in the partial concatenated sequence according to a combined order; and generating a final concatenated sequence that includes a finalized source sequence and a finalized target sequence, wherein the finalized target sequence includes one or more target elements.
    Type: Application
    Filed: May 26, 2020
    Publication date: November 26, 2020
    Inventors: William Chan, Mitchell Thomas Stern, Nikita Kitaev, Kelvin Gu, Jakob D. Uszkoreit