Patents by Inventor Jakob D. Uszkoreit

Jakob D. Uszkoreit has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10740433
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for implementing a sequence to sequence model that is recurrent in depth while employing self-attention to combine information from different parts of sequences.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: August 11, 2020
    Assignee: Google LLC
    Inventors: Mostafa Dehghani, Stephan Gouws, Oriol Vinyals, Jakob D. Uszkoreit, Lukasz Mieczyslaw Kaiser
  • Publication number: 20200234011
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating network outputs using insertion operations.
    Type: Application
    Filed: January 23, 2020
    Publication date: July 23, 2020
    Inventors: Jakob D. Uszkoreit, Mitchell Thomas Stern, Jamie Ryan Kiros, William Chan
  • Patent number: 10719764
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: September 3, 2019
    Date of Patent: July 21, 2020
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20200089755
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for training a machine learning model to perform multiple machine learning tasks from multiple machine learning domains. One system includes a machine learning model that includes multiple input modality neural networks corresponding to respective different modalities and being configured to map received data inputs of the corresponding modality to mapped data inputs from a unified representation space; an encoder neural network configured to process mapped data inputs from the unified representation space to generate respective encoder data outputs; a decoder neural network configured to process encoder data outputs to generate respective decoder data outputs from the unified representation space; and multiple output modality neural networks corresponding to respective different modalities and being configured to map decoder data outputs to data outputs of the corresponding modality.
    Type: Application
    Filed: November 19, 2019
    Publication date: March 19, 2020
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Ashish Teku Vaswani
  • Publication number: 20200082226
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can score sequences in sublinear time. By generating predictions for multiple time steps at once then backing off to a longest prefix validated by the scoring model, the methods can substantially improve the speed of greedy decoding without compromising performance.
    Type: Application
    Filed: November 13, 2019
    Publication date: March 12, 2020
    Inventors: Noam M. Shazeer, Jakob D. Uszkoreit, Mitchell Thomas Stern
  • Patent number: 10521479
    Abstract: The present disclosure relates to evaluating different semantic interpretations of a search query. One example method includes obtaining a set of search results for a particular search query submitted to a search engine; obtaining a set of semantic interpretations for the particular search query; obtaining, for each semantic interpretation of the set, a canonical search query; generating a modified search query based at least in part on the particular search query and the canonical search query for the semantic interpretation; obtaining a set of search results for the modified search query for the semantic interpretation; and determining, for each semantic interpretation of the set, a degree of similarity between (i) the set of search results of the modified search query for the semantic interpretation, and (ii) the set of search results for the particular search query.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: December 31, 2019
    Assignee: Google LLC
    Inventors: Ashish Venugopal, Jakob D. Uszkoreit, John Blitzer, Edward Everett Anderson
  • Patent number: 10521701
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can score sequences in sublinear time. By generating predictions for multiple time steps at once then backing off to a longest prefix validated by the scoring model, the methods can substantially improve the speed of greedy decoding without compromising performance.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: December 31, 2019
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Jakob D. Uszkoreit, Mitchell Thomas Stern
  • Publication number: 20190392319
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: September 3, 2019
    Publication date: December 26, 2019
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20190354812
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing parallel generation of output from an autoregressive sequence to sequence model. In one aspect, a blockwise parallel decoding method takes advantage of the fact that some architectures can score sequences in sublinear time. By generating predictions for multiple time steps at once then backing off to a longest prefix validated by the scoring model, the methods can substantially improve the speed of greedy decoding without compromising performance.
    Type: Application
    Filed: May 20, 2019
    Publication date: November 21, 2019
    Inventors: Noam M. Shazeer, Jakob D. Uszkoreit, Mitchell Thomas Stern
  • Publication number: 20190354567
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for implementing a sequence to sequence model that is recurrent in depth while employing self-attention to combine information from different parts of sequences.
    Type: Application
    Filed: May 20, 2019
    Publication date: November 21, 2019
    Inventors: Mostafa Dehghani, Stephan Gouws, Oriol Vinyals, Jakob D. Uszkoreit, Lukasz Mieczyslaw Kaiser
  • Patent number: 10452978
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: June 28, 2018
    Date of Patent: October 22, 2019
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20190278813
    Abstract: The present disclosure relates to evaluating different semantic interpretations of a search query. One example method includes obtaining a set of search results for a particular search query submitted to a search engine; obtaining a set of semantic interpretations for the particular search query; obtaining, for each semantic interpretation of the set, a canonical search query; generating a modified search query based at least in part on the particular search query and the canonical search query for the semantic interpretation; obtaining a set of search results for the modified search query for the semantic interpretation; and determining, for each semantic interpretation of the set, a degree of similarity between (i) the set of search results of the modified search query for the semantic interpretation, and (ii) the set of search results for the particular search query.
    Type: Application
    Filed: May 20, 2019
    Publication date: September 12, 2019
    Inventors: Ashish Venugopal, Jakob D. Uszkoreit, John Blitzer, Edward Everett Anderson
  • Patent number: 10353964
    Abstract: The present disclosure relates to evaluating different semantic interpretations of a search query. One example method includes obtaining a set of search results for a particular search query submitted to a search engine; obtaining a set of semantic interpretations for the particular search query; obtaining, for each semantic interpretation of the set, a canonical search query; generating a modified search query based at least in part on the particular search query and the canonical search query for the semantic interpretation; obtaining a set of search results for the modified search query for the semantic interpretation; and determining, for each semantic interpretation of the set, a degree of similarity between (i) the set of search results of the modified search query for the semantic interpretation, and (ii) the set of search results for the particular search query.
    Type: Grant
    Filed: March 11, 2015
    Date of Patent: July 16, 2019
    Assignee: Google LLC
    Inventors: Ashish Venugopal, Jakob D. Uszkoreit, John Blitzer, Edward Everett Anderson
  • Publication number: 20190130213
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output image. In one aspect, one of the methods includes generating the output image intensity value by intensity value according to a generation order of pixel-color channel pairs from the output image, comprising, for each particular generation order position in the generation order: generating a current output image representation of a current output image, processing the current output image representation using a decoder neural network to generate a probability distribution over possible intensity values for the pixel-color channel pair at the particular generation order position, wherein the decoder neural network includes one or more local masked self-attention sub-layers; and selecting an intensity value for the pixel-color channel pair at the particular generation order position using the probability distribution.
    Type: Application
    Filed: October 29, 2018
    Publication date: May 2, 2019
    Inventors: Noam M. Shazeer, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Niki Parmar, Ashish Teku Vaswani
  • Publication number: 20190050450
    Abstract: Methods, systems, and apparatus for generating data describing context clusters and context cluster probabilities, wherein each context cluster includes query inputs based on the input context for each of the query inputs and the content described by each query input, and each context cluster probability indicates a probability that at a query input that belongs to the context cluster will be selected by the user, receiving, from a user device, an indication of a user event that includes data indicating a context of the user device, selecting as a selected context cluster, based on the context cluster probabilities for each of the context clusters and the context of the user device, a context cluster for selection input by the user device, and providing, to the user device, data that causes the user device to display a context cluster selection input that indicates the selected context cluster for user selection.
    Type: Application
    Filed: October 17, 2018
    Publication date: February 14, 2019
    Inventor: Jakob D. Uszkoreit
  • Patent number: 10146829
    Abstract: Methods, systems, and apparatus for generating data describing context clusters and context cluster probabilities, wherein each context cluster includes query inputs based on the input context for each of the query inputs and the content described by each query input, and each context cluster probability indicates a probability that at a query input that belongs to the context cluster will be selected by the user, receiving, from a user device, an indication of a user event that includes data indicating a context of the user device, selecting as a selected context cluster, based on the context cluster probabilities for each of the context clusters and the context of the user device, a context cluster for selection input by the user device, and providing, to the user device, data that causes the user device to display a context cluster selection input that indicates the selected context cluster for user selection.
    Type: Grant
    Filed: September 28, 2015
    Date of Patent: December 4, 2018
    Assignee: Google LLC
    Inventor: Jakob D. Uszkoreit
  • Publication number: 20180341860
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: June 28, 2018
    Publication date: November 29, 2018
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Patent number: 9984684
    Abstract: A language processing system collects similar queries and respective responses and aggregated by responses. Incorrect responses are determined and filtered by the aggregation. The remaining responses are then used to query a high precision system for attributes of entities specified by the queries. The attribute type is determined from the responses of the high precision system, and corresponding parse rules are generated. The parse rules are then associated with an operation that yields a response that specifies an attribute of the attribute type.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: May 29, 2018
    Assignee: Google LLC
    Inventors: Jakob D. Uszkoreit, John Blitzer, Engin Cinar Sahin, Rahul Gupta, Dekang Lin, Fernando Pereira
  • Patent number: 9812124
    Abstract: A language processing system identifies first command input sentences that do not successfully parse by any parsing rule in a set of parsing rules. Each of the parsing rules is associated with an action, and a user device performs the action associated with a parsing rule in response to an input sentence being successfully parsed by the parsing rule. For each of these identified first sentences, the system determines whether the first input sentence has an underserving signal that is indicative of one or more actions being underserved. If the first sentence has the underserving signal, then the first sentence is selected as a candidate input sentence. Each candidate input sentence is provided to an action analysis processes that determines whether a candidate input sentence is to be associated with one action, and upon a positive determination generates a parsing rule for the candidate input sentence.
    Type: Grant
    Filed: June 15, 2017
    Date of Patent: November 7, 2017
    Assignee: Google Inc.
    Inventors: Jakob D. Uszkoreit, Percy Liang, Daniel M. Bikel
  • Patent number: 9704481
    Abstract: A language processing system identifies first command input sentences that do not successfully parse by any parsing rule in a set of parsing rules. Each of the parsing rules is associated with an action, and a user device performs the action associated with a parsing rule in response to an input sentence being successfully parsed by the parsing rule. For each of these identified first sentences, the system determines whether the first input sentence has an underserving signal that is indicative of one or more actions being underserved. If the first sentence has the underserving signal, then the first sentence is selected as a candidate input sentence. Each candidate input sentence is provided to an action analysis processes that determines whether a candidate input sentence is to be associated with one action, and upon a positive determination generates a parsing rule for the candidate input sentence.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: July 11, 2017
    Assignee: Google Inc.
    Inventors: Jakob D. Uszkoreit, Percy Liang, Daniel M. Bikel