Patents by Inventor Andrew M. Dai

Andrew M. Dai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240112027
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing neural architecture search for machine learning models. In one aspect, a method comprises receiving training data for a machine learning, generating a plurality of candidate neural networks for performing the machine learning task, wherein each candidate neural network comprises a plurality of instances of a layer block composed of a plurality of layers, for each candidate neural network, selecting a respective type for each of the plurality of layers from a set of layer types that comprises, training the candidate neural network and evaluating performance scores for the trained candidate neural networks as applied to the machine learning task, and determining a final neural network for performing the machine learning task based at least on the performance scores for the candidate neural networks.
    Type: Application
    Filed: September 28, 2023
    Publication date: April 4, 2024
    Inventors: Yanqi Zhou, Yanping Huang, Yifeng Lu, Andrew M. Dai, Siamak Shakeri, Zhifeng Chen, James Laudon, Quoc V. Le, Da Huang, Nan Du, David Richard So, Daiyi Peng, Yingwei Cui, Jeffrey Adgate Dean, Chang Lan
  • Patent number: 11900235
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing inputs using recurrent neural networks. One of the systems includes a main recurrent neural network comprising one or more recurrent neural network layers and a respective hyper recurrent neural network corresponding to each of the one or more recurrent neural network layers, wherein each hyper recurrent neural network is configured to, at each of a plurality of time steps: process the layer input at the time step to the corresponding recurrent neural network layer, the current layer hidden state of the corresponding recurrent neural network layer, and a current hypernetwork hidden state of the hyper recurrent neural network to generate an updated hypernetwork hidden state.
    Type: Grant
    Filed: September 9, 2021
    Date of Patent: February 13, 2024
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le, David Ha
  • Patent number: 11868888
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a document classification neural network. One of the methods includes training an autoencoder neural network to autoencode input documents, wherein the autoencoder neural network comprises the one or more LSTM neural network layers and an autoencoder output layer, and wherein training the autoencoder neural network comprises determining pre-trained values of the parameters of the one or more LSTM neural network layers from initial values of the parameters of the one or more LSTM neural network layers; and training the document classification neural network on a plurality of training documents to determine trained values of the parameters of the one or more LSTM neural network layers from the pre-trained values of the parameters of the one or more LSTM neural network layers.
    Type: Grant
    Filed: December 13, 2021
    Date of Patent: January 9, 2024
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le
  • Publication number: 20230334306
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting future patient health using a recurrent neural network. In particular, at each time step, a network input for the time step is processed using a recurrent neural network to update a hidden state of the recurrent neural network. Specifically, the hidden state of the recurrent neural network is partitioned into a plurality of partitions and the plurality of partitions comprises a respective partition for each of a plurality of possible observational features.
    Type: Application
    Filed: February 18, 2020
    Publication date: October 19, 2023
    Inventors: Kun Zhang, Andrew M. Dai, Yuan Xue, Alvin Rishi Rajkomar, Gerardo Flores
  • Publication number: 20230274151
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for searching for an architecture for a neural network that performs a multi-modal task that requires operating on inputs that each include data from multiple different modalities.
    Type: Application
    Filed: March 30, 2021
    Publication date: August 31, 2023
    Inventors: Zhen Xu, David Richard So, Andrew M. Dai
  • Patent number: 11742087
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting future patient health using neural networks. One of the methods includes receiving electronic health record data for a patient; generating a respective observation embedding for each of the observations, comprising, for each clinical note: processing the sequence of tokens in the clinical note using a clinical note embedding LSTM to generate a respective token embedding for each of the tokens; and generating the observation embedding for the clinical note from the token embeddings; generating an embedded representation, comprising, for each time window: combining the observation embeddings of observations occurring during the time window to generate a patient record embedding; and processing the embedded representation of the electronic health record data using a prediction recurrent neural network to generate a neural network output that characterizes a future health status of the patient.
    Type: Grant
    Filed: August 11, 2020
    Date of Patent: August 29, 2023
    Assignee: Google LLC
    Inventors: Jonas Beachey Kemp, Andrew M. Dai, Alvin Rishi Rajkomar
  • Patent number: 11501168
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for structuring and training a recurrent neural network. This describes a technique that improves the ability to capture long term dependencies in recurrent neural networks by adding an unsupervised auxiliary loss at one or more anchor points to the original objective. This auxiliary loss forces the network to either reconstruct previous events or predict next events in a sequence, making truncated backpropagation feasible for long sequences and also improving full backpropagation through time.
    Type: Grant
    Filed: February 11, 2019
    Date of Patent: November 15, 2022
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le, Hoang Trieu Trinh, Thang Minh Luong
  • Patent number: 11200492
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a document classification neural network. One of the methods includes training an autoencoder neural network to autoencode input documents, wherein the autoencoder neural network comprises the one or more LSTM neural network layers and an autoencoder output layer, and wherein training the autoencoder neural network comprises determining pre-trained values of the parameters of the one or more LSTM neural network layers from initial values of the parameters of the one or more LSTM neural network layers; and training the document classification neural network on a plurality of training documents to determine trained values of the parameters of the one or more LSTM neural network layers from the pre-trained values of the parameters of the one or more LSTM neural network layers.
    Type: Grant
    Filed: January 6, 2020
    Date of Patent: December 14, 2021
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le
  • Patent number: 11164066
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing inputs using recurrent neural networks. One of the systems includes a main recurrent neural network comprising one or more recurrent neural network layers and a respective hyper recurrent neural network corresponding to each of the one or more recurrent neural network layers, wherein each hyper recurrent neural network is configured to, at each of a plurality of time steps: process the layer input at the time step to the corresponding recurrent neural network layer, the current layer hidden state of the corresponding recurrent neural network layer, and a current hypernetwork hidden state of the hyper recurrent neural network to generate an updated hypernetwork hidden state.
    Type: Grant
    Filed: September 26, 2017
    Date of Patent: November 2, 2021
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le, David Ha
  • Publication number: 20210125721
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting future patient health using neural networks. One of the methods includes receiving electronic health record data for a patient; generating a respective observation embedding for each of the observations, comprising, for each clinical note: processing the sequence of tokens in the clinical note using a clinical note embedding LSTM to generate a respective token embedding for each of the tokens; and generating the observation embedding for the clinical note from the token embeddings; generating an embedded representation, comprising, for each time window: combining the observation embeddings of observations occurring during the time window to generate a patient record embedding; and processing the embedded representation of the electronic health record data using a prediction recurrent neural network to generate a neural network output that characterizes a future health status of the patient.
    Type: Application
    Filed: August 11, 2020
    Publication date: April 29, 2021
    Inventors: Jonas Beachey Kemp, Andrew M. Dai, Alvin Rishi Rajkomar
  • Publication number: 20210034973
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network. One of the methods includes training the neural network for one or more training steps in accordance with a current learning rate; generating a training dynamics observation characterizing the training of the trainee neural network on the one or more training steps; providing the training dynamics observation as input to a controller neural network that is configured to process the training dynamics observation to generate a controller output that defines an updated learning rate; obtaining as output from the controller neural network the controller output that defines the updated learning rate; and setting the learning rate to the updated learning rate.
    Type: Application
    Filed: July 30, 2020
    Publication date: February 4, 2021
    Inventors: Zhen Xu, Andrew M. Dai, Jonas Beachey Kemp, Luke Shekerjian Metz
  • Patent number: 10803380
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating document vector representations. One of the methods includes obtaining a new document; selecting a plurality of new document word sets; and determining a vector representation for the new document using a trained neural network system, wherein the trained neural network system comprises: a document embedding layer and a classifier, and wherein determining the vector representation for the new document using the trained neural network system comprises iteratively providing each of the plurality of new document word sets to the trained neural network system to determine the vector representation for the new document using gradient descent.
    Type: Grant
    Filed: September 12, 2016
    Date of Patent: October 13, 2020
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le, Gregory Sean Corrado
  • Publication number: 20200293873
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating document vector representations. One of the methods includes obtaining a new document; selecting a plurality of new document word sets; and determining a vector representation for the new document using a trained neural network system, wherein the trained neural network system comprises: a document embedding layer and a classifier, and wherein determining the vector representation for the new document using the trained neural network system comprises iteratively providing each of the plurality of new document word sets to the trained neural network system to determine the vector representation for the new document using gradient descent.
    Type: Application
    Filed: September 12, 2016
    Publication date: September 17, 2020
    Inventors: Andrew M. Dai, Quoc V. Le, Gregory Sean Corrado
  • Patent number: 10770180
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting future patient health using neural networks. One of the methods includes receiving electronic health record data for a patient; generating a respective observation embedding for each of the observations, comprising, for each clinical note: processing the sequence of tokens in the clinical note using a clinical note embedding LSTM to generate a respective token embedding for each of the tokens; and generating the observation embedding for the clinical note from the token embeddings; generating an embedded representation, comprising, for each time window: combining the observation embeddings of observations occurring during the time window to generate a patient record embedding; and processing the embedded representation of the electronic health record data using a prediction recurrent neural network to generate a neural network output that characterizes a future health status of the patient.
    Type: Grant
    Filed: December 12, 2019
    Date of Patent: September 8, 2020
    Assignee: Google LLC
    Inventors: Jonas Beachey Kemp, Andrew M. Dai, Alvin Rishi Rajkomar
  • Patent number: 10528866
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a document classification neural network. One of the methods includes training an autoencoder neural network to autoencode input documents, wherein the autoencoder neural network comprises the one or more LSTM neural network layers and an autoencoder output layer, and wherein training the autoencoder neural network comprises determining pre-trained values of the parameters of the one or more LSTM neural network layers from initial values of the parameters of the one or more LSTM neural network layers; and training the document classification neural network on a plurality of training documents to determine trained values of the parameters of the one or more LSTM neural network layers from the pre-trained values of the parameters of the one or more LSTM neural network layers.
    Type: Grant
    Filed: September 6, 2016
    Date of Patent: January 7, 2020
    Assignee: Google LLC
    Inventors: Andrew M. Dai, Quoc V. Le
  • Publication number: 20190251449
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for structuring and training a recurrent neural network. This describes a technique that improves the ability to capture long term dependencies in recurrent neural networks by adding an unsupervised auxiliary loss at one or more anchor points to the original objective. This auxiliary loss forces the network to either reconstruct previous events or predict next events in a sequence, making truncated backpropagation feasible for long sequences and also improving full backpropagation through time.
    Type: Application
    Filed: February 11, 2019
    Publication date: August 15, 2019
    Inventors: Andrew M. Dai, Quoc V. Le, Hoang Trieu Trinh, Thang Minh Luong
  • Patent number: 9075792
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for decompounding compound words are disclosed. In one aspect, a method includes obtaining a token that includes a sequence of characters, identifying two or more candidate sub-words that are constituents of the token, and one or more morphological operations that are required to transform the sub-words into the token, where at least one of the morphological operations involves a use of a non-dictionary word, and determining a cost associated with each sub-word and a cost associated with each morphological operation.
    Type: Grant
    Filed: February 14, 2011
    Date of Patent: July 7, 2015
    Assignee: Google Inc.
    Inventors: Andrew M. Dai, Klaus Macherey, Franz Josef Och, Ashok C. Popat, David R. Talbot
  • Publication number: 20140172853
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating tasks from user observations. One of the methods includes segmenting a plurality of observations associated with a user of a user device into a plurality of tasks previously engaged in by the user; and generating a respective task presentation for each of the plurality of tasks for presentation to the user.
    Type: Application
    Filed: December 5, 2013
    Publication date: June 19, 2014
    Applicant: Google Inc.
    Inventors: Ramanathan V. Guha, Ramakrishnan Srikant, Vineet Gupta, David Martin, Mahesh Keralapura Manjunatha, Andrew M. Dai, Carolyn Au, Elena Erbiceanu, Surabhi Gupta, Matthew D. Wytock, Carl R. Lischeske, III, Vivek Raghunathan
  • Publication number: 20140156623
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating tasks from user observations. One of the methods includes segmenting a plurality of observations associated with a user of a user device into a plurality of tasks previously engaged in by the user; and generating a respective task presentation for each of the plurality of tasks for presentation to the user.
    Type: Application
    Filed: December 5, 2013
    Publication date: June 5, 2014
    Applicant: Google Inc.
    Inventors: Ramanathan V. Guha, Ramakrishnan Srikant, Vineet Gupta, David Martin, Mahesh Keralapura Manjunatha, Andrew M. Dai, Carolyn Au, Elena Erbiceanu, Surabhi Gupta, Matthew D. Wytock, Carl R. Lischeske, III, Vivek Raghunathan
  • Publication number: 20110202330
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for decompounding compound words are disclosed. In one aspect, a method includes obtaining a token that includes a sequence of characters, identifying two or more candidate sub-words that are constituents of the token, and one or more morphological operations that are required to transform the sub-words into the token, where at least one of the morphological operations involves a use of a non-dictionary word, and determining a cost associated with each sub-word and a cost associated with each morphological operation.
    Type: Application
    Filed: February 14, 2011
    Publication date: August 18, 2011
    Applicant: GOOGLE INC.
    Inventors: Andrew M. Dai, Klaus Macherey, Franz Josef Och, Ashok C. Popat, David R. Talbot