Patents by Inventor Le HOU

Le HOU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250094838
    Abstract: An example technique for image analysis is provided. An example image analysis method includes obtaining an instructive sequence descriptive of an instructive query, an instructive response, and an instructive trace of intermediate states from the instructive query to the instructive response. The example image analysis method includes inputting, to a machine-learned model, the instructive sequence and an operative image processing query that comprises image data, wherein the machine-learned model is configured to process the operative query with attention over the instructive sequence. The example method can include generating, using the machine-learned model and responsive to the operative query, an operative image processing response that comprises an analysis of the image data.
    Type: Application
    Filed: December 3, 2024
    Publication date: March 20, 2025
    Inventors: Jason Weng Wei, Dengyong Zhou, Xuezhi Wang, Dale Eric Schuurmans, Quoc V. Le, Maarten Paul Bosma, Ed Huai-Hsin Chi, Olivier Jean Andrè Bousquet, Le Hou, Charles Aloysius Sutton, Nathanael Martin Schärli, Nathan Kemp Sekiguchi Scales, Augustus Quadrozzi Odena, Sharan Ajit Narang, Guy Gur-Ari Krakover, Aakanksha Chowdhery, David Martin Dohan, Aitor Lewkowycz, Jacob Austin, Henryk Michalewski, David Luan, David J. Bieber, Anders Johan Andreassen, Maxwell Isaac Nye
  • Publication number: 20240256965
    Abstract: An example method for training a machine-learned sequence processing model includes obtaining a plurality of training examples for training the machine-learned sequence processing model. For each respective training example of the plurality of training examples, the example method includes: obtaining a respective query associated with the respective training example; inputting the respective query to the machine-learned sequence processing model; obtaining, from the machine-learned sequence processing model a response to the respective query and a trace of intermediate states from the respective query to the response; evaluating the response using a ground truth response associated with the respective training example; evaluating the trace using a ground truth trace associated with the respective training example; and updating one or more parameters of the machine-learned sequence processing model based on the evaluation of the response and based on the evaluation of the trace.
    Type: Application
    Filed: January 26, 2024
    Publication date: August 1, 2024
    Inventors: Hyung Won Chung, Barret Zoph, Dengyong Zhou, Liam Fedus, Shayne Longpre, Le Hou, Yi Tay, Jason Weng Wei, Siddhartha Brahma, Quoc V. Le
  • Publication number: 20230394328
    Abstract: Example embodiments of aspects of the present disclosure provide an example computer-implemented method for improved prompting of a machine-learned model. The example method can include obtaining an instructive sequence descriptive of an instructive query, an instructive response, and an instructive trace of intermediate states from the instructive query to the instructive response. The example method can include inputting, to a machine-learned model, the instructive sequence and an operative query, wherein the machine-learned model is configured to process the operative query with attention over the instructive sequence. The example method can include generating, using the machine-learned model and responsive to the operative query, an operative response.
    Type: Application
    Filed: August 5, 2022
    Publication date: December 7, 2023
    Inventors: Jason Weng Wei, Dengyong Zhou, Dale Eric Schuurmans, Quoc V. Le, Maarten Paul Bosma, Ed Huai-Hsin Chi, Olivier Jean Andrè Bousquet, Le Hou, Nathan Kemp Sekiguchi Scales, David J. Bieber, Charles Aloysius Sutton, Nathanael Martin Schärli, Augustus Quadrozzi Odena, Sharan Ajit Narang, Guy Gur-Ari Krakover, Aakanksha Chowdhery, Aitor Lewkowycz, Jiageng Luan, David Martin Dohan, Henryk Michalewski, Jacob Austin, Anders Johan Andreassen, Maxwell Isaac Nye, Xuezhi Wang
  • Publication number: 20230244938
    Abstract: An example method for pretraining a machine-learned model is provided. The example method includes obtaining a plurality of different combinations of configuration parameters of a pretraining objective framework. The example method includes generating, using the pretraining objective framework, a plurality of corrupted training examples from one or more training examples, wherein the plurality of corrupted training examples are respectively generated according to the plurality of different combinations. The example method includes inputting the plurality of corrupted training examples into the machine-learned model, wherein the machine-learned model is configured to generate uncorrupted subportions corresponding to corrupted subportions of the corrupted training examples. The example method includes obtaining, from the machine-learned model, a plurality of outputs respectively generated by the machine-learned model based on the plurality of corrupted training examples.
    Type: Application
    Filed: January 27, 2023
    Publication date: August 3, 2023
    Inventors: Jason Weng Wei, Dengyong Zhou, Xuezhi Wang, Dale Eric Schuurmans, Quoc V. Le, Maarten Paul Bosma, Ed Huai-Hsin Chi, Olivier Jean Andrè Bousquet, Le Hou, Charles Aloysius Sutton, Nathanael Martin Schärli, Nathan Kemp Sekiguchi Scales, Augustus Quadrozzi Odena, Sharan Ajit Narang, Guy Gur-Ari Krakover, Aakanksha Chowdhery, David Martin Dohan, Aitor Lewkowycz, Henryk Michalewski, Jiageng Luan, David J. Bieber, Jacob Austin, Anders Johan Andreassen, Maxwell Isaac Nye, Yi Tay, Mostafa Dehghani
  • Publication number: 20220108221
    Abstract: Systems and methods of the present disclosure are directed to a computer-implemented method. The method can include obtaining a machine-learned model comprising a plurality of model units, wherein each model unit comprises a plurality of parameters that are tied to a shared plurality of parameters. The method can include performing a first plurality of training iterations with the machine-learned model to adjust parameters of the shared plurality of parameters. The method can include detecting, based on the first plurality of training iterations, an occurrence of an untying condition. The method can include untying the parameters of one or more model units from the shared plurality of parameters. The method can include performing a second plurality of training iterations with the machine-learned model to adjust parameters of the one or more model units independent of the shared plurality of parameters.
    Type: Application
    Filed: October 4, 2021
    Publication date: April 7, 2022
    Inventors: Dengyong Zhou, Xiaodan Song, Shuo Yang, Qiang Liu, Le Hou
  • Patent number: 11164312
    Abstract: A system associated with quantifying a density level of tumor-infiltrating lymphocytes, based on prediction of reconstructed TIL information associated with tumoral tissue image data during pathology analysis of the tissue image data is disclosed. The system receives digitized diagnostic and stained whole-slide image data related to tissue of a particular type of tumoral data. Defined are regions of interest that represents a portion of, or a full image of the whole-slide image data. The image data is encoded into segmented data portions based on convolutional autoencoding of objects associated with the collection of image data. The density of tumor-infiltrating lymphocytes is determined of bounded segmented data portions for respective classification of the regions of interest. A classification label is assigned to the regions of interest. It is determined whether an assigned classification label is above a pre-determined threshold probability value of lymphocyte infiltrated.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: November 2, 2021
    Assignees: The Research Foundation tor the State University of New York, Board of Regents, The University of Texas System, Institute for Systems Biology
    Inventors: Joel Haskin Saltz, Tahsin Kurc, Rajarsi Gupta, Tianhao Zhao, Rebecca Batiste, Le Hou, Vu Nguyen, Dimitrios Samaras, Arvind Rao, John Van Arnam, Pankaj Singh, Alexander Lazar, Ashish Sharma, Ilya Shmulevich, Vesteinn Thorsson
  • Publication number: 20200388029
    Abstract: A system associated with quantifying a density level of tumor-infiltrating lymphocytes, based on prediction of reconstructed TIL information associated with tumoral tissue image data during pathology analysis of the tissue image data is disclosed. The system receives digitized diagnostic and stained whole-slide image data related to tissue of a particular type of tumoral data. Defined are regions of interest that represents a portion of, or a full image of the whole-slide image data. The image data is encoded into segmented data portions based on convolutional autoencoding of objects associated with the collection of image data. The density of tumor-infiltrating lymphocytes is determined of bounded segmented data portions for respective classification of the regions of interest. A classification label is assigned to the regions of interest. It is determined whether an assigned classification label is above a pre-determined threshold probability value of lymphocyte infiltrated.
    Type: Application
    Filed: November 30, 2018
    Publication date: December 10, 2020
    Inventors: Joel Haskin SALTZ, Tahsin KURC, Rajarsi GUPTA, Tianhao ZHAO, Rebecca BATISTE, Le HOU, Vu NGUYEN, Dimitrios SAMARAS, Arvind RAO, John VAN ARNAM, Pankaj SINGH, Alexander LAZAR, Ashish SHARMA, Ilya SHMULEVICH, Vesteinn THORSSON