Patents by Inventor Subhabrata Mukherjee

Subhabrata Mukherjee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240046087
    Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.
    Type: Application
    Filed: October 4, 2023
    Publication date: February 8, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
  • Publication number: 20230419019
    Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.
    Type: Application
    Filed: September 8, 2023
    Publication date: December 28, 2023
    Inventors: Hany Mohamed Hassan AWADALLA, Subhabrata Mukherjee, Ahmed Awadallah
  • Patent number: 11816566
    Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.
    Type: Grant
    Filed: May 18, 2020
    Date of Patent: November 14, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
  • Patent number: 11797755
    Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.
    Type: Grant
    Filed: August 13, 2020
    Date of Patent: October 24, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hany Mohamed Hassan Awadalla, Subhabrata Mukherjee, Ahmed Awadallah
  • Publication number: 20230214629
    Abstract: Generally discussed herein are devices, systems, and methods for improving architecture search and identification with constraints. A method can include receiving, at a compute device, a request for a transformer-based autoregressive language model (TBALM), the request specifying a maximum latency, identifying TBALM architectures that satisfies the maximum latency, identifying a TBALM architecture of the identified TBALM architectures that has a greatest number of decoder parameters resulting in an identified TBALM architecture, and providing the identified TBALM architecture.
    Type: Application
    Filed: December 30, 2021
    Publication date: July 6, 2023
    Inventors: Debadeepta Dey, Shital Rajnikant Shah, Gustavo Henrique De Rosa, Caio César Teodoro Mendes, Sebastien Bubeck, Tomasz Lukasz Religa, Saurabh Vasant Naik, Yan He, Subhabrata Mukherjee, Mojan Javaheripi
  • Publication number: 20220050955
    Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.
    Type: Application
    Filed: August 13, 2020
    Publication date: February 17, 2022
    Inventors: Hany Mohamed Hassan Awadalla, Subhabrata Mukherjee, Ahmed Awadallah
  • Publication number: 20210357747
    Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.
    Type: Application
    Filed: May 18, 2020
    Publication date: November 18, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
  • Patent number: 11151055
    Abstract: Systems and methods of tracking page state changes are provided. An input/output is communicatively coupled to a host having a memory. The I/O device receives a command from the host to monitor page state changes in a region of the memory allocated to a process. The I/O device, bypassing a CPU of the host, modifies data stored in the region based on a request, for example, received from a client device via a computer network. The I/O device records the modification to a bitmap by setting a bit in the bitmap that corresponds to a location of the data in the memory. The I/O device transfers contents of the bitmap to the CPU, wherein the CPU completes the live migration by copying sections of the first region indicated by the bitmap to a second region of memory. In some implementations, the process can be a virtual machine, a user space application, or a container.
    Type: Grant
    Filed: August 26, 2019
    Date of Patent: October 19, 2021
    Assignee: Google LLC
    Inventors: Shrijeet Subhabrata Mukherjee, Prashant Chandra, David Alan Dillow, Joseph Raymond Michael Zbiciak, Horacio Andres Lagar Cavilla
  • Publication number: 20200356493
    Abstract: Systems and methods of tracking page state changes are provided. An input/output is communicatively coupled to a host having a memory. The I/O device receives a command from the host to monitor page state changes in a region of the memory allocated to a process. The I/O device, bypassing a CPU of the host, modifies data stored in the region based on a request, for example, received from a client device via a computer network. The I/O device records the modification to a bitmap by setting a bit in the bitmap that corresponds to a location of the data in the memory. The I/O device transfers contents of the bitmap to the CPU, wherein the CPU completes the live migration by copying sections of the first region indicated by the bitmap to a second region of memory. In some implementations, the process can be a virtual machine, a user space application, or a container.
    Type: Application
    Filed: August 26, 2019
    Publication date: November 12, 2020
    Inventors: Shrijeet Subhabrata Mukherjee, Prashant Chandra, David Alan Dillow, Joseph Raymond Michael Zbiciak, Horacio Andres Lagar Cavilla