Patents by Inventor Subhabrata Mukherjee
Subhabrata Mukherjee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240046087Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.Type: ApplicationFiled: October 4, 2023Publication date: February 8, 2024Applicant: Microsoft Technology Licensing, LLCInventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
-
Publication number: 20230419019Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.Type: ApplicationFiled: September 8, 2023Publication date: December 28, 2023Inventors: Hany Mohamed Hassan AWADALLA, Subhabrata Mukherjee, Ahmed Awadallah
-
Patent number: 11816566Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.Type: GrantFiled: May 18, 2020Date of Patent: November 14, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
-
Patent number: 11797755Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.Type: GrantFiled: August 13, 2020Date of Patent: October 24, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Hany Mohamed Hassan Awadalla, Subhabrata Mukherjee, Ahmed Awadallah
-
Publication number: 20230214629Abstract: Generally discussed herein are devices, systems, and methods for improving architecture search and identification with constraints. A method can include receiving, at a compute device, a request for a transformer-based autoregressive language model (TBALM), the request specifying a maximum latency, identifying TBALM architectures that satisfies the maximum latency, identifying a TBALM architecture of the identified TBALM architectures that has a greatest number of decoder parameters resulting in an identified TBALM architecture, and providing the identified TBALM architecture.Type: ApplicationFiled: December 30, 2021Publication date: July 6, 2023Inventors: Debadeepta Dey, Shital Rajnikant Shah, Gustavo Henrique De Rosa, Caio César Teodoro Mendes, Sebastien Bubeck, Tomasz Lukasz Religa, Saurabh Vasant Naik, Yan He, Subhabrata Mukherjee, Mojan Javaheripi
-
Publication number: 20220050955Abstract: A method for training a machine learning model with parallel annotations of source instances and while facilitating security of the source instances can be performed by a system that generates a coupled machine learning model from (1) a first machine learning model trained on a first set of training data comprising unannotated natural language and (2) a second machine learning model trained on populated target templates which are populated with a plurality of vocabulary words. Once formed, the coupled machine learning model is configured to transform unannotated natural language into annotated machine-readable text.Type: ApplicationFiled: August 13, 2020Publication date: February 17, 2022Inventors: Hany Mohamed Hassan Awadalla, Subhabrata Mukherjee, Ahmed Awadallah
-
Publication number: 20210357747Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having a first classification layer, a second classification layer, and an encoder that feeds into the first classification layer and the second classification layer. The example method also involves obtaining first training examples having explicit labels and second training examples having inferred labels. The inferred labels are based at least on actions associated with the second training examples. The example method also involves training the machine learning model using the first training examples and the second training examples using a training objective that considers first training loss of the first classification layer for the explicit labels and second training loss of the second classification layer for the inferred labels. The method also involves outputting a trained machine learning model having the encoder and the first classification layer.Type: ApplicationFiled: May 18, 2020Publication date: November 18, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Subhabrata Mukherjee, Guoqing Zheng, Ahmed Awadalla, Milad Shokouhi, Susan Theresa Dumais, Kai Shu
-
Patent number: 11151055Abstract: Systems and methods of tracking page state changes are provided. An input/output is communicatively coupled to a host having a memory. The I/O device receives a command from the host to monitor page state changes in a region of the memory allocated to a process. The I/O device, bypassing a CPU of the host, modifies data stored in the region based on a request, for example, received from a client device via a computer network. The I/O device records the modification to a bitmap by setting a bit in the bitmap that corresponds to a location of the data in the memory. The I/O device transfers contents of the bitmap to the CPU, wherein the CPU completes the live migration by copying sections of the first region indicated by the bitmap to a second region of memory. In some implementations, the process can be a virtual machine, a user space application, or a container.Type: GrantFiled: August 26, 2019Date of Patent: October 19, 2021Assignee: Google LLCInventors: Shrijeet Subhabrata Mukherjee, Prashant Chandra, David Alan Dillow, Joseph Raymond Michael Zbiciak, Horacio Andres Lagar Cavilla
-
Publication number: 20200356493Abstract: Systems and methods of tracking page state changes are provided. An input/output is communicatively coupled to a host having a memory. The I/O device receives a command from the host to monitor page state changes in a region of the memory allocated to a process. The I/O device, bypassing a CPU of the host, modifies data stored in the region based on a request, for example, received from a client device via a computer network. The I/O device records the modification to a bitmap by setting a bit in the bitmap that corresponds to a location of the data in the memory. The I/O device transfers contents of the bitmap to the CPU, wherein the CPU completes the live migration by copying sections of the first region indicated by the bitmap to a second region of memory. In some implementations, the process can be a virtual machine, a user space application, or a container.Type: ApplicationFiled: August 26, 2019Publication date: November 12, 2020Inventors: Shrijeet Subhabrata Mukherjee, Prashant Chandra, David Alan Dillow, Joseph Raymond Michael Zbiciak, Horacio Andres Lagar Cavilla