Patents by Inventor Nagaraj N. Bhat

Nagaraj N. Bhat has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240143934
    Abstract: A method includes accessing document including sentences, document being associated with configuration flag indicating whether ABSA, SLSA, or both are to be performed; inputting the document into language model that generates chunks of token embeddings for the document; and, based on the configuration flag, performing at least one from among the ABSA and the SLSA by inputting the chunks of token embeddings into a multi-task model. When performing the SLSA, a part of token embeddings in each of the chunks is masked, and the masked token embeddings do not belong to a particular sentence on which the SLSA is performed.
    Type: Application
    Filed: October 12, 2023
    Publication date: May 2, 2024
    Applicant: Oracle International Corporation
    Inventors: Poorya Zaremoodi, Duy Vu, Nagaraj N. Bhat, Srijon Sarkar, Varsha Kuppur Rajendra, Thanh Long Duong, Mark Edward Johnson, Pramir Sarkar, Shahid Reza
  • Publication number: 20240135116
    Abstract: A computer-implemented method includes: accessing a plurality of datasets, where each dataset of the plurality of datasets includes training examples; selecting datasets that include the training examples in a source language and a target language; and sampling, based on a sampling weight that is determined for each of the selected datasets, the training examples from the selected datasets to generate the training batches; training an ML model for performing at least a first task using the training examples of the training batches, by interleavingly inputting the training batches to the ML model; and outputting the trained ML model configured to perform the at least the first task on input utterances provided in at least one among the source language and the target language. The sampling weight is determined for each of the selected datasets based on one or more attributes common to the training examples of the selected dataset.
    Type: Application
    Filed: October 12, 2023
    Publication date: April 25, 2024
    Applicant: Oracle International Corporation
    Inventors: Duy Vu, Poorya Zaremoodi, Nagaraj N. Bhat, Srijon Sarkar, Varsha Kuppur Rajendra, Thanh Long Duong, Mark Edward Johnson, Pramir Sarkar, Shahid Reza
  • Publication number: 20240103925
    Abstract: Techniques disclosed herein can include receiving an instruction to perform a stress test on one or more cloud computing resources of a cloud computing system. Worker nodes of the cloud computing system can be provisioned by a resource manager to perform the stress test on the cloud computing resources. The resource manager can instruct the one or more worker nodes of the cloud computing system to perform the stress test. Data generated by the worker nodes during the stress test can be received by the resource manager and used to train a projection framework comprising a trained machine learning model. The projection framework can generate a resource projection and the projection can be used to provision cloud computing resources to host the cloud service.
    Type: Application
    Filed: September 28, 2022
    Publication date: March 28, 2024
    Applicant: Oracle International Corporation
    Inventors: Nagaraj N. Bhat, Joydeb Mondal, Amritanshu Jain, Pramir Sarkar
  • Publication number: 20230237277
    Abstract: Techniques for dynamically developing a contextual set of prompts based on relevant aspects extracted from s set of training data. One technique includes obtaining training data comprising text examples and associated labels, extracting aspects from the training data, generating prompting templates based on the training data and the extracted aspects, concatenating each of the text examples with the respective generated prompting template to create prompting functions, training a machine learning language model on the prompting functions to predict a solution for a task, where the training is formulated as a masked language modeling problem with blanks of the prompting templates being set as text labels and expected output for the task being set as specified solution labels, and the training learns or updates model parameters of the machine learning language model for performing the task. The machine learning language model is provided with the learned or updated model parameters.
    Type: Application
    Filed: January 25, 2022
    Publication date: July 27, 2023
    Applicant: Oracle International Corporation
    Inventors: Shahid Reza, Nagaraj N. Bhat
  • Publication number: 20230098783
    Abstract: Techniques are disclosed herein for focused training of language models and end-to-end hypertuning of the framework. In one aspect, a method is provided that includes obtaining a machine learning model pre-trained for language modeling, and post-training the machine learning model for various tasks to generate a focused machine learning model. The post-training includes: (i) training the machine learning model on an unlabeled set of training data pertaining to a task that the machine learning model was pre-trained for as part of the language modeling, and the unlabeled set of training data is obtained with respect to a target domain, a target task, or a target language, and (ii) training the machine learning model on a labeled set of training data that pertains to another task that is an auxiliary task related to a downstream task to be performed using the machine learning model or output from the machine learning model.
    Type: Application
    Filed: September 23, 2022
    Publication date: March 30, 2023
    Applicant: Oracle International Corporation
    Inventors: Poorya Zaremoodi, Cong Duy Vu Hoang, Duy Vu, Dai Hoang Tran, Budhaditya Saha, Nagaraj N. Bhat, Thanh Tien Vu, Tuyen Quang Pham, Adam Craig Pocock, Katherine Silverstein, Srinivasa Phani Kumar Gadde, Vishal Vishnoi, Mark Edward Johnson, Thanh Long Duong