Patents by Inventor Omid Mohamad Nezami

Omid Mohamad Nezami has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240062112
    Abstract: Techniques are disclosed herein for adaptive training data augmentation to facilitate training named entity recognition (NER) models. Adaptive augmentation techniques are disclosed herein that take into consideration the distribution of different entity types within training data. The adaptive augmentation techniques generate adaptive numbers of augmented examples (e.g., utterances) based on the distribution of entities to make sure enough numbers of examples for minority class entities are generated during augmentation of the training data.
    Type: Application
    Filed: August 16, 2023
    Publication date: February 22, 2024
    Applicant: Oracle International Corporation
    Inventors: Omid Mohamad Nezami, Thanh Tien Vu, Budhaditya Saha, Shubham Pawankumar Shah
  • Publication number: 20230325599
    Abstract: Techniques are provided for augmenting training data using gazetteers and perturbations to facilitate training named entity recognition models. The training data can be augmented by generating additional utterances from original utterances in the training data and combining the generated additional utterances with the original utterances to form the augmented training data. The additional utterances can be generated by replacing the named entities in the original utterances with different named entities and/or perturbed versions of the named entities in the original utterances selected from a gazetteer. Gazetteers of named entities can be generated from the training data and expanded by searching a knowledge base and/or perturbing the named entities therein. The named entity recognition model can be trained using the augmented training data.
    Type: Application
    Filed: March 17, 2023
    Publication date: October 12, 2023
    Applicant: Oracle International Corporation
    Inventors: Omid Mohamad Nezami, Shivashankar Subramanian, Thanh Tien Vu, Tuyen Quang Pham, Budhaditya Saha, Aashna Devang Kanuga, Shubham Pawankumar Shah
  • Publication number: 20230115321
    Abstract: Techniques are provided for customizing or fine-tuning a pre-trained version of a machine-learning model that includes multiple layers and is configured to process audio or textual language input. Each of the multiple layers is configured with a plurality of layer-specific pre-trained parameter values corresponding to a plurality of parameters, and each of the multiple layers is configured to implement multi-head attention. An incomplete subset of the multiple layers is identified for which corresponding layer-specific pre-trained parameter values are to be fine-tuned using a client data set. The machine-learning model is fine-tuned using the client data set to generate an updated version of the machine-learning model, where the layer-specific pre-trained parameter values configured for each layer of one of more of the multiple layers not included in the incomplete subset are frozen during the fine-tuning. Use of the updated version of the machine-learning model is facilitated.
    Type: Application
    Filed: May 3, 2022
    Publication date: April 13, 2023
    Applicant: Oracle International Corporation
    Inventors: Thanh Tien Vu, Tuyen Quang Pham, Omid Mohamad Nezami, Mark Edward Johnson, Thanh Long Duong, Cong Duy Vu Hoang