Patents by Inventor Prafulla Kumar Choubey

Prafulla Kumar Choubey has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240070394
    Abstract: Embodiments described herein provide a mechanism that ensembles trainable soft prompts to transfer knowledge from source tasks under few-shot learning settings. Specifically, given a source task input from a source task training dataset, a set of soft prompts may be trained using a frozen PLM on the large-scale source task training dataset. The set of soft prompts are then prepended to a target task input, based on which the frozen pre-trained language model generates a set of logits for predicting classification of the target task input, respectively. An attention module is used to generate input-logit attention scores, which are used to compute a weighted linear combination of the logits given the attention scores. The weighted linear combination are the final logits to predict the final classification of the target task input.
    Type: Application
    Filed: January 27, 2023
    Publication date: February 29, 2024
    Inventors: Xiangyu Peng, Chen Xing, Prafulla Kumar Choubey, Chieng-Sheng Wu
  • Publication number: 20230419017
    Abstract: Embodiments described herein provide a method for text summarization. The method includes receiving a training dataset having at least an uncompressed text, a compressed text, and one or more information entities accompanying the compressed text. The method also includes generating, using a perturber model, a perturbed text with the one or more information entities being inserted into the compressed text. The method further includes training the perturber model based on a first training objective, and generating, using the trained perturber model, a perturbed summary in response to an input of a reference summary. The method further includes generating, via an editor model, a predicted summary by removing information from the perturbed summary conditioned on a source document of the reference summary, and training the editor model based on a second training objective.
    Type: Application
    Filed: October 6, 2022
    Publication date: December 28, 2023
    Inventors: Alexander R. Fabbri, Prafulla Kumar Choubey, Jesse Vig, Chien-Sheng Wu, Caiming Xiong
  • Publication number: 20230376677
    Abstract: Embodiments described herein provide a document summarization framework that employs an ensemble of summarization models, each of which is a modified version of a base summarization model to control hallucination. For example, a base summarization model may first be trained on a full training data set. The trained base summarization model is then fine-tuned using a first filtered subset of the training data which contains noisy data, resulting in an “anti-expert” model. The parameters of the anti-expert model are subtracted from the parameters of the trained base model to produce a final summarization model which yields robust factual performance.
    Type: Application
    Filed: August 3, 2022
    Publication date: November 23, 2023
    Inventors: Prafulla Kumar Choubey, Alexander R. Fabbri, Jesse Vig, Chien-Sheng Wu, Wenhao Liu, Nazneen Rajani
  • Publication number: 20230334245
    Abstract: Embodiments described herein provide a Conformal Predictor (CP) that reduces the number of likely target class labels CP. Specifically, the CP provides a model agnostic framework to generate a label set, instead of a single label prediction, within a pre-defined error rate. The CP employs a fast base classifier which may be used to filter out unlikely labels from the target label set, and thus restrict the number of probable target class labels while ensuring the candidate class labels set meets the pre-defined error rate.
    Type: Application
    Filed: August 16, 2022
    Publication date: October 19, 2023
    Inventors: Prafulla Kumar Choubey, Yu Bai, Nazneen Rajani, Wenhao Liu
  • Publication number: 20230119109
    Abstract: Embodiments described herein provide a document summarization framework that controls different factual errors, referred to as “Mixture of Factual Experts (MoFE)” framework. MoFE applies an ensemble of factual expert models to control hallucination in summarization systems. Each factual expert model is trained to generate summaries with a unique type of factual quality. Factual consistency metrics may be used to filter training data in order to adjust the training inputs for each respective expert. The overall factual quality of MoFE may be achieved by controlling the relative weight of each factual expert. The experts may be ensembled (either through logits ensembling, or weighted average of parameters) in order to create a combined output that shares characteristics from each according to its relative weight.
    Type: Application
    Filed: January 27, 2022
    Publication date: April 20, 2023
    Inventors: Prafulla Kumar Choubey, Nazneen Rajani
  • Publication number: 20230083512
    Abstract: Embodiments described herein provide a system and method for extracting factual information. The system transforms a query into a natural language prompt in a format of a query subject and a queried relation. The system encodes, via an embedding layer of a pre-trained language model, the natural language prompt into a first embedding. The system encodes, via the adapter model, the first embedding into a second embedding based on a probability that the second embedding returns the factual information when the second embedding is fed the first attention layer of the pre-trained language model. The system decodes, by the first attention layer of the pre-trained language mode, the second embedding into a response to the query. The system extracts the factual information from the decoded response to the query.
    Type: Application
    Filed: January 28, 2022
    Publication date: March 16, 2023
    Inventors: Benjamin Newman, Nazneen Rajani, Prafulla Kumar Choubey