Patents by Inventor Akhilesh Deepak Gotmare

Akhilesh Deepak Gotmare has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230376841
    Abstract: Embodiments described herein provide a reinforcement learning based framework engaging pretrained language models (LMs) for program synthesis tasks. Specifically, the framework adopts a training strategy that optimizes pretrained LMs for program synthesis tasks in an actor-critic approach.
    Type: Application
    Filed: August 26, 2022
    Publication date: November 23, 2023
    Inventors: Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Chu Hong Hoi
  • Publication number: 20230376840
    Abstract: Embodiments described herein provide a reinforcement learning based framework engaging pretrained language models (LMs) for program synthesis tasks. Specifically, the framework adopts a training strategy that optimizes pretrained LMs for program synthesis tasks in an actor-critic approach.
    Type: Application
    Filed: August 26, 2022
    Publication date: November 23, 2023
    Inventors: Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Chu Hong Hoi
  • Publication number: 20230109681
    Abstract: Embodiments are directed to translating a natural language query into a code snippet in a programing language that semantically represents the query. The embodiments include a cascading neural network that includes an encoder network and a classifier network. The encoder network being faster but less accurate than the classifier network. The encoder network is trained using a contrastive learning framework to identify code candidates from a large set of code snippets. The classifier network is trained using a binary classifier to identify the code snippet that semantically represents the query from the code candidates.
    Type: Application
    Filed: January 28, 2022
    Publication date: April 13, 2023
    Inventors: Akhilesh Deepak Gotmare, Junnan Li, Chu Hong Hoi
  • Publication number: 20220374595
    Abstract: Embodiments described herein provides a contrastive learning framework that leverages hard negative examples, that are mined globally from the entire training corpus for a given query to improve the quality of code and natural language representations. Specifically, similar examples from the training corpus are extracted and used as hard negatives in an online manner during training while keeping the minibatch construction random.
    Type: Application
    Filed: November 19, 2021
    Publication date: November 24, 2022
    Inventors: Akhilesh Deepak Gotmare, Junnan Li, Shafiq Rayhan Joty, Chu Hong Hoi
  • Patent number: 11481552
    Abstract: The embodiments describe a generative-discriminative (GeDi) language modeling for determining a next token in a text sequence. A class conditional language model and a positive control code determine a first class conditional probability for each token candidate. The class conditional language model and a negative control code determine a second class conditional probability for the each token candidate. A logarithmic probability difference between the first class conditional probability and the second class conditional probability is determined for each token candidate. An unconditional language model determines an unconditional probability for each token candidate. A combined probability is determined by combining the unconditional probability and the logarithmic probability difference for each token candidate. The next token is selected from the token candidates based on the combined probabilities of the token candidates.
    Type: Grant
    Filed: September 3, 2020
    Date of Patent: October 25, 2022
    Assignee: Salesforce.com, Inc.
    Inventors: Ben Krause, Akhilesh Deepak Gotmare
  • Publication number: 20210374341
    Abstract: The embodiments describe a generative-discriminative (GeDi) language modeling for determining a next token in a text sequence. A class conditional language model and a positive control code determine a first class conditional probability for each token candidate. The class conditional language model and a negative control code determine a second class conditional probability for the each token candidate. A logarithmic probability difference between the first class conditional probability and the second class conditional probability is determined for each token candidate. An unconditional language model determines an unconditional probability for each token candidate. A combined probability is determined by combining the unconditional probability and the logarithmic probability difference for each token candidate. The next token is selected from the token candidates based on the combined probabilities of the token candidates.
    Type: Application
    Filed: September 3, 2020
    Publication date: December 2, 2021
    Inventors: Ben Krause, Akhilesh Deepak Gotmare