Patents by Inventor Brian David Lester

Brian David Lester has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12154552
    Abstract: A natural language understanding (NLU) system generates in-place annotations for natural language utterances or other types of time-based media based on stand-off annotations. The in-place annotations are associated with particular sub-sequences of an annotation, which provides richer information than stand-off annotations, which are associated only with an utterance as a whole. To generate the in-place annotations for an utterance, the NLU system applies an encoder network and a decoder network to obtain attention weights for the various tokens within the utterance. The NLU system disqualifies tokens of the utterance based on their corresponding attention weights, and selects highest-scoring contiguous sequences of tokens between the disqualified tokens. In-place annotations are associated with the selected sequences.
    Type: Grant
    Filed: August 31, 2021
    Date of Patent: November 26, 2024
    Assignee: Interactions LLC
    Inventors: Brian David Lester, Srinivas Bangalore
  • Publication number: 20240378196
    Abstract: Systems and methods for prompt tuning can leverage semantic searching for determining similar prompts to use for retraining. A prompt can be generated then searched to find the similar prompts. Data related to the similar prompts can then be utilized for prompt tuning. Moreover, systems and methods for prompt tuning can generate and utilize a meta-prompt to reduce the computational cost of generating prompts. The prompt tuning techniques can be implemented as part of a prompt tuning application programming interface (API).
    Type: Application
    Filed: August 20, 2021
    Publication date: November 14, 2024
    Inventors: Brian David Lester, Rami Eid Sammour Al-Rfou, Noah JG Constant
  • Publication number: 20240020546
    Abstract: Systems and methods for prompt tuning can utilize previously-learned prompts for the initialization of tuning for prompts on different tasks that may differ from the task associated with the previously-learned prompt. The prompt being utilized for initialization can be a generic prompt and/or may be a prompt selected based on a determined similarity between two or more task embeddings.
    Type: Application
    Filed: July 13, 2022
    Publication date: January 18, 2024
    Inventors: Tu Thanh Vu, Daniel Matthew Cer, Noah Constant, Brian David Lester, Rami Al-Rfou
  • Publication number: 20230325725
    Abstract: Systems and methods for natural language processing can leverage trained prompts to condition a large pre-trained machine-learned model to generate an output for a specific task. For example, a subset of parameters may be trained for the particular task to then be input with a set of input data into the pre-trained machine-learned model to generate the task-specific output. During the training of the prompt, the parameters of the pre-trained machine-learned model can be frozen, which can reduce the computational resources used during training while still leveraging the previously learned data from the pre-trained machine-learned model.
    Type: Application
    Filed: April 12, 2022
    Publication date: October 12, 2023
    Inventors: Brian David Lester, Rami Al-Rfou, Noah Constant