Patents by Inventor Julian Eisenschlos

Julian Eisenschlos has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240086436
    Abstract: Systems and methods for pre-training and fine-tuning of neural-network-based language models to reason directly over tables without generating logical forms. In some examples, a language model can be pre-trained using masked-language modeling tasks synthetically generated from tables pulled from a knowledge corpus. In some examples, the language model may be further pre-trained using pairs of counterfactual statements generated from those tables, and/or one or more statements that compare selected data from those tables. The language model may then be fine-tuned using examples that include only a question, an answer, and a table, allowing fine-tuning examples to be harvested directly from existing benchmark datasets or synthetically generated.
    Type: Application
    Filed: November 20, 2023
    Publication date: March 14, 2024
    Inventors: Thomas Müller, Jonathan Herzig, Pawel Nowak, Julian Eisenschlos, Francesco Piccinno, Syrine Krichene
  • Patent number: 11868381
    Abstract: Systems and methods for pre-training and fine-tuning of neural-network-based language models to reason directly over tables without generating logical forms. In some examples, a language model can be pre-trained using masked-language modeling tasks synthetically generated from tables pulled from a knowledge corpus. In some examples, the language model may be further pre-trained using pairs of counterfactual statements generated from those tables, and/or one or more statements that compare selected data from those tables. The language model may then be fine-tuned using examples that include only a question, an answer, and a table, allowing fine-tuning examples to be harvested directly from existing benchmark datasets or synthetically generated.
    Type: Grant
    Filed: March 29, 2021
    Date of Patent: January 9, 2024
    Assignee: Google LLC
    Inventors: Thomas Müller, Jonathan Herzig, Pawel Nowak, Julian Eisenschlos, Francesco Piccinno, Syrine Krichene
  • Publication number: 20220309087
    Abstract: Systems and methods for pre-training and fine-tuning of neural-network-based language models to reason directly over tables without generating logical forms. In some examples, a language model can be pre-trained using masked-language modeling tasks synthetically generated from tables pulled from a knowledge corpus. In some examples, the language model may be further pre-trained using pairs of counterfactual statements generated from those tables, and/or one or more statements that compare selected data from those tables. The language model may then be fine-tuned using examples that include only a question, an answer, and a table, allowing fine-tuning examples to be harvested directly from existing benchmark datasets or synthetically generated.
    Type: Application
    Filed: March 29, 2021
    Publication date: September 29, 2022
    Inventors: Thomas Müller, Jonathan Herzig, Pawel Nowak, Julian Eisenschlos, Francesco Piccinno, Syrine Krichene