Patents by Inventor Jana N. Thompson

Jana N. Thompson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200184146
    Abstract: Methods, apparatuses and computer readable medium are presented for generating a natural language model. A method for generating a natural language model comprises: receiving more than one annotation of a document; calculating a level of agreement among the received annotations; determining that a criterion among a first criterion, a second criterion, and a third criterion is satisfied based at least in part on the level of agreement; determining an aggregated annotation representing an aggregation of information in the received annotations and training a natural language model using the aggregated annotation, when the first criterion is satisfied; generating at least one human readable prompt configured to receive additional annotations of the document, when the second criterion is satisfied; and discarding the received annotations from use in training the natural language model, when the third criterion is satisfied.
    Type: Application
    Filed: January 22, 2020
    Publication date: June 11, 2020
    Inventors: Robert J. Munro, Christopher Walker, Sarah K. Luger, Brendan D. Callahan, Gary C. King, Paul A. Tepper, Jana N. Thompson, Tyler J. Schnoebelen, Jason Brenier, Jessica D. Long
  • Publication number: 20190311024
    Abstract: Methods, apparatuses and computer readable medium are presented for generating a natural language model. A method for generating a natural language model comprises: receiving more than one annotation of a document; calculating a level of agreement among the received annotations; determining that a criterion among a first criterion, a second criterion, and a third criterion is satisfied based at least in part on the level of agreement; determining an aggregated annotation representing an aggregation of information in the received annotations and training a natural language model using the aggregated annotation, when the first criterion is satisfied; generating at least one human readable prompt configured to receive additional annotations of the document, when the second criterion is satisfied; and discarding the received annotations from use in training the natural language model, when the third criterion is satisfied.
    Type: Application
    Filed: November 9, 2018
    Publication date: October 10, 2019
    Applicant: AIPARC HOLDINGS PTE. LTD.
    Inventors: Robert J. Munro, Christopher Walker, Sarah K. Luger, Brendan D. Callahan, Gary C. King, Paul A. Tepper, Jana N. Thompson, Tyler J. Schnoebelen, Jason Brenier, Jessica D. Long
  • Publication number: 20160162464
    Abstract: Methods, apparatuses and computer readable medium are presented for generating a natural language model. A method for generating a natural language model comprises: receiving more than one annotation of a document; calculating a level of agreement among the received annotations; determining that a criterion among a first criterion, a second criterion, and a third criterion is satisfied based at least in part on the level of agreement; determining an aggregated annotation representing an aggregation of information in the received annotations and training a natural language model using the aggregated annotation, when the first criterion is satisfied; generating at least one human readable prompt configured to receive additional annotations of the document, when the second criterion is satisfied; and discarding the received annotations from use in training the natural language model, when the third criterion is satisfied.
    Type: Application
    Filed: December 9, 2015
    Publication date: June 9, 2016
    Applicant: Idibon, Inc.
    Inventors: Robert J. Munro, Christopher Walker, Sarah K. Luger, Brendan D. Callahan, Gary C. King, Paul A. Tepper, Jana N. Thompson, Tyler J. Schnoebelen, Jason Brenier, Jessica D. Long