Patents Assigned to AIPARC HOLDINGS PTE. LTD.
  • Publication number: 20200034737
    Abstract: Systems are presented for generating a natural language model. The system may comprise a database module, an application program interface (API) module, a background processing module, and an applications module, each stored on the at least one memory and executable by the at least one processor. The system may be configured to generate the natural language model by: ingesting training data, generating a hierarchical data structure, selecting a plurality of documents among the training data to be annotated, generating an annotation prompt for each document configured to elicit an annotation about said document, receiving the annotation based on the annotation prompt, and generating the natural language model using an adaptive machine learning process configured to determine patterns among the annotations for how the documents in the training data are to be subdivided according to the at least two topical nodes of the hierarchical data structure.
    Type: Application
    Filed: February 28, 2019
    Publication date: January 30, 2020
    Applicant: AIPARC HOLDINGS PTE. LTD. `
    Inventors: Robert J. Munro, Schuyler D. Erle, Christopher Walker, Sarah K. Luger, Jason Brenier, Gary C. King, Paul A. Tepper, Ross Mechanic, Andrew Gilchrist-Scott, Jessica D. Long, James B. Robinson, Brendan D. Callahan, Michelle Casbon, Ujjwal Sarin, Aneesh Nair, Veena Basavaraj, Tripti Saxena, Edgar Nunez, Martha G. Hinrichs, Haley Most, Tyler J. Schnoebelen
  • Publication number: 20190384809
    Abstract: Systems, methods, and apparatuses are presented for a trained language model to be stored in an efficient manner such that the trained language model may be utilized in virtually any computing device to conduct natural language processing. Unlike other natural language processing engines that may be computationally intensive to the point of being capable of running only on high performance machines, the organization of the natural language models according to the present disclosures allows for natural language processing to be performed even on smaller devices, such as mobile devices.
    Type: Application
    Filed: January 11, 2019
    Publication date: December 19, 2019
    Applicant: AIPARC HOLDINGS PTE. LTD.
    Inventors: Schuyler D. Erle, Robert J. Munro, Brendan D. Callahan, Gary C. King, Jason Brenier, James B. Robinson
  • Publication number: 20190377788
    Abstract: Methods, apparatuses, and systems are presented for generating natural language models using a novel system architecture for feature extraction. A method for extracting features for natural language processing comprises: accessing one or more tokens generated from a document to be processed; receiving one or more feature types defined by user; receiving selection of one or more feature types from a plurality of system-defined and user-defined feature types, wherein each feature type comprises one or more rules for generating features; receiving one or more parameters for the selected feature types, wherein the one or more rules for generating features are defined at least in part by the parameters; generating features associated with the document to be processed based on the selected feature types and the received parameters; and outputting the generated features in a format common among all feature types.
    Type: Application
    Filed: January 2, 2019
    Publication date: December 12, 2019
    Applicant: AIPARC HOLDINGS PTE. LTD.
    Inventors: Robert J. Munro, Schuyler D. Erle, Tyler J. Schnoebelen, Brendan D. Callahan, Jessica D. Long, Gary C. King, Paul A. Tepper, Jason A. Brenier, Stefan Krawczyk
  • Publication number: 20190311024
    Abstract: Methods, apparatuses and computer readable medium are presented for generating a natural language model. A method for generating a natural language model comprises: receiving more than one annotation of a document; calculating a level of agreement among the received annotations; determining that a criterion among a first criterion, a second criterion, and a third criterion is satisfied based at least in part on the level of agreement; determining an aggregated annotation representing an aggregation of information in the received annotations and training a natural language model using the aggregated annotation, when the first criterion is satisfied; generating at least one human readable prompt configured to receive additional annotations of the document, when the second criterion is satisfied; and discarding the received annotations from use in training the natural language model, when the third criterion is satisfied.
    Type: Application
    Filed: November 9, 2018
    Publication date: October 10, 2019
    Applicant: AIPARC HOLDINGS PTE. LTD.
    Inventors: Robert J. Munro, Christopher Walker, Sarah K. Luger, Brendan D. Callahan, Gary C. King, Paul A. Tepper, Jana N. Thompson, Tyler J. Schnoebelen, Jason Brenier, Jessica D. Long
  • Publication number: 20190311025
    Abstract: Systems and methods are presented for the automatic placement of rules applied to topics in a logical hierarchy when conducting natural language processing. In some embodiments, a method includes: accessing, at a child node in a logical hierarchy, at least one rule associated with the child node; identifying a percolation criterion associated with a parent node to the child node, said percolation criterion indicating that the at least one rule associated with the child node is to be associated also with the parent node; associating the at least one rule with the parent node such that the at least one rule defines a second factor for determining whether the document is to also be classified into the parent node; accessing the document for natural language processing; and determining whether the document is to be classified into the parent node or the child node based on the at least one rule.
    Type: Application
    Filed: November 20, 2018
    Publication date: October 10, 2019
    Applicant: AIPARC HOLDINGS PTE. LTD.
    Inventors: Robert J. Munro, Schuyler D. Erie, Tyler J. Schnoebelen, Jason Brenier, Jessica D. Long, Brendan D. Callahan, Paul A. Tepper, Edgar Nunez