Patents by Inventor Bhushan Kotnis

Bhushan Kotnis has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230385281
    Abstract: A memory for storing data for access by an application program includes an encoded sequential query representing m×n path queries of a query graph having a plurality of elements including m root nodes and n leaf nodes, and a missing element, wherein m and n are integer values. Each of the path queries includes a root node as a start position and a leaf node as an end position, wherein positional encodings of the elements include a positional order within each path query, wherein the positional encodings of the elements include counter values that are reset at a start position of each of the path queries, and wherein the missing element is masked. The invention can be used for, but is not limited to, medical uses, such as for predicting protein-drug interactions or medical conditions, as well as other applications, and significantly outperforms the optimized model.
    Type: Application
    Filed: July 20, 2023
    Publication date: November 30, 2023
    Inventors: Bhushan Kotnis, Mathias Niepert, Carolin Lawrence
  • Publication number: 20230359629
    Abstract: A method for encoding a query graph into a sequence representation includes receiving m×n path queries representing a query graph having m root nodes and n leaf nodes, each path query beginning with a root node and ending with a leaf node, and encoding positions of each node and each edge between nodes within each path query independently, wherein the encoded positions include a positional order within each path query. The positional encodings may include no positional ordering between different path queries. The query graph may include one or more missing entities and each missing entity may be masked to produce a masked sequential query, which may be fed to a transformer encoder. The invention can be used for, but is not limited to, medical uses, such as for predicting protein-drug interactions or medical conditions, as well as other applications, and significantly outperforms the optimized model.
    Type: Application
    Filed: July 20, 2023
    Publication date: November 9, 2023
    Inventors: Bhushan Kotnis, Mathias Niepert, Carolin Lawrence
  • Publication number: 20230359630
    Abstract: A method of encoding a query graph includes receiving m×n path queries representing the query graph, each starting with a root node and ending with a leaf node, the query graph including one or more missing nodes and/or missing edges between nodes. Positions of each node and edge are encoded within each of the path queries independently, wherein the encoded positions include a positional order within each path query, and wherein the encoded positions include positional counter values that are reset at a start position of each of the path queries. Each of the missing nodes and/or missing edges are masked to produce a masked query, which is fed to a transformer encoder. The invention can be used for, but is not limited to, medical uses, such as for predicting protein-drug interactions or medical conditions, as well as other applications, and significantly outperforms the optimized model.
    Type: Application
    Filed: July 20, 2023
    Publication date: November 9, 2023
    Inventors: Bhushan Kotnis, Mathias Niepert, Carolin Lawrence
  • Patent number: 11748356
    Abstract: Method for encoding a query graph into a sequence representation includes receiving a set of m×n path queries representing a query graph, the query graph having a plurality of nodes including m root nodes and n leaf nodes, wherein m and n are integer values, each of the m×n path queries beginning with a root node and ending with a leaf node, and encoding positions of each node and each edge between nodes within each of the m×n path queries independently, wherein the encoded positions include a positional order within each path query. The positional encodings may include no positional ordering between different path queries. The query graph may include one or more missing entities and each token representing one of the one or more missing entities may be masked to produce a masked sequential query, which may be fed to a transformer encoder.
    Type: Grant
    Filed: April 24, 2020
    Date of Patent: September 5, 2023
    Assignee: NEC CORPORATION
    Inventors: Bhushan Kotnis, Mathias Niepert, Carolin Lawrence
  • Patent number: 11741318
    Abstract: A method is provided for extracting machine readable data structures from unstructured, low-resource language input text. The method includes obtaining a corpus of high-resource language data structures, filtering the corpus of high-resource language data structures to obtain a filtered corpus of high-resource language data structures, obtaining entity types for each entity of each filtered high-resource language data structure, performing type substitution for each obtained entity by replacing each entity with an entity of the same type to generate type substituted data structures, and replacing each entity with an equivalent a corresponding low-resource language data structure entity to generate code switched sentences. The method further includes generating an augmented data structure corpus, training a multi-head self-attention transformer model, and providing the unstructured low-resource language input text to the trained model to extract the machine readable data structures.
    Type: Grant
    Filed: June 9, 2021
    Date of Patent: August 29, 2023
    Assignee: NEC CORPORATION
    Inventors: Bhushan Kotnis, Kiril Gashteovski, Carolin Lawrence
  • Publication number: 20230267338
    Abstract: A method for automated decision making in an artificial intelligence task by fact-relevant open information extraction and knowledge graph generation includes obtaining a keyword query for performing the fact-relevant open information extraction and expanding the keyword query using keyword alias and query generation. The fact-relevant open information extraction is performed to extract triples from a text which contains the keyword or the keyword alias. The knowledge graph is generated using the extracted triples and an open knowledge graph (OpenKG) extractor that has been trained using keywords and aliases. Supervised or unsupervised classification is performed using the generated knowledge graph to make the automated decision in the artificial intelligence task.
    Type: Application
    Filed: May 2, 2022
    Publication date: August 24, 2023
    Inventors: Bhushan Kotnis, Kiril Gashteovski, Carolin Lawrence
  • Publication number: 20220309254
    Abstract: A method is provided for extracting machine readable data structures from unstructured, low-resource language input text. The method includes obtaining a corpus of high-resource language data structures, filtering the corpus of high-resource language data structures to obtain a filtered corpus of high-resource language data structures, obtaining entity types for each entity of each filtered high-resource language data structure, performing type substitution for each obtained entity by replacing each entity with an entity of the same type to generate type substituted data structures, and replacing each entity with an equivalent a corresponding low-resource language data structure entity to generate code switched sentences. The method further includes generating an augmented data structure corpus, training a multi-head self-attention transformer model, and providing the unstructured low-resource language input text to the trained model to extract the machine readable data structures.
    Type: Application
    Filed: June 9, 2021
    Publication date: September 29, 2022
    Inventors: Bhushan Kotnis, Kiril Gashteovski, Carolin Lawrence
  • Publication number: 20220309230
    Abstract: A method for transforming an input sequence into an output sequence includes obtaining a data set of interest, the data set including input sequences and output sequences, wherein each of the sequences is decomposable into tokens. At a prediction time, the input sequence is concatenated with a sequence of placeholder tokens of a configured maximum length to generate a concatenated sequence. The concatenated sequence is provided as input to a transformer encoder that is learnt at a training time. A prediction strategy is applied to replace the placeholder tokens with real output tokens. The real output tokens are provided as the output sequence.
    Type: Application
    Filed: September 9, 2019
    Publication date: September 29, 2022
    Inventors: Carolin LAWRENCE, Bhushan KOTNIS, Mathias NIEPERT
  • Publication number: 20210173841
    Abstract: Method for encoding a query graph into a sequence representation includes receiving a set of m×n path queries representing a query graph, the query graph having a plurality of nodes including m root nodes and n leaf nodes, wherein m and n are integer values, each of the m×n path queries beginning with a root node and ending with a leaf node, and encoding positions of each node and each edge between nodes within each of the m×n path queries independently, wherein the encoded positions include a positional order within each path query. The positional encodings may include no positional ordering between different path queries. The query graph may include one or more missing entities and each token representing one of the one or more missing entities may be masked to produce a masked sequential query, which may be fed to a transformer encoder.
    Type: Application
    Filed: April 24, 2020
    Publication date: June 10, 2021
    Inventors: Bhushan Kotnis, Mathias Niepert, Carolin Lawrence
  • Publication number: 20200160215
    Abstract: A method for learning numerical attributes in a knowledge graph includes learning knowledge graph embeddings based on jointly minimizing a knowledge graph loss and a number of numerical attribute prediction losses. The method also includes executing a numerical attribute propagation algorithm using an adjacency matrix of the knowledge graph and numerical values of labeled nodes of the knowledge graph to predict missing ones of the numerical attributes.
    Type: Application
    Filed: June 7, 2019
    Publication date: May 21, 2020
    Inventors: Bhushan Kotnis, Alberto Garcia Duran