Patents by Inventor Ndivhuwo Makondo

Ndivhuwo Makondo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11934441
    Abstract: An ontology topic is selected and a pretrained predictive language model is primed to create a predictive primed model based on one or more ontological rules corresponding to the selected ontology topic. Using the predictive primed model, natural language text is generated based on the ontology topic and guidance of a prediction steering component. The predictive primed model is guided in selecting text that is predicted to be appropriate for the ontology topic and the generated natural language text. The generated natural language text is processed to generate extracted ontology rules and the extracted ontology rules are compared to one or more rules of an ontology rule database that correspond to the ontology topic. A check is performed to determine if a performance of the ontology extractor is acceptable.
    Type: Grant
    Filed: April 29, 2020
    Date of Patent: March 19, 2024
    Assignee: International Business Machines Corporation
    Inventors: Francois Pierre Luus, Etienne Eben Vos, Ndivhuwo Makondo, Naweed Aghmad Khan, Ismail Yunus Akhalwaya
  • Publication number: 20230409872
    Abstract: A computer-implemented method may include processors configured for receiving input data corresponding to a knowledge base comprising a plurality of propositional logic clauses, generating output data corresponding to a first set of logical rules and a first set of facts based on the plurality of propositional logic clauses, accumulating the first set of facts to generate accumulated facts, generating a model graph based on the accumulated facts and the first set of logical rules, alternating reasoning and learning passes at the model graph until convergence to generate a second set of logical rules and a second set of facts, and generating a third set of logical rules and a third set of facts, wherein the third set of logical rules and the third set of facts exceed a first predetermined threshold.
    Type: Application
    Filed: June 21, 2022
    Publication date: December 21, 2023
    Inventors: Ndivhuwo Makondo, Francois Pierre Luus, Naweed Aghmad Khan, Ismail Yunus Akhalwaya, Ryan Nelson Riegel, Oarabile Hope Moloko, Thabang Doreen Lebese
  • Patent number: 11803657
    Abstract: Methods and systems for generating representative data. A generator is configured to create, using a learning model, one or more generated records based on a plurality of training records obtained from a sensitive database. A discriminator is trained to identify the generated records as being generated based on the training records and a privacy adversary is trained to identify a training sample as being more similar to a distribution of the generated records than a distribution of the reference records.
    Type: Grant
    Filed: April 23, 2020
    Date of Patent: October 31, 2023
    Assignee: International Business Machines Corporation
    Inventors: Francois Pierre Luus, Naweed Aghmad Khan, Ndivhuwo Makondo, Etienne Eben Vos, Ismail Yunus Akhalwaya
  • Patent number: 11687724
    Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.
    Type: Grant
    Filed: September 30, 2020
    Date of Patent: June 27, 2023
    Assignee: International Business Machines Corporation
    Inventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
  • Publication number: 20230100883
    Abstract: To improve the technological process of computerized answering of logical queries over incomplete knowledge bases, obtain a first order logic query; with a trained, computerized neural network, convert the first order logic query into a logic embedding; and answer the first order logic query using the logic embedding.
    Type: Application
    Filed: September 28, 2021
    Publication date: March 30, 2023
    Inventors: Francois Pierre Luus, Prithviraj Sen, Ryan Nelson Riegel, Ndivhuwo Makondo, Thabang Doreen Lebese, Naweed Aghmad Khan, Pavan Kapanipathi
  • Patent number: 11494634
    Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.
    Type: Grant
    Filed: May 13, 2020
    Date of Patent: November 8, 2022
    Assignee: International Business Machines Corporation
    Inventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo
  • Publication number: 20220100962
    Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
  • Publication number: 20210365817
    Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds.
    Type: Application
    Filed: October 6, 2020
    Publication date: November 25, 2021
    Inventors: Ryan Nelson Riegel, Francois Pierre Luus, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Ndivhuwo Makondo, Francisco Barahona, Alexander Gray
  • Publication number: 20210357738
    Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.
    Type: Application
    Filed: May 13, 2020
    Publication date: November 18, 2021
    Inventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo
  • Publication number: 20210342380
    Abstract: An ontology topic is selected and a pretrained predictive language model is primed to create a predictive primed model based on one or more ontological rules corresponding to the selected ontology topic. Using the predictive primed model, natural language text is generated based on the ontology topic and guidance of a prediction steering component. The predictive primed model is guided in selecting text that is predicted to be appropriate for the ontology topic and the generated natural language text. The generated natural language text is processed to generate extracted ontology rules and the extracted ontology rules are compared to one or more rules of an ontology rule database that correspond to the ontology topic. A check is performed to determine if a performance of the ontology extractor is acceptable.
    Type: Application
    Filed: April 29, 2020
    Publication date: November 4, 2021
    Inventors: Francois Pierre Luus, Etienne Eben Vos, Ndivhuwo Makondo, Naweed Aghmad Khan, Ismail Yunus Akhalwaya
  • Publication number: 20210334403
    Abstract: Methods and systems for generating representative data. A generator is configured to create, using a learning model, one or more generated records based on a plurality of training records obtained from a sensitive database. A discriminator is trained to identify the generated records as being generated based on the training records and a privacy adversary is trained to identify a training sample as being more similar to a distribution of the generated records than a distribution of the reference records.
    Type: Application
    Filed: April 23, 2020
    Publication date: October 28, 2021
    Inventors: Francois Pierre Luus, Naweed Aghmad Khan, Ndivhuwo Makondo, Etienne Eben Vos, Ismail Yunus Akhalwaya