Patents by Inventor Ndivhuwo Makondo
Ndivhuwo Makondo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11934441Abstract: An ontology topic is selected and a pretrained predictive language model is primed to create a predictive primed model based on one or more ontological rules corresponding to the selected ontology topic. Using the predictive primed model, natural language text is generated based on the ontology topic and guidance of a prediction steering component. The predictive primed model is guided in selecting text that is predicted to be appropriate for the ontology topic and the generated natural language text. The generated natural language text is processed to generate extracted ontology rules and the extracted ontology rules are compared to one or more rules of an ontology rule database that correspond to the ontology topic. A check is performed to determine if a performance of the ontology extractor is acceptable.Type: GrantFiled: April 29, 2020Date of Patent: March 19, 2024Assignee: International Business Machines CorporationInventors: Francois Pierre Luus, Etienne Eben Vos, Ndivhuwo Makondo, Naweed Aghmad Khan, Ismail Yunus Akhalwaya
-
Publication number: 20230409872Abstract: A computer-implemented method may include processors configured for receiving input data corresponding to a knowledge base comprising a plurality of propositional logic clauses, generating output data corresponding to a first set of logical rules and a first set of facts based on the plurality of propositional logic clauses, accumulating the first set of facts to generate accumulated facts, generating a model graph based on the accumulated facts and the first set of logical rules, alternating reasoning and learning passes at the model graph until convergence to generate a second set of logical rules and a second set of facts, and generating a third set of logical rules and a third set of facts, wherein the third set of logical rules and the third set of facts exceed a first predetermined threshold.Type: ApplicationFiled: June 21, 2022Publication date: December 21, 2023Inventors: Ndivhuwo Makondo, Francois Pierre Luus, Naweed Aghmad Khan, Ismail Yunus Akhalwaya, Ryan Nelson Riegel, Oarabile Hope Moloko, Thabang Doreen Lebese
-
Patent number: 11803657Abstract: Methods and systems for generating representative data. A generator is configured to create, using a learning model, one or more generated records based on a plurality of training records obtained from a sensitive database. A discriminator is trained to identify the generated records as being generated based on the training records and a privacy adversary is trained to identify a training sample as being more similar to a distribution of the generated records than a distribution of the reference records.Type: GrantFiled: April 23, 2020Date of Patent: October 31, 2023Assignee: International Business Machines CorporationInventors: Francois Pierre Luus, Naweed Aghmad Khan, Ndivhuwo Makondo, Etienne Eben Vos, Ismail Yunus Akhalwaya
-
Patent number: 11687724Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.Type: GrantFiled: September 30, 2020Date of Patent: June 27, 2023Assignee: International Business Machines CorporationInventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
-
Publication number: 20230100883Abstract: To improve the technological process of computerized answering of logical queries over incomplete knowledge bases, obtain a first order logic query; with a trained, computerized neural network, convert the first order logic query into a logic embedding; and answer the first order logic query using the logic embedding.Type: ApplicationFiled: September 28, 2021Publication date: March 30, 2023Inventors: Francois Pierre Luus, Prithviraj Sen, Ryan Nelson Riegel, Ndivhuwo Makondo, Thabang Doreen Lebese, Naweed Aghmad Khan, Pavan Kapanipathi
-
Patent number: 11494634Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.Type: GrantFiled: May 13, 2020Date of Patent: November 8, 2022Assignee: International Business Machines CorporationInventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo
-
Publication number: 20220100962Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.Type: ApplicationFiled: September 30, 2020Publication date: March 31, 2022Inventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
-
Publication number: 20210365817Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds.Type: ApplicationFiled: October 6, 2020Publication date: November 25, 2021Inventors: Ryan Nelson Riegel, Francois Pierre Luus, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Ndivhuwo Makondo, Francisco Barahona, Alexander Gray
-
Publication number: 20210357738Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.Type: ApplicationFiled: May 13, 2020Publication date: November 18, 2021Inventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo
-
Publication number: 20210342380Abstract: An ontology topic is selected and a pretrained predictive language model is primed to create a predictive primed model based on one or more ontological rules corresponding to the selected ontology topic. Using the predictive primed model, natural language text is generated based on the ontology topic and guidance of a prediction steering component. The predictive primed model is guided in selecting text that is predicted to be appropriate for the ontology topic and the generated natural language text. The generated natural language text is processed to generate extracted ontology rules and the extracted ontology rules are compared to one or more rules of an ontology rule database that correspond to the ontology topic. A check is performed to determine if a performance of the ontology extractor is acceptable.Type: ApplicationFiled: April 29, 2020Publication date: November 4, 2021Inventors: Francois Pierre Luus, Etienne Eben Vos, Ndivhuwo Makondo, Naweed Aghmad Khan, Ismail Yunus Akhalwaya
-
Publication number: 20210334403Abstract: Methods and systems for generating representative data. A generator is configured to create, using a learning model, one or more generated records based on a plurality of training records obtained from a sensitive database. A discriminator is trained to identify the generated records as being generated based on the training records and a privacy adversary is trained to identify a training sample as being more similar to a distribution of the generated records than a distribution of the reference records.Type: ApplicationFiled: April 23, 2020Publication date: October 28, 2021Inventors: Francois Pierre Luus, Naweed Aghmad Khan, Ndivhuwo Makondo, Etienne Eben Vos, Ismail Yunus Akhalwaya