Patents by Inventor Ryan Nelson Riegel

Ryan Nelson Riegel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240144058
    Abstract: According to one embodiment, a method, computer system, and computer program product for probabilistic inference from imprecise knowledge is provided. The embodiment may include identifying a knowledge base of one or more statements and first probability distributions corresponding to each of the one or more statements. The embodiment may also include identifying one or more queries. The embodiment may further include determining logical inferences about and second probability distributions for queries from the one or more queries or statements from the one or more statements based on information in the knowledge base.
    Type: Application
    Filed: October 28, 2022
    Publication date: May 2, 2024
    Inventors: Radu Marinescu, HAIFENG QIAN, Debarun Bhattacharjya, Alexander Gray, Francisco Barahona, Tian GAO, Ryan Nelson Riegel
  • Publication number: 20230409872
    Abstract: A computer-implemented method may include processors configured for receiving input data corresponding to a knowledge base comprising a plurality of propositional logic clauses, generating output data corresponding to a first set of logical rules and a first set of facts based on the plurality of propositional logic clauses, accumulating the first set of facts to generate accumulated facts, generating a model graph based on the accumulated facts and the first set of logical rules, alternating reasoning and learning passes at the model graph until convergence to generate a second set of logical rules and a second set of facts, and generating a third set of logical rules and a third set of facts, wherein the third set of logical rules and the third set of facts exceed a first predetermined threshold.
    Type: Application
    Filed: June 21, 2022
    Publication date: December 21, 2023
    Inventors: Ndivhuwo Makondo, Francois Pierre Luus, Naweed Aghmad Khan, Ismail Yunus Akhalwaya, Ryan Nelson Riegel, Oarabile Hope Moloko, Thabang Doreen Lebese
  • Patent number: 11687724
    Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.
    Type: Grant
    Filed: September 30, 2020
    Date of Patent: June 27, 2023
    Assignee: International Business Machines Corporation
    Inventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
  • Publication number: 20230100883
    Abstract: To improve the technological process of computerized answering of logical queries over incomplete knowledge bases, obtain a first order logic query; with a trained, computerized neural network, convert the first order logic query into a logic embedding; and answer the first order logic query using the logic embedding.
    Type: Application
    Filed: September 28, 2021
    Publication date: March 30, 2023
    Inventors: Francois Pierre Luus, Prithviraj Sen, Ryan Nelson Riegel, Ndivhuwo Makondo, Thabang Doreen Lebese, Naweed Aghmad Khan, Pavan Kapanipathi
  • Patent number: 11494634
    Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.
    Type: Grant
    Filed: May 13, 2020
    Date of Patent: November 8, 2022
    Assignee: International Business Machines Corporation
    Inventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo
  • Publication number: 20220100962
    Abstract: Word sense disambiguation using a glossary layer embedded in a deep neural network includes receiving, by one or more processors, input sentences including a plurality of words. At least two words in the plurality of words are homonyms. The one or more processors convert the plurality of words associated with each input sentence into a first vector including possible senses for the at least two words. The first vector is then combined with a second vector including a domain-specific contextual vector associated with the at least two words. The combination of the first vector with the second vector is fed into a recurrent deep logico-neural network model to generate a third vector that includes word senses for the at least two words. A threshold is set for the third vector to generate a fourth vector including a final word sense vector for the at least two words.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Francois Pierre Luus, Ndivhuwo Makondo, Ryan Nelson Riegel, Alexander Gray
  • Publication number: 20210365817
    Abstract: A system for configuring and using a logical neural network including a graph syntax tree of formulae in a represented knowledgebase connected to each other via nodes representing each proposition. One neuron exists for each logical connective occurring in each formula and, additionally, one neuron for each unique proposition occurring in any formula. All neurons return pairs of values representing upper and lower bounds on truth values of their corresponding subformulae and propositions. Neurons corresponding to logical connectives accept as input the output of neurons corresponding to their operands and have activation functions configured to match the connectives' truth functions. Neurons corresponding to propositions accept as input the output of neurons established as proofs of bounds on the propositions' truth values and have activation functions configured to aggregate the tightest such bounds.
    Type: Application
    Filed: October 6, 2020
    Publication date: November 25, 2021
    Inventors: Ryan Nelson Riegel, Francois Pierre Luus, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Ndivhuwo Makondo, Francisco Barahona, Alexander Gray
  • Publication number: 20210357738
    Abstract: Maximum expressivity can be received representing a ratio between maximum and minimum input weights to a neuron of a neural network implementing a weighted real-valued logic gate. Operator arity can be received associated with the neuron. Logical constraints associated with the weighted real-valued logic gate can be determined in terms of weights associated with inputs to the neuron, a threshold-of-truth, and a neuron threshold for activation. The threshold-of-truth can be determined as a parameter used in an activation function of the neuron, based on solving an activation optimization formulated based on the logical constraints, the activation optimization maximizing a product of expressivity representing a distribution width of input weights to the neuron and gradient quality for the neuron given the operator arity and the maximum expressivity. The neural network of logical neurons can be trained using the activation function at the neuron, the activation function using the determined threshold-of-truth.
    Type: Application
    Filed: May 13, 2020
    Publication date: November 18, 2021
    Inventors: Francois Pierre Luus, Ryan Nelson Riegel, Ismail Yunus Akhalwaya, Naweed Aghmad Khan, Etienne Eben Vos, Ndivhuwo Makondo