Patents by Inventor Mohammed J. Zaki

Mohammed J. Zaki has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11816136
    Abstract: For a passage text and a corresponding answer text, perform a word-level soft alignment to obtain contextualized passage embeddings and contextualized answer embeddings, and a hidden level soft alignment on the contextualized passage embeddings and the contextualized answer embeddings to obtain a passage embedding matrix. Construct a passage graph of the passage text based on the passage embedding matrix, and apply a bidirectional gated graph neural network to the passage graph until a final state embedding is determined, during which intermediate node embeddings are fused from both incoming and outgoing edges. Obtain a graph-level embedding from the final state embedding, and decode the final state embedding to generate an output sequence word-by-word. Train a machine learning model to generate at least one question corresponding to the passage text and the answer text, by evaluating the output sequence with a hybrid evaluator combining cross-entropy evaluation and reinforcement learning evaluation.
    Type: Grant
    Filed: October 23, 2022
    Date of Patent: November 14, 2023
    Assignees: International Business Machines Corporation, RENSSELAER POLYTECHNIC INSTITUTE
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Patent number: 11593672
    Abstract: Aspects described herein include a method of conversational machine reading comprehension, as well as an associated system and computer program product. The method comprises receiving a plurality of questions relating to a context, and generating a sequence of context graphs. Each of the context graphs includes encoded representations of: (i) the context, (ii) a respective question of the plurality of questions, and (iii) a respective conversation history reflecting: (a) one or more previous questions relative to the respective question, and (b) one or more previous answers to the one or more previous questions. The method further comprises identifying, using at least one graph neural network, one or more temporal dependencies between adjacent context graphs of the sequence. The method further comprises predicting, based at least on the one or more temporal dependencies, an answer for a first question of the plurality of questions.
    Type: Grant
    Filed: August 22, 2019
    Date of Patent: February 28, 2023
    Assignee: International Business Machines Corporation
    Inventors: Lingfei Wu, Mohammed J Zaki, Yu Chen
  • Publication number: 20230055666
    Abstract: For a passage text and a corresponding answer text, perform a word-level soft alignment to obtain contextualized passage embeddings and contextualized answer embeddings, and a hidden level soft alignment on the contextualized passage embeddings and the contextualized answer embeddings to obtain a passage embedding matrix. Construct a passage graph of the passage text based on the passage embedding matrix, and apply a bidirectional gated graph neural network to the passage graph until a final state embedding is determined, during which intermediate node embeddings are fused from both incoming and outgoing edges. Obtain a graph-level embedding from the final state embedding, and decode the final state embedding to generate an output sequence word-by-word. Train a machine learning model to generate at least one question corresponding to the passage text and the answer text, by evaluating the output sequence with a hybrid evaluator combining cross-entropy evaluation and reinforcement learning evaluation.
    Type: Application
    Filed: October 23, 2022
    Publication date: February 23, 2023
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Patent number: 11481418
    Abstract: For a passage text and a corresponding answer text, perform a word-level soft alignment to obtain contextualized passage embeddings and contextualized answer embeddings, and a hidden level soft alignment on the contextualized passage embeddings and the contextualized answer embeddings to obtain a passage embedding matrix. Construct a passage graph of the passage text based on the passage embedding matrix, and apply a bidirectional gated graph neural network to the passage graph until a final state embedding is determined, during which intermediate node embeddings are fused from both incoming and outgoing edges. Obtain a graph-level embedding from the final state embedding, and decode the final state embedding to generate an output sequence word-by-word. Train a machine learning model to generate at least one question corresponding to the passage text and the answer text, by evaluating the output sequence with a hybrid evaluator combining cross-entropy evaluation and reinforcement learning evaluation.
    Type: Grant
    Filed: April 9, 2020
    Date of Patent: October 25, 2022
    Assignees: International Business Machines Corporation, RENSSELAER POLYTECHNIC INSTITUTE
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Publication number: 20220027707
    Abstract: A method, a computer program product, and a system for subgraph guided knowledge graph question generation. The method includes inputting a knowledge graph subgraph and a target answer into a long short-term memory encoder. The method also includes producing embeddings relating to the nodes and the edges. The method includes indicating the embeddings associated with the target answer. The method includes applying a graph neural network encoder computation in an iterative manner to the embeddings, with updated embeddings produced by the GNN encoder acting as initial values that are applied to the GNN encoder for a next iteration, until final state embeddings are produced. The method includes computing a graph-level embedding based on the final state embeddings and computing, by a recurrent neural network decoder, a question relating to the target answer and the knowledge graph subgraph using the graph-level embedding.
    Type: Application
    Filed: July 24, 2020
    Publication date: January 27, 2022
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Publication number: 20210374499
    Abstract: An initial noisy graph topology is obtained and an initial adjacency matrix is generated by a similarity learning component using similarity learning and a similarity metric function. An updated adjacency matrix with node embeddings is produced from the initial adjacency matrix using a graph neural network (GNN). The node embeddings are fed back to revise the similarity learning component. The generating, producing, and feeding back operations are repeated for a plurality of iterations.
    Type: Application
    Filed: May 26, 2020
    Publication date: December 2, 2021
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Publication number: 20210209139
    Abstract: For a passage text and a corresponding answer text, perform a word-level soft alignment to obtain contextualized passage embeddings and contextualized answer embeddings, and a hidden level soft alignment on the contextualized passage embeddings and the contextualized answer embeddings to obtain a passage embedding matrix. Construct a passage graph of the passage text based on the passage embedding matrix, and apply a bidirectional gated graph neural network to the passage graph until a final state embedding is determined, during which intermediate node embeddings are fused from both incoming and outgoing edges. Obtain a graph-level embedding from the final state embedding, and decode the final state embedding to generate an output sequence word-by-word. Train a machine learning model to generate at least one question corresponding to the passage text and the answer text, by evaluating the output sequence with a hybrid evaluator combining cross-entropy evaluation and reinforcement learning evaluation.
    Type: Application
    Filed: April 9, 2020
    Publication date: July 8, 2021
    Inventors: Lingfei Wu, Yu Chen, Mohammed J. Zaki
  • Patent number: 10540354
    Abstract: A method for discovering representative composite configuration item (CI) patterns in an IT system that includes a plurality of configuration items may include data mining a graph representing the IT system to extract extended frequent composite CI patterns. The method may also include clustering the extended frequent composite CI patterns into clusters based on similarity between the maximal frequent composite CI patterns. The method may further include extracting a representative composite CI pattern for each of the clusters, and using an output device, outputting the representative composite CI pattern for each of the clusters.
    Type: Grant
    Filed: October 17, 2011
    Date of Patent: January 21, 2020
    Assignee: Micro Focus LLC
    Inventors: Omer Barkol, Shahar Golan, Ruth Bergman, Yifat Felder, Arik Sityon, Mohammed J. Zaki, Pranay Anchuri
  • Publication number: 20130097138
    Abstract: A method for discovering representative composite configuration item (CI) patterns in an IT system that includes a plurality of configuration items may include data mining a graph representing the IT system to extract extended frequent composite CI patterns. The method may also include clustering the extended frequent composite CI patterns into clusters based on similarity between the maximal frequent composite CI patterns. The method may further include extracting a representative composite CI pattern for each of the clusters, and using an output device, outputting the representative composite CI pattern for each of the clusters.
    Type: Application
    Filed: October 17, 2011
    Publication date: April 18, 2013
    Inventors: Omer BARKOL, Shahar GOLAN, Ruth BERGMAN, Yifat FELDER, Arik SITYON, Mohammed J. ZAKI, Pranay ANCHURI
  • Patent number: 6230151
    Abstract: A method and system for generating a decision-tree classifier in parallel in a shared-memory multiprocessor system is disclosed. The processors first generate in the shared memory an attribute list for each record attribute. Each attribute list is assigned to a processor. The processors independently determine the best splits for their respective assigned lists, and cooperatively determine a global best split for all attribute lists. The attribute lists are reassigned to the processors and split according to the global best split into the lists for child nodes. The split attribute lists are again assigned to the processors and the process is repeated for each new child node until each attribute list for the new child nodes includes only tuples of the same record class or a fixed number of tuples.
    Type: Grant
    Filed: April 16, 1998
    Date of Patent: May 8, 2001
    Assignee: International Business Machines Corporation
    Inventors: Rakesh Agrawal, Ching-Tien Ho, Mohammed J. Zaki