Patents by Inventor Seyed Mehran KAZEMI

Seyed Mehran KAZEMI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11694115
    Abstract: Systems and methods for expanding a multi-relational data structure tunable for generating a non-linear dataset from a time-dependent query. The systems include a processor and a memory. The memory may store processor-executable instructions that, when executed, configure the processor to: receive the query of the multi-relational data structure, wherein the query includes at least one entity node at a queried time relative to the time data; obtain, based on the query, a temporal representation vector based on a diachronic embedding of the multi-relational data structure, the diachronic embedding based on a combination of a first sub-function associated with a temporal feature and a second sub-function associated with a persistent feature; determine, from the temporal representation vector, at least one time-varied score corresponding to the queried time; and generate a response dataset based on the at least one time-varied score determined from the temporal representation vector.
    Type: Grant
    Filed: May 15, 2020
    Date of Patent: July 4, 2023
    Assignee: ROYAL BANK OF CANADA
    Inventors: Seyed Mehran Kazemi, Rishab Goel
  • Publication number: 20220237378
    Abstract: A computer-implemented system and method and for learning an entity-independent representation are disclosed. The method may include: receiving an input text; identifying named entities in the input text; replacing the named entities in the input text with entity markers; parsing the input text into a plurality of tokens; generating a plurality of token embeddings based on the plurality of tokens; generating a plurality of positional embeddings based on the respective position of each of the plurality of tokens within the input text; generating a plurality of token type embeddings based on the plurality of tokens and the one or more named entities in the input text; and processing the plurality of token embeddings, the plurality of positional embeddings, and the plurality of token type embeddings using a transformer neural network model to generate a hidden state vector for each of the plurality of tokens in the input text.
    Type: Application
    Filed: January 25, 2022
    Publication date: July 28, 2022
    Inventors: Layla EL ASRI, Aishik Chakraborty, Seyed Mehran Kazemi
  • Publication number: 20220101103
    Abstract: A graph structure having nodes and edges is represented as an adjacency matrix, and nodes of the graph structure have node features. A computer-implemented method and system for generating a graph structure are provided, the method comprising: generating an adjacency matrix based on a plurality of node features; generating a plurality of noisy node features based on the plurality of node features; generating a plurality of denoised node features using a neural network based on the plurality of noisy node features and the adjacency matrix; and updating the adjacency matrix based on the plurality of denoised node features.
    Type: Application
    Filed: September 24, 2021
    Publication date: March 31, 2022
    Inventors: Bahare FATEMI, Seyed Mehran KAZEMI, Layla EL ASRI
  • Publication number: 20210224690
    Abstract: Disclosed are systems, methods, and devices for out-of-sample representation learning using knowledge graphs. An embedding data structure reflective of a knowledge graph embedding model is received. A training data set including a plurality of training data entries, each of the training data entries reflective of a head entity, a tail entity, and a relation therebetween, wherein at least one of the head entities or the tail entities includes an out-of-sample entity, is received. A plurality of knowledge graph embedding model processors is provided. A random number is generated and compared to at least one criterion. A knowledge graph embedding model processor is selected from among the plurality of knowledge graph embedding model processors based at least in part on the comparing. The embedding data structure is processed with the selected knowledge graph embedding model processor.
    Type: Application
    Filed: January 15, 2021
    Publication date: July 22, 2021
    Inventors: Marjan ALBOOYEH, Seyed Mehran KAZEMI
  • Publication number: 20200364619
    Abstract: Systems and methods for expanding a multi-relational data structure tunable for generating a non-linear dataset from a time-dependent query. The systems include a processor and a memory. The memory may store processor-executable instructions that, when executed, configure the processor to: receive the query of the multi-relational data structure, wherein the query includes at least one entity node at a queried time relative to the time data; obtain, based on the query, a temporal representation vector based on a diachronic embedding of the multi-relational data structure, the diachronic embedding based on a combination of a first sub-function associated with a temporal feature and a second sub-function associated with a persistent feature; determine, from the temporal representation vector, at least one time-varied score corresponding to the queried time; and generate a response dataset based on the at least one time-varied score determined from the temporal representation vector.
    Type: Application
    Filed: May 15, 2020
    Publication date: November 19, 2020
    Inventors: Seyed Mehran KAZEMI, Rishab GOEL
  • Publication number: 20200234100
    Abstract: Described in various embodiments herein is a technical solution directed to decomposition of time as an input for machine learning, and various related mechanisms and data structures. In particular, specific machines, computer-readable media, computer processes, and methods are described that are utilized to improve machine learning outcomes, including, improving accuracy, convergence speed (e.g., reduced epochs for training), and reduced overall computational resource requirements. A vector representation of continuous time containing a periodic function with frequency and phase-shift learnable parameters is used to decompose time into output dimensions for improved tracking of periodic behavior of a feature. The vector representation is used to modify time inputs in machine learning architectures.
    Type: Application
    Filed: January 18, 2020
    Publication date: July 23, 2020
    Inventors: Janahan Mathuran RAMANAN, Jaspreet SAHOTA, Rishab GOEL, Sepehr EGHBALI, Seyed Mehran KAZEMI