Patents by Inventor Yatin Chaudhary

Yatin Chaudhary has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230289532
    Abstract: Various embodiments include a computer-implemented method for a language modeling, LM. In some examples, the method includes: performing a topic modeling, TM, for at least one document to acquire a first type of topic representation which represents a topic distribution for each word in the at least one document; generating a second type of topic representation based on a predefined number of key terms for each topic of the topic distribution represented by the first topic representation; generating a TM representation comprising the first type of topic representation, the second type of topic representation, or a combination of the first type of topic representation and the second type of topic representation; receiving an input sentence for the LM; and performing the LM on the input sentence based on the TM representation.
    Type: Application
    Filed: August 5, 2020
    Publication date: September 14, 2023
    Applicant: Siemens Aktiengesellschaft
    Inventors: Pankaj Gupta, Yatin Chaudhary
  • Publication number: 20230289533
    Abstract: Various embodiments of the teachings herein include a computer-implemented method for a topic modeling with a continuous learning. The method may include: extracting a current topic representation which represents a topic distribution over vocabulary within a current document; adjusting a size of the vocabulary of the current topic representation based on words used in a topic pool, wherein the topic pool includes past topic representations accumulated by each of past documents; regularizing the current topic representation by controlling a degree of topic imitation with past topic representations, based on comparison of the current topic representation and each of the past topic representations; and accumulating the regularized current topic representation into the topic pool.
    Type: Application
    Filed: August 5, 2020
    Publication date: September 14, 2023
    Applicant: Siemens Aktiengesellschaft
    Inventors: Pankaj Gupta, Yatin Chaudhary, Thomas Runkler
  • Patent number: 11416689
    Abstract: The invention refers to a natural language processing system configured for receiving an input sequence ci of input words (v1, v2, . . . vN) representing a first sequence of words in a natural language of a first text and generating an output sequence of output words (, , . . . ) representing a second sequence of words in a natural language of a second text and modeled by a multinominal topic model, wherein the multinominal topic model is extended by an incorporation of language structures using a deep contextualized Long-Short-Term Memory model.
    Type: Grant
    Filed: March 28, 2019
    Date of Patent: August 16, 2022
    Assignee: SIEMENS AKTIENGESELLSCHAFT
    Inventors: Florian Büttner, Yatin Chaudhary, Pankaj Gupta
  • Patent number: 11182559
    Abstract: The invention refers to a natural language processing system configured for receiving an input sequence ci of input words representing a first sequence of words in a natural language of a first text and generating an output sequence of output words representing a second sequence of words in a natural language of a second text and modeled by a multinominal topic model, wherein the multinominal topic model is extended by an incorporation of full contextual information around each word vi, wherein both preceding words v<i and following words v>i around each word vi are captured by using a bi-directional language modelling and a feed-forward fashion, wherein position dependent forward hidden layers {right arrow over (h)}i and backward hidden layers i for each word vi are computed.
    Type: Grant
    Filed: March 26, 2019
    Date of Patent: November 23, 2021
    Inventors: Florian Büttner, Yatin Chaudhary, Pankaj Gupta
  • Publication number: 20210004690
    Abstract: The present invention relates to a computer-implemented method of Neural Topic Modelling (NTM), a respective computer program, computer-readable medium and data processing system. Global-View Transfer (GVT) or Multi-View Transfer (MTV, GVT and Local-View Transfer (LVT) jointly applied), with or without Multi-Source Transfer (MST) are utilised in the method of NTM. For GVT a pre-trained topic Knowledge Base (KB) of latent topic features is prepared and knowledge is transferred to a target by GVT via learning meaningful latent topic features guided by relevant latent topic features of the topic KB. This is effected by extending a loss function and minimising the extended loss function. For MVT additionally a pre-trained word embeddings KB of word embeddings is prepared and knowledge is transferred to the target by LVT via learning meaningful word embeddings guided by relevant word embeddings of the word embeddings KB. This is effected by extending a term for calculating pre-activations.
    Type: Application
    Filed: July 1, 2019
    Publication date: January 7, 2021
    Inventors: YATIN CHAUDHARY, PANKAJ GUPTA
  • Publication number: 20200311205
    Abstract: The invention refers to a natural language processing system configured for receiving an input sequence ci of input words representing a first sequence of words in a natural language of a first text and generating an output sequence of output words representing a second sequence of words in a natural language of a second text and modeled by a multinominal topic model, wherein the multinominal topic model is extended by an incorporation of full contextual information around each word vi, wherein both preceding words v<i and following words v>i around each word vi are captured by using a bi-directional language modelling and a feed-forward fashion, wherein position dependent forward hidden layers {right arrow over (h)}i and backward hidden layers for each word vi are computed.
    Type: Application
    Filed: March 26, 2019
    Publication date: October 1, 2020
    Inventors: Florian Büttner, Yatin Chaudhary, Pankaj Gupta
  • Publication number: 20200311213
    Abstract: The invention refers to a natural language processing system configured for receiving an input sequence ci of input words (v1, v2, . . . vN) representing a first sequence of words in a natural language of a first text and generating an output sequence of output words (, , . . . ) representing a second sequence of words in a natural language of a second text and modeled by a multinominal topic model, wherein the multinominal topic model is extended by an incorporation of language structures using a deep contextualized Long-Short-Term Memory model.
    Type: Application
    Filed: March 28, 2019
    Publication date: October 1, 2020
    Inventors: Florian Büttner, Yatin Chaudhary, Pankaj Gupta