Patents by Inventor Jackie C. K. CHEUNG

Jackie C. K. CHEUNG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11914955
    Abstract: A computer implemented method is described for conducting text sequence machine learning, the method comprising: receiving an input sequence x=[x1, x2, . . . , xn], to produce a feature vector for a series of hidden states hx=[h1, h2, . . . , hn], wherein the feature vector for the series of hidden states hx is generated by performing pooling over a temporal dimension of all hidden states output by the encoder machine learning data architecture; and extracting from the series of hidden states hx, a mean and a variance parameter, and to encapsulate the mean and the variance parameter as an approximate posterior data structure.
    Type: Grant
    Filed: May 21, 2020
    Date of Patent: February 27, 2024
    Assignee: ROYAL BANK OF CANADA
    Inventors: Teng Long, Yanshuai Cao, Jackie C. K. Cheung
  • Patent number: 11763100
    Abstract: A system is provided comprising a processor and a memory storing instructions which configure the processor to process an original sentence structure through an encoder neural network to decompose the original sentence structure into an original semantics component and an original syntax component, process the original syntax component through a syntax variation autoencoder (VAE) to receive a syntax mean vector and a syntax covariance matrix, obtain a sampled syntax value from a syntax Gaussian posterior parameterized by the syntax mean vector and the syntax covariance matrix, process the original semantics component through a semantics VAE to receive a semantics mean vector and a semantics covariance matrix, obtain a sampled semantics vector from the Gaussian semantics posterior parameterized by the semantics mean vector and the semantics covariance matrix, and process the sampled syntax vector and the sampled semantics vector through a decoder neural network to compose a new sentence.
    Type: Grant
    Filed: May 22, 2020
    Date of Patent: September 19, 2023
    Assignee: ROYAL BANK OF CANADA
    Inventors: Peng Xu, Yanshuai Cao, Jackie C. K. Cheung
  • Patent number: 11270072
    Abstract: Systems and methods of automatically generating a coherence score for text data is provided. The approach includes receiving a plurality of string tokens representing decomposed portions of the target text data object. A trained neural network is provided that has been trained against a plurality of corpuses of training text across a plurality of topics. The string tokens are arranged to extract string tokens representing adjacent sentence pairs of the target text data object. For each adjacent sentence pair, the neural network generates a local coherence score representing a coherence level of the adjacent sentence pair of the target text data object, which are then aggregated for each adjacent sentence pair of the target text data object to generate a global coherence score for the target text data object.
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: March 8, 2022
    Assignee: ROYAL BANK OF CANADA
    Inventors: Yanshuai Cao, Peng Z. Xu, Hamidreza Saghir, Jin Sung Kang, Teng Long, Jackie C. K. Cheung
  • Publication number: 20200372225
    Abstract: A computer system and method for machine text generation is provided. The system comprises at least one processor and a memory storing instructions which when executed by the processor configure the processor to perform the method.
    Type: Application
    Filed: May 22, 2020
    Publication date: November 26, 2020
    Inventors: Peng XU, Yanshuai CAO, Jackie C. K. CHEUNG
  • Publication number: 20200372214
    Abstract: A computer implemented method is described for conducting text sequence machine learning, the method comprising: receiving an input sequence x=[x1, x2, . . . , xn], to produce a feature vector for a series of hidden states hx=[h1, h2, . . . , hn], wherein the feature vector for the series of hidden states hx is generated by performing pooling over a temporal dimension of all hidden states output by the encoder machine learning data architecture; and extracting from the series of hidden states hx, a mean and a variance parameter, and to encapsulate the mean and the variance parameter as an approximate posterior data structure.
    Type: Application
    Filed: May 21, 2020
    Publication date: November 26, 2020
    Inventors: Teng LONG, Yanshuai CAO, Jackie C. K. CHEUNG
  • Publication number: 20200134016
    Abstract: Systems and methods of automatically generating a coherence score for text data is provided. The approach includes receiving a plurality of string tokens representing decomposed portions of the target text data object. A trained neural network is provided that has been trained against a plurality of corpuses of training text across a plurality of topics. The string tokens are arranged to extract string tokens representing adjacent sentence pairs of the target text data object. For each adjacent sentence pair, the neural network generates a local coherence score representing a coherence level of the adjacent sentence pair of the target text data object, which are then aggregated for each adjacent sentence pair of the target text data object to generate a global coherence score for the target text data object.
    Type: Application
    Filed: October 31, 2019
    Publication date: April 30, 2020
    Inventors: Yanshuai CAO, Peng Z. XU, Hamidreza SAGHIR, Jin Sung KANG, Leo LONG, Jackie C. K. CHEUNG