Patents by Inventor Chunyang Xiao

Chunyang Xiao has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10853724
    Abstract: Methods, systems, and devices for semantic parsing. In an example embodiment, a method for semantic parsing can include steps, operations, or instructions such as obtaining a data pair for learning, the data pair comprising logical form data and natural utterance data; acquiring grammar for targeted logical forms among the logical form data of the data pair; modeling data comprising other available prior knowledge utilizing WFSA (Weighted Finite State Automata); combining with the targeted logical forms with the data modeled comprising the other available prior knowledge to form a background; and exploiting the background on the data pair. Note that we do not “learn” the background, but “learn” the background-RNN (Recurrent Neural Network).
    Type: Grant
    Filed: June 2, 2017
    Date of Patent: December 1, 2020
    Assignee: Xerox Corporation
    Inventors: Chunyang Xiao, Marc Dymetman
  • Publication number: 20180349767
    Abstract: Methods, systems, and devices for semantic parsing. In an example embodiment, a method for semantic parsing can include steps, operations, or instructions such as obtaining a data pair for learning, the data pair comprising logical form data and natural utterance data; acquiring grammar for targeted logical forms among the logical form data of the data pair; modeling data comprising other available prior knowledge utilizing WFSA (Weighted Finite State Automata); combining with the targeted logical forms with the data modeled comprising the other available prior knowledge to form a background; and exploiting the background on the data pair. Note that we do not “learn” the background, but “learn” the background-RNN (Recurrent Neural Network).
    Type: Application
    Filed: June 2, 2017
    Publication date: December 6, 2018
    Inventors: Chunyang Xiao, Marc Dymetman
  • Publication number: 20180349765
    Abstract: A neural network apparatus includes a recurrent neural network having a long-linear output layer. The recurrent neural network is trained by training data and the recurrent neural network models outputs symbols as complex combinations of attributes without requiring that each combination among the complex combinations be directly observed in the training data. The recurrent neural network is configured to permit an inclusion of flexible prior knowledge in a form of specified modular features, wherein the recurrent neural network learns to dynamically control weights of a log-linear distribution to promote the specified modular features. The recurrent neural network can be implemented as a log-linear recurrent neural network.
    Type: Application
    Filed: May 30, 2017
    Publication date: December 6, 2018
    Inventors: Marc Dymetman, Chunyang Xiao
  • Publication number: 20180060779
    Abstract: A method of generating a business process model includes receiving a description of a select business process in electronic form, processing the description of the select business process at the computerized system using a natural language processing algorithm to parse business activities from the description, searching a business concept repository using a search engine for existing business concept models related to each business activity parsed from the description, and displaying each existing business concept model resulting from the searching in a graphical representation for manipulation by a process designer using a business concept manipulation tool to form a specific business process model for the select business process. The select business process is associated with a domain-specific business type. A computerized system for designing and modeling business processes for use in business process management is also provided.
    Type: Application
    Filed: August 24, 2016
    Publication date: March 1, 2018
    Applicant: Conduent Business Services, LLC
    Inventors: Kunal Suri, Chunyang Xiao, Adrian Corneliu Mos
  • Patent number: 9858263
    Abstract: A method for predicting a canonical form for an input text sequence includes predicting the canonical form with a neural network model. The model includes an encoder, which generates a first representation of the input text sequence based on a representation of n-grams in the text sequence and a second representation of the input text sequence generated by a first neural network. The model also includes a decoder which sequentially predicts terms of the canonical form based on the first and second representations and a predicted prefix of the canonical form. The canonical form can be used, for example, to query a knowledge base or to generate a next utterance in a discourse.
    Type: Grant
    Filed: May 5, 2016
    Date of Patent: January 2, 2018
    Assignees: Conduent Business Services, LLC, Centre National De La Recherche Scientifique
    Inventors: Chunyang Xiao, Marc Dymetman, Claire Gardent
  • Patent number: 9830315
    Abstract: A system and method are provided which employ a neural network model which has been trained to predict a sequentialized form for an input text sequence. The sequentialized form includes a sequence of symbols. The neural network model includes an encoder which generates a representation of the input text sequence based on a representation of n-grams in the text sequence and a decoder which sequentially predicts a next symbol of the sequentialized form based on the representation and a predicted prefix of the sequentialized form. Given an input text sequence, a sequentialized form is predicted with the trained neural network model. The sequentialized form is converted to a structured form and information based on the structured form is output.
    Type: Grant
    Filed: July 13, 2016
    Date of Patent: November 28, 2017
    Assignees: XEROX CORPORATION, Centre National de la Recherche Scientifique
    Inventors: Chunyang Xiao, Marc Dymetman, Claire Gardent
  • Publication number: 20170323636
    Abstract: A method for predicting a canonical form for an input text sequence includes predicting the canonical form with a neural network model. The model includes an encoder, which generates a first representation of the input text sequence based on a representation of n-grams in the text sequence and a second representation of the input text sequence generated by a first neural network. The model also includes a decoder which sequentially predicts terms of the canonical form based on the first and second representations and a predicted prefix of the canonical form. The canonical form can be used, for example, to query a knowledge base or to generate a next utterance in a discourse.
    Type: Application
    Filed: May 5, 2016
    Publication date: November 9, 2017
    Applicant: Conduent Business Services, LLC
    Inventors: Chunyang Xiao, Marc Dymetman, Claire Gardent
  • Publication number: 20170031896
    Abstract: A system and method permit analysis and generation to be performed with the same reversible probabilistic model. The model includes a set of factors, including a canonical factor, which is a function of a logical form and a realization thereof, a similarity factor, which is a function of a canonical text string and a surface string, a language model factor, which is a static function of a surface string, a language context factor, which is a dynamic function of a surface string, and a semantic context factor, which is a dynamic function of a logical form. When performing generation, the canonical factor, similarity factor, language model factor, and language context factor are composed to receive as input a logical form and output a surface string, and when performing analysis, the similarity factor, canonical factor, and semantic context factor are composed to take as input a surface string and output a logical form.
    Type: Application
    Filed: July 28, 2015
    Publication date: February 2, 2017
    Applicant: Xerox Corporation
    Inventors: Marc Dymetman, Sriram Venkatapathy, Chunyang Xiao