Patents by Inventor Xingdi Yuan

Xingdi Yuan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12236363
    Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.
    Type: Grant
    Filed: June 24, 2022
    Date of Patent: February 25, 2025
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
  • Publication number: 20240404421
    Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.
    Type: Application
    Filed: August 14, 2024
    Publication date: December 5, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
  • Patent number: 12094362
    Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.
    Type: Grant
    Filed: January 7, 2021
    Date of Patent: September 17, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Xingdi Yuan, Tong Wang, Adam Peter Trischler, Sandeep Subramanian
  • Patent number: 12067490
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Grant
    Filed: October 17, 2022
    Date of Patent: August 20, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
  • Publication number: 20230042546
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Application
    Filed: October 17, 2022
    Publication date: February 9, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Adam TRISCHLER, Zheng YE, Xingdi YUAN, Philip BACHMAN
  • Patent number: 11507834
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Grant
    Filed: May 12, 2020
    Date of Patent: November 22, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
  • Publication number: 20220327407
    Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.
    Type: Application
    Filed: June 24, 2022
    Publication date: October 13, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Adam TRISCHLER, Philip BACHMAN, Xingdi YUAN, Alessandro SORDONI, Zheng YE
  • Patent number: 11379736
    Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.
    Type: Grant
    Filed: May 17, 2017
    Date of Patent: July 5, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
  • Publication number: 20210134173
    Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.
    Type: Application
    Filed: January 7, 2021
    Publication date: May 6, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
  • Patent number: 10902738
    Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.
    Type: Grant
    Filed: August 3, 2017
    Date of Patent: January 26, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Xingdi Yuan, Tong Wang, Adam Peter Trischler, Sandeep Subramanian
  • Publication number: 20200279161
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Application
    Filed: May 12, 2020
    Publication date: September 3, 2020
    Applicant: MALUUBA INC.
    Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
  • Patent number: 10691999
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Grant
    Filed: March 16, 2017
    Date of Patent: June 23, 2020
    Assignee: Maluuba Inc.
    Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
  • Publication number: 20190043379
    Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.
    Type: Application
    Filed: August 3, 2017
    Publication date: February 7, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
  • Publication number: 20170337479
    Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.
    Type: Application
    Filed: May 17, 2017
    Publication date: November 23, 2017
    Applicant: Maluuba Inc.
    Inventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
  • Publication number: 20170270409
    Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.
    Type: Application
    Filed: March 16, 2017
    Publication date: September 21, 2017
    Applicant: Maluuba Inc.
    Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman