Patents by Inventor Xingdi Yuan
Xingdi Yuan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12236363Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.Type: GrantFiled: June 24, 2022Date of Patent: February 25, 2025Assignee: Microsoft Technology Licensing, LLCInventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
-
Publication number: 20240404421Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.Type: ApplicationFiled: August 14, 2024Publication date: December 5, 2024Applicant: Microsoft Technology Licensing, LLCInventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
-
Patent number: 12094362Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.Type: GrantFiled: January 7, 2021Date of Patent: September 17, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Xingdi Yuan, Tong Wang, Adam Peter Trischler, Sandeep Subramanian
-
Patent number: 12067490Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: GrantFiled: October 17, 2022Date of Patent: August 20, 2024Assignee: Microsoft Technology Licensing, LLCInventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
-
Publication number: 20230042546Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: ApplicationFiled: October 17, 2022Publication date: February 9, 2023Applicant: Microsoft Technology Licensing, LLCInventors: Adam TRISCHLER, Zheng YE, Xingdi YUAN, Philip BACHMAN
-
Patent number: 11507834Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: GrantFiled: May 12, 2020Date of Patent: November 22, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
-
Publication number: 20220327407Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.Type: ApplicationFiled: June 24, 2022Publication date: October 13, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Adam TRISCHLER, Philip BACHMAN, Xingdi YUAN, Alessandro SORDONI, Zheng YE
-
Patent number: 11379736Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.Type: GrantFiled: May 17, 2017Date of Patent: July 5, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
-
Publication number: 20210134173Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.Type: ApplicationFiled: January 7, 2021Publication date: May 6, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
-
Patent number: 10902738Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.Type: GrantFiled: August 3, 2017Date of Patent: January 26, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Xingdi Yuan, Tong Wang, Adam Peter Trischler, Sandeep Subramanian
-
Publication number: 20200279161Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: ApplicationFiled: May 12, 2020Publication date: September 3, 2020Applicant: MALUUBA INC.Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
-
Patent number: 10691999Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: GrantFiled: March 16, 2017Date of Patent: June 23, 2020Assignee: Maluuba Inc.Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman
-
Publication number: 20190043379Abstract: A method, system, and storage device storing a computer program, for generating questions based on provided content, such as, for example, a document having words. The method comprises automatically estimating the probability of interesting phrases in the provided content, and generating a question in natural language based on the estimating. In one example embodiment herein, the estimating includes predicting the interesting phrases as answers, and the estimating is performed by a neural model. The method further comprises conditioning a question generation model based on the interesting phrases predicted in the predicting, the question generation model generating the question. The method also can include training the neural model. In one example, the method further comprises identifying start and end locations of the phrases in the provided content, and the identifying includes performing a dot product attention mechanism parameterizing a probability distribution.Type: ApplicationFiled: August 3, 2017Publication date: February 7, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Xingdi YUAN, Tong WANG, Adam Peter TRISCHLER, Sandeep SUBRAMANIAN
-
Publication number: 20170337479Abstract: Described herein are systems and methods for providing a natural language comprehension system that employs a two-stage process for machine comprehension of text. The first stage indicates words in one or more text passages that potentially answer a question. The first stage outputs a set of candidate answers for the question, along with a first probability of correctness for each candidate answer. The second stage forms one or more hypotheses by inserting each candidate answer into the question and determines whether a sematic relationship exists between each hypothesis and each sentence in the text. The second processing circuitry generates a second probability of correctness for each candidate answer and combines the first probability with the second probability to produce a score that is used to rank the candidate answers. The candidate answer with the highest score is selected as a predicted answer.Type: ApplicationFiled: May 17, 2017Publication date: November 23, 2017Applicant: Maluuba Inc.Inventors: Adam Trischler, Philip Bachman, Xingdi Yuan, Alessandro Sordoni, Zheng Ye
-
Publication number: 20170270409Abstract: Examples of the present disclosure provide systems and methods relating to a machine comprehension test with a learning-based approach, harnessing neural networks arranged in a parallel hierarchy. This parallel hierarchy enables the model to compare the passage, question, and answer from a variety of perspectives, as opposed to using a manually designed set of features. Perspectives may range from the word level to sentence fragments to sequences of sentences, and networks operate on word-embedding representations of text. A training methodology for small data is also provided.Type: ApplicationFiled: March 16, 2017Publication date: September 21, 2017Applicant: Maluuba Inc.Inventors: Adam Trischler, Zheng Ye, Xingdi Yuan, Philip Bachman