Patents by Inventor Bing Xiang
Bing Xiang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210158209Abstract: Techniques for active learning for document querying machine learning (ML) models as a service are described. A service may perform a search of data of a user, using a machine learning model, for a search query to generate a result, generate a confidence score for the result of the search, select a proper subset of the data to be provided to the user based on the confidence score, display the proper subset of the data to the user, receive an indication from the user of one or more sections of the proper subset of the data for use in a next training iteration of the machine learning model, and perform the next training iteration of the machine learning model with the one or more sections of the proper subset of the data.Type: ApplicationFiled: November 27, 2019Publication date: May 27, 2021Inventors: Bing XIANG, Jean-Pierre DODEL, Ramesh M. NALLAPATI
-
Publication number: 20210157857Abstract: Techniques for generation of synthetic queries from customer data for training of document querying machine learning (ML) models as a service are described. A service may receive one or more documents from a user, generate a set of question and answer pairs from the one or more documents from the user using a machine learning model trained to predict a question from an answer, and store the set of question and answer pairs generated from the one or more documents from the user. The question and answer pairs may be used to train another machine learning model, for example, a document ranking model, a passage ranking model, a question/answer model, or a frequently asked question (FAQ) model.Type: ApplicationFiled: November 27, 2019Publication date: May 27, 2021Inventors: Cicero NOGUEIRA DOS SANTOS, Xiaofei MA, Peng XU, Ramesh M. NALLAPATI, Bing XIANG, Sudipta SENGUPTA, Zhiguo WANG, Patrick NG
-
Publication number: 20210157845Abstract: Techniques for searching documents are described. An exemplary method includes receiving a document search query; querying at least one index based upon the document search query to identify matching data; fetching the identified matched data; determining one or more of a top ranked passage and top ranked documents from the set of documents based upon one or more invocations of one or more machine learning models based at least on the fetched identified matched data and the document search query; and returning one or more of the top ranked passage and the proper subset of documents.Type: ApplicationFiled: November 27, 2019Publication date: May 27, 2021Inventors: Jean-Pierre DODEL, Zhiheng HUANG, Xiaofei MA, Ramesh M. NALLAPATI, Krishnakumar RAJAGOPALAN, Milan SAINI, Sudipta SENGUPTA, Saurabh Kumar SINGH, Dimitrios SOULIOS, Ankit SULTANIA, Dong WANG, Zhiguo WANG, Bing XIANG, Peng XU, Yong YUAN
-
Publication number: 20210157854Abstract: Techniques for displaying a search are described. An exemplary method includes receiving a search query, performing the search query on a plurality of documents, the documents including text passages, to generate a search query result, determining an aspect of the search query result that has a confidence value that exceeds a first confidence threshold with respect to its relevance to the search query; and, displaying the search result including an emphasis on the aspect of the result exceeds the first confidence threshold.Type: ApplicationFiled: November 27, 2019Publication date: May 27, 2021Inventors: Zhiguo WANG, Zhiheng HUANG, Ramesh M. NALLAPATI, Bing XIANG
-
Publication number: 20210157861Abstract: Techniques for intaking one or more documents are described.Type: ApplicationFiled: November 27, 2019Publication date: May 27, 2021Inventors: Jared Lee KATZMAN, Nithin KUNALA, Bing XIANG, Krishnakumar RAJAGOPALAN, Andrew M. GRANT
-
Publication number: 20200288795Abstract: Embodiments relate generally to systems and methods for providing an anti-counterfeit element within a filtering mask. A method of manufacturing a filtering mask may comprise extruding a filtering material from a source of the filtering material; melt-blowing the extruded filtering material to form a plurality of fibers; collecting the melt-blown fibers onto an outer surface of a lathe, wherein the outer surface of the lathe comprises a logo pattern; forming a fiber material about the logo pattern on the outer surface of the lathe via the collected melt-blown fibers; removing the formed fiber material to create an anti-counterfeit layer for use in the filtering mask, wherein at least one surface of the anti-counterfeit layer comprises the logo pattern within the fiber material; and assembling the anti-counterfeit layer within the filtering mask.Type: ApplicationFiled: November 30, 2017Publication date: September 17, 2020Inventors: Hong Bing XIANG, Weifeng SHEN, Yuzheng LU, Chuang MA
-
Publication number: 20200073761Abstract: Embodiments of the present disclosure provide a hot backup system, a hot backup method, and a computer device. The hot backup system includes a centralized management module, a master server, a slave server and a delay server. The master server is configured to receive a write instruction sent by the centralized management module, and write first data to a database of the master server based on the write instruction. The slave server is configured to perform data synchronization with the master server in real time, receive a read instruction sent by the centralized management module, and send second data read based on the read instruction to the centralized management module to cause the centralized management module to send the second data to the service server.Type: ApplicationFiled: August 28, 2019Publication date: March 5, 2020Inventors: Bing XIANG, Xiaoliang CONG
-
Publication number: 20190377747Abstract: Software that generates an answer to an input question using a source document by performing the following operations: (i) receiving a question; (ii) generating a plurality of vectors including a first vector representation of a term in the question and a second vector representation of a term in a source document; (iii) providing each dimension of each of the first vector representation and the second vector representation into a respective input node of an artificial neural network; (iv) determining whether the source document is relevant to answering the question based, at least in part, on an output generated by the artificial neural network; and (v) in response to determining that the source document is relevant, generating an answer to the question utilizing the source document.Type: ApplicationFiled: August 21, 2019Publication date: December 12, 2019Inventors: James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou
-
Patent number: 10467268Abstract: Software that compares vector representations of question terms and passage terms in question answering systems by performing the following steps: (i) receiving a question; (ii) generating a plurality of vectors including a first vector representation of a term in the question and a second vector representation of a term in a set of natural language text; (iii) generating a similarity score representing an amount of similarity between the first vector representation and the second vector representation; and (iv) determining whether the set of natural language text is relevant to the question based, at least in part, on the generated similarity score.Type: GrantFiled: June 2, 2015Date of Patent: November 5, 2019Assignee: International Business Machines CorporationInventors: James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou
-
Patent number: 10467270Abstract: Software that compares vector representations of question terms and passage terms in question answering systems by performing the following steps: (i) receiving a question; (ii) generating a plurality of vectors including a first vector representation of a term in the question and a second vector representation of a term in a set of natural language text; (iii) generating a similarity score representing an amount of similarity between the first vector representation and the second vector representation; and (iv) determining whether the set of natural language text is relevant to the question based, at least in part, on the generated similarity score.Type: GrantFiled: June 14, 2016Date of Patent: November 5, 2019Assignee: International Business Machines CorporationInventors: James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou
-
Patent number: 10329700Abstract: A method for producing a fluffy temperature regulating warmth retention material and the fluffy temperature regulating warmth retention material produced therefrom are disclosed. The method comprises: selecting a low melting point fiber and an additional fiber; carding to form a single web; spray coating a phase change material along at least part of the length of a surface of the single web; lapping layer by layer of the single web; and performing a heat setting reinforcement to form the warmth retention material. According to the present invention, a fluffy temperature regulating warmth retention material comprising an appropriate ratio of a phase change material may be obtained and the material exhibits a satisfactory temperature regulating effect, and meanwhile, it can maintain, to the full extent, or is close to, the original filling power and soft hand feeling where no phase change material is incorporated.Type: GrantFiled: October 30, 2014Date of Patent: June 25, 2019Assignee: 3M Innovative Properties CompanyInventors: Feng Xu, Guo Tong Zhao, Xiaoshuan Fu, Hong Bing Xiang, Yue Ge
-
Patent number: 9947314Abstract: Software that trains an artificial neural network for generating vector representations for natural language text, by performing the following steps: (i) receiving, by one or more processors, a set of natural language text; (ii) generating, by one or more processors, a set of first metadata for the set of natural language text, where the first metadata is generated using supervised learning method(s); (iii) generating, by one or more processors, a set of second metadata for the set of natural language text, where the second metadata is generated using unsupervised learning method(s); and (iv) training, by one or more processors, an artificial neural network adapted to generate vector representations for natural language text, where the training is based, at least in part, on the received natural language text, the generated set of first metadata, and the generated set of second metadata.Type: GrantFiled: February 21, 2017Date of Patent: April 17, 2018Assignee: International Business Machines CorporationInventors: Liangliang Cao, James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou
-
Patent number: 9922025Abstract: A computer program that generates a vector representation of a set of natural language text in a natural language processing system by: (i) receiving a first set of natural language text and a set of information pertaining to the first set of natural language text, where the information includes a dependency parse tree including a root node and a plurality of nodes that depend from the root node, where the root node represents the first set of natural language text, and where the plurality of nodes that depend from the root node represent context features of the first set of natural language text; and (ii) generating, by the natural language processing system, a first vector representation of the first set of natural language text, wherein the generating includes adding vector representations for the context features represented by the plurality of nodes that depend from the root node.Type: GrantFiled: August 8, 2017Date of Patent: March 20, 2018Assignee: International Business Machines CorporationInventors: James H. Cross, III, James J. Fan, Bing Xiang, Bowen Zhou
-
Patent number: 9898458Abstract: A computer program that uses structured information, such as syntactic and semantic information, as context for representing words and/or phrases as vectors, by performing the following steps: (i) receiving a first set of natural language text and a set of information pertaining to the first set of natural language text, where the information includes metadata and corresponding contextual information indicating a relationship between the metadata and the first set of natural language text; and (ii) generating a first vector representation for the first set of natural language text utilizing the metadata and its corresponding contextual information.Type: GrantFiled: May 8, 2015Date of Patent: February 20, 2018Assignee: International Business Machines CorporationInventors: James H. Cross, III, James J. Fan, Bing Xiang, Bowen Zhou
-
Patent number: 9892113Abstract: A computer program that uses structured information, such as syntactic and semantic information, as context for representing words and/or phrases as vectors, by performing the following steps: (i) receiving a first set of natural language text and a set of information pertaining to the first set of natural language text, where the information includes metadata and corresponding contextual information indicating a relationship between the metadata and the first set of natural language text; and (ii) generating a first vector representation for the first set of natural language text utilizing the metadata and its corresponding contextual information.Type: GrantFiled: June 14, 2016Date of Patent: February 13, 2018Assignee: International Business Machines CorporationInventors: James H. Cross, III, James J. Fan, Bing Xiang, Bowen Zhou
-
Publication number: 20170337183Abstract: A computer program that generates a vector representation of a set of natural language text in a natural language processing system by: (i) receiving a first set of natural language text and a set of information pertaining to the first set of natural language text, where the information includes a dependency parse tree including a root node and a plurality of nodes that depend from the root node, where the root node represents the first set of natural language text, and where the plurality of nodes that depend from the root node represent context features of the first set of natural language text; and (ii) generating, by the natural language processing system, a first vector representation of the first set of natural language text, wherein the generating includes adding vector representations for the context features represented by the plurality of nodes that depend from the root node.Type: ApplicationFiled: August 8, 2017Publication date: November 23, 2017Inventors: JAMES H. CROSS, III, JAMES J. FAN, BING XIANG, BOWEN ZHOU
-
Publication number: 20170308790Abstract: According to an aspect a method includes configuring a convolutional neural network (CNN) for classifying text based on word embedding features into a predefined set of classes identified by class labels. The predefined set of classes includes a class labeled none-of-the-above for text that does not fit into any of the other classes in the predefined set of classes. The CNN is trained based on a set of training data. The training includes learning parameters of class distributed vector representations (DVRs) of each of the predefined set of classes. The learning includes minimizing a pair-wise ranking loss function over the set of training data. A class embedding matrix of the class DVRs of the predefined set of classes that excludes a class embedding for the none-of-the-above class is generated. Each column in the class embedding matrix corresponds to one of the predefined classes.Type: ApplicationFiled: April 21, 2016Publication date: October 26, 2017Inventors: Cicero Nogueira dos Santos, Bing Xiang, Bowen Zhou
-
Publication number: 20170231310Abstract: Disclosed are a stuffing (4) and a method for manufacturing the same, the stuffing (4) comprising a branch (402) and trunk (401) structure formed with a microporous polymer, and having a trunk (401), and branches (402) formed by splitting from the trunk (401). The stuffing (4) of the present invention can still maintain at a dispersedly stuffing state after multiple washings, without notable phenomena such as agglomeration and entanglement.Type: ApplicationFiled: August 21, 2015Publication date: August 17, 2017Inventors: Hong Bing Xiang, Feng Xu, Guo Tong Zhao
-
Publication number: 20170162189Abstract: Software that trains an artificial neural network for generating vector representations for natural language text, by performing the following steps: (i) receiving, by one or more processors, a set of natural language text; (ii) generating, by one or more processors, a set of first metadata for the set of natural language text, where the first metadata is generated using supervised learning method(s); (iii) generating, by one or more processors, a set of second metadata for the set of natural language text, where the second metadata is generated using unsupervised learning method(s); and (iv) training, by one or more processors, an artificial neural network adapted to generate vector representations for natural language text, where the training is based, at least in part, on the received natural language text, the generated set of first metadata, and the generated set of second metadata.Type: ApplicationFiled: February 21, 2017Publication date: June 8, 2017Inventors: Liangliang Cao, James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou
-
Patent number: 9672814Abstract: Software that trains an artificial neural network for generating vector representations for natural language text, by performing the following steps: (i) receiving, by one or more processors, a set of natural language text; (ii) generating, by one or more processors, a set of first metadata for the set of natural language text, where the first metadata is generated using supervised learning method(s); (iii) generating, by one or more processors, a set of second metadata for the set of natural language text, where the second metadata is generated using unsupervised learning method(s); and (iv) training, by one or more processors, an artificial neural network adapted to generate vector representations for natural language text, where the training is based, at least in part, on the received natural language text, the generated set of first metadata, and the generated set of second metadata.Type: GrantFiled: May 8, 2015Date of Patent: June 6, 2017Assignee: International Business Machines CorporationInventors: Liangliang Cao, James J. Fan, Chang Wang, Bing Xiang, Bowen Zhou