Patents by Inventor Yanbin Zhao
Yanbin Zhao has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12373735Abstract: A method for pre-training a language model includes: constructing a pre-training language data set, in which the pre-training language data set comprises unsupervised language data and supervised language data; generating a hierarchical multi-template and multi-task language data set based on the pre-training language data set; and pre-training the language model based on the hierarchical multi-template and multi-task language data set.Type: GrantFiled: March 7, 2023Date of Patent: July 29, 2025Assignee: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.Inventors: Junyuan Shang, Shuohuan Wang, Siyu Ding, Yanbin Zhao, Chao Pang, Yu Sun, Hao Tian, Hua Wu, Haifeng Wang
-
Patent number: 12314677Abstract: A method and apparatus for pre-training a model, a device, a storage medium, and a program product. An embodiment of the method includes: acquiring a sample natural language text; generating N types of prompt words based on the sample natural language text, where N is a positive integer; generating sample input data based on the sample natural language text and the N types of prompt words; and training an initial language model based on the sample input data, to obtain a pre-trained language model.Type: GrantFiled: August 16, 2022Date of Patent: May 27, 2025Assignee: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.Inventors: Junyuan Shang, Shuohuan Wang, Siyu Ding, Yanbin Zhao, Chao Pang, Yu Sun
-
Publication number: 20240412002Abstract: A method is provided. The method includes: obtaining a first sample dataset; inputting at least one first question text corresponding to at least one piece of first sample data into a dialog model separately to obtain at least one first answer prediction result; inputting each second question text into the dialog model to obtain a second answer prediction result output by the dialog model; inputting the second answer prediction result into a reward model to obtain a score of the second answer prediction result output by the reward model; determining a comprehensive loss based on the at least one first answer prediction result, a first answer text of each of the at least one piece of first sample data, and a score corresponding to each of at least one piece of second sample data; and adjusting at least one parameter of the dialog model based on the comprehensive loss.Type: ApplicationFiled: June 19, 2024Publication date: December 12, 2024Applicant: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.Inventors: Yanbin ZHAO, Siyu DING, Shuohuan WANG, Yu SUN, Hao TIAN, Hua WU, Haifeng WANG
-
Patent number: 12131728Abstract: The present application provides a method of training a natural language processing model, which relates to a field of artificial intelligence, and in particular to a field of natural language processing. A specific implementation scheme includes: performing a semantic learning for multi-tasks on an input text, so as to obtain a semantic feature for the multi-tasks, wherein the multi-tasks include a plurality of branch tasks; performing a feature learning for each branch task based on the semantic feature, so as to obtain a first output result for each branch task; calculating a loss for each branch task according to the first output result for the branch task; and adjusting a parameter of the natural language processing model according to the loss for each branch task. The present application further provides a method of processing a natural language, an electronic device, and a storage medium.Type: GrantFiled: May 31, 2022Date of Patent: October 29, 2024Assignee: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.Inventors: Siyu Ding, Chao Pang, Shuohuan Wang, Yanbin Zhao, Junyuan Shang, Yu Sun, Shikun Feng, Hao Tian, Hua Wu, Haifeng Wang
-
Publication number: 20230252354Abstract: A method for pre-training a language model includes: constructing a pre-training language data set, in which the pre-training language data set comprises unsupervised language data and supervised language data; generating a hierarchical multi-template and multi-task language data set based on the pre-training language data set; and pre-training the language model based on the hierarchical multi-template and multi-task language data set.Type: ApplicationFiled: March 7, 2023Publication date: August 10, 2023Inventors: Junyuan SHANG, Shuohuan WANG, Siyu DING, Yanbin ZHAO, Chao PANG, Yu SUN, Hao TIAN, Hua WU, Haifeng WANG
-
Publication number: 20230206080Abstract: A model training system includes at least one first cluster and a second cluster communicating with the at least first cluster. The at least one first cluster is configured to acquire a sample data set, generate training data according to the sample data set, and send the training data to the second cluster; and the second cluster is configured to train a pre-trained model according to the training data sent by the at least one first cluster.Type: ApplicationFiled: March 7, 2023Publication date: June 29, 2023Inventors: Shuohuan WANG, Weibao GONG, Zhihua WU, Yu SUN, Siyu DING, Yaqian HAN, Yanbin ZHAO, Yuang LIU, Dianhai YU
-
Publication number: 20230040095Abstract: A method and apparatus for pre-training a model, a device, a storage medium, and a program product. An embodiment of the method includes: acquiring a sample natural language text; generating N types of prompt words based on the sample natural language text, where N is a positive integer; generating sample input data based on the sample natural language text and the N types of prompt words; and training an initial language model based on the sample input data, to obtain a pre-trained language model.Type: ApplicationFiled: August 16, 2022Publication date: February 9, 2023Inventors: Junyuan SHANG, Shuohuan WANG, Siyu DING, Yanbin ZHAO, Chao PANG, Yu Sun
-
Publication number: 20220398277Abstract: A method for recommending works is provided. The method includes: receiving, from a login account of an application, a recommendation request; acquiring, in response to the recommendation request, a first candidate work set of each type of a plurality of types, wherein the first candidate work set includes multimedia works posted by an associated account of the login account in the application; screening the first candidate work sets, and aggregating screening results into a second candidate work set, wherein the second candidate work set includes multimedia works of the plurality of types; and ranking multimedia works of the plurality of types in the second candidate work set, and recommending the multimedia works to the login account based on a ranking result.Type: ApplicationFiled: August 19, 2022Publication date: December 15, 2022Inventors: Haocheng WEN, Gang ZENG, Yanbin ZHAO
-
Publication number: 20220293092Abstract: The present application provides a method of training a natural language processing model, which relates to a field of artificial intelligence, and in particular to a field of natural language processing. A specific implementation scheme includes: performing a semantic learning for multi-tasks on an input text, so as to obtain a semantic feature for the multi-tasks, wherein the multi-tasks include a plurality of branch tasks; performing a feature learning for each branch task based on the semantic feature, so as to obtain a first output result for each branch task; calculating a loss for each branch task according to the first output result for the branch task; and adjusting a parameter of the natural language processing model according to the loss for each branch task. The present application further provides a method of processing a natural language, an electronic device, and a storage medium.Type: ApplicationFiled: May 31, 2022Publication date: September 15, 2022Inventors: Siyu DING, Chao PANG, Shuohuan WANG, Yanbin ZHAO, Junyuan SHANG, Yu SUN, Shikun FENG, Hao TIAN, Hua WU, Haifeng WANG
-
Virtual machine live migration method, virtual machine deployment method, server, and cluster system
Patent number: 10334034Abstract: A virtual machine live migration method includes: acquiring load information of physical machines in a first physical machine group, where the physical machines share a same access switch; determining a source physical machine and a destination physical machine according to a first dynamic resource scheduling policy and the load information of the physical machines; and delivering a migration instruction to the source physical machine according to a second dynamic resource scheduling policy. In the foregoing method, virtual machine live migration is performed preferentially within a physical machine group so that network traffic of the migration passes through only one access switch. This shortens a length of a data transmission link, increases a migration rate, and relieves impact of migration traffic on network load in a cluster.Type: GrantFiled: November 3, 2014Date of Patent: June 25, 2019Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventor: Yanbin Zhao -
Virtual Machine Live Migration Method, Virtual Machine Deployment Method, Server, and Cluster System
Publication number: 20150052254Abstract: A virtual machine live migration method includes: acquiring load information of physical machines in a first physical machine group, where the physical machines share a same access switch; determining a source physical machine and a destination physical machine according to a first dynamic resource scheduling policy and the load information of the physical machines; and delivering a migration instruction to the source physical machine according to a second dynamic resource scheduling policy. In the foregoing method, virtual machine live migration is performed preferentially within a physical machine group so that network traffic of the migration passes through only one access switch. This shortens a length of a data transmission link, increases a migration rate, and relieves impact of migration traffic on network load in a cluster.Type: ApplicationFiled: November 3, 2014Publication date: February 19, 2015Inventor: Yanbin Zhao -
Publication number: 20140082202Abstract: The present invention discloses a method and an apparatus for integration of a virtual cluster, and a virtual cluster system. The method includes: selecting a physical machine to be integrated according to a cluster load; determining a migration time and an interrupt time; determining a migration cost according to the migration time and the interrupt time to migrate each virtual machine on the physical machine to be integrated; selecting a virtual machine to be migrated according to the migration cost; selecting a target physical machine according to the cluster load; and migrating the selected virtual machine to be migrated to the selected target physical machine. With the present invention, the migration time and the interrupt time are determined; and the migration cost to migrate the virtual machine is determined according to the migration time and the interrupt time, so that virtual cluster integration is optimized according to the migration cost.Type: ApplicationFiled: November 15, 2013Publication date: March 20, 2014Applicant: HUAWEI TECHNOLOGIES CO., LTD.Inventor: Yanbin Zhao
-
Patent number: D868332Type: GrantFiled: May 28, 2018Date of Patent: November 26, 2019Assignee: Hengdian Group Tospo Lighting Co., Ltd.Inventors: Yanbin Zhao, Junchu Si, Yufei Zhao, Xuemei Guo, Meiling Jin