Patents by Inventor Liangchao Wu

Liangchao Wu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230418470
    Abstract: Disclosed in the embodiments of the present disclosure are a data processing method and apparatus, and an electronic device. A specific implementation of the method comprises: determining a target first storage region from among a preset number of first storage regions; on the basis of a interaction process with a second device, determining identical target identification information comprised in the target first storage region and a target second storage region, and determining whether the interaction process meets a target requirement; and in response to the interaction process meeting the target requirement, storing target first data identified by the target identification information, so that the second device stores target second data identified by the target identification information. Thus, the target first data and the target second data identified by the same target identification information in the target first storage region and the target second storage region are aligned.
    Type: Application
    Filed: November 15, 2021
    Publication date: December 28, 2023
    Inventors: Liangchao WU, Junyuan XIE, Lizhe ZHANG, Di WU, Xiaobing LIU
  • Patent number: 11809429
    Abstract: Provided are a method for processing model parameters, and an apparatus. The method comprises: a model parameter set to be sharded is obtained, wherein the model parameter set comprises a multi-dimensional array corresponding to a feature embedding; attribute information for a storage system used for storing the model parameter set to be sharded is obtained, wherein the storage system used for storing the model parameter set to be sharded differs from a system on which a model corresponding to the model parameter set to be sharded is located when operating; the model parameter set to be sharded is stored in the storage system according to the attribute information.
    Type: Grant
    Filed: August 12, 2022
    Date of Patent: November 7, 2023
    Assignee: BEIJING BYTEDANCE NETWORK TECHNOLOGY CO., LTD.
    Inventors: Cheng Chen, Peng Zhao, Di Wu, Junyuan Xie, Chenliaohui Fang, Longyijia Li, Long Huang, Liangchao Wu, Long Chang, Lizhe Zhang, Yixiang Chen, Xiaobing Liu
  • Patent number: 11811864
    Abstract: A network connection method and device for a training participant of a joint training model are provided, the training participant operates in a master-worker mode. The method includes: acquiring communication state information of a worker, the communication state information indicating a communication connection phase that the worker is in; acquiring communication state information of a target worker as target communication state information, where the target worker includes a peer node corresponding to the worker, and the peer node belongs to a different training participant of the joint training model; and resetting, in response to determining that the target communication state information does not match the communication state information of the worker, a communication connection phase that the worker is in.
    Type: Grant
    Filed: August 12, 2022
    Date of Patent: November 7, 2023
    Assignee: Douyin Vision Co., Ltd.
    Inventors: Longyijia Li, Cheng Chen, Di Wu, Chenliaohui Fang, Peng Zhao, Junyuan Xie, Yixiang Chen, Liangchao Wu, Long Chang, Xiaobing Liu
  • Patent number: 11755691
    Abstract: Disclosed are a data protection method and apparatus, and a server and a medium. A particular embodiment of the method comprises: acquiring gradient associated information, which respectively corresponds to a target sample that belongs to a binary classification sample set with unbalanced distribution and a reference sample that belongs to the same batch as the target sample; generating information of data noise to be added; according to the information of said data noise, correcting an initial gradient transfer value corresponding to the target sample, such that corrected gradient transfer information corresponding to samples in the sample set that belong to different types is consistent; and sending the gradient transfer information to a passive party of a joint training model. By means of the embodiment, there is no significant difference between corrected gradient transfer information corresponding to positive and negative samples, thereby effectively protecting the security of data.
    Type: Grant
    Filed: July 29, 2022
    Date of Patent: September 12, 2023
    Assignees: BEIJING BYTEDANCE NETWORK TECHNOLOGY CO., LTD., BYTEDANCE INC.
    Inventors: Jiankai Sun, Weihao Gao, Hongyi Zhang, Chong Wang, Junyuan Xie, Liangchao Wu, Xiaobing Liu
  • Publication number: 20230023253
    Abstract: Provided are a method for processing model parameters, and an apparatus. The method comprises: a model parameter set to be sharded is obtained, wherein the model parameter set comprises a multi-dimensional array corresponding to a feature embedding; attribute information for a storage system used for storing the model parameter set to be sharded is obtained, wherein the storage system used for storing the model parameter set to be sharded differs from a system on which a model corresponding to the model parameter set to be sharded is located when operating; the model parameter set to be sharded is stored in the storage system according to the attribute information.
    Type: Application
    Filed: August 12, 2022
    Publication date: January 26, 2023
    Inventors: Cheng CHEN, Peng ZHAO, Di WU, Junyuan XIE, Chenliaohui FANG, Longyijia LI, Long HUANG, Liangchao WU, Long CHANG, Lizhe ZHANG, Yixiang CHEN, Peng ZHAO, Xiaobing LIU
  • Publication number: 20220391642
    Abstract: Provided are a method and apparatus for evaluating a joint training model. A specific implementation of the method for evaluating a joint training model comprises: receiving a model evaluation data request sent by a target device, wherein the target device comprises a participant of a joint training model; acquiring a sample set matching the model evaluation data request, wherein the matching sample set is labeled data associated with the joint training model; and generating model evaluation data of the joint training model according to the matching sample set. By means of the implementation, an effect index of a joint training model can be shared on the premise of not exposing original sample data. Accordingly, a timely and effective data reference basis is provided for the optimization and improvement of the joint training model.
    Type: Application
    Filed: August 12, 2022
    Publication date: December 8, 2022
    Inventors: Longyijia LI, Di WU, Cheng CHEN, Chenliaohui FANG, Peng ZHAO, Junyuan XIE, Yixiang CHEN, Liangchao WU, Long CHANG, Xiaobing LIU
  • Publication number: 20220394085
    Abstract: A network connection method and device for a training participant of a joint training model are provided, the training participant operates in a master-worker mode. The method includes: acquiring communication state information of a worker, the communication state information indicating a communication connection phase that the worker is in; acquiring communication state information of a target worker as target communication state information, where the target worker includes a peer node corresponding to the worker, and the peer node belongs to a different training participant of the joint training model; and resetting, in response to determining that the target communication state information does not match the communication state information of the worker, a communication connection phase that the worker is in.
    Type: Application
    Filed: August 12, 2022
    Publication date: December 8, 2022
    Inventors: Longyijia LI, Cheng CHEN, Di WU, Chenliaohui FANG, Peng ZHAO, Junyuan XIE, Yixiang CHEN, Liangchao WU, Long CHANG, Xiaobing LIU
  • Publication number: 20220385739
    Abstract: Provided are a method and an apparatus for generating prediction information, an electronic device, and a computer readable medium. The method includes: generating, based on first user characteristic information of a target user, anonymous user information of the target user (201); sending the anonymous user information to a second processing end to enable the second processing end to generate prediction information based on the anonymous user information and second user characteristic information (202) of the target user. Data interaction and sharing are realized while ensuring data privacy, thereby improving accuracy of the prediction information.
    Type: Application
    Filed: August 5, 2022
    Publication date: December 1, 2022
    Inventors: Liangchao WU, Lizhe ZHANG, Junyuan XIE, Di WU, Jun ZHANG, Cheng CHEN, Longyijia LI, Chenliaohui FANG, Kan LIU, Long CHANG, Long HUANG, Yixiang CHEN, Xiang WU, Peng ZHAO, Xiaobing LIU
  • Publication number: 20220383054
    Abstract: Disclosed are a data protection method and apparatus, and a server and a medium. A particular embodiment of the method comprises: acquiring gradient associated information, which respectively corresponds to a target sample that belongs to a binary classification sample set with unbalanced distribution and a reference sample that belongs to the same batch as the target sample; generating information of data noise to be added; according to the information of said data noise, correcting an initial gradient transfer value corresponding to the target sample, such that corrected gradient transfer information corresponding to samples in the sample set that belong to different types is consistent; and sending the gradient transfer information to a passive party of a joint training model. By means of the embodiment, there is no significant difference between corrected gradient transfer information corresponding to positive and negative samples, thereby effectively protecting the security of data.
    Type: Application
    Filed: July 29, 2022
    Publication date: December 1, 2022
    Inventors: Jiankai Sun, Weihao Gao, Hongyi Zhang, Chong Wang, Junyuan Xie, Liangchao Wu, Xiaobing Liu