Patents by Inventor Zailiang YU

Zailiang YU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11501171
    Abstract: Disclosed are an automatic compression method and platform for a pre-trained language model based on multilevel knowledge distillation. The method includes the following steps: step 1, constructing multilevel knowledge distillation, and distilling a knowledge structure of a large model at three different levels: a self-attention unit, a hidden layer state and an embedded layer; step 2, training a knowledge distillation network of meta-learning to generate a general compression architecture of a plurality of pre-trained language models; and step 3, searching for an optimal compression structure based on an evolutionary algorithm.
    Type: Grant
    Filed: December 20, 2021
    Date of Patent: November 15, 2022
    Assignee: ZHEJIANG LAB
    Inventors: Hongsheng Wang, Enping Wang, Zailiang Yu
  • Publication number: 20220198276
    Abstract: Disclosed are an automatic compression method and platform for a pre-trained language model based on multilevel knowledge distillation. The method includes the following steps: step 1, constructing multilevel knowledge distillation, and distilling a knowledge structure of a large model at three different levels: a self-attention unit, a hidden layer state and an embedded layer; step 2, training a knowledge distillation network of meta-learning to generate a general compression architecture of a plurality of pre-trained language models; and step 3, searching for an optimal compression structure based on an evolutionary algorithm.
    Type: Application
    Filed: December 20, 2021
    Publication date: June 23, 2022
    Inventors: Hongsheng WANG, Enping WANG, Zailiang YU