Patents by Inventor Shaoxun SU

Shaoxun SU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11941035
    Abstract: The present application provides a summary generation model training method, apparatus, electronic device, and non-transitory computer readable storage medium. The summary generation model training method includes: obtaining a first vector set, where vectors in the first vector set are original encoding vectors which have been trained; generating a second vector set based on the first vector set, where the number of vectors in the second vector set is greater than the number of the vectors in the first vector set, and each vector in the second vector set is determined according to one or more vectors in the first vector set; and taking the vectors included in the first vector set and the vectors included in the second vector set as input encoding vectors to perform model training to obtain a summary generation model.
    Type: Grant
    Filed: December 20, 2021
    Date of Patent: March 26, 2024
    Assignee: BOE Technology Group Co., Ltd.
    Inventor: Shaoxun Su
  • Publication number: 20230386470
    Abstract: A speech instruction recognition method, an electronic device, and a non-transient computer readable storage medium. The speech instruction recognition method comprises: acquiring a target speech; processing the target speech to obtain a target speech vector corresponding to the target speech; performing speech recognition on the target speech to obtain a target speech text of the target speech, and processing the target speech text to obtain a target text vector corresponding to the target speech text; and inputting the target speech vector and the target text vector to a pre-trained instruction recognition model to obtain an instruction category corresponding to the target speech.
    Type: Application
    Filed: January 6, 2021
    Publication date: November 30, 2023
    Inventor: Shaoxun SU
  • Publication number: 20220318289
    Abstract: The present application provides a summary generation model training method, apparatus, electronic device, and non-transitory computer readable storage medium. The summary generation model training method includes: obtaining a first vector set, where vectors in the first vector set are original encoding vectors which have been trained; generating a second vector set based on the first vector set, where the number of vectors in the second vector set is greater than the number of the vectors in the first vector set, and each vector in the second vector set is determined according to one or more vectors in the first vector set; and taking the vectors included in the first vector set and the vectors included in the second vector set as input encoding vectors to perform model training to obtain a summary generation model.
    Type: Application
    Filed: December 20, 2021
    Publication date: October 6, 2022
    Inventor: Shaoxun SU
  • Publication number: 20220318506
    Abstract: A method for event extraction according to the disclosure includes: processing an object text using a preset extraction model to determine event information of the object text; wherein the event information includes an event element, and an event type and a role corresponding to the event element; and the extraction model includes a classification layer and an output layer; the classification layer is configured to determine a token attribute of a token in the object text; the token attribute includes whether the token is a start token of the event element of any event type and any role, and whether the token is an end token of the event element of any event type and any role; and the output layer is configured to determine the event element according to the token attribute of the token, and determine the event type and the role corresponding to the event element.
    Type: Application
    Filed: September 28, 2020
    Publication date: October 6, 2022
    Inventors: Bingqian WANG, Shaoxun SU, Tianxin LIANG