Patents by Inventor Zhaopeng Tu

Zhaopeng Tu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11977851
    Abstract: Embodiments of this disclosure disclose an information processing method, apparatus and a non-transitory computer readable medium. The method includes: obtaining a target text sequence corresponding to to-be-processed text information; obtaining a context vector according to the target text sequence; determining a logical similarity corresponding to the target text sequence according to the context vector and the target text sequence; and encoding the target text sequence corresponding to target text information by using the logical similarity to obtain a text encoding result. In this embodiment of this disclosure, a context vector related to a discrete sequence is used to encode the discrete sequence, to strengthen the dependence between elements in the discrete sequence, thereby enhancing the performance of a neural network model and improving the learning capability of the model.
    Type: Grant
    Filed: February 24, 2021
    Date of Patent: May 7, 2024
    Assignee: Tencent Technology (Shenzhen) Company Limited
    Inventors: Zhaopeng Tu, Baosong Yang, Xing Wang
  • Patent number: 11928439
    Abstract: A translation method is provided, including: encoding to-be-processed text information to obtain a source vector representation sequence, the to-be-processed text information belonging to a first language; obtaining a source context vector corresponding to a first instance according to the source vector representation sequence, the source context vector indicating to-be-processed source content in the to-be-processed text information at the first instance; determining a translation vector according to the source vector representation sequence and the source context vector; and decoding the translation vector and the source context vector, to obtain target information of the first instance, the target information belonging to a second language.
    Type: Grant
    Filed: January 22, 2020
    Date of Patent: March 12, 2024
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Zhaopeng Tu, Hao Zhou, Shuming Shi
  • Patent number: 11900069
    Abstract: A translation model training method for a computer device includes obtaining a training sample set, the training sample set including a plurality of training samples. Each training sample is a training sample pair having a training input sample in a first language and a training output sample in a second language. The method also includes determining a disturbance sample set corresponding to each training sample in the training sample set, the disturbance sample set comprising at least one disturbance sample, and a semantic similarity between the disturbance sample and the corresponding training sample being greater than a first preset value; and training an initial translation model by using the plurality of training samples and the disturbance sample set corresponding to each training sample to obtain a target translation model, such that the training output sample remains same for the disturbance sample and the corresponding training sample.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: February 13, 2024
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Yong Cheng, Zhaopeng Tu, Fandong Meng, Junjie Zhai, Yang Liu
  • Patent number: 11875220
    Abstract: The present disclosure describes a method, apparatus, and storage medium for generating network representation for a neural network. The method includes obtaining a source-side vector sequence corresponding to an input sequence.
    Type: Grant
    Filed: October 13, 2020
    Date of Patent: January 16, 2024
    Assignee: Tencent Technology (Shenzhen) Company Limited
    Inventors: Zhaopeng Tu, Baosong Yang, Tong Zhang
  • Patent number: 11853709
    Abstract: This application relates to a machine translation method performed at a computer device. The method includes: obtaining an original source text and a reconstructed source text; performing semantic encoding on the original source text, to obtain a source vector sequence; sequentially decoding the source vector sequence to obtain target vectors by performing decoding on the source vector sequence at a current time according to a word vector of a candidate target word determined at a previous time, determining a candidate target word at the current time according to a target vector at the current time, and forming a target vector sequence accordingly; performing reconstruction assessment on the source vector sequence and the target vector sequence using the reconstructed source text, to obtain reconstruction scores corresponding to the candidate target words; and generating a target text according to the reconstruction scores and the candidate target words.
    Type: Grant
    Filed: October 5, 2020
    Date of Patent: December 26, 2023
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Zhaopeng Tu, Longyue Wang
  • Patent number: 11574203
    Abstract: A content explanation method and apparatus applied to content explanation includes identifying, by a content explanation apparatus, an emotion of the user, when identifying a negative emotion showing that the user is confused about delivered multimedia information, obtaining, by the content explanation apparatus, a target representation manner of target content in a target intelligence type, where the target content is content about which the user is confused in the multimedia information delivered to the user by an information delivery apparatus associated with the content explanation apparatus, and presenting, by the content explanation apparatus, the target content to the user in the target representation manner.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: February 7, 2023
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Li Qian, Xueyan Huang, Zhaopeng Tu
  • Patent number: 11288458
    Abstract: The present application relates to natural language processing and discloses a sequence conversion method. The method includes: obtaining a source sequence from an input signal; converting the source sequence into one or more source context vectors; obtaining a target context vector corresponding to each source context vector; combining the target context vectors to obtain the target sequence; and outputting the target sequence. A weight vector is applied on a source context vector and a reference context vector, to obtain a target context vector, wherein the weight of one or more elements in the source context vector associated with notional words or weight of a function word in the target context vector is increased. The source sequence and the target sequence are representations of natural language contents. The claimed process improves faithfulness of converting the source sequence to the target sequence.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: March 29, 2022
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Zhaopeng Tu, Xiaohua Liu, Zhengdong Lu, Hang Li
  • Patent number: 11275907
    Abstract: A machine translation method and an apparatus are provided. The method includes: obtaining, when translating a word fragment from a first language into a second language, a source representation vector of the word fragment. The source representation vector of the word fragment represents the word fragment in the first language. The method also includes obtaining a historical translation vector of the word fragment by querying historical translation information according to the source representation vector of the word fragment. The historical translation vector of the word fragment represents a historical translation situation corresponding to the word fragment. The method further includes translating the word fragment according to the historical translation vector of the word fragment.
    Type: Grant
    Filed: August 31, 2019
    Date of Patent: March 15, 2022
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Zhaopeng Tu, Shuming Shi, Tong Zhang
  • Patent number: 11270079
    Abstract: A translation model based training method is provided for a computer device. The method includes inputting a source sentence to a translation model, to obtain a target sentence outputted by the translation model; determining a fidelity of the target sentence to the source sentence; using the target sentence and a reference sentence as input of a discriminator model, using the fidelity as output of the discriminator model, and training the discriminator model on a performance of calculating a similarity between the target sentence and the reference sentence; outputting the similarity by using the discriminator model; and using the source sentence as input of the translation model, using the target sentence as output of the translation model, and using the similarity as a weight coefficient, and training the translation model on a performance of outputting the corresponding target sentence according to the input source sentence.
    Type: Grant
    Filed: November 15, 2019
    Date of Patent: March 8, 2022
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Zhaopeng Tu, Xiang Kong, Shuming Shi, Tong Zhang
  • Patent number: 11244108
    Abstract: Embodiments of the present invention provide a translation method and apparatus, and relate to the field of machine translation. The method includes: obtaining a to-be-translated sentence, where the to-be-translated sentence is a sentence expressed in a first language; determining a first named entity set in the to-be-translated sentence, and an entity type of each first named entity in the first named entity set; determining, based on the first named entity set and the entity type of each first named entity, a second named entity set expressed in a second language; determining a source semantic template of the to-be-translated sentence, and obtaining a target semantic template corresponding to the source semantic template from a semantic template correspondence; and determining a target translation sentence based on the second named entity set and the target semantic template.
    Type: Grant
    Filed: June 25, 2019
    Date of Patent: February 8, 2022
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Zhaopeng Tu, Longyue Wang, Jinhua Du
  • Patent number: 11132516
    Abstract: A sequence conversion method includes receiving a source sequence, converting the source sequence into a source vector representation sequence, obtaining at least two candidate target sequences and a translation probability value of each of the at least two candidate target sequences according to the source vector representation sequence, adjusting the translation probability value of each candidate target sequence, selecting an output target sequence from the at least two candidate target sequences according to an adjusted translation probability value of each candidate target sequence, and outputting the output target sequence. Hence, loyalty of a target sequence to a source sequence can be improved during sequence conversion.
    Type: Grant
    Filed: April 26, 2019
    Date of Patent: September 28, 2021
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Zhaopeng Tu, Lifeng Shang, Xiaohua Liu, Hang Li
  • Publication number: 20210201147
    Abstract: Embodiments of this application disclose a neural network model training method, a machine translation method, a computer device, and a storage medium. The method includes: obtaining a training sample set including a training sample and a standard tag vector corresponding to the training sample; inputting the training sample into a neural network model including a plurality of attention networks to obtain a feature fusion vector; obtaining a predicted tag vector according to the feature fusion vector, and performing adjustment on a model parameter of the neural network model until a convergence condition is met to obtain a target neural network model.
    Type: Application
    Filed: March 11, 2021
    Publication date: July 1, 2021
    Inventors: Zhaopeng TU, Jian LI, Xing WANG, Longyue WANG
  • Publication number: 20210182501
    Abstract: Embodiments of this disclosure disclose an information processing method, apparatus and a non-transitory computer readable medium. The method includes: obtaining a target text sequence corresponding to to-be-processed text information; obtaining a context vector according to the target text sequence; determining a logical similarity corresponding to the target text sequence according to the context vector and the target text sequence; and encoding the target text sequence corresponding to target text information by using the logical similarity to obtain a text encoding result. In this embodiment of this disclosure, a context vector related to a discrete sequence is used to encode the discrete sequence, to strengthen the dependence between elements in the discrete sequence, thereby enhancing the performance of a neural network model and improving the learning capability of the model.
    Type: Application
    Filed: February 24, 2021
    Publication date: June 17, 2021
    Applicant: Tencent Technology (Shenzhen) Company Limited
    Inventors: Zhaopeng TU, Baosong YANG, Xing WANG
  • Publication number: 20210182504
    Abstract: A text translation method includes: obtaining a to-be-translated text sequence; encoding the to-be-translated text sequence, to obtain a first hidden state sequence; obtaining a first state vector; generating a second hidden state sequence according to the first state vector and the first hidden state sequence; generating a context vector corresponding to a current word according to the second hidden state sequence and the first state vector; determining a second target word according to the context vector, the first state vector, and a first target word. The first state vector corresponds to a predecessor word of a current word, the current word is a to-be-translated word in the source language text, the predecessor word is a word that has been translated in the source language text. The first target word is a translation result of a predecessor word, and the second target word is a translation result of the current word.
    Type: Application
    Filed: February 24, 2021
    Publication date: June 17, 2021
    Inventors: Zhaopeng TU, Xinwei GENG, Longyue WANG, Xing WANG
  • Publication number: 20210174170
    Abstract: Embodiments of this application disclose a sequence model processing method and apparatus, to improve a task execution effect of a sequence model. The method includes: inputting a source sequence into an encoder side of a sequence model, the encoder side including a self-attention encoder and a temporal encoder; encoding the source sequence by using the temporal encoder, to obtain a first encoding result, the first encoding result including time series information obtained by performing time series modeling on the source sequence; and encoding the source sequence by using the self-attention encoder, to obtain a second encoding result; inputting a target sequence, the first encoding result, and the second encoding result into a decoder side of the sequence model; and decoding the target sequence, the first encoding result, and the second encoding result by using the decoder side, and outputting a decoding result obtained after the decoding.
    Type: Application
    Filed: February 22, 2021
    Publication date: June 10, 2021
    Inventors: Zhaopeng TU, Jie HAO, Xing WANG, Longyue WANG
  • Publication number: 20210042603
    Abstract: The present disclosure describes a method, apparatus, and storage medium for generating network representation for a neural network. The method includes obtaining a source-side vector sequence corresponding to an input sequence.
    Type: Application
    Filed: October 13, 2020
    Publication date: February 11, 2021
    Applicant: Tencent Technology (Shenzhen) Company Limited
    Inventors: Zhaopeng Tu, Baosong Yang, Tong Zhang
  • Patent number: 10909315
    Abstract: A syntax analysis method and apparatus are disclosed. The method includes: obtaining a source language sentence that is a translation of a target language sentence (S110); determining instances of state transition for the target language sentence according to the source language sentence and a correspondence between words of the target language sentence and words of the source language sentence (S120); and generating a syntax tree of the target language sentence according to the instances of state transition for the target language sentence (S130). The syntax analysis method and apparatus can improve efficiency of syntax analysis.
    Type: Grant
    Filed: January 17, 2018
    Date of Patent: February 2, 2021
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Zhaopeng Tu, Xiao Chen, Wenbin Jiang
  • Publication number: 20210019479
    Abstract: This application relates to a machine translation method performed at a computer device. The method includes: obtaining an original source text and a reconstructed source text; performing semantic encoding on the original source text, to obtain a source vector sequence; sequentially decoding the source vector sequence to obtain target vectors by performing decoding on the source vector sequence at a current time according to a word vector of a candidate target word determined at a previous time, determining a candidate target word at the current time according to a target vector at the current time, and forming a target vector sequence accordingly; performing reconstruction assessment on the source vector sequence and the target vector sequence using the reconstructed source text, to obtain reconstruction scores corresponding to the candidate target words; and generating a target text according to the reconstruction scores and the candidate target words.
    Type: Application
    Filed: October 5, 2020
    Publication date: January 21, 2021
    Inventors: Zhaopeng TU, Longyue WANG
  • Publication number: 20200364412
    Abstract: A translation model training method for a computer device includes obtaining a training sample set, the training sample set including a plurality of training samples. Each training sample is a training sample pair having a training input sample in a first language and a training output sample in a second language. The method also includes determining a disturbance sample set corresponding to each training sample in the training sample set, the disturbance sample set comprising at least one disturbance sample, and a semantic similarity between the disturbance sample and the corresponding training sample being greater than a first preset value; and training an initial translation model by using the plurality of training samples and the disturbance sample set corresponding to each training sample to obtain a target translation model, such that the training output sample remains same for the disturbance sample and the corresponding training sample.
    Type: Application
    Filed: August 7, 2020
    Publication date: November 19, 2020
    Inventors: Yong CHENG, Zhaopeng TU, Fandong MENG, Junjie ZHAI, Yang LIU
  • Publication number: 20200226328
    Abstract: A translation method is provided, including: encoding to-be-processed text information to obtain a source vector representation sequence, the to-be-processed text information belonging to a first language; obtaining a source context vector corresponding to a first instance according to the source vector representation sequence, the source context vector indicating to-be-processed source content in the to-be-processed text information at the first instance; determining a translation vector according to the source vector representation sequence and the source context vector; and decoding the translation vector and the source context vector, to obtain target information of the first instance, the target information belonging to a second language.
    Type: Application
    Filed: January 22, 2020
    Publication date: July 16, 2020
    Inventors: Zhaopeng TU, Hao ZHOU, Shuming SHI