Patents by Inventor Yuxin Ding

Yuxin Ding has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240233707
    Abstract: A method includes receiving distillation data including a plurality of out-of-domain training utterances. For each particular out-of-domain training utterance of the distillation data, the method includes generating a corresponding augmented out-of-domain training utterance, and generating, using a teacher ASR model trained on training data corresponding to a target domain, a pseudo-label corresponding to the corresponding augmented out-of-domain training utterance. The method also includes distilling a student ASR model from the teacher ASR model by training the student ASR model using the corresponding augmented out-of-domain training utterances paired with the corresponding pseudo-labels generated by the teacher ASR model.
    Type: Application
    Filed: October 17, 2023
    Publication date: July 11, 2024
    Applicant: Google LLC
    Inventors: Tien-Ju Yang, You-Chi Cheng, Shankar Kumar, Jared Lichtarge, Ehsan Amid, Yuxin Ding, Rajiv Mathews, Mingqing Chen
  • Publication number: 20240194192
    Abstract: Information can be distilled from a global automatic speech recognition (ASR) model to a client ASR model. Many implementations include using an RNN-T model as the ASR model, where the global ASR model includes a global encoder, a joint network, a prediction network, and where the client ASR model includes a client encoder, the joint network, and the prediction network. Various implementations include using principal component analysis (PCA) while training the global ASR model to learn a mean vector and a set of principal components corresponding to the global ASR model. Additional or alternative implementations include training the client ASR model to generate one or more predicted coefficients of the global ASR model.
    Type: Application
    Filed: December 9, 2022
    Publication date: June 13, 2024
    Inventors: Ehsan Amid, Rajiv Mathews, Shankar Kumar, Jared Lichtarge, Mingqing Chen, Tien-Ju Yang, Yuxin Ding
  • Publication number: 20240135918
    Abstract: A method includes receiving distillation data including a plurality of out-of-domain training utterances. For each particular out-of-domain training utterance of the distillation data, the method includes generating a corresponding augmented out-of-domain training utterance, and generating, using a teacher ASR model trained on training data corresponding to a target domain, a pseudo-label corresponding to the corresponding augmented out-of-domain training utterance. The method also includes distilling a student ASR model from the teacher ASR model by training the student ASR model using the corresponding augmented out-of-domain training utterances paired with the corresponding pseudo-labels generated by the teacher ASR model.
    Type: Application
    Filed: October 16, 2023
    Publication date: April 25, 2024
    Applicant: Google LLC
    Inventors: Tien-Ju Yang, You-Chi Cheng, Shankar Kumar, Jared Lichtarge, Ehsan Amid, Yuxin Ding, Rajiv Mathews, Mingqing Chen
  • Patent number: D1036920
    Type: Grant
    Filed: May 26, 2022
    Date of Patent: July 30, 2024
    Assignee: FOSHAN SHUNDE MIDEA ELECTRICAL HEATING APPLIANCES MANUFACTURING CO., LTD.
    Inventors: Yuxin Cao, Huiming Wu, Jie Ding, Guanhong Ru
  • Patent number: D1038680
    Type: Grant
    Filed: May 5, 2022
    Date of Patent: August 13, 2024
    Assignee: Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co., Ltd.
    Inventors: Jie Ding, Huiming Wu, Yuxin Cao, Guanhong Ru