Patents by Inventor Junfeng Wen

Junfeng Wen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230385694
    Abstract: Model training systems collaborate on model training without revealing respective private data sets. Each private data set learns a set of client weights for a set of computer models that are also learned during training. Inference for a particular private data set is determined as a mixture of the computer model parameters according to the client weights. During training, at each iteration, the client weights are updated in one step based on how well sampled models represent the private data set. In another step, gradients are determined for each sampled model and may be weighed according to the client weight for that model, relatively increasing the gradient contribution of a private data set for model parameters that correspond more highly to that private data set.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Ka Ho Yenson Lau, Junfeng Wen, Yi Sui
  • Patent number: 11755916
    Abstract: An improved computer implemented method and corresponding systems and computer readable media for improving performance of a deep neural network are provided to mitigate effects related to catastrophic forgetting in neural network learning. In an embodiment, the method includes storing, in memory, logits of a set of samples from a previous set of tasks (D1); and maintaining classification information from the previous set of tasks by utilizing the logits for matching during training on a new set of tasks (D2).
    Type: Grant
    Filed: September 5, 2019
    Date of Patent: September 12, 2023
    Assignee: ROYAL BANK OF CANADA
    Inventors: Yanshuai Cao, Ruitong Huang, Junfeng Wen
  • Publication number: 20230153461
    Abstract: A model training system protects data leakage of private data in a federated learning environment by training a private model in conjunction with a proxy model. The proxy model is trained with protections for the private data and may be shared with other participants. Proxy models from other participants are used to train the private model, enabling the private model to benefit from parameters based on other models’ private data without privacy leakage. The proxy model may be trained with a differentially private algorithm that quantifies a privacy cost for the proxy model, enabling a participant to measure the potential exposure of private data and drop out. Iterations may include training the proxy and private models and then mixing the proxy models with other participants. The mixing may include updating and applying a bias to account for the weights of other participants in the received proxy models.
    Type: Application
    Filed: November 15, 2022
    Publication date: May 18, 2023
    Inventors: Shivam Kalra, Jesse Cole Cresswell, Junfeng Wen, Maksims Volkovs, Hamid R. Tizhoosh
  • Publication number: 20200074305
    Abstract: An improved computer implemented method and corresponding systems and computer readable media for improving performance of a deep neural network are provided to mitigate effects related to catastrophic forgetting in neural network learning. In an embodiment, the method includes storing, in memory, logits of a set of samples from a previous set of tasks (D1); and maintaining classification information from the previous set of tasks by utilizing the logits for matching during training on a new set of tasks (D2).
    Type: Application
    Filed: September 5, 2019
    Publication date: March 5, 2020
    Inventors: Yanshuai CAO, Ruitong HUANG, Junfeng WEN
  • Patent number: 8933745
    Abstract: A transconductance-enhancing passive frequency mixer comprises a transconductance amplification stage, a frequency mixing stage, and an output transresistance amplifier. The transconductance amplification stage has a pre-amplification transconductance-enhancing structure, so that the transconductance is greatly enhanced, thereby obtaining the same transconductance value at a lower bias current. A radio-frequency current is modulated by the frequency mixing stage to generate an output mid-frequency current signal. The mid-frequency current signal passes through the transresistance amplifier, to form voltage output, and finally obtain a mid-frequency voltage signal. The transresistance amplifier has a transconductance-enhancing structure, thereby further reducing input impedance, and improving current utilization efficiency and port isolation. The frequency mixer has the characteristics of low power consumption, high conversion gain, good port isolation, and the like.
    Type: Grant
    Filed: May 29, 2012
    Date of Patent: January 13, 2015
    Assignee: Southeast University
    Inventors: Jianhui Wu, Xiao Shi, Chao Chen, Zhilin Liu, Qiang Zhao, Junfeng Wen, Xudong Wang, Chunfeng Bai, Qian Tian
  • Publication number: 20130285715
    Abstract: A transconductance-enhancing passive frequency mixer comprises a transconductance amplification stage, a frequency mixing stage, and an output transresistance amplifier. The transconductance amplification stage has a pre-amplification transconductance-enhancing structure, so that the transconductance is greatly enhanced, thereby obtaining the same transconductance value at a lower bias current. A radio-frequency current is modulated by the frequency mixing stage to generate an output mid-frequency current signal. The mid-frequency current signal passes through the transresistance amplifier, to form voltage output, and finally obtain a mid-frequency voltage signal. The transresistance amplifier has a transconductance-enhancing structure, thereby further reducing input impedance, and improving current utilization efficiency and port isolation. The frequency mixer has the characteristics of low power consumption, high conversion gain, good port isolation, and the like.
    Type: Application
    Filed: May 29, 2012
    Publication date: October 31, 2013
    Applicant: Southeast University
    Inventors: Jianhui Wu, Xiao Shi, Chao Chen, Zhilin Liu, Qiang Zhao, Junfeng Wen, Xudong Wang, Chunfeng Bai, Qian Tian