Patents by Inventor Saurabh Kumar Mishra

Saurabh Kumar Mishra has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11593634
    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that asynchronously train a machine learning model across client devices that implement local versions of the model while preserving client data privacy. To train the model across devices, in some embodiments, the disclosed systems send global parameters for a global machine learning model from a server device to client devices. A subset of the client devices uses local machine learning models corresponding to the global model and client training data to modify the global parameters. Based on those modifications, the subset of client devices sends modified parameter indicators to the server device for the server device to use in adjusting the global parameters. By utilizing the modified parameter indicators (and not client training data), in certain implementations, the disclosed systems accurately train a machine learning model without exposing training data from the client device.
    Type: Grant
    Filed: June 19, 2018
    Date of Patent: February 28, 2023
    Assignee: Adobe Inc.
    Inventors: Sunav Choudhary, Saurabh Kumar Mishra, Manoj Ghuhan A, Ankur Garg
  • Patent number: 11170320
    Abstract: Systems and techniques are described herein for updating a machine learning model on edge servers. Local parameters of the machine learning model are updated at a plurality of edge servers using fresh data on the edge servers, rather than waiting for the data to reach a global server to update the machine learning model. Hence, latency is significantly reduced, making the systems and techniques described herein suitable for real-time services that support streaming data. Moreover, by updating global parameters of the machine learning model at a global server in a deterministic manner based on parameter updates from the edge servers, rather than by including randomization steps, global parameters of the converge quickly to their optimal values. The global parameters are sent from the global server to the plurality of edge servers at each iteration, thereby synchronizing the machine learning model on the edge servers.
    Type: Grant
    Filed: July 19, 2018
    Date of Patent: November 9, 2021
    Assignee: Adobe Inc.
    Inventors: Ankur Garg, Sunav Choudhary, Saurabh Kumar Mishra, Manoj Ghuhan A.
  • Publication number: 20200027033
    Abstract: Systems and techniques are described herein for updating a machine learning model on edge servers. Local parameters of the machine learning model are updated at a plurality of edge servers using fresh data on the edge servers, rather than waiting for the data to reach a global server to update the machine learning model. Hence, latency is significantly reduced, making the systems and techniques described herein suitable for real-time services that support streaming data. Moreover, by updating global parameters of the machine learning model at a global server in a deterministic manner based on parameter updates from the edge servers, rather than by including randomization steps, global parameters of the converge quickly to their optimal values. The global parameters are sent from the global server to the plurality of edge servers at each iteration, thereby synchronizing the machine learning model on the edge servers.
    Type: Application
    Filed: July 19, 2018
    Publication date: January 23, 2020
    Applicant: Adobe Inc.
    Inventors: Ankur Garg, Sunav Choudhary, Saurabh Kumar Mishra, Manoj Ghuhan A.
  • Publication number: 20190385043
    Abstract: This disclosure relates to methods, non-transitory computer readable media, and systems that asynchronously train a machine learning model across client devices that implement local versions of the model while preserving client data privacy. To train the model across devices, in some embodiments, the disclosed systems send global parameters for a global machine learning model from a server device to client devices. A subset of the client devices uses local machine learning models corresponding to the global model and client training data to modify the global parameters. Based on those modifications, the subset of client devices sends modified parameter indicators to the server device for the server device to use in adjusting the global parameters. By utilizing the modified parameter indicators (and not client training data), in certain implementations, the disclosed systems accurately train a machine learning model without exposing training data from the client device.
    Type: Application
    Filed: June 19, 2018
    Publication date: December 19, 2019
    Inventors: Sunav Choudhary, Saurabh Kumar Mishra, Manoj Ghuhan A, Ankur Garg