Patents by Inventor Tiffany Tuor

Tiffany Tuor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11836576
    Abstract: A training process of a machine learning model is executed at the edge node for a number of iterations to generate a model parameter based at least in part on a local dataset and a global model parameter. A resource parameter set indicative of resources available at the edge node is estimated. The model parameter and the resource parameter set are sent to a synchronization node. Updates to the global model parameter and the number of iterations are received from the synchronization node based at least in part on the model parameter and the resource parameter set of edge nodes. The training process of the machine learning model is repeated at the edge node to determine an update to the model parameter based at least in part on the local dataset and updates to the global model parameter and the number of iterations from the synchronization node.
    Type: Grant
    Filed: April 13, 2018
    Date of Patent: December 5, 2023
    Assignee: International Business Machines Corporation
    Inventors: Shiqiang Wang, Tiffany Tuor, Theodoros Salonidis, Christian Makaya, Bong Jun Ko
  • Patent number: 11574254
    Abstract: Techniques for adaptive asynchronous federated learning are described herein. An aspect includes providing a first version of a global parameter to a first client and a second client. Another aspect includes receiving, from the first client, a first gradient, wherein the first gradient was computed by the first client based on the first version of the global parameter and a respective first local dataset of the first client. Another aspect includes determining whether the first version of the global parameter matches a most recent version of the global parameter. Another aspect includes, based on determining that the first version of the global parameter does not match the most recent version of the global parameter, selecting a version of the global parameter. Another aspect includes aggregating the first gradient with the selected version of the global parameter to determine an updated version of the global parameter.
    Type: Grant
    Filed: April 29, 2020
    Date of Patent: February 7, 2023
    Assignee: International Business Machines Corporation
    Inventors: Shiqiang Wang, Tiffany Tuor, Changchang Liu, Thai Franck Le
  • Patent number: 11461593
    Abstract: A method, a computer program product, and a computer system determine when to perform a federated learning process. The method includes identifying currently available contributors among contributors of a federated learning task for which the federated learning process is to be performed. The method includes determining a usefulness metric of the currently available contributors for respective datasets from each of the currently available contributors used in performing the federated learning process. The method includes, as a result of the usefulness metric of the currently available contributors being at least a usefulness threshold, generating a recommendation to perform the federated learning process with the datasets of the currently available contributors. The method includes transmitting the recommendation to a processing component configured to perform the federated learning process.
    Type: Grant
    Filed: November 26, 2019
    Date of Patent: October 4, 2022
    Assignee: International Business Machines Corporation
    Inventors: Tiffany Tuor, Shiqiang Wang, Changchang Liu, Bong Jun Ko, Wei-Han Lee
  • Publication number: 20210342749
    Abstract: Techniques for adaptive asynchronous federated learning are described herein. An aspect includes providing a first version of a global parameter to a first client and a second client. Another aspect includes receiving, from the first client, a first gradient, wherein the first gradient was computed by the first client based on the first version of the global parameter and a respective first local dataset of the first client. Another aspect includes determining whether the first version of the global parameter matches a most recent version of the global parameter. Another aspect includes, based on determining that the first version of the global parameter does not match the most recent version of the global parameter, selecting a version of the global parameter. Another aspect includes aggregating the first gradient with the selected version of the global parameter to determine an updated version of the global parameter.
    Type: Application
    Filed: April 29, 2020
    Publication date: November 4, 2021
    Inventors: Shiqiang Wang, Tiffany Tuor, Changchang Liu, Thai Franck Le
  • Publication number: 20210158099
    Abstract: A method, a computer program product, and a computer system determine when to perform a federated learning process. The method includes identifying currently available contributors among contributors of a federated learning task for which the federated learning process is to be performed. The method includes determining a usefulness metric of the currently available contributors for respective datasets from each of the currently available contributors used in performing the federated learning process. The method includes, as a result of the usefulness metric of the currently available contributors being at least a usefulness threshold, generating a recommendation to perform the federated learning process with the datasets of the currently available contributors. The method includes transmitting the recommendation to a processing component configured to perform the federated learning process.
    Type: Application
    Filed: November 26, 2019
    Publication date: May 27, 2021
    Inventors: Tiffany Tuor, Shiqiang Wang, Changchang Liu, Bong Jun KO, Wei-Han Lee
  • Publication number: 20190318268
    Abstract: A training process of a machine learning model is executed at the edge node for a number of iterations to generate a model parameter based at least in part on a local dataset and a global model parameter. A resource parameter set indicative of resources available at the edge node is estimated. The model parameter and the resource parameter set are sent to a synchronization node. Updates to the global model parameter and the number of iterations are received from the synchronization node based at least in part on the model parameter and the resource parameter set of edge nodes. The training process of the machine learning model is repeated at the edge node to determine an update to the model parameter based at least in part on the local dataset and updates to the global model parameter and the number of iterations from the synchronization node.
    Type: Application
    Filed: April 13, 2018
    Publication date: October 17, 2019
    Inventors: Shiqiang Wang, Tiffany Tuor, Theodoros Salonidis, Christian Makaya, Bong Jun KO