Patents by Inventor Nathalie Baracaldo Angel
Nathalie Baracaldo Angel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12160504Abstract: A plurality of public encryption keys are distributed to a plurality of participants in a federated learning system, and a first plurality of responses is received from the plurality of participants, where each respective response of the first plurality of responses was generated based on training data local to a respective participant of the plurality of participants and is encrypted using a respective public encryption key of the plurality of public encryption keys. A first aggregation vector is generated based on the first plurality of responses, and a first private encryption key is retrieved using the first aggregation vector. An aggregated model is then generated based on the first private encryption key and the first plurality of responses.Type: GrantFiled: November 13, 2019Date of Patent: December 3, 2024Assignee: International Business Machines CorporationInventors: Runhua Xu, Nathalie Baracaldo Angel, Yi Zhou, Ali Anwar, Heiko H Ludwig
-
Publication number: 20240362521Abstract: A system and a computer-implemented method of training a global student model is disclosed. The global student model and a teacher model are stored on a server and each include a first layer. The method includes transmitting local student models based on the global student model, the local student models each including an embedding layer and a first layer. The method includes receiving an embedding layer output of one of the local student models. The method includes performing a forward pass on the first layer of the teacher model, with the embedding layer output as an input, to generate a teacher model first layer output. The method includes transmitting the teacher model first layer output. The method includes receiving first layer weights of the local student models. The method includes calculating first layer weights of the global student model using the received first layer weights of the local student models.Type: ApplicationFiled: April 26, 2023Publication date: October 31, 2024Inventors: Syed Zawad, Nathalie Baracaldo Angel, Yi Zhou, Swanand Ravindra Kadhe, Wesley M. Gifford
-
Patent number: 12131231Abstract: A method, a computer program product, and a system of training a machine learning model using federated learning with extreme gradient boosting. The method includes computing an epsilon hyperparameter using training dataset sizes from a first party and a second party. The method also includes transmitting a machine learning model and the epsilon hyperparameter to the first party and the second party and receiving a first model update and a second model update from the first party and the second party respectively. The method further includes fusing the first model update and the second model update to produce a global histogram and determining at least one split candidate in a decision tree used by the machine learning model using the global histogram. The method also includes rebuilding the machine learning model by adding the split candidate to a decision tree of the machine learning model.Type: GrantFiled: September 16, 2020Date of Patent: October 29, 2024Assignee: International Business Machines CorporationInventors: Yuya Jeremy Ong, Yi Zhou, Nathalie Baracaldo Angel
-
Publication number: 20240330757Abstract: A computer-implemented method of training a machine learning model to prevent data leakage from membership inference attacks. A pre-trained model and a pre-defined hyperparameter ? are received as an input. A forward pass is applied by querying the pre-trained model with a private data. An initial loss distribution LINIT of loss values is computed. A batch loss of a minibatch from the private data is computed after beginning a fine-tuning operation to transform the pre-trained model into a fine-tuned model, and a batch loss distribution LBATCH is computed. A divergence metric is computed between LINIT and LBATCH, and the output of the divergence metric is multiplied with the pre-defined hyperparameter A to obtain a result that is added to the batch loss as a regularizer. The model parameters are updated by computing backpropagation on the regularized loss. The fine-tuned model is output.Type: ApplicationFiled: March 31, 2023Publication date: October 3, 2024Inventors: Mustafa Safa Ozdayi, Swanand Ravindra Kadhe, Yi Zhou, Nathalie Baracaldo Angel
-
Publication number: 20240249018Abstract: One or more systems, devices, computer program products and/or computer-implemented methods of use provided herein relate to a process for privacy-enhanced machine learning and inference. A system can comprise a memory that stores computer executable components, and a processor that executes the computer executable components stored in the memory, wherein the computer executable components can comprise a processing component that generates an access rule that modifies access to first data of a graph database, wherein the first data comprises first party information identified as private, a sampling component that executes a random walk for sampling a first graph of the graph database while employing the access rule, wherein the first graph comprises the first data, and an inference component that, based on the sampling, generates a prediction in response to a query, wherein the inference component avoids directly exposing the first party information in the prediction.Type: ApplicationFiled: January 23, 2023Publication date: July 25, 2024Inventors: Ambrish Rawat, Naoise Holohan, Heiko H. Ludwig, Ehsan Degan, Nathalie Baracaldo Angel, Alan Jonathan King, Swanand Ravindra Kadhe, Yi Zhou, Keith Coleman Houck, Mark Purcell, Giulio Zizzo, Nir Drucker, Hayim Shaul, Eyal Kushnir, Lam Minh Nguyen
-
Publication number: 20240249153Abstract: Systems, devices, computer program products and/or computer-implemented methods of use provided herein relate to federated training and inferencing. A system can comprise a memory that stores computer executable components, and a processor that executes the computer executable components stored in the memory, wherein the computer executable components can comprise a modeling component that trains an inferential model using data from a plurality of parties and comprising horizontally partitioned data and vertically partitioned data, wherein the modeling component employs a random decision tree comprising the data to train the inferential model, and an inference component that responds to a query, employing the inferential model, by generating an inference, wherein first party private data, of the data, originating from a first passive party of the plurality of parties, is not directly shared with other passive parties of the plurality of parties to generate the inference.Type: ApplicationFiled: February 8, 2023Publication date: July 25, 2024Inventors: Swanand Ravindra Kadhe, Heiko H. Ludwig, Nathalie Baracaldo Angel, Yi Zhou, Alan Jonathan King, Keith Coleman Houck, Ambrish Rawat, Mark Purcell, Naoise Holohan, Mikio Takeuchi, Ryo Kawahara, Nir Drucker, Hayim Shaul
-
Publication number: 20240242087Abstract: Systems and techniques that facilitate feature selection in vertical federated learning are provided. For example, one or more embodiments described herein can comprise a system, which can comprise a memory that can store computer executable components. The system can also comprise a processor, operably coupled to the memory that can execute the computer executable components stored in memory. The computer executable components can comprise an aggregator machine learning model that aggregates a plurality of embedding components from one or more local machine learning models and removes one or more embedding components based on minimizing weights at an input layer of the aggregator machine learning model.Type: ApplicationFiled: January 18, 2023Publication date: July 18, 2024Inventors: Timothy John Castiglia, Yi Zhou, Nathalie Baracaldo Angel, Swanand Ravindra Kadhe, Shiqiang Wang, Stacy Elizabeth Patterson
-
Publication number: 20240144026Abstract: A computer-implemented method, according to one approach, includes issuing a hyperparameter optimization (HPO) query to a plurality of computing devices. HPO results are received from the plurality of computing devices, and the HPO results include a set of hyperparameter (HP)/rank value pairs. The method further includes computing, based on the set of HP/rank value pairs, a global set of HPs from the HPO results for federated learning (FL) training. An indication of the global set of HPs is output to the plurality of computing devices. A computer program product, according to another approach, includes a computer readable storage medium having program instructions embodied therewith. The program instructions are readable and/or executable by a computer to cause the computer to perform the foregoing method.Type: ApplicationFiled: February 28, 2023Publication date: May 2, 2024Inventors: Yi Zhou, Parikshit Ram, Theodoros Salonidis, Nathalie Baracaldo Angel, Horst Cornelius Samulowitz, Heiko H. Ludwig
-
Publication number: 20240144027Abstract: A method, a computer program product, and a system of personalized training a machine learning model using federated learning with gradient boosted trees. The method includes training a global machine learning model using federated learning between a plurality of parties. The method also includes distributing the global machine learning model to each of the parties and receiving personalized model updates from each of the parties. The personalized model updates are generated from updated models boosted locally and produced by each of the parties using their respective local data. The method further includes fusing the personalized model updates to produce a boosted decision tree to update the global machine learning model. The method also includes training global machine learning model, iteratively, in this manner until a stopping criterion is achieved.Type: ApplicationFiled: February 27, 2023Publication date: May 2, 2024Inventors: Yuya Jeremy Ong, Yi Zhou, Parikshit Ram, Theodoros Salonidis, Nathalie Baracaldo Angel
-
Patent number: 11948096Abstract: Techniques for improved federated learning are provided. One or more queries are issued to a plurality of participants in a federated learning system, and one or more replies are received from the plurality of participants. A first aggregated model is generated based on the one or more relies and a first influence vector. Upon determining that a predefined criterion is satisfied, a second influence vector modifying a weight of a first participant of the plurality of participants is generated. A second aggregated model is generated based on the one or more replies and the second influence vector.Type: GrantFiled: March 13, 2020Date of Patent: April 2, 2024Assignee: International Business Machines CorporationInventors: Yi Zhou, Ali Anwar, Nathalie Baracaldo Angel, Hekio H. Ludwig
-
Publication number: 20240089081Abstract: An example system includes a processor to compute a tensor of indicators indicating a presence of partial sums in an encrypted vector of indicators. The processor can also securely reorder an encrypted array based on the computed tensor of indicators to generate a reordered encrypted array.Type: ApplicationFiled: August 25, 2022Publication date: March 14, 2024Inventors: Eyal KUSHNIR, Hayim SHAUL, Omri SOCEANU, Ehud AHARONI, Nathalie BARACALDO ANGEL, Runhua XU, Heiko H. LUDWIG
-
Publication number: 20240039692Abstract: A second set of data identifiers, comprising identifiers of data usable in federated model training by a second data owner, is received at a first data owner from the second data owner. An intersection set of data identifiers is determined at the first data owner. At the first data owner according to the intersection set of data identifiers, the data usable in federated model training is rearranged by the first data owner to result in a first training dataset. At the first data owner using the intersection set of data identifiers, the first training dataset, and a previous iteration of an aggregated set of model weights, a first partial set of model weights is computed. An updated aggregated set of model weights, comprising the first partial set of model weights and a second partial set of model weights from the second data owner, is received from an aggregator.Type: ApplicationFiled: July 28, 2022Publication date: February 1, 2024Applicant: International Business Machines CorporationInventors: Runhua Xu, Nathalie Baracaldo Angel, Hayim Shaul, OMRI SOCEANU
-
Publication number: 20240012942Abstract: A computer-implemented method, a computer program product, and a computer system for defending against adversarial attacks in federated learning. In the federated learning comprising an aggregator and parties, the aggregator receives weights sent from the respective parties. The aggregator computes values of a performance metric for weight arrays obtained by the respective parties, using a validation dataset. The aggregator ranks the values of the performance metric in a list. The aggregator recursively splits the list in half until one or more adversary updates of the weights are isolated. The aggregator excludes one or more parties that send the one or more adversary updates from participating in a current round of training in the federated learning.Type: ApplicationFiled: July 7, 2022Publication date: January 11, 2024Inventors: Yi Zhou, Kamala Micaela Noelle Varma, NATHALIE BARACALDO ANGEL
-
Publication number: 20240005215Abstract: A method, system, and computer program product for training models for federated learning. The method determines, by a federated learning aggregator, a set of sample ratios for a set of participant systems. Each sample ratio is associated with a distinct participant system. A set of participant epsilon values are generated for the set of participant systems with each participant epsilon value being associated with a participant system of the set of participant systems. A set of surrogate data sets are received for the set of participant systems with each surrogate data set representing a data set of a participant system. The federated learning aggregator generates a set of local models. Each local model is generated based on a first global model. The method generates a second global model based on a prediction set generated by the set of participant systems using the set of local models.Type: ApplicationFiled: June 29, 2022Publication date: January 4, 2024Inventors: Yuya Jeremy Ong, Yi Zhou, Nathalie Baracaldo Angel
-
Patent number: 11862313Abstract: An example operation may include one or more of connecting, by a pharmacy node, to a blockchain network configured to store patients' data on a blockchain ledger, receiving, by the pharmacy node, a request from a patient node for a prescription refill, the request contains a secret key of a patient, extracting, by the pharmacy node, the secret key from the request to verify a patient's identity, and executing, by the pharmacy node, a smart contract to: (a) decrypt a prescription data located on the ledger by an application of the secret key, (b) retrieve patient's allergy records from the ledger to check the allergy records against the prescription data, (c) determine a number of remaining refills from the prescription data, (d) check validity of the prescription data based on an expiration date, and commit a prescription refill transaction to the blockchain based on a successful execution of (b)-(d).Type: GrantFiled: June 10, 2019Date of Patent: January 2, 2024Assignee: International Business Machines CorporationInventors: Dulce B. Ponceleon, Nathalie Baracaldo Angel, Nitin Gaur
-
Patent number: 11856021Abstract: Computer-implemented methods, program products, and systems for provenance-based defense against poison attacks are disclosed. In one approach, a method includes: receiving observations and corresponding provenance data from data sources; determining whether the observations are poisoned based on the corresponding provenance data; and removing the poisoned observation(s) from a final training dataset used to train a final prediction model. Another implementation involves provenance-based defense against poison attacks in a fully untrusted data environment. Untrusted data points are grouped according to provenance signature, and the groups are used to train learning algorithms and generate complete and filtered prediction models. The results of applying the prediction models to an evaluation dataset are compared, and poisoned data points identified where the performance of the filtered prediction model exceeds the performance of the complete prediction model.Type: GrantFiled: March 22, 2023Date of Patent: December 26, 2023Assignee: International Business Machines CorporationInventors: Nathalie Baracaldo-Angel, Bryant Chen, Evelyn Duesterwald, Heiko H. Ludwig
-
Publication number: 20230409959Abstract: According to one embodiment, a method, computer system, and computer program product for grouped federated learning is provided. The embodiment may include initializing a plurality of aggregation groups including a plurality of parties and a plurality of local aggregators. The embodiment may also include submitting a query to a first party from the plurality of parties. The embodiment may further include submitting an initial response to the query from the first party or a second party from the plurality of parties to a first local aggregator from the plurality of local aggregators. The embodiment may also include submitting a final response from the first local aggregator or a second local aggregator from the plurality of local aggregators to a global aggregator. The embodiment may further include building a machine learning model based on the final response.Type: ApplicationFiled: June 21, 2022Publication date: December 21, 2023Inventors: Ali Anwar, Yi Zhou, NATHALIE BARACALDO ANGEL, Runhua Xu, YUYA JEREMY ONG, Annie K Abay, Heiko H. Ludwig, Gegi Thomas, Jayaram Kallapalayam Radhakrishnan, Laura Wynter
-
Publication number: 20230401439Abstract: The method provides for analyzing input and output connections of layers of a received neural network model configured for vertical federated learning. An undirected graph of nodes is generated in which a node having two or more child nodes includes an aggregation operation, based on the analysis of the model in which a model output corresponds to a node of the graph. A layer of the model is identified in which a sum of lower layer outputs are computed. The identified model layer is partitioned into a first part applied respectively to the multiple entities and a second part applied as an aggregator of the output of the first part. The aggregation operation is performed between pairs of lower layer outputs, and multiple forward and backward passes of the neural network model are performed that include secure aggregation and maintain model partitioning in forward and backward passes.Type: ApplicationFiled: June 13, 2022Publication date: December 14, 2023Inventors: Shiqiang Wang, Timothy John Castiglia, Nathalie Baracaldo Angel, Stacy Elizabeth Patterson, Runhua Xu, Yi Zhou
-
Patent number: 11824968Abstract: Techniques regarding privacy preservation in a federated learning environment are provided. For example, one or more embodiments described herein can comprise a system, which can comprise a memory that can store computer executable components. The system can also comprise a processor, operably coupled to the memory, and that can execute the computer executable components stored in the memory. The computer executable components can comprise a plurality of machine learning components that can execute a machine learning algorithm to generate a plurality of model parameters. The computer executable components can also comprise an aggregator component that can synthesize a machine learning model based on an aggregate of the plurality of model parameters. The aggregator component can communicate with the plurality of machine learning components via a data privacy scheme that comprises a privacy process and a homomorphic encryption process in a federated learning environment.Type: GrantFiled: September 13, 2021Date of Patent: November 21, 2023Inventors: Nathalie Baracaldo Angel, Stacey Truex, Heiko H. Ludwig, Ali Anwar, Thomas Steinke, Rui Zhang
-
Patent number: 11755954Abstract: An indication of availability over time and resource usage is maintained for each computing device of a plurality of computing devices. An optimal combination of a subset of the plurality of computing devices is determined for each round of one or more rounds of training based on the availability over time and the resource usage for each computing device. A global model is generated utilizing the one or more optimal combinations of the plurality of computing devices and a query is performed utilizing the global model.Type: GrantFiled: March 11, 2021Date of Patent: September 12, 2023Assignee: International Business Machines CorporationInventors: Ali Anwar, Syed Amer Zawad, Yi Zhou, Nathalie Baracaldo Angel