Patents by Inventor Kanthi Sarpatwar

Kanthi Sarpatwar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11271958
    Abstract: Aspects of the present disclosure describe techniques for detecting anomalous data in an encrypted data set. An example method generally includes receiving a data set of encrypted data points. A tree data structure having a number of levels is generated for the data set. Each level of the tree data structure generally corresponds to a feature of the encrypted plurality of features, and each node in the tree data structure at a given level represents a probability distribution of a likelihood that each data point is less than or greater than a split value determined for a given feature. An encrypted data point is received for analysis, and anomaly score is calculated based on a probability identified for each of the plurality of encrypted features. Based on determining that the calculated anomaly score exceeds a threshold value, the encrypted data point is identified as potentially anomalous.
    Type: Grant
    Filed: September 20, 2019
    Date of Patent: March 8, 2022
    Assignee: International Business Machines Corporation
    Inventors: Kanthi Sarpatwar, Venkata Sitaramagiridharganesh Ganapavarapu, Saket Sathe, Roman Vaculin
  • Publication number: 20210397988
    Abstract: This disclosure provides a method, apparatus and computer program product to create a full homomorphic encryption (FHE)-friendly machine learning model. The approach herein leverages a knowledge distillation framework wherein the FHE-friendly (student) ML model closely mimics the predictions of a more complex (teacher) model, wherein the teacher model is one that, relative to the student model, is more complex and that is pre-trained on large datasets. In the approach herein, the distillation framework uses the more complex teacher model to facilitate training of the FHE-friendly model, but using synthetically-generated training data in lieu of the original datasets used to train the teacher.
    Type: Application
    Filed: June 22, 2020
    Publication date: December 23, 2021
    Applicant: International Business Machines Corporation
    Inventors: Kanthi Sarpatwar, Nalini K. Ratha, Karthikeyan Shanmugam, Karthik Nandakumar, Sharathchandra Pankanti, Roman Vaculin, James Thomas Rayfield
  • Publication number: 20210376995
    Abstract: A technique for computationally-efficient privacy-preserving homomorphic inferencing against a decision tree. Inferencing is carried out by a server against encrypted data points provided by a client. Fully homomorphic computation is enabled with respect to the decision tree by intelligently configuring the tree and the real number-valued features that are applied to the tree. To that end, and to the extent the decision tree is unbalanced, the server first balances the tree. A cryptographic packing scheme is then applied to the balanced decision tree and, in particular, to one or more entries in at least one of: an encrypted feature set, and a threshold data set, that are to be used during the decision tree evaluation process. Upon receipt of an encrypted data point, homomorphic inferencing on the configured decision tree is performed using a highly-accurate approximation comparator, which implements a “soft” membership recursive computation on real numbers, all in an oblivious manner.
    Type: Application
    Filed: May 27, 2020
    Publication date: December 2, 2021
    Applicant: International Business Machines Corporation
    Inventors: Nalini K. Ratha, Kanthi Sarpatwar, Karthikeyan Shanmugam, Sharathchandra Pankanti, Karthik Nandakumar, Roman Vaculin
  • Publication number: 20210344478
    Abstract: A method, apparatus and computer program product for homomorphic inference on a decision tree (DT) model. In lieu of HE-based inferencing on the decision tree, the inferencing instead is performed on a neural network (NN), which acts as a surrogate. To this end, the neural network is trained to learn DT decision boundaries, preferably without using the original DT model data training points. During training, a random data set is applied to the DT, and expected outputs are recorded. This random data set and the expected outputs are then used to train the neural network such that the outputs of the neural network match the outputs expected from applying the original data set to the DT. Preferably, the neural network has low depth, just a few layers. HE-based inferencing on the decision tree is done using HE inferencing on the shallow neural network. The latter is computationally-efficient and is carried without the need for bootstrapping.
    Type: Application
    Filed: April 30, 2020
    Publication date: November 4, 2021
    Applicant: International Business Machines Corporation
    Inventors: Kanthi Sarpatwar, Nalini K. Ratha, Karthikeyan Shanmugam, Karthik Nandakumar, Sharathchandra Pankanti, Roman Vaculin
  • Publication number: 20210319353
    Abstract: An example operation includes one or more of computing, by a data owner node, updated gradients on a loss function based on a batch of private data and previous parameters of a machine learning model associated with a blockchain, encrypting, by the data owner node, update information, recording, by the data owner, the encrypted update information as a new transaction on the blockchain, and providing the update information for an audit.
    Type: Application
    Filed: April 9, 2020
    Publication date: October 14, 2021
    Inventors: Kanthi Sarpatwar, Karthikeyan Shanmugam, Venkata Sitaramagiridharganesh Ganapavarapu, Roman Vaculin
  • Publication number: 20210150266
    Abstract: In an approach for training machine-learning models using encrypted data, a processor receives a set of encrypted data from a client computing device. A processor trains a machine-learning model using a boosting algorithm. A processor performs a first classification on the set of encrypted data using the machine-learning model. A processor sends a first set of encrypted results of the first classification to the client computing device. A processor receives a first set of boosting updates from the client computing device. A processor applies the first set of boosting updates to the machine-learning model.
    Type: Application
    Filed: November 15, 2019
    Publication date: May 20, 2021
    Inventors: Kanthi Sarpatwar, Roman VACULIN
  • Publication number: 20210092137
    Abstract: Aspects of the present disclosure describe techniques for detecting anomalous data in an encrypted data set. An example method generally includes receiving a data set of encrypted data points. A tree data structure having a number of levels is generated for the data set. Each level of the tree data structure generally corresponds to a feature of the encrypted plurality of features, and each node in the tree data structure at a given level represents a probability distribution of a likelihood that each data point is less than or greater than a split value determined for a given feature. An encrypted data point is received for analysis, and anomaly score is calculated based on a probability identified for each of the plurality of encrypted features. Based on determining that the calculated anomaly score exceeds a threshold value, the encrypted data point is identified as potentially anomalous.
    Type: Application
    Filed: September 20, 2019
    Publication date: March 25, 2021
    Inventors: Kanthi Sarpatwar, Venkata Sitaramagiridharganesh Ganapavarapu, Saket Sathe, Roman Vaculin
  • Publication number: 20200394470
    Abstract: An example operation may include one or more of generating, by a training participant client, a plurality of transaction proposals, each of the plurality of transaction proposals corresponding to a training iteration for machine learning model training related to stochastic gradient descent, the machine learning model training comprising a plurality of training iterations, the transaction proposals comprising a gradient calculation performed by the training participant client, transferring the plurality of transaction proposals to one or more endorser nodes or peers each comprising a verify gradient smart contract, executing, by each of the endorser nodes or peers, the verify gradient smart contract; and providing endorsements corresponding to the plurality of transaction proposals to the training participation client.
    Type: Application
    Filed: June 12, 2019
    Publication date: December 17, 2020
    Inventors: Venkata Sitaramagiridharganesh Ganapavarapu, Kanthi Sarpatwar, Karthikeyan Shanmugam, Roman Vaculin
  • Publication number: 20200394552
    Abstract: An example operation may include one or more of generating, by a plurality of training participant clients, gradient calculations for machine learning model training, each of the training participant clients comprising a training dataset, converting, by a training aggregator coupled to the plurality of training participant clients, the gradient calculations to a plurality of transaction proposals, receiving, by one or more endorser nodes or peers of a blockchain network, the plurality of transaction proposals, executing, by each of the endorser nodes or peers, a verify gradient smart contract, and providing endorsements corresponding to the plurality of transaction proposals to the training aggregator.
    Type: Application
    Filed: June 12, 2019
    Publication date: December 17, 2020
    Inventors: Venkata Sitaramagiridharganesh Ganapavarapu, Kanthi Sarpatwar, Karthikeyan Shanmugam, Roman Vaculin
  • Publication number: 20200394471
    Abstract: An example operation may include one or more of generating, by a training participant client comprising a training dataset, a plurality of transaction proposals that each correspond to a training iteration for machine learning model training related to stochastic gradient descent, the machine learning model training comprising a plurality of training iterations, the transaction proposals comprising a gradient calculation performed by the training participant client, a batch from the private dataset, a loss function, and an original model parameter, receiving, by one or more endorser nodes of peers of a blockchain network, the plurality of transaction proposals, and evaluating each transaction proposal.
    Type: Application
    Filed: June 12, 2019
    Publication date: December 17, 2020
    Inventors: Venkata Sitaramagiridharganesh Ganapavarapu, Kanthi Sarpatwar, Karthikeyan Shanmugam, Roman Vaculin
  • Publication number: 20190287027
    Abstract: An example operation may include one or more of generating a hashed summary including hashes of one or more of a validation data set and hashes of data points chosen in previous iterations from producer nodes, and exposing the hashed summary to a plurality of producer nodes, receiving, iteratively, a plurality of requests from the plurality of producer nodes, respectively, where each request identifies a marginal value provided by a hash of a data sample available to a producer node, selecting a request received from a producer node based on a marginal value associated with the request, retrieving hashed data of the producer node associated with the selected request, and aggregating the hashed data of the producer node with the summary of hashes generated at one or more previous iterations to produce an updated summary, and storing the updated summary via a data block of a distributed ledger.
    Type: Application
    Filed: March 15, 2018
    Publication date: September 19, 2019
    Inventors: Michele M. Franceschini, Ashish Jagmohan, Kanthi Sarpatwar, Karthikeyan SHANMUGAM, Roman Vaculin
  • Patent number: 10338970
    Abstract: A method of scheduling assignment of resources to a plurality of applications includes: determining shares of the resources assigned to each application during a first period; determining shares of the resources assigned to each application during a second period that occurs after the first period; determining an imbalance value for each application that is based on a sum of the shares assigned to the corresponding application over both periods; and considering requests of the applications for resources in an order that depends on a result of comparing the imbalance values of the applications.
    Type: Grant
    Filed: September 20, 2016
    Date of Patent: July 2, 2019
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Peter D. Kirchner, Krzysztof P. Onak, Robert Saccone, Kanthi Sarpatwar, Joel L. Wolf
  • Publication number: 20180081722
    Abstract: A method of scheduling assignment of resources to a plurality of applications includes: determining shares of the resources assigned to each application during a first period; determining shares of the resources assigned to each application during a second period that occurs after the first period; determining an imbalance value for each application that is based on a sum of the shares assigned to the corresponding application over both periods; and considering requests of the applications for resources in an order that depends on a result of comparing the imbalance values of the applications.
    Type: Application
    Filed: September 20, 2016
    Publication date: March 22, 2018
    Inventors: PETER D. KIRCHNER, KRZYSZTOF P. ONAK, ROBERT SACCONE, KANTHI SARPATWAR, JOEL L. WOLF
  • Publication number: 20160292300
    Abstract: A system and method for performing network graph queries on a network graph includes a preprocessing module adapted to generate a data structure from the network graph and to store and dynamically maintain the data structure. The system and method also includes a query module adapted to receive a network query and to generate a query response that answers the network query from the data structure.
    Type: Application
    Filed: March 30, 2015
    Publication date: October 6, 2016
    Applicants: ALCATEL LUCENT USA INC., LGS INNOVATION LLC
    Inventors: Randeep S. Bhatia, Bhawna Gupta, Kanthi Sarpatwar, Lloyd Greenwald