Patents by Inventor Lam Minh Nguyen

Lam Minh Nguyen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240169253
    Abstract: Using a first dataset of labeled data, a model is trained by adjusting a feature extractor parameter, a classifier parameter, and a discriminator parameter of the model. Using the discriminator parameter and a parametric function of the feature extractor parameter, a plurality of samples of a dataset of unlabeled data is scored. A subset of the scored plurality of samples is selected for labeling. Responsive to receiving a label of each of the selected subset of the scored plurality of samples, the first dataset of labeled data is augmented with the selected subset of the scored plurality of samples and the label of each of the selected subset of the scored plurality of samples. Using the augmented dataset of labeled data, the model is retrained. The retraining comprises further adjusting the feature extractor parameter, the classifier parameter, and the discriminator parameter of the model.
    Type: Application
    Filed: November 22, 2022
    Publication date: May 23, 2024
    Applicant: International Business Machines Corporation
    Inventors: Dzung Tien Phan, Huozhi Zhou, Lam Minh Nguyen, Chandrasekhara K. Reddy, Jayant R. Kalagnanam
  • Publication number: 20240144052
    Abstract: A maintenance solution pipeline is automatically selected from a plurality of maintenance solution pipelines, based on obtained information. The maintenance solution pipeline is to be used in providing a physical asset maintenance solution for a plurality of physical assets. Code and model rendering for the maintenance solution pipeline automatically selected is initiated. Output from an artificial intelligence process is obtained. The output includes an automatically generated risk estimation relating to one or more conditions of at least one physical asset of the plurality of physical assets. Code and model rendering for the maintenance solution pipeline is re-initiated, based on the output from the artificial intelligence process. The maintenance solution pipeline automatically selected is reused.
    Type: Application
    Filed: October 31, 2022
    Publication date: May 2, 2024
    Inventors: Nianjun ZHOU, Pavankumar MURALI, Dzung Tien PHAN, Lam Minh NGUYEN
  • Publication number: 20240119298
    Abstract: In aspects of the disclosure, a method comprises training, by a computing system, a dynamics model of a cooperative multi-agent reinforcement learning (c-MARL) environment. The method further comprises processing, by the computing system, a perturbation optimizer to generate a state perturbation of the c-MARL environment, based on the dynamics model. The method further comprises selecting one or more agents of the c-MARL system as having enhanced vulnerability. The method further comprises attacking, by the computing system, the c-MARL system based on the state perturbation and the selected one or more agents.
    Type: Application
    Filed: September 23, 2022
    Publication date: April 11, 2024
    Inventors: Nhan Huu Pham, Lam Minh Nguyen, Jie Chen, Thanh Lam Hoang, Subhro Das
  • Publication number: 20240119274
    Abstract: Select an initial weight vector for a convex optimization sub-problem associated with a neural network having a non-convex network architecture loss surface. With at least one processor, approximate a solution to the convex optimization sub-problem that obtains a search direction, to learn a common classifier from training data. With the at least one processor, update the initial weight vector by subtracting the approximate solution to the convex optimization sub-problem times a first learning rate. With the at least one processor, repeat the approximating and updating steps, for a plurality of iterations, with the updated weight vector from a given one of the iterations taken as the initial weight vector for a next one of the iterations, to obtain a final weight vector for the neural network, until convergence to a global minimum is achieved, to implement the common classifier.
    Type: Application
    Filed: September 23, 2022
    Publication date: April 11, 2024
    Inventor: Lam Minh Nguyen
  • Publication number: 20240103457
    Abstract: Methods, systems, and computer program products for a decision-improvement framework are provided herein. A computer-implemented method includes obtaining regression functions that predict an output of processes of a physical system based on inputs received at each process; automatically generating one or more constraints and one or more objective functions for a model for the physical system based on the regression functions and a representation of the physical system, where the representation specifies relationships between at least a portion of the processes; identifying a set of parameter values for controlling the physical system based on the model; generating a score, for the set of parameter values, based on a predicted improvement of the physical system relative to historical performance of the physical system; and in response to the generated score satisfying a threshold, causing the physical system to be configured in accordance with the set of parameter values.
    Type: Application
    Filed: September 20, 2022
    Publication date: March 28, 2024
    Inventors: Dzung Tien Phan, Lam Minh Nguyen
  • Publication number: 20240103959
    Abstract: In example aspects of this disclosure, a method includes generating, by one or more computing devices, a parametric model that expresses condition states for each of a plurality of assets, and the probability of the assets transitioning between the condition states; generating, by the one or more computing devices, stochastic degradation predictions of a group of the assets, based on the condition states and the probability of transitioning between the condition states for at least some of the assets; and generating, by the one or more computing devices, a maintenance schedule based on: the stochastic degradation predictions of the group of the assets, costs of corrective maintenance for assets in a failed state, and costs of scheduled maintenance for the assets.
    Type: Application
    Filed: September 21, 2022
    Publication date: March 28, 2024
    Inventors: Pavankumar Murali, Dzung Tien Phan, Nianjun Zhou, Lam Minh Nguyen
  • Publication number: 20240096057
    Abstract: A computer implemented method for certifying robustness of image classification in a neural network is provided. The method includes initializing a neural network model. The neural network model includes a problem space and a decision boundary. A processor receives a data set of images, image labels, and a perturbation schedule. Images are drawn from the data set in the problem space. A distance from the decision boundary is determined for the images in the problem space. A re-weighting value is applied to the images. A modified perturbation magnitude is applied to the images. A total loss function for the images in the problem space is determined using the re-weighting value. A confidence level of the classification of the images in the data set is evaluated for certifiable robustness.
    Type: Application
    Filed: September 19, 2022
    Publication date: March 21, 2024
    Inventors: Lam Minh Nguyen, Wang Zhang, Subhro Das, Pin-Yu Chen, Alexandre Megretski, Luca Daniel
  • Publication number: 20240020528
    Abstract: An index sequence specifying an index of training data corresponding to a component of a cost function is generated. A first model parameter in the set of model parameters is set to an initial value. Using the index sequence, a neural network model comprising a set of weights is trained. As part of the training, using the index sequence, a learning rate, and a set of gradients, a subset of the set of model parameters is updated. As part of the training, a momentum term is set. As part of the training, using the momentum term as the first model parameter, the updating and the setting are repeated until reaching a training completion condition. The trained neural network model is used to predict an outcome by analyzing live data.
    Type: Application
    Filed: July 14, 2022
    Publication date: January 18, 2024
    Applicant: International Business Machines Corporation
    Inventors: Lam Minh Nguyen, HUYEN TRANG TRAN
  • Publication number: 20230316150
    Abstract: A method includes training, by one or more processing devices, a plurality of machine learning predictive models, thereby generating a plurality of trained machine learning predictive models. The method further includes generating, by the one or more processing devices, a solved machine learning optimization model, based at least in part on the plurality of trained machine learning predictive models. The method further includes outputting, by the one or more processing devices, one or more control input and predicted outputs based at least in part on the solved machine learning optimization model.
    Type: Application
    Filed: March 30, 2022
    Publication date: October 5, 2023
    Inventors: Dzung Tien Phan, Long Vu, Lam Minh Nguyen, Dharmashankar Subramanian
  • Publication number: 20230267339
    Abstract: In unsupervised interpretable machine learning, one or more datasets having multiple features can be received. A machine can be trained to jointly cluster and interpret resulting clusters of the dataset by at least jointly clustering the dataset into clusters and generating hyperplanes in a multi-dimensional feature space of the dataset, where the hyperplanes separate pairs of the clusters, where a hyperplane separates a pair of clusters. Jointly clustering the dataset into clusters and generating hyperplanes can repeat until convergence, where the clustering in a subsequent iteration uses the generated hyperplanes from a previous iteration to optimize performance of the clustering. The hyperplanes can be adjusted to further improve the performance of the clustering. The clusters and interpretation of the clusters can be provided, where a cluster's interpretation is provided based on hyperplanes that construct a polytope containing the cluster.
    Type: Application
    Filed: February 18, 2022
    Publication date: August 24, 2023
    Inventors: Dzung Tien Phan, Connor Aram Lawless, Jayant R. Kalagnanam, Lam Minh Nguyen, Chandrasekhara K. Reddy
  • Publication number: 20230252234
    Abstract: Software that performs the following operations: (i) receiving a set of graph predictions corresponding to an input text, where graph predictions of the set of graph predictions are generated by different respective machine learning models; (ii) blending the graph predictions of the set of graph predictions to generate a plurality of candidate blended graphs, where nodes and edges of the candidate blended graphs have respective selection metric values, generated using a selection metric function, that meet a minimum threshold; and (iii) selecting as an output blended graph a candidate blended graph of the plurality of candidate blended graphs having a highest total combination of selection metric values among the plurality of candidate blended graphs.
    Type: Application
    Filed: February 8, 2022
    Publication date: August 10, 2023
    Inventors: Thanh Lam Hoang, Gabriele Picco, Yufang Hou, Young-Suk Lee, Lam Minh Nguyen, Dzung Tien Phan, Vanessa Lopez Garcia, Ramon Fernandez Astudillo
  • Publication number: 20230251608
    Abstract: A method includes: receiving, by a computing device, data from sensors in a manufacturing environment; mapping, by the computing device, the data into a deep learning network; learning, by the computing device, correlations between inputs and outputs of the manufacturing environment using the data; pruning, by the computing device, the deep learning network; predicting, by the computing device and using the pruned network, an output of the pruned network from the inputs of the manufacturing environment; linearizing, by the computing device, the pruned network; optimizing, by the computing device, the output of the linearized pruned network to calculate predicted inputs for the manufacturing environment; and changing, by the computing device, operation inputs in the manufacturing environment to match the predicted inputs.
    Type: Application
    Filed: February 7, 2022
    Publication date: August 10, 2023
    Inventors: Dzung Tien Phan, Jayant R. Kalagnanam, Lam Minh Nguyen
  • Publication number: 20230196081
    Abstract: An approach to federated learning of a machine learning model may be provided. The approach may include broadcasting hyperparameters of a machine learning model to one or more client computing devices from a primary device associated with an outer loop or an inner loop. A gradient for the loss function may be calculated at the client device if previous gradients have been sufficiently large. If gradients exceeds a threshold, the client can send the mini-batch of gradients or the difference of the mini-batch of gradients back to the primary device. A search direction may be calculated based on the full gradient of the loss function for an outer loop or the mini-batch of gradient differences for an inner loop. A learning rate step may be calculated from the search direction. The hyperparameter may be updated for the inner loop based on the learning rate.
    Type: Application
    Filed: December 21, 2021
    Publication date: June 22, 2023
    Inventors: Lam Minh Nguyen, Dzung Tien Phan, Jayant R. Kalagnanam
  • Publication number: 20230186107
    Abstract: A system and method can be provided for constructing and training a decision tree for machine learning. A training set can be received. The decision tree can be initialized by constructing a root node and a root solver can be trained with the training set. A processor can grow the decision tree by iteratively splitting nodes of the decision tree, where at a node of the decision tree, dimension reduction is performed on features of data of the training set received at the node, and the data having reduced dimension is split based on a routing function, for routing to another node of the decision tree. The dimension reduction and the split can be performed together at the node based on solving a nonlinear optimization problem.
    Type: Application
    Filed: December 14, 2021
    Publication date: June 15, 2023
    Inventors: Dzung Tien Phan, Michael Huang, Pavankumar Murali, Lam Minh Nguyen
  • Publication number: 20230134798
    Abstract: Embodiments are provided for generating a reasonable language model learning for text data in a knowledge graph in a computing system by a processor. One or more data sources and one or more triples may be analyzed from a knowledge graph. Training data having one or more candidate labels associated with one or more of the triples may be generated. One or more reasonable language models may be trained based on the training data.
    Type: Application
    Filed: November 2, 2021
    Publication date: May 4, 2023
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Thanh Lam HOANG, Dzung Tien PHAN, Gabriele PICCO, Lam Minh NGUYEN, Vanessa LOPEZ GARCIA
  • Publication number: 20230128821
    Abstract: A computer implemented method of generating a classifier engine for machine learning includes receiving a set of data points. A semi-supervised k-means process is applied to the set of data points from each class. The set of data points in a class is clustered into multiple clusters of data points, using the semi-supervised k-means process. Multi-polytopes are constructed for one or more of the clusters from all classes. A support vector machine (SVM) process is run on every pair of clusters from all classes. Separation hyperplanes are determined for the clustered classes. Labels are determined for each cluster based on the separation by hyperplanes.
    Type: Application
    Filed: September 30, 2021
    Publication date: April 27, 2023
    Inventors: Dzung Tien Phan, Lam Minh Nguyen, Jayant R. Kalagnanam, Chandrasekhara K. Reddy, Srideepika Jayaraman
  • Patent number: 11568171
    Abstract: A computer-implemented method for a shuffling-type gradient for training a machine learning model using a stochastic gradient descent (SGD) includes the operations of uniformly randomly distributing data samples or coordinate updates of a training data, and calculating the learning rates for a no-shuffling scheme and a shuffling scheme. A combined operation of the no-shuffling scheme and the shuffling scheme of the training data is performed using a stochastic gradient descent (SGD) algorithm. The combined operation is switched to performing only the shuffling scheme from the no-shuffling scheme based on one or more predetermined criterion; and training the machine learning models with the training data based on the combined no-shuffling scheme and shuffling scheme.
    Type: Grant
    Filed: December 1, 2020
    Date of Patent: January 31, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Lam Minh Nguyen, Dzung Tien Phan
  • Publication number: 20220383138
    Abstract: A computer-implemented method for site-wide prediction optimization includes training a plurality of a mixed type of regression models with a mixed type of control variables for identifying control set-points of a site-wide operation. A decision tree regression model is trained to predict a status of the plurality of initial set-points for non-linear regression functions. The decision tree regression model is reformulated into a mixed-integer linear program (MILP) and solved by an MILP solver to find a global solution. An MILP surrogate is determined for a nonlinear optimization problem to provide a best solution for one or more of the non-linear regression functions using the best solution as a starting point for solving non-linear regression functions, and a set-point of the mixed control variables is recommended to control a throughput of the site-wide operation by executing a decomposition operation or a federated learning algorithm.
    Type: Application
    Filed: May 25, 2021
    Publication date: December 1, 2022
    Inventors: Dzung Tien Phan, Nhan Huu Pham, Lam Minh Nguyen
  • Publication number: 20220171996
    Abstract: A computer-implemented method for a shuffling-type gradient for training a machine learning model using a stochastic gradient descent (SGD) includes the operations of uniformly randomly distributing data samples or coordinate updates of a training data, and calculating the learning rates for a no-shuffling scheme and a shuffling scheme. A combined operation of the no-shuffling scheme and the shuffling scheme of the training data is performed using a stochastic gradient descent (SGD) algorithm. The combined operation is switched to performing only the shuffling scheme from the no-shuffling scheme based on one or more predetermined criterion; and training the machine learning models with the training data based on the combined no-shuffling scheme and shuffling scheme.
    Type: Application
    Filed: December 1, 2020
    Publication date: June 2, 2022
    Inventors: Lam Minh Nguyen, Dung Tien Phan