Patents by Inventor Joshua David Griffin

Joshua David Griffin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11062219
    Abstract: A computer solves a nonlinear optimization problem. An optimality check is performed for a current solution to an objective function that is a nonlinear equation with constraint functions on decision variables. When the performed optimality check indicates that the current solution is not an optimal solution, a barrier parameter value is updated, and a Lagrange multiplier value is updated for each constraint function based on a result of a complementarity slackness test. The current solution to the objective function is updated using a search direction vector determined by solving a primal-dual linear system that includes a dual variable for each constraint function and a step length value determined for each decision variable and for each dual variable. The operations are repeated until the optimality check indicates that the current solution is the optimal solution or a predefined number of iterations has been performed.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: July 13, 2021
    Assignee: SAS Institute Inc.
    Inventors: Joshua David Griffin, Riadh Omheni, Yan Xu
  • Patent number: 11055639
    Abstract: Manufacturing processes can be optimized using machine learning models. For example, a system can execute an optimization model to identify a recommended set of values for configurable settings of a manufacturing process associated with an object. The optimization model can determine the recommended set of values by implementing an iterative process using an objective function. Each iteration of the iterative process can include selecting a current set of candidate values for the configurable settings from within a current region of a search space defined by the optimization model; providing the current set of candidate values as input to a trained machine learning model that can predict a value for a target characteristic of the object or the manufacturing process based on the current set of candidate values; and identifying a next region of the search space to use in a next iteration of the iterative process based on the value.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: July 6, 2021
    Assignee: SAS INSTITUTE INC.
    Inventors: Pelin Cay, Nabaruna Karmakar, Natalia Summerville, Varunraj Valsaraj, Antony Nicholas Cooper, Steven Joseph Gardner, Joshua David Griffin
  • Patent number: 10963802
    Abstract: A computing device selects decision variable values. A lower boundary value and an upper boundary value is defined for a decision variable. (A) A plurality of decision variable configurations is determined using a search method. The value for the decision variable is between the lower boundary value and the upper boundary value. (B) A decision variable configuration is selected. (C) A model of the model type is trained using the decision variable configuration. (D) The model is scored to compute an objective function value. (E) The computed objective function value and the selected decision variable configuration are stored. (F) (B) through (E) is repeated for a plurality of decision variable configurations. (G) The lower boundary value and the upper boundary value are updated using the objective function value and the decision variable configuration stored. Repeat (A)-(F) with the lower boundary value and the upper boundary value updated in (G).
    Type: Grant
    Filed: December 14, 2020
    Date of Patent: March 30, 2021
    Assignee: SAS Institute Inc.
    Inventors: Steven Joseph Gardner, Joshua David Griffin, Yan Xu, Yan Gao
  • Patent number: 10949747
    Abstract: A computer trains a neural network model. (A) Observation vectors are randomly selected from a plurality of observation vectors. (B) A forward and backward propagation of a neural network is executed to compute a gradient vector and a weight vector. (C) A search direction vector is computed. (D) A step size value is computed. (E) An updated weight vector is computed. (F) Based on a predefined progress check frequency value, second observation vectors are randomly selected, a progress check objective function value is computed given the weight vector, the step size value, the search direction vector, and the second observation vectors, and based on an accuracy test, the mini-batch size value is updated. (G) (A) to (F) are repeated until a convergence parameter value indicates training of the neural network is complete. The weight vector for a next iteration is the computed updated weight vector.
    Type: Grant
    Filed: November 17, 2020
    Date of Patent: March 16, 2021
    Assignee: SAS INSTITUTE INC.
    Inventors: Majid Jahani, Joshua David Griffin, Seyedalireza Yektamaram, Wenwen Zhou
  • Patent number: 10769528
    Abstract: A computer trains a neural network model. (B) A neural network is executed to compute a post-iteration gradient vector and a current iteration weight vector. (C) A search direction vector is computed using a Hessian approximation matrix and the post-iteration gradient vector. (D) A step size value is initialized. (E) An objective function value is computed that indicates an error measure of the executed neural network. (F) When the computed objective function value is greater than an upper bound value, the step size value is updated using a predefined backtracking factor value. The upper bound value is computed as a sliding average of a predefined upper bound updating interval value number of previous upper bound values. (G) (E) and (F) are repeated until the computed objective function value is not greater than the upper bound value. (H) An updated weight vector is computed to describe a trained neural network model.
    Type: Grant
    Filed: October 2, 2019
    Date of Patent: September 8, 2020
    Assignee: SAS Institute Inc.
    Inventors: Ben-hao Wang, Joshua David Griffin, Seyedalireza Yektamaram, Yan Xu
  • Patent number: 10360517
    Abstract: A computing device automatically selects hyperparameter values based on objective criteria to train a predictive model. Each session of a plurality of sessions executes training and scoring of a model type using an input dataset in parallel with other sessions of the plurality of sessions. Unique hyperparameter configurations are determined using a search method and assigned to each session. For each session of the plurality of sessions, training of a model of the model type is requested using a training dataset and the assigned hyperparameter configuration, scoring of the trained model using a validation dataset and the assigned hyperparameter configuration is requested to compute an objective function value, and the received objective function value and the assigned hyperparameter configuration are stored. A best hyperparameter configuration is identified based on an extreme value of the stored objective function values.
    Type: Grant
    Filed: November 27, 2017
    Date of Patent: July 23, 2019
    Assignee: SAS INSTITUTE INC.
    Inventors: Patrick Nathan Koch, Brett Alan Wujek, Oleg Borisovich Golovidov, Steven Joseph Gardner, Joshua David Griffin, Scott Russell Pope, Yan Xu
  • Publication number: 20180240041
    Abstract: A computing device automatically selects hyperparameter values based on objective criteria to train a predictive model. Each session of a plurality of sessions executes training and scoring of a model type using an input dataset in parallel with other sessions of the plurality of sessions. Unique hyperparameter configurations are determined using a search method and assigned to each session. For each session of the plurality of sessions, training of a model of the model type is requested using a training dataset and the assigned hyperparameter configuration, scoring of the trained model using a validation dataset and the assigned hyperparameter configuration is requested to compute an objective function value, and the received objective function value and the assigned hyperparameter configuration are stored. A best hyperparameter configuration is identified based on an extreme value of the stored objective function values.
    Type: Application
    Filed: November 27, 2017
    Publication date: August 23, 2018
    Inventors: Patrick Nathan Koch, Brett Alan Wujek, Oleg Borisovich Golovidov, Steven Joseph Gardner, Joshua David Griffin, Scott Russell Pope, Yan Xu
  • Patent number: 10049302
    Abstract: A computing device trains models for streaming classification. A baseline penalty value is computed that is inversely proportional to a square of a maximum explanatory variable value. A set of penalty values is computed based on the baseline penalty value. For each penalty value of the set of penalty values, a classification type model is trained using the respective penalty value and the observation vectors to compute parameters that define a trained model, the classification type model is validated using the respective penalty value and the observation vectors to compute a validation criterion value that quantifies a validation error, and the validation criterion value, the respective penalty value, and the parameters that define a trained model are stored to the computer-readable medium. The classification type model is trained to predict the response variable value of each observation vector based on the respective explanatory variable value of each observation vector.
    Type: Grant
    Filed: March 5, 2018
    Date of Patent: August 14, 2018
    Assignee: SAS Institute Inc.
    Inventors: Jun Liu, Yan Xu, Joshua David Griffin, Manoj Keshavmurthi Chari