Patents by Inventor Gukyeong Kwon

Gukyeong Kwon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12079738
    Abstract: Neural networks and learning algorithms can use a variance of gradients to provide a heuristic understanding of the model. The variance of gradients can be used in active learning techniques to train a neural network. Techniques include receiving a dataset with a vector. The dataset can be annotated and a loss calculated. The loss value can be used to update the neural network through backpropagation. An updated dataset can be used to calculate additional losses. The loss values can be added to a pool of gradients. A variance of gradients can be calculated from the pool of gradient vectors. The variance of gradients can be used to update a neural network.
    Type: Grant
    Filed: February 10, 2021
    Date of Patent: September 3, 2024
    Assignee: Ford Global Technologies, LLC
    Inventors: Armin Parchami, Ghassan AlRegib, Dogancan Temel, Mohit Prabhushankar, Gukyeong Kwon
  • Publication number: 20220327389
    Abstract: In a method for determining if a test data set is anomalous in a deep neural network that has been trained with a plurality of training data sets resulting in back propagated training gradients having statistical measures thereof, the test data set is forward propagated through the deep neural network so as to generate test data intended labels including at least original data, prediction labels, and segmentation maps. The test data intended labels are back propagated through the deep neural network so as to generate a test data back propagated gradient. If the test data back propagated gradient differs from one of the statistical measures of the back propagated training gradients by a predetermined amount, then an indication that the test data set is anomalous is generated. The statistical measures of the back propagated training gradient include a quantity including an average of all the back propagated training gradients.
    Type: Application
    Filed: September 4, 2020
    Publication date: October 13, 2022
    Inventors: Ghassan AlRegib, Gukyeong Kwon, Mohit Prabhushankar, Dogancan Temel
  • Publication number: 20220253724
    Abstract: Neural networks and learning algorithms can use a variance of gradients to provide a heuristic understanding of the model. The variance of gradients can be used in active learning techniques to train a neural network. Techniques include receiving a dataset with a vector. The dataset can be annotated and a loss calculated. The loss value can be used to update the neural network through backpropagation. An updated dataset can be used to calculate additional losses. The loss values can be added to a pool of gradients. A variance of gradients can be calculated from the pool of gradient vectors. The variance of gradients can be used to update a neural network.
    Type: Application
    Filed: February 10, 2021
    Publication date: August 11, 2022
    Applicant: Ford Global Technologies, LLC
    Inventors: Armin Parchami, Ghassan AlRegib, Dogancan Temel, Mohit Prabhushankar, Gukyeong Kwon