Patents Examined by Hansol Doh
-
Patent number: 11062208Abstract: A computer-implemented method and computer processing system are provided for update management for a neural network. The method includes performing an isotropic update process on the neural network using a Resistive Processing Unit. The isotropic update process uses a multiplicand and a multiplier from a multiplication operation. The performing step includes scaling the multiplicand and the multiplier to have a same order of magnitude.Type: GrantFiled: December 14, 2017Date of Patent: July 13, 2021Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Tayfun Gokmen, Oguzhan Murat Önen
-
Patent number: 11049011Abstract: Approaches for classifying training samples with minimal error in a neural network using a low complexity neural network classifier, are described. In one example, for the neural network, an upper bound on the Vapnik-Chervonenkis (VC) dimension is determined. Thereafter, an empirical error function corresponding to the neural network is determined. A modified error function based on the upper bound on the VC dimension and the empirical error function is generated, and used for training the neural network.Type: GrantFiled: November 16, 2017Date of Patent: June 29, 2021Assignee: Indian Institute of Technology DelhiInventor: Jayadeva
-
Patent number: 11049007Abstract: A recognition apparatus based on a deep neural network, a training apparatus and methods thereof. The deep neural network is obtained by inputting training samples comprising positive samples and negative samples into an input layer of the deep neural network and training. The apparatus includes: a judging unit configured to judge that a sample to be recognized is a suspected abnormal sample when confidences of positive sample classes in a classification result outputted by an output layer of the deep neural network are all less than a predefined threshold value. Hence, reliability of a confidence of a classification result outputted by the deep neural network may be efficiently improved.Type: GrantFiled: May 5, 2017Date of Patent: June 29, 2021Assignee: FUJITSU LIMITEDInventors: Song Wang, Wei Fan, Jun Sun
-
Patent number: 10929761Abstract: Systems and methods for automatically detecting annotation discrepancies in annotated training data samples and repairing the annotated training data samples for a machine learning-based automated dialogue system include evaluating a corpus of a plurality of distinct training data samples; identifying one or more of a slot span defect and a slot label defect of a target annotated slot span of a target training data sample of the corpus based on the evaluation; and automatically correcting one or more annotations of the target annotated slot span based on the identified one or more of the slot span defect and the slot label defect.Type: GrantFiled: June 8, 2020Date of Patent: February 23, 2021Assignee: Clinic, Inc.Inventors: Stefan Larson, Anish Mahendran, Parker Hill, Jonathan K. Kummerfeld, Michael A. Laurenzano, Lingjia Tang, Jason Mars
-
Patent number: 10909630Abstract: Embodiments of the present invention are generally directed towards providing systems and methods for providing risk recommendation, mitigation and prediction. In particular, embodiments of the present invention are configured to allow for input of data related to known hazards to be interpreted and tracked or estimation of risk present in a variety of scenarios. Further embodiments of the present invention are configured to allow for predictive modeling and analysis of risk based on data as well as predictive behavior and other modeled information.Type: GrantFiled: March 14, 2017Date of Patent: February 2, 2021Inventor: David Baxter
-
Patent number: 10783432Abstract: A computer-implemented method and computer processing system are provided for update management for a neural network. The method includes performing an isotropic update process on the neural network using a Resistive Processing Unit. The isotropic update process uses a multiplicand and a multiplier from a multiplication operation. The performing step includes scaling the multiplicand and the multiplier to have a same order of magnitude.Type: GrantFiled: April 14, 2017Date of Patent: September 22, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Tayfun Gokmen, Oguzhan Murat Önen
-
Patent number: 10671908Abstract: A differential recurrent neural network (RNN) is described that handles dependencies that go arbitrarily far in time by allowing the network system to store states using recurrent loops without adversely affecting training. The differential RNN includes a state component for storing states, and a trainable transition and differential non-linearity component which includes a neural network. The trainable transition and differential non-linearity component takes as input, an output of the previous stored states from the state component along with an input vector, and produces positive and negative contribution vectors which are employed to produce a state contribution vector. The state contribution vector is input into the state component to create a set of current states. In one implementation, the current states are simply output.Type: GrantFiled: April 14, 2017Date of Patent: June 2, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventor: Patrice Simard
-
Patent number: 10572800Abstract: Aspects of the present disclosure describe techniques for training a convolutional neural network using an inconsistent stochastic gradient descent (ISGD) algorithm. Training effort for training batches used by the ISGD algorithm are dynamically adjusted according to a determined loss for a given training batch which are classified into two sub states—well-trained or under-trained. The ISGD algorithm provides more iterations for under-trained batches while reducing iterations for well-trained ones.Type: GrantFiled: February 2, 2017Date of Patent: February 25, 2020Assignee: NEC CorporationInventors: Linnan Wang, Yi Yang, Renqiang Min, Srimat Chakradhar