Network Learning Techniques (e.g., Back Propagation) Patents (Class 382/157)
  • Patent number: 5535302
    Abstract: A method and apparatus is provided to determine image affine flow from time-varying imagery. The novel artificial neural computational system of a cortical hypercolumn comprising a plurality of specific orientation (SO) columns and a least square error fitting circuit is based on a Lie group model of cortical visual motion processing. Time-varying imagery, comprising intensity imagery and time-derivative imagery is provided to a plurality of specific orientation (SO) columns comprising simple cells and Lie germs. The cortical representation of image time derivative and affine Lie derivatives are extracted from responses of simple cells and Lie germs, respectively. The temporal derivative and affine Lie-derivative information obtained from each specific orientation (SO) columns is applied to least square error fitting analog circuit having a three layer multiplicative neural architecture to determine image affine flow components in accordance with an error minimization gradient dynamical system technique.
    Type: Grant
    Filed: December 1, 1994
    Date of Patent: July 9, 1996
    Inventor: Tien-Ren Tsao
  • Patent number: 5528700
    Abstract: A character recognition system based on a neural network determines activation patterns in an input layer and output layer, increases weights of synapses in a middle layer so that neurons activate with more than a certain rate among those corresponding to neurons in the input layer and the output layer and repeats the same process for each neuron in the middle layer. The input layer and output layer possess a plurality of neurons which activate and output certain data according to a specific result and the middle layer is between the input layer and output layer. The middle layer also possesses a plurality of neurons which are connected to each neuron in the input layer and output layer.
    Type: Grant
    Filed: April 25, 1994
    Date of Patent: June 18, 1996
    Assignee: Yozan Inc.
    Inventors: Sunao Takatori, Makoto Yamamoto
  • Patent number: 5524065
    Abstract: Pattern recognition system which provides an indication of the confidence with which a candidate is selected for an unknown pattern. The pattern recognition apparatus includes an image data input device, a host for segmenting the image data into unknown patterns, and a character recognition device for providing a candidate for each unknown pattern. The character recognition device includes a confidence level indicator for providing a confidence level indication. In one aspect, the confidence level indication is determined based on the proximity of an unknown pattern to a known pattern. In another aspect, the confidence level indication is determined based on the consistency with which the unknown pattern is recognized using different recognition functions. In yet another aspect, the confidence level indication is determined by ensuring that a candidate is not provided unless the candidate is closer than a predetermined distance from a known pattern.
    Type: Grant
    Filed: July 25, 1994
    Date of Patent: June 4, 1996
    Assignee: Canon Kabushiki Kaisha
    Inventor: Toshiaki Yagasaki
  • Patent number: 5515454
    Abstract: A self-organizing circuit providing improved performance and reduction in costs. The improvements are of two basic types: those that apply to improved circuit design and those that apply to improved "teaching" of the circuit. A method to allow the circuit elements to learn new patterns quickly is provided. Also, a mechanism by which serial or sequential information can be learned is disclosed. Finally, the invention includes mechanisms by which the circuits can be simplified by reducing the number of interconnections within the circuit. Improved teaching of the circuit includes ways by which the self-organizing circuit can be quickly taught new patterns. First by making each input to a subcircuit compete against the many other inputs to that subcircuit, by weighting each input according to simple Boolean functions, and lastly by incorporating a method by which information can be added to the circuit after the circuit has already learned some information.
    Type: Grant
    Filed: July 30, 1992
    Date of Patent: May 7, 1996
    Inventor: B. Shawn Buckley
  • Patent number: 5500905
    Abstract: A multi-layered pattern recognition neural network (30) is disclosed that comprises an input layer (50) that is operable to be mapped onto an input space that includes a scan window (32). Two hidden layers (54) and (58) map the input space to an output layer (34). The hidden layers utilize a local receptor field architecture and store representations of objects within the scan window (32) for mapping into one of a plurality of output nodes. Further, the output layer (34) is also operable to store representations of desired distances between the center of the scan window (32) and the next adjacent object thereto and also the distance between the center of the scan window (32) and the center of the current object. A scanning system can then utilize the information regarding the distance to the next adjacent object, which is stored in an output vector (40) to incrementally jump to the center of the next adjacent character rather than scan the entire distance therebetween.
    Type: Grant
    Filed: March 16, 1992
    Date of Patent: March 19, 1996
    Assignee: Microelectronics and Computer Technology Corporation
    Inventors: Gale L. Martin, James A. Pittman, Mosfeq Rashid
  • Patent number: 5493688
    Abstract: Pattern categorization is provided by a self-organizing analog field/layer which learns many-to-many, analog spatiotemporal mappings. The field/layer employs a set of input nodes, each input node having two long term memory weights, and a set of output nodes. Each input node is for categorizing patterns with respect to a plurality of categories. The long term memory weights of an input node encode the patterns categorized by the input node. Each input node generates signals as a function of respective long term memory weights and input signals to the input node. Each input node is coupled to a different output node. Each output node receives signals generated by the respective input node and selects a category of the respective input node. The output nodes provide a mapping between plural parts of the input pattern and plural categories of the input nodes. Category selections of the output nodes are modified such that sum of the output signals from the output nodes is within a predefined range.
    Type: Grant
    Filed: January 11, 1993
    Date of Patent: February 20, 1996
    Assignee: Booz, Allen & Hamilton, Inc.
    Inventor: Fred S. Weingard
  • Patent number: 5444796
    Abstract: An unsupervised back propagation method for training neural networks. For a set of inputs, target outputs are assigned 1's and 0's randomly or arbitrarily for a small number of outputs. The learning process is initiated and the convergence of outputs towards targets is monitored. At intervals, the learning is paused, and the values for those targets for the outputs which are converging at a less than average rate, are changed (e.g., 0.fwdarw.1, or 1.fwdarw.0), and the learning is then resumed with the new targets. The process is continuously iterated and the outputs converge on a stable classification, thereby providing unsupervised back propagation. In a further embodiment, samples classified with the trained network may serve as the training sets for additional subdivisions to grow additional layers of a hierarchical classification tree which converges to indivisible branch tips. After training is completed, such a tree may be used to classify new unlabelled samples with high efficiency.
    Type: Grant
    Filed: October 18, 1993
    Date of Patent: August 22, 1995
    Assignee: Bayer Corporation
    Inventor: Leonard Ornstein