Recurrent Patents (Class 706/30)
  • Patent number: 7324980
    Abstract: This invention relates to an information processing device and method that enable generation of an unlearned new pattern. Data xt corresponding to a predetermined time series pattern is inputted to an input layer (11) of a recurrent neural network (1), and a prediction value x*t+1 is acquired from an output layer 13. A difference between teacher data xt+1 and the prediction value x*t+1 is learned by a back propagation method, and a weighting coefficient of an intermediate layer 12 is set at a predetermined value. After the recurrent neural network is caused to learn plural time series patterns, a parameter having a different value from the value in learning is inputted to parametric bias nodes (11-2), and an unlearned time series pattern corresponding to the parameter is generated from the output layer (13). This invention can be applied to a robot.
    Type: Grant
    Filed: January 21, 2003
    Date of Patent: January 29, 2008
    Assignees: Sony Corporation, Riken
    Inventors: Masato Ito, Jun Tani
  • Patent number: 7321882
    Abstract: A method for the supervised teaching of a recurrent neutral network (RNN) is disclosed. A typical embodiment of the method utilizes a large (50 units or more), randomly initialized RNN with a globally stable dynamics. During the training period, the output units of this RNN are teacher-forced to follow the desired output signal. During this period, activations from all hidden units are recorded. At the end of the teaching period, these recorded data are used as input for a method which computes new weights of those connections that feed into the output units. The method is distinguished from existing training methods for RNNs through the following characteristics: (1) Only the weights of connections to output units are changed by learning—existing methods for teaching recurrent networks adjust all network weights.
    Type: Grant
    Filed: October 5, 2001
    Date of Patent: January 22, 2008
    Assignee: Fraunhofer-Gesellschaft zur Foederung der Angewandten Forschung e.V.
    Inventor: Herbert Jaeger
  • Patent number: 7280989
    Abstract: A neural network computer (20) includes a weighting network (21) coupled to a plurality of phase-locked loop circuits (251-25N). The weighting network (21) has a plurality of weighting circuits (C11, . . . , CNN) having output terminals connected to a plurality of adder circuits (311-31N). A single weighting element (Ckj) typically has a plurality of output terminals coupled to a corresponding adder circuit (31k). Each adder circuit (31k) is coupled to a corresponding bandpass filter circuit (31k) which is in turn coupled to a corresponding phase-locked loop circuit (25k). The weighting elements (C1,1, . . . , CN,N) are programmed with connection strengths, wherein the connection strengths have phase-encoded weights. The phase relationships are used to recognize an incoming pattern.
    Type: Grant
    Filed: January 26, 2001
    Date of Patent: October 9, 2007
    Assignee: Arizona Board of Regents
    Inventors: Frank C. Hoppensteadt, Eugene M. Izhikevich
  • Patent number: 7089174
    Abstract: A first model (10), such as a architectural level model or an instruction set simulator model makes calls to a second model (12), such as a pipeline simulator for a data processing device returning cycle count or energy consumption values. The calls to the second model are relatively slow. The system stalls the returned behavioural characteristics from the second model (12) in a memo table (14) and when a sufficient number of these have been returned with sufficiently little variation between them, then they are marked as being valid for use in place of a call to the second model (12), thus speeding up modelling.
    Type: Grant
    Filed: February 21, 2003
    Date of Patent: August 8, 2006
    Assignee: ARM Limited
    Inventors: Syed Samin Ishtiaq, Peter Neal, John Mark Burton
  • Patent number: 7089219
    Abstract: An information processing apparatus includes a first recurrent neural network (RNN) for performing processing which corresponds to a time-series and a second RNN for processing another correlated time-series. The difference between a context set output by the first RNN and a context set output by the second RNN is computed by a subtractor, and the obtained difference is used as a prediction error. Backpropagation is performed based on the prediction error, thus determining a coefficient for each neuron of an output layer, an intermediate layer, and an input layer.
    Type: Grant
    Filed: March 25, 2005
    Date of Patent: August 8, 2006
    Assignee: Sony Corporation
    Inventor: Jun Tani
  • Patent number: 7082421
    Abstract: An information processing apparatus includes a first recurrent neural network (RNN) for performing processing which corresponds to a time-series and a second RNN for processing another correlated time-series. The difference between a context set output by the first RNN and a context set output by the second RNN is computed by a subtractor, and the obtained difference is used as a prediction error. Backpropagation is performed based on the prediction error, thus determining a coefficient for each neuron of an output layer, an intermediate layer, and an input layer.
    Type: Grant
    Filed: March 25, 2005
    Date of Patent: July 25, 2006
    Assignee: Sony Corporation
    Inventor: Jun Tani
  • Patent number: 7072875
    Abstract: An information processing apparatus includes a first recurrent neural network (RNN) for performing processing which corresponds to a time-series and a second RNN for processing another correlated time-series. The difference between a context set output by the first RNN and a context set output by the second RNN is computed by a subtractor, and the obtained difference is used as a prediction error. Backpropagation is performed based on the prediction error, thus determining a coefficient for each neuron of an output layer, an intermediate layer, and an input layer.
    Type: Grant
    Filed: March 25, 2005
    Date of Patent: July 4, 2006
    Assignee: Sony Corporation
    Inventor: Jun Tani
  • Patent number: 7028017
    Abstract: A temporal summation device can be composed of one or more nanoconnections having an input and an output thereof, wherein an input signal provided to the input causes one or more of the nanoconnection to experience an increase in connection strength thereof over time. Additionally, a voltage divider is formed by the nanoconnection(s) and a resistor connected to the output of the nanoconnection(s), such that the voltage divider provides a voltage at the output the nanoonnection(s) that is in direct proportion to the connection strength of nanoconnection(s). An amplifier is also connected to the voltage divider, wherein when the voltage provided by the voltage divider attains a desired threshold voltage, the amplifier attains a high voltage output thereby providing a temporal summation device thereof.
    Type: Grant
    Filed: January 31, 2005
    Date of Patent: April 11, 2006
    Assignee: Knowm Tech, LLC
    Inventor: Alex Nugent
  • Patent number: 6963862
    Abstract: A method for training a recurrent network represented by x(k+1)=f(W x(k)), where W is a weight matrix, x is the output of the network, and K is a time index includes (a) determining the weight matrix at a first time increment, (b) incrementing the time increment associated with a received data point, and (c) determining a change in the weight matrix at the incremented time interval according to the formula: ? ? ? ? W ? ( K ) = ? ? ? ? W ? ( K - 1 ) + ? ? ? ? ? ? ( K ) ? x T ? ( K - 1 ) ? ? ? V - 1 ? ( K - 1 ) - B ? ( K - 1 ) ? ? ? V - 1 ? ( K - 1 ) ? ? ? x ? ? ? ( K - 1 ) ? [ V - 1 ? ( K - 1 ) ? ? ? x ? ? ? ( K - 1 ) ] T 1 + x T ? ( K - 1 ) ? ? ? V - 1 ? ( K - 1 ) ? ? ? x ? ? ? ( K - 1 )
    Type: Grant
    Filed: March 30, 2001
    Date of Patent: November 8, 2005
    Assignee: The Texas A&M University System
    Inventors: Alexander G. Parlos, Amir F. Atiya
  • Patent number: 6915283
    Abstract: An information processing apparatus includes a first recurrent neural network (RNN) for performing processing which corresponds to a time-series and a second RNN for processing another correlated time-series. The difference between a context set output by the first RNN and a context set output by the second RNN is computed by a subtractor, and the obtained difference is used as a prediction error. Backpropagation is performed based on the prediction error, thus determining a coefficient for each neuron of an output layer, an intermediate layer, and an input layer.
    Type: Grant
    Filed: July 2, 2001
    Date of Patent: July 5, 2005
    Assignee: Sony Corporation
    Inventor: Jun Tani
  • Patent number: 6898582
    Abstract: A method and apparatus for processing a composite signal generated by a transient signal generation mechanism to extract a repetitive low SNR transient signal, such as an evoked potential (EP) appearing in an electroencephalogram (EEG) generated in response to sensory stimulation, by: (a) dynamically identifying, via a learning process, the major transient signal types in the composite signal; (b) decomposing the identified major transient signal types into their respective constituent components; (c) synthesizing a parametric model emulating the transient signal generation mechanism; and (d) utilizing the model and the constituent components to identify and extract the low SNR transient signal from the composite signal.
    Type: Grant
    Filed: September 26, 2001
    Date of Patent: May 24, 2005
    Assignee: Algodyne, Ltd.
    Inventor: Daniel H. Lange
  • Patent number: 6832214
    Abstract: Disclosed is a system, method, and program for generating a compiler to map a code set to object code capable of being executed on an operating system platform. At least one neural network is trained to convert the code set to object code. The at least one trained neural network can then be used to convert the code set to the object code.
    Type: Grant
    Filed: December 7, 1999
    Date of Patent: December 14, 2004
    Assignee: International Business Machines Corporation
    Inventor: Chung T. Nguyen
  • Patent number: 6826550
    Abstract: Provided is a compiler to map application program code to object code capable of being executed on an operating system platform. A first neural network module is trained to generate characteristic output based on input information describing attributes of the application program. A second neural network module is trained to receive as input the application program code and the characteristic output and, in response, generate object code. The first and second neural network modules are used to convert the application program code to object code.
    Type: Grant
    Filed: December 15, 2000
    Date of Patent: November 30, 2004
    Assignee: International Business Machines Corporation
    Inventors: Michael Wayne Brown, Chung Tien Nguyen
  • Patent number: 6799171
    Abstract: A neural network system including a plurality of tiers of interconnected computing elements. The plurality of tiers includes an input tier whereto a sequence of input speech vectors is applied at a first rate. Two of the plurality of tiers are interconnected through a decimator configured to reduce the first rate of the sequence of input vectors. Alternatively, two of the plurality of tiers are interconnected through an interpolator configured to increase the first rate of the sequence of input vectors.
    Type: Grant
    Filed: March 1, 2001
    Date of Patent: September 28, 2004
    Assignee: Swisscom AG
    Inventor: Robert Van Kommer
  • Patent number: 6792413
    Abstract: This invention provides a data processing apparatus which can store and recall more complicated time-series data than those processed in related art technologies. In the data processing apparatus, a recurrent neural network (RNN) of higher layer generates long-period parameter and supplies it to an input layer of RNN of lower layer via a computing block. The RNN uses this input as a parameter and computes short-period input.
    Type: Grant
    Filed: February 6, 2002
    Date of Patent: September 14, 2004
    Assignee: Sony Corporation
    Inventor: Jun Tani
  • Patent number: 6751601
    Abstract: The invention consists of a learning system (108) based on the Dynamical System Architecture (DSA). Every implementation of the DSA is composed of a dynamic system (112) and a static system (116). The dynamic system (112) generates an intermediate output (114), which together with an external input (100) is fed into the static system (116), which produces a generated output (106). Every time the dynamic system (112) is reset it produces the same trajectory: an intermediate output (114) that does not cross over itself during the temporal span of each of the events generated by a reference system (102) whose behavior has to be duplicated. The static system (116) can be anything that behaves like a trainable universal function approximator. Training uses the intermediate output (114), the external input (100), the observed output (104) produced by the reference system (102) whose behavior is going to be mimicked, and a self-organizing procedure that only modifies the parameters of the static system (116).
    Type: Grant
    Filed: July 20, 2001
    Date of Patent: June 15, 2004
    Inventor: Pablo Zegers
  • Patent number: 6735580
    Abstract: A neural network based universal time series prediction system for financial securities includes a pipelined recurrent ANN architecutre having a plurality of identical modules to first adjust internal weights and biases in response to a first training set representing a nonlinear financial time series of samples of a financial quantity and a target value, and then determine and store an estimated prediction error of the ANN in order to adjust short time stock price predictions in accordance with the stored prediction error. The prediction system is also designed to output upper and lower prediction bounds within a confidence region.
    Type: Grant
    Filed: August 25, 2000
    Date of Patent: May 11, 2004
    Assignee: Westport Financial LLC
    Inventors: Liang Li, Yi Tang, Bin Li, Xiaohua Wu
  • Patent number: 6728691
    Abstract: Computation elements are connected to one another with a first subsystem having a first input computation element, to which time series values, which each describe one state of a system in a first state space at a time, can be supplied. The first input computation element is connected to a first intermediate computation element, by which a state of the system can be described in a second state space at a time. In a second subsystem a second intermediate computation element, by which a state of the system can be described in the second state space at a time, is connected to a first output computation element, on which a first output signal can be tapped off. In a third subsystem a third intermediate computation element, by which a state of the system can be described in the second state space at a time, is connected to a second output computation element, on which a second output signal can be tapped off.
    Type: Grant
    Filed: September 4, 2001
    Date of Patent: April 27, 2004
    Assignee: Siemens Aktiengesellschaft
    Inventors: Ralf Neuneier, Hans-Georg Zimmermann
  • Patent number: 6708160
    Abstract: A method, system and computer program product for implementing at least one of a learning-based diagnostics system and a control system (e.g., using a neural network). By using ObjectNets to model general object types, it is possible to design a control system that represents system components as relational structures rather than fixed vectors. Such an advance is possible by exploiting non-Euclidean principles of symmetry.
    Type: Grant
    Filed: April 6, 2000
    Date of Patent: March 16, 2004
    Inventor: Paul J. Werbos
  • Patent number: 6601051
    Abstract: A neural system is disclosed for processing an exogenous input process to produce a good outward output process with respect to a performance criterion, even if the range of one or both of these processes is necessarily large and/or keeps necessarily expanding during the operation of the neural system. The disclosed neural system comprises a recurrent neural network (RNN) and at least one range extender or reducer, each of which is a dynamic transformer. A range reducer transforms dynamically at least one component of the exogenous input process into inputs to at least one input neuron of said RNN. A range extender transforms dynamically outputs of at least one output neuron of said RNN into at least one component of the outward output process. There are many types of range extender and reducer, which have different degrees of effectiveness and computational costs.
    Type: Grant
    Filed: July 11, 1997
    Date of Patent: July 29, 2003
    Assignee: Maryland Technology Corporation
    Inventors: James Ting-Ho Lo, Lei Yu
  • Patent number: 6601054
    Abstract: Active vibration control (AVC) systems without online path modeling and controller adjustment are provided that are able to adapt to an uncertain operating environment. The controller (250, 280, 315, 252, 282, 317, 254, 319) of such an AVC system is an adaptive recursive neural network whose weights are determined in an offline training and are held fixed online during the operation of the system. AVC feedforward, feedback, and feedforward-feedback systems in accordance with the present invention are described. An AVC feedforward system has no error sensor and an AVC feedback system has no reference sensor. All sensor outputs of an AVC system are processed by the controller for generating control signals to drive at least one secondary source (240). While an error sensor (480, 481) must be a vibrational sensor, a reference sensor (230, 270, 295, 305, 330) may be either a vibrational or nonvibrational sensor.
    Type: Grant
    Filed: August 11, 2000
    Date of Patent: July 29, 2003
    Assignee: Maryland Technology Corporation
    Inventors: James T. Lo, Lei Yu
  • Patent number: 6463424
    Abstract: There is provided a basic association unit for creating an information processing apparatus capable of performing information processing like information processing that actually occurs in central nerve systems of animals including human beings. The association unit is an unit for repeating input and output signals having m input terminals and n output terminals. When a first input signal which is a rectangular wave signal in the form of a pulse is simultaneously input to input terminals in a quantity less than m, an output signal having the same contents as the first input signal is output from particular output terminals which are associated with the input terminals in advance.
    Type: Grant
    Filed: April 23, 1998
    Date of Patent: October 8, 2002
    Inventors: Norio Ogata, Koji Ataka
  • Patent number: 6434541
    Abstract: An engine diagnostic system includes a bit-serial based recurrent neuroprocessor for processing data from an internal combustion engine in order to diagnose misfires in real-time and reduces the number of neurons required to perform the task by time multiplexing groups of neurons from a candidate pool of neurons to achieve the successive hidden layers of the recurrent network topology.
    Type: Grant
    Filed: April 21, 1999
    Date of Patent: August 13, 2002
    Assignee: Ford Global Technologies, Inc.
    Inventors: Raoul Tawel, Nazeeh Aranki, Lee A. Feldkamp, Gintaras V. Puskorius, Kenneth A. Marko, John V. James
  • Patent number: 6356884
    Abstract: An artificial neural network-based system and method for determining desired concepts and relationships within a predefined field of endeavor, including a neural network portion, which neural network portion includes an artificial neural network that has been previously trained in accordance with a set of given training exemplars, a monitor portion associated with the neural network portion to observe the data outputs produced by the previously trained artificial neural network, and a perturbation portion for perturbing the neural network portion to effect changes, subject to design constraints of the artificial neural network that remain unperturbed, in the outputs produced by the neural network portion, the perturbation portion operable such that production of an output by the neural network portion thereafter effects a perturbation of the neural network portion by the perturbation portion, the monitor portion responsive to detection of the data outputs being produced by the previously trained neural networ
    Type: Grant
    Filed: July 2, 1999
    Date of Patent: March 12, 2002
    Inventor: Stephen L. Thaler
  • Patent number: 6199057
    Abstract: A neuroprocessor architecture employs a combination of bit-serial and serial-parallel techniques for implementing the neurons of the neuroprocessor. The neuroprocessor architecture includes a neural module containing a pool of neurons, a global controller, a sigmoid activation ROM look-up-table, a plurality of neuron state registers, and a synaptic weight RAM. The neuroprocessor reduces the number of neurons required to perform the task by time multiplexing groups of neurons from a fixed pool of neurons to achieve the successive hidden layers of a recurrent network topology.
    Type: Grant
    Filed: October 23, 1997
    Date of Patent: March 6, 2001
    Assignee: California Institute of Technology
    Inventor: Raoul Tawel
  • Patent number: 6151592
    Abstract: A recognition apparatus and method using a neural network is provided. A neuron-like element stores a value of its inner condition. The neuron-like element also updates a values of its internal status on the basis of an output from the neuron-like element itself, outputs from other neuron-like elements and an external input, and an output value generator a value of its internal status into an external output. Accordingly, the neuron-like element itself can retain the history of input data. This enables the time series data, such as speech, to be processed without providing any special devices in the neural network.
    Type: Grant
    Filed: January 20, 1998
    Date of Patent: November 21, 2000
    Assignee: Seiko Epson Corporation
    Inventor: Mitsuhiro Inazumi
  • Patent number: 6115701
    Abstract: A system and process for readily determining, for a specified knowledge domain in a given field of endeavor, perturbations applicable to an artificial neural network embodying such a specified knowledge domain that will produce a desired output, comprising a first, previously trained, artificial neural network containing training in some problem domain, which neural network is responsive to the presentment of a set of data inputs at the input portion thereof to produce a set of data outputs at the output portion thereof, a monitoring portion which constantly monitors the outputs of the first neural network to identify the desired outputs, and a network perturbation portion for effecting the application of perturbations, either externally or internally, to the first neural network to thereby effect changes in the output thereof.
    Type: Grant
    Filed: August 9, 1999
    Date of Patent: September 5, 2000
    Inventor: Stephen L. Thaler
  • Patent number: 6058206
    Abstract: A pattern recognition device having modifiable feature detectors (28) which respond to a transduced input signal (26) and communicate a feature activity signal (30) to allow classification and an appropriate output action (70). A memory (40) stores a set of comparison patterns, and is used by an assigner (66) to find likely features, or parts, in the current input signal (26). Each part is assigned to a feature detector (28[m]) judged to be responsible for it. An updater (42) modifies each responsible feature detector (28[m]) so as to make its preferred feature more similar to its assigned part. The modification embodies a strong constraint on the feature learning process, in particular an assumption that the ideal features for describing the pattern domain occur independently. This constraint allows improved learning speed and potentially improved scaling properties.
    Type: Grant
    Filed: December 1, 1997
    Date of Patent: May 2, 2000
    Inventor: Chris Alan Kortge
  • Patent number: 6018727
    Abstract: A device for generating useful information employing a first neural network trained to produce input-output maps within a predetermined initial knowledge domain, an apparatus for subjecting the neural network to perturbations which may produce changes in the predetermined knowledge domain, the neural network having an optional output for feeding the outputs of the first neural network to a second neural network that evaluates the outputs based on training within the second neural network. The device may also include a reciprocal feed back connection from the output of the second neural network to the first neural network to further influence and change what takes place in the aforesaid neural network.
    Type: Grant
    Filed: August 13, 1997
    Date of Patent: January 25, 2000
    Inventor: Stephen L. Thaler
  • Patent number: 5963929
    Abstract: A recursive neurofilter comprising a recursive neural network (NN) is disclosed for processing an information process to estimate a signal process with respect to an estimation error criterion. The information process either consists of a measurement process, or if the signal and measurement processes are time-variant, consists of the measurement process as well as a time variance process, that describes the time-variant properties of the signal and measurement processes. The recursive neurofilter is synthesized from exemplary realizations of the signal and information processes. No assumptions such as Gaussian distribution, linear dynamics, additive noise, and Markov property are required. The synthesis is performed essentially through training recursive NNs. The training criterion is constructed to reflect the mentioned estimation error criterion with the exemplary realizations.
    Type: Grant
    Filed: July 11, 1997
    Date of Patent: October 5, 1999
    Assignee: Maryland Technology Corporation
    Inventor: James Ting-Ho Lo
  • Patent number: 5943659
    Abstract: Based on the encoding of deterministic finite-state automata (DFA) in discrete-time, second-order recurrent neural networks, an algorithm constructs an augmented recurrent neural network that encodes a FFA and recognizes a given fuzzy regular language with arbitrary accuracy.
    Type: Grant
    Filed: October 3, 1995
    Date of Patent: August 24, 1999
    Assignee: NEC Research Institute, Inc.
    Inventors: C. Lee Giles, Christian Walter Omlin, Karvel Kuhn Thornber