Modifiable Weight Patents (Class 706/39)
-
Patent number: 11715031Abstract: An information processing method includes acquiring first output data for input data of first learning model, reference data for the input data, and second output data for the input data of second learning model obtained by converting first learning model; calculating first difference data corresponding to a difference between the first difference data and the reference data and second difference data corresponding to a difference between the second output data and the reference data; and training first learning model with use of the first difference data and the second difference data.Type: GrantFiled: August 1, 2019Date of Patent: August 1, 2023Assignee: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICAInventors: Yasunori Ishii, Yohei Nakata, Hiroaki Urabe
-
Patent number: 11636316Abstract: Broadly speaking, the present techniques exploit the properties of correlated electron materials for artificial neural networks and neuromorphic computing. In particular, the present techniques provide apparatuses/devices that comprise at least one correlated electron switch (CES) element and which may be used as, or to form, an artificial neuron or an artificial synapse.Type: GrantFiled: January 31, 2018Date of Patent: April 25, 2023Assignee: Cerfe Labs, Inc.Inventors: Lucian Shifren, Shidhartha Das, Naveen Suda, Carlos Alberto Paz de Araujo
-
Patent number: 11112785Abstract: Systems and methods for data collection and signal processing are disclosed, including a plurality of variable groups of analog sensor inputs, the analog sensors operationally coupled to an industrial environment. The inputs of the sensors may be received by an analog crosspoint switch, where the signals are monitored, data collection may be adaptively scheduled, front end signal conditioning may occur, and a noise value determined.Type: GrantFiled: December 12, 2018Date of Patent: September 7, 2021Assignee: Strong Force IoT Portfolio 2016, LLCInventors: Charles Howard Cella, Gerald William Duffy, Jr., Jeffrey P. McGuckin, Mehul Desai
-
Patent number: 11074493Abstract: Boltzmann machine includes a plurality of circuit units each having an adder that adds weighted input signals and a comparison unit that compares an output signal of the adder with a threshold signal to output a binary output signal; and digital arithmetic units each generating the weighted input signals by weighting the binary output signal of the circuit units with a weight. The comparison unit has a first comparator that compares a thermal noise with a reference voltage to output a binary digital random signal, a DA converter that converts the digital random signal to an analog random signal and varies a magnitude of the analog random signal, and a second comparator that compares the output signal of the adder with the analog random signal to generate the binary output signal with a predetermined probability function.Type: GrantFiled: January 30, 2017Date of Patent: July 27, 2021Assignee: FUJITSU LIMITEDInventors: Takumi Danjo, Sanroku Tsukamoto, Hirotaka Tamura
-
Patent number: 11055608Abstract: A convolutional neural network is provided comprising artificial neurons arranged in layers, each comprising output matrices. An output matrix comprises output neurons and is connected to an input matrix, comprising input neurons, by synapses associated with a convolution matrix comprising weight coefficients associated with the output neurons of an output matrix. Each synapse consists of a set of memristive devices storing a weight coefficient of the convolution matrix. In response to a change of the output value of an input neuron, the neural network dynamically associates each set of memristive devices with an output neuron connected to the input neuron. The neural network comprises accumulator(s) for each output neuron; to accumulate the values of the weight coefficients stored in the sets of memristive devices dynamically associated with the output neuron, the output value of the output neuron being determined from the value accumulated in the accumulator(s).Type: GrantFiled: August 18, 2015Date of Patent: July 6, 2021Assignee: COMMISSARIAT A L'ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVESInventor: Olivier Bichler
-
Patent number: 10628699Abstract: Event-based image feature extraction includes reducing an accumulated magnitude of a leaky integrate and fire (LIF) neuron based on a difference between a current time and a previous time; receiving an event input from a dynamic vision sensor (DVS) pixel at the current time; weighting the received input; adding the weighted input to the reduced magnitude to form an accumulated magnitude of the LIF neuron at the current time; and, if the accumulated magnitude reaches a threshold, firing the neuron and decreasing the accumulated magnitude.Type: GrantFiled: June 13, 2017Date of Patent: April 21, 2020Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Lior Zamir, Nathan Henri Levy
-
Patent number: 10565500Abstract: A spiking neural network (SNN) is implemented on a neuromorphic computers and includes a plurality of neurons, a first set of the plurality of synapses defining feed-forward connections from a first subset of the neurons to a second subset of the neurons, a second subset of the plurality of synapses to define recurrent connections between the second subset of neurons, and a third subset of the plurality of synapses to define feedback connections from the second subset of neurons to the first subset of neurons. A set of input vectors are provided to iteratively modify weight values of the plurality of synapses. Each iteration involves selectively enabling and disabling the third subset of synapses with a different one of the input vectors applied to the SNN. The weight values are iteratively adjusted to derive a solution to an equation comprising an unknown matrix variable and an unknown vector variable.Type: GrantFiled: December 20, 2016Date of Patent: February 18, 2020Assignee: Intel CorporationInventor: Tsung-Han Lin
-
Patent number: 9092736Abstract: Certain embodiments of the present disclosure support techniques for training of synapses in biologically inspired networks. Only one device based on a memristor can be used as a synaptic connection between a pair of neurons. The training of synaptic weights can be achieved with a low current consumption. A proposed synapse training circuit may be shared by a plurality of incoming/outgoing connections, while only one digitally implemented pulse-width modulation (PWM) generator can be utilized per neuron circuit for generating synapse-training pulses. Only up to three phases of a slow clock can be used for both the neuron-to-neuron communications and synapse training. Some special control signals can be also generated for setting up synapse training events. By means of these signals, the synapse training circuit can be in a high-impedance state outside the training events, thus the synaptic resistance (i.e., the synaptic weight) is not affected outside the training process.Type: GrantFiled: July 7, 2010Date of Patent: July 28, 2015Assignee: QUALCOMM IncorporatedInventors: Vladimir Aparin, Yi Tang
-
Patent number: 8959040Abstract: A spike timing dependent plasticity (STDP) apparatus, neuromorphic synapse system and a method provide STDP processing of spike signals. The STDP apparatus includes a first leaky integrator to receive a first spike signal and a second leaky integrator to receive a second spike signal. An output of the first leaky integrator is gated according to the second spike signal to produce a first gated integrated signal and an output of the second leaky integrator is gated according to the first spike signal to produce a second gated integrated signal. The STDP apparatus further includes an output integrator to integrate a difference of the first and second gated integrated signals to produce a weighted signal. The system includes a synapse core and the STDP apparatus. The method includes integrating the spike signals, gating the integrated signals and integrating a difference of the gated integrated signals.Type: GrantFiled: March 8, 2012Date of Patent: February 17, 2015Assignee: HRL Laboratories, LLCInventors: Jose Cruz-Albrecht, Peter Petre, Narayan Srinivasa
-
Patent number: 8768921Abstract: Embodiments of the present invention relate to an approach for reusing information/knowledge. Specifically, embodiments of the present invention provide an approach for retrieving previously stored data to satisfy queries (e.g., jobs/tickets) for solutions to problems while maintaining privacy/security of the data as well as ensuring the quality of the results. In a typical embodiment, a query for a solution to a problem is received and details are extracted therefrom. Using the details, a search is performed on a set of data stored in at least one computer storage device. Based on the search, a set of results will be generated and classified into a set of categories. In any event, the quality of each of the set of results will be assessed based on the usefulness of the set of results.Type: GrantFiled: October 20, 2011Date of Patent: July 1, 2014Assignee: International Business Machines CorporationInventors: Sugata Ghosal, Anup K. Ghosh, Nandakishore Kambhatla, Rose C. Kanjirathinkal, Asidhara Lahiri, Debapriyo Majumdar, Shajith I. Mohamed, Karthik Visweswariah
-
Patent number: 8706624Abstract: Disclosed is a system and method for Facilitating Credit Transactions, which may allow for the division of a given purchase or cash-withdrawal transaction amount, into periodical installments by enabling the financing of said transaction.Type: GrantFiled: October 4, 2012Date of Patent: April 22, 2014Assignee: Pay It Simple Ltd.Inventors: Gil Don, Alon Feit, Victoria Niel Kraine
-
Patent number: 7904398Abstract: Neuron component and method for use in artificial neural networks (ANNs) with input synapses (204, 204b . . . 204n), each synapse includes multiple weights called synapse weights (206-1, 206-2, 206-3). Each synapse further includes a facility to modulate, or gate, an input signal connected to the synapses, by each of the respective synapse weights within the synapse, supplying the result of each modulating operation. The neuron also sums the results of all modulating operations, and subjects the results to a transfer function. Each of the multiple weights associated with a given synapse, may be specified to have its own weight-adjustment facility (214, 214b, 214c), with its own error-values (216, 216b, 216c), and its own specified learning and aspect (1000) includes a separate sum (1018, 1018b) and transfer function (1020, 1020b) for each synapse weight.Type: GrantFiled: March 22, 2007Date of Patent: March 8, 2011Inventor: Dominic John Repici
-
Patent number: 7730086Abstract: A method of allocation a computer to service a request for a data set in a system having a plurality of computers. The method is implemented on a neural network having only an input layer having input nodes and an output layer having output nodes, where each output node is associated with a specific computer. Connecting the input nodes to the output nodes are weights w(j,k). The method includes the steps of receiving a request for data set āIā and inputting to the input layer a vector R(I) dependent upon the number of requests for the requested data over a predetermined period of time and selecting a computer assignment associated with of one of the output nodes to service the data request, where the output node selected is associated with a specific weight selected to minimize a predetermined metric measuring the distance between the vector entry R(I) and the weights(I,k).Type: GrantFiled: February 1, 2007Date of Patent: June 1, 2010Assignees: Louisiana Tech University Foundation, Inc., Board of Supervisors of Louisiana State University Agricultural and Mechanical College on Behalf of the Louisiana State University Health Sciences CenterInventors: Vir V. Phoha, Sitharama S. Iyengar, Rajgopal Kannan
-
Patent number: 7620819Abstract: We develop a system consisting of a neural architecture resulting in classifying regions corresponding to users' keystroke patterns. We extend the adaptation properties to classification phase resulting in learning of changes over time. Classification results on login attempts of 43 users (216 valid, 657 impersonation samples) show considerable improvements over existing methods.Type: GrantFiled: September 29, 2005Date of Patent: November 17, 2009Assignees: The Penn State Research Foundation, Louisiana Tech University Foundation, Inc.Inventors: Vir V. Phoha, Sunil Babu, Asok Ray, Shashi P. Phoba
-
Publication number: 20080243740Abstract: An apparatus includes a circuit element that requires calibration, a calibration circuit for use in calibrating the circuit element, and a damping diode electrically connectable in a first path that includes the calibration circuit and electrically connectable in a second path that excludes the calibration circuit. The first path is for electrically connecting the calibration circuit and the circuit element, and the second path is for use in protecting the apparatus from electrostatic discharge. A switching circuit is used to switch the clamping diode between the first path and the second path.Type: ApplicationFiled: April 2, 2007Publication date: October 2, 2008Inventor: Steven L. Hauptman
-
Patent number: 7398260Abstract: An Effector machine is a new kind of computing machine. When implemented in hardware, the Effector machine can execute multiple instructions simultaneously because every one of its computing elements is active. This greatly enhances the computing speed. By executing a meta program whose instructions change the connections in a dynamic Effector machine, the Effector machine can perform tasks that digital computers are unable to compute.Type: GrantFiled: March 2, 2004Date of Patent: July 8, 2008Assignee: Fiske Software LLCInventor: Michael Stephen Fiske
-
Patent number: 7222112Abstract: A method, system and machine-readable storage medium for monitoring an engine using a cascaded neural network that includes a plurality of neural networks is disclosed. In operation, the method, system and machine-readable storage medium store data corresponding to the cascaded neural network. Signals generated by a plurality of engine sensors are then inputted into the cascaded neural network. Next, a second neural network is updated at a first rate, with an output of a first neural network, wherein the output is based on the inputted signals. In response, the second neural network outputs at a second rate, at least one engine control signal, wherein the second rate is faster than the first rate.Type: GrantFiled: January 27, 2006Date of Patent: May 22, 2007Assignee: Caterpillar Inc.Inventor: Evan Earl Jacobson
-
Patent number: 7143073Abstract: The invention relates to generating a test suite of instructions for testing the operation of a processor. A fuzzy finite state machine with a plurality of states 2 and transitions 4 determined by weights W1, W2 . . . W10 is used to generate a sequence of instructions. The weights determine the next state as well as an instruction and operands for each state. The weights may be adapted based on the generated sequence and further sequences are generated.Type: GrantFiled: April 4, 2002Date of Patent: November 28, 2006Assignee: Broadcom CorporationInventor: Geoff Barrett
-
Patent number: 7143071Abstract: A method for changing the CPU frequency under control of a neural network. The neural network has m basis functions and n basis points that are connected together. Using the learning capability of the neural network to deduce basis weights based on dummy environmental parameters and a dummy output vector. In an application procedure, environmental parameters are input to the basis points and basis vectors are calculated based on the basis functions. Integrating the multiplication of each basis vector and its corresponding basis weight, an output vector can be generated to determine a control signal so that the CPU can be controlled to raise or lower its operating frequency. In addition, if the user has to change the parameters due to behavior, a fast learning function of a radial neural network can be used for complying with each user's behavior.Type: GrantFiled: March 8, 2002Date of Patent: November 28, 2006Assignee: Via Technologies, Inc.Inventors: I-Larn Chen, Yuh-Dar Tseng
-
Patent number: 7080053Abstract: A method for evolving appropriate connections among units in a neural network includes a) calculating weight changes at each existing connection and incipient connections between units for each training example; and b) determining a K ratio using the weight changes, wherein said K ratio comprises the weight change of existing connections, and wherein if the K ratio the weight change of incipient connections exceeds a threshold, further including b1) increasing a weight of the existing connection; b2) creating new connections at the incipient connections. The method further includes c) pruning weak connections between the units.Type: GrantFiled: August 16, 2001Date of Patent: July 18, 2006Assignee: Research Foundation of State University of New YorkInventors: Paul Adams, Kingsley J. A. Cox, John D. Pinezich
-
Patent number: 7035834Abstract: A method, system and machine-readable storage medium for monitoring an engine using a cascaded neural network that includes a plurality of neural networks is disclosed. In operation, the method, system and machine-readable storage medium store data corresponding to the cascaded neural network. Signals generated by a plurality of engine sensors are then inputted into the cascaded neural network. Next, a second neural network is updated at a first rate, with an output of a first neural network, wherein the output is based on the inputted signals. In response, the second neural network outputs at a second rate, at least one engine control signal, wherein the second rate is faster than the first rate.Type: GrantFiled: May 15, 2002Date of Patent: April 25, 2006Assignee: Caterpillar Inc.Inventor: Evan Earl Jacobson
-
Patent number: 7016886Abstract: An artificial neuron includes inputs and dendrites, a respective one of which is associated with a respective one of the inputs. A respective dendrite includes a respective power series of weights. The weights in a given power of the power series represent a maximal projection. A respective power also may include at least one switch, to identify holes in the projections. By providing maximal projections, linear scaling may be provided for the maximal projections, and quasi-linear scaling may be provided for the artificial neuron, while allowing a lossless compression of the associations. Accordingly, hetero-associative and/or auto-associative recall may be accommodated for large numbers of inputs, without requiring geometric scaling as a function of input.Type: GrantFiled: August 9, 2002Date of Patent: March 21, 2006Assignee: Saffron Technology Inc.Inventors: David R. Cabana, Manuel Aparicio, IV, James S. Fleming
-
Patent number: 6999953Abstract: An analog neural computing medium, neuron and neural networks are disclosed. The neural computing medium includes a phase change material that has the ability to cumulatively respond to multiple input signals. Input signals induce transformations among a plurality of accumulation states of the disclosed neural computing medium. The accumulation states are characterized by a high electrical resistance. Upon cumulative receipt of energy from one or more input signals that equals or exceeds a threshold value, the neural computing medium fires by transforming to a low resistance state. The disclosed neural computing medium may also be configured to perform a weighting function whereby it weights incoming signals. The disclosed neurons may also include activation units for further transforming signals transmitted by the accumulation units according to a mathematical operation. The artificial neurons, weighting units, accumulation units and activation units may be connected to form artificial neural networks.Type: GrantFiled: July 3, 2002Date of Patent: February 14, 2006Assignee: Energy Conversion Devices, Inc.Inventor: Stanford R. Ovhsinsky
-
Patent number: 6876989Abstract: A neural network system includes a feedforward network comprising at least one neuron circuit for producing an activation function and a first derivative of the activation function and a weight updating circuit for producing updated weights to the feedforward network. The system also includes an error back-propagation network for receiving the first derivative of the activation function and to provide weight change data information to the weight updating circuit.Type: GrantFiled: February 13, 2002Date of Patent: April 5, 2005Assignee: Winbond Electronics CorporationInventors: Bingxue Shi, Chun Lu, Lu Chen
-
Patent number: 6844582Abstract: A learning method of a semiconductor device of the present invention comprises a neuro device having a multiplier as a synapse in which a weight varies according to an input weight voltage, and functioning as a neural network system that processes analog data, comprising a step A of inputting predetermined input data to the neuro device and calculating an error between a target value of an output of the neuro device with respect to the input data and an actual output, a step B of calculating variation amount in the error by varying a weight of the multiplier thereafter, and a step C of varying the weight of the multiplier based on the variation amount in the error, wherein in the steps B and C, after inputting a reset voltage for setting the weight to a substantially constant value to the multiplier as the weight voltage, the weight is varied by inputting the weight voltage corresponding to the weight to be varied.Type: GrantFiled: May 9, 2003Date of Patent: January 18, 2005Assignee: Matsushita Electric Industrial Co., Ltd.Inventors: Michihito Ueda, Kenji Toyoda, Takashi Ohtsuka, Kiyoyuki Morita
-
Patent number: 6792412Abstract: A system and method for controlling information output based on user feedback about the information that includes a plurality of information sources. At least one neural network module selects one or more of a plurality of objects to receive information from the plurality of information sources based on a plurality of inputs and a plurality of weight values during that epoch. At least one server, associated with the neural network module, provides one or more of the objects to a plurality of recipients. The recipients provide feedback during an epoch. At the conclusion of an epoch, the neural network takes the feedback that has been provided from the recipients and generates a rating value for each of the objects. Based on the rating value and selections made, the neural network redetermines the weight values. The neural network then selects the objects to receive information during a subsequent epoch.Type: GrantFiled: February 2, 1999Date of Patent: September 14, 2004Inventors: Alan Sullivan, Ivan Pope
-
Patent number: 6768927Abstract: The invention relates to a control system to which a state vector representing the states of a controlled system is applied. The control system provides a correcting variables vector of optimized correcting variables. The relation between state vector and correcting variables vector is defined by a matrix of weights. These weights depend on the solution of the state dependent Riccati equation. An equation solving system for solving the state dependent Riccati equation in real time is provided. The state vector is applied to this equation solving system. The solution of the state dependent Riccati equation is transferred to the control system to determine the weights.Type: GrantFiled: March 29, 2001Date of Patent: July 27, 2004Assignee: Bodenseewerk Geratetechnik GmbHInventor: Uwe Krogmann
-
Patent number: 6735578Abstract: A system and a method for an automated intelligent information mining includes receiving product-related queries and respective product-related information from various text sources; extracting multiple key-phrases from the product-related information and received queries; generating two or more layers of contextual relation maps by mapping the extracted key-phrases to two-dimensional maps using a self organizing map, and a technique including a combination of Hessian matrix and Perturbation technique to enhance the learning process and to categorize the extracted key-phrases based on a contextual meaning. Further, the technique includes forming word clusters and constructing corresponding key phrase frequency histograms for each of the generated contextual relation maps.Type: GrantFiled: May 10, 2001Date of Patent: May 11, 2004Assignee: Honeywell International Inc.Inventors: Ravindra K. Shetty, Venkatesan Thyagarajan
-
Patent number: 6665651Abstract: A feedback control system for automatic on-line training of a controller for a plant, the system having a reinforcement learning agent connected in parallel with the controller. The learning agent comprises an actor network and a critic network operatively arranged to carry out at least one sequence of a stability phase followed by a learning phase. During the stability phase, a multi-dimensional boundary of values is determined. During the learning phase, a plurality of updated weight values is generated in connection with the on-line training, if and until one of the updated weight values reaches the boundary, at which time a next sequence is carried out to determine a next multi-dimensional boundary of values followed by a next learning phase.Type: GrantFiled: July 18, 2002Date of Patent: December 16, 2003Assignee: Colorado State University Research FoundationInventors: Peter M. Young, Charles Anderson, Douglas C. Hittle, Matthew Kretchmar
-
Patent number: 6665639Abstract: A method and apparatus are described that allow inexpensive speech recognition in applications where this capability is not otherwise feasible because of cost or technical reasons, or because of inconvenience to the user. A relatively simple speaker independent recognition algorithm, capable of recognizing a limited number of utterances at any one time, is associated with the base unit of an electronics product. To function, the product requires information from an external medium and this medium also provides the data required to recognize several sets of utterances pertinent to other information provided by the external medium.Type: GrantFiled: January 16, 2002Date of Patent: December 16, 2003Assignee: Sensory, Inc.Inventors: Todd F. Mozer, Forrest S. Mozer, Thomas North
-
Patent number: 6654730Abstract: When neuron operations are computed in parallel using a large number of arithmetic units, arithmetic units for neuron operations and arithmetic units for error signal operations need not be provided separately, and a neural network arithmetic apparatus that consumes the bus band less is provided for updating of synapse connection weights. Operation results of arithmetic units and setting information of a master node are exchanged between them through a local bus. During neuron operations, partial sums of neuron output values from the arithmetic units are accumulated by the master node to generate and output a neuron output value, and an arithmetic unit to which neuron operations of the specific neuron are assigned receives and stores the neuron output value outputted from the master node.Type: GrantFiled: November 1, 2000Date of Patent: November 25, 2003Assignee: Fuji Xerox Co., Ltd.Inventors: Noriji Kato, Hirotsugu Kashimura, Hitoshi Ikeda, Nobuaki Miyakawa
-
Patent number: 6581049Abstract: An artificial neuron includes inputs and dendrites, a respective one of which is associated with a respective one of the inputs. Each dendrite includes a power series of weights, and each weight in a power series includes an associated count for the associated power. The power series of weights preferably is a base-two power series of weights, each weight in the base-two power series including an associated count that represents a bit position. The counts for the associated power preferably are statistical counts. More particularly, the dendrites preferably are sequentially ordered, and the power series of weights preferably includes a pair of first and second power series of weights. Each weight in the first power series includes a first count that is a function of associations of prior dendrites, and each weight of the second power series includes a second count that is a function of associations of next dendrites.Type: GrantFiled: November 8, 1999Date of Patent: June 17, 2003Assignee: Saffron Technology, Inc.Inventors: Manuel Aparicio, IV, James S. Fleming, Dan Ariely
-
Patent number: 6513023Abstract: A neural network circuit is provided having a plurality of circuits capable of charge storage. Also provided is a plurality of circuits each coupled to at least one of the plurality of charge storage circuits and constructed to generate an output in accordance with a neuron transfer function. Each of a plurality of circuits is coupled to one of the plurality of neuron transfer function circuits and constructed to generate a derivative of the output. A weight update circuit updates the charge storage circuits based upon output from the plurality of transfer function circuits and output from the plurality of derivative circuits. In preferred embodiments, separate training and validation networks share the same set of charge storage circuits and may operate concurrently.Type: GrantFiled: October 1, 1999Date of Patent: January 28, 2003Assignee: The United States of America as represented by the Administrator of the National Aeronautics and Space AdministrationInventor: Tuan A. Duong
-
Patent number: 6366897Abstract: A cortronic neural network defines connections between neurons in a number of regions using target lists, which identify the output connections of each neuron and the connection strength. Neurons are preferably sparsely interconnected between regions. Training of connection weights employs a three stage process, which involves computation of the contribution to the input intensity of each neuron by every currently active neuron, a competition process that determines the next set of active neurons based on their current input intensity, and a weight adjustment process that updates and normalizes the connection weights based on which neurons won the competition process, and their connectivity with other winning neurons.Type: GrantFiled: July 26, 1999Date of Patent: April 2, 2002Assignee: HNC Software, Inc.Inventors: Robert W. Means, Richard Calmbach
-
Patent number: 6151592Abstract: A recognition apparatus and method using a neural network is provided. A neuron-like element stores a value of its inner condition. The neuron-like element also updates a values of its internal status on the basis of an output from the neuron-like element itself, outputs from other neuron-like elements and an external input, and an output value generator a value of its internal status into an external output. Accordingly, the neuron-like element itself can retain the history of input data. This enables the time series data, such as speech, to be processed without providing any special devices in the neural network.Type: GrantFiled: January 20, 1998Date of Patent: November 21, 2000Assignee: Seiko Epson CorporationInventor: Mitsuhiro Inazumi
-
Patent number: 6064997Abstract: A family of novel multi-layer discrete-time neural net controllers is presented for the control of an multi-input multi-output (MIMO) dynamical system. No learning phase is needed. The structure of the neural net (NN) controller is derived using a filtered error/passivity approach. For guaranteed stability, the upper bound on the constant learning rate parameter for the delta rule employed in standard back propagation is shown to decrease with the number of hidden-layer neurons so that learning must slow down. This major drawback is shown to be easily overcome by using a projection algorithm in each layer. The notion of persistency of excitation for multilayer NN is defined and explored. New on-line improved tuning algorithms for discrete-time systems are derived, which are similar to e-modification for the case of continuous-time systems, that include a modification to the learning rate parameter plus a correction term. These algorithms guarantee tracking as well as bounded NN weights.Type: GrantFiled: March 19, 1997Date of Patent: May 16, 2000Assignee: University of Texas System, The Board of RegentsInventors: Sarangapani Jagannathan, Frank Lewis
-
Patent number: 5852815Abstract: Constructing and simulating artificial neural networks and components thereof within a spreadsheet environment results in user friendly neural networks which do not require algorithmic based software in order to train or operate. Such neural networks can be easily cascaded to form complex neural networks and neural network systems, including neural networks capable of self-organizing so as to self-train within a spreadsheet, neural networks which train simultaneously within a spreadsheet, and neural networks capable of autonomously moving, monitoring, analyzing, and altering data within a spreadsheet. Neural networks can also be cascaded together in self training neural network form to achieve a device prototyping system.Type: GrantFiled: May 15, 1998Date of Patent: December 22, 1998Inventor: Stephen L. Thaler
-
Patent number: RE41658Abstract: A neural network including a number of synaptic weighting elements, and a neuron stage; each of the synaptic weighting elements having a respective synaptic input connection supplied with a respective input signal; and the neuron stage having inputs connected to the synaptic weighting elements, and being connected to an output of the neural network supplying a digital output signal. The accumulated weighted inputs are represented as conductances, and a conductance-mode neuron is used to apply nonlinearity and produce an output. The synaptic weighting elements are formed by memory cells programmable to different threshold voltage levels, so that each presents a respective programmable conductance; and the neuron stage provides for measuring conductance on the basis of the current through the memory cells, and for generating a binary output signal on the basis of the total conductance of the synaptic elements.Type: GrantFiled: July 31, 2003Date of Patent: September 7, 2010Assignee: STMicroelectronics S.r.l.Inventors: Vito Fabbrizio, Gianluca Colli, Alan Kramer