Patents by Inventor Pascal Tannhof

Pascal Tannhof has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 6622135
    Abstract: To avoid the problem of category assignment in artificial neural networks (ANNs) based upon a mapping of the input space (like ROI and KNN algorithms), the present method uses “probabilities”. Now patterns memorized as prototypes do not represent categories any longer but the “probabilities” to belong to categories. Thus, after having memorized the most representative patterns in a first step of the learning phase, the second step consists of an evaluation of these probabilities. To that end, several counters are associated with each prototype and are used to evaluate the response frequency and accuracy for each neuron of the ANN. These counters are dynamically incremented during this second step using distances evaluation (between the input vectors and the prototypes) and error criteria (for example the differences between the desired responses and the response given by the ANN).
    Type: Grant
    Filed: December 16, 1999
    Date of Patent: September 16, 2003
    Assignee: International Business Machines Corporation
    Inventors: Gislain Imbert De Tremiolles, Pascal Tannhof, Erin Williams
  • Publication number: 20030135475
    Abstract: A method is described to improve the data transfer rate between a personal computer or a host computer and a neural network implemented in hardware by merging a plurality of input patterns into a single input pattern configured to globally represent the set of input patterns. A base consolidated vector (U′*n) representing the input pattern is defined to describe all the vectors (Un, . . . , Un+6) representing the input patterns derived thereof (U′n, . . . , U′n+6) by combining components having fixed and ‘don't care’ values. The base consolidated vector is provided only once with all the components of the vectors. An artificial neural network (ANN) is then configured as a combination of sub-networks operating in parallel. In order to compute the distances with an adequate number of components, the prototypes are to include also components having a definite value and ‘don't care’ conditions.
    Type: Application
    Filed: December 10, 2002
    Publication date: July 17, 2003
    Applicant: International Business Machines Corporation
    Inventors: Pascal Tannhof, Ghislain Imbert de Tremiolles
  • Publication number: 20030133605
    Abstract: An artificial neural network (ANN) based system that is adapted to process an input pattern to generate an output pattern related thereto having a different number of components than the input pattern. The system (26) is comprised of an ANN (27) and a memory (28), such as a DRAM memory, that are serially connected. The input pattern (23) is applied to a processor (22), where it can be processed or not (the most general case), before it is applied to the ANN and stored therein as a prototype (if learned). A category is associated with each stored prototype. The processor computes the coefficients that allow the determination of the estimated values of the output pattern, these coefficients are the components of a so-called intermediate pattern (24). Assuming the ANN has already learned a number of input patterns, when a new input pattern is presented to the ANN in the recognition phase, the category of the closest prototype is output therefrom and is used as a pointer to the memory.
    Type: Application
    Filed: December 17, 2002
    Publication date: July 17, 2003
    Inventors: Pascal Tannhof, Ghislain Imbert De Tremiolles
  • Patent number: 6523018
    Abstract: The neural semiconductor chip first includes: a global register and control logic circuit block, a R/W memory block and a plurality of neurons fed by buses transporting data such as the input vector data, set-up parameters, etc., and signals such as the feed back and control signals. The R/W memory block, typically a RAM, is common to all neurons to avoid circuit duplication, increasing thereby the number of neurons integrated in the chip. The R/W memory stores the prototype components. Each neuron comprises a computation block, a register block, an evaluation block and a daisy chain block to chain the neurons. All these blocks (except the computation block) have a symmetric structure and are designed so that each neuron may operate in a dual manner, i.e. either as a single neuron (single mode) or as two independent neurons (dual mode). Each neuron generates local signals.
    Type: Grant
    Filed: December 22, 1999
    Date of Patent: February 18, 2003
    Assignee: International Business Machines Corporation
    Inventors: Didier Louis, Pascal Tannhof, André Steimle
  • Publication number: 20030023569
    Abstract: An improved Artificial Neural Network (ANN) is disclosed that comprises a conventional ANN, a database block, and a compare and update circuit. The conventional ANN is formed by a plurality of neurons, each neuron having a prototype memory dedicated to store a prototype and a distance evaluator to evaluate the distance between the input pattern presented to the ANN and the prototype stored therein. The database block has: all the prototypes arranged in slices, each slice being capable to store up to a maximum number of prototypes; the input patterns or queries to be presented to the ANN; and the distances resulting of the evaluation performed during the recognition/classification phase. The compare and update circuit compares the distance with the distance previously found for the same input pattern updates or not the distance previously stored.
    Type: Application
    Filed: May 3, 2002
    Publication date: January 30, 2003
    Applicant: International Business Machines Corporation
    Inventors: Ghislain Imbert De Tremiolles, Pascal Tannhof
  • Patent number: 6502083
    Abstract: The improved neuron is connected to input buses which transport input data and control signals. It basically consists of a computation block, a register block, an evaluation block and a daisy chain block. All these blocks, except the computation block substantially have a symmetric construction. Registers are used to store data: the local norm and context, the distance, the AIF value and the category. The improved neuron further needs some R/W memory capacity which may be placed either in the neuron or outside. The evaluation circuit is connected to an output bus to generate global signals thereon. The daisy chain block allows to chain the improved neuron with others to form an artificial neural network (ANN). The improved neuron may work either as a single neuron (single mode) or as two independent neurons (dual mode). In the latter case, the computation block, which is common to the two dual neurons, must operate sequentially to service one neuron after the other.
    Type: Grant
    Filed: December 22, 1999
    Date of Patent: December 31, 2002
    Assignee: International Business Machines Corporation
    Inventors: Didier Louis, Pascal Tannhof, Andre Steimle
  • Publication number: 20020095394
    Abstract: Let us consider a plurality of input patterns having an essential characteristic in common but which differ on at least one parameter (this parameter modifies the input pattern in some extent but not this essential characteristic for a specific application). During the learning phase, each input pattern is normalized in a normalizer, before it is presented to a classifier. If not recognized, it is learned, i.e. the normalized pattern is stored in the classifier as a prototype with its category associated thereto. From a predetermined reference value of that parameter, the normalizer computes an element related to said parameter which allows to set the normalized pattern from the input pattern and vice versa to retrieve the input pattern from the normalized pattern. As a result, all these input patterns are represented by the same normalized pattern. The above method and circuits allow to reduce the number of required prototypes in the classifier, improving thereby its response quality.
    Type: Application
    Filed: December 11, 2001
    Publication date: July 18, 2002
    Inventors: Ghislain Imbert De Tremiolles, Pascal Tannhof
  • Publication number: 20020073053
    Abstract: The method and circuits of the present invention aim to associate a norm to each component of an input pattern presented to an input space mapping algorithm based artificial neural network (ANN) during the distance evaluation process. The set of norms, referred to as the “component” norms is memorized in specific memorization means in the ANN. In a first embodiment, the ANN is provided with a global memory, common for all the neurons of the ANN, that memorizes all the component norms. For each component of the input pattern, all the neurons perform the elementary (or partial) distance calculation with the corresponding prototype components stored therein during the distance evaluation process using the associated component norm. The distance elementary calculations are then combined using a “distance” norm to determine the final distance between the input pattern and the prototypes stored in the neurons.
    Type: Application
    Filed: September 12, 2001
    Publication date: June 13, 2002
    Applicant: International Business Machines Corporation
    Inventors: Ghislain Imbert de Tremiolles, Pascal Tannhof
  • Publication number: 20020059150
    Abstract: The method and circuits of the present invention aim to associate a norm to each component of an input pattern presented to an input space mapping algorithm based artificial neural network (ANN) during the distance evaluation process. The set of norms, referred to as the “component” norms is memorized in specific memorization means in the ANN. In a first embodiment, the ANN is provided with a global memory, common for all the neurons of the ANN, that memorizes all the component norms. For each component of the input pattern, all the neurons perform the elementary (or partial) distance calculation with the corresponding prototype components stored therein during the distance evaluation process using the associated component norm. The distance elementary calculations are then combined using a “distance” norm to determine the final distance between the input pattern and the prototypes stored in the neurons.
    Type: Application
    Filed: July 12, 2001
    Publication date: May 16, 2002
    Applicant: IBM
    Inventors: Ghislain Imbert de Tremiolles, Pascal Tannhof
  • Patent number: 6377941
    Abstract: A method of achieving automatic learning of an input vector presented to an artificial neural network (ANN) formed by a plurality of neurons, using the K nearest neighbor (KNN) mode. Upon providing an input vector to be learned to the ANN, a Write component operation is performed to store the input vector components in the first available free neuron of the ANN. Then, a Write category operation is performed by assigning a category defined by the user to the input vector. Next, a test is performed to determine whether this category matches the categories of the nearest prototypes, i.e. which are located at the minimum distance. If it matches, this first free neuron is not engaged. Otherwise, it is engaged by assigning the matching category to it. As a result, the input vector becomes the new prototype with the matching category associated thereto. Further described is a circuit which automatically retains the first free neuron of the ANN for learning.
    Type: Grant
    Filed: June 22, 1999
    Date of Patent: April 23, 2002
    Assignee: International Business Machines Corporation
    Inventors: Andre Steimle, Pascal Tannhof
  • Patent number: 6347309
    Abstract: The improved neural network of the present invention results from the combination of a dedicated logic block with a conventional neural network based upon a mapping of the input space usually employed to classify an input data by computing the distance between said input data and prototypes memorized therein. The improved neural network is able to classify an input data, for instance, represented by a vector A even when some of its components are noisy or unknown during either the learning or the recognition phase. To that end, influence fields of various and different shapes are created for each neuron of the conventional neural network. The logic block transforms at least some of the n components (A1, . . . , An) of the input vector A into the m components (V1, . . . , Vm) of a network input vector V according to a linear or non-linear transform function F. In turn, vector V is applied as the input data to said conventional neural network.
    Type: Grant
    Filed: December 30, 1998
    Date of Patent: February 12, 2002
    Assignee: International Business Machines Corporation
    Inventors: Ghislain Imbert De Tremiolles, Pascal Tannhof
  • Publication number: 20010013048
    Abstract: In the search of the minimum value among a set of p Numbers coded on q bits, each Number is split into K sub-values coded on n bits (q>=K×n). Parameter K thus assigns a rank to each sub-value so that K slices of bits are formed wherein each slice is composed of sub-values of the same rank. Each sub-value is then encoded on m bits (m>n) using a “thermometric” coding technique. A parallel search is then performed on the first slice of encoded sub-values (MSBs) to determine the minimum sub-value of that slice. All the Numbers associated to sub-values that are greater than the minimum sub-value that has been evaluated are deselected. The evaluation process is continued the same way until the last slice (LSBs) has been processed. At the end of the evaluation process, the Number which remains selected has the minimum value. The response time (i.e. the number of processing steps) now only depends upon the number K of sub-values in which the Numbers have been split up.
    Type: Application
    Filed: January 4, 2001
    Publication date: August 9, 2001
    Inventors: Ghislain Imbert de Tremiolles, Didier Louis, Pascal Tannhof
  • Patent number: 5740326
    Abstract: In a neural network of N neuron circuits, having an engaged neuron's calculated p bit wide distance between an input vector and a prototype vector and stored in the weight memory thereof, an aggregate search/sort circuit (517) of N engaged neurons' search/sort circuits. The aggregate search/sort circuit determines the minimum distance among the calculated distances. Each search/sort circuit (502-1) has p elementary search/sort units connected in series to form a column, such that the aggregate circuit is a matrix of elementary search/sort units. The distance bit signals of the same bit rank are applied to search/sort units in each row. A feedback signal is generated by ORing in an OR gate (12.1) all local search/sort output signals from the elementary search/sort units of the same row. The search process is based on identifying zeroes in the distance bit signals, from the MSB's to the LSB's. As a zero is found in a row, all the columns with a one in that row are excluded from the subsequent row search.
    Type: Grant
    Filed: June 7, 1995
    Date of Patent: April 14, 1998
    Assignee: International Business Machines Corporation
    Inventors: Jean-Yves Boulet, Pascal Tannhof, Guy Paillet
  • Patent number: 5717832
    Abstract: A base neural semiconductor chip (10) including a neural network or unit (11(#)). The neural network (11(#)) has a plurality of neuron circuits fed by different buses transporting data such as the input vector data, set-up parameters, and control signals. Each neuron circuit (11) includes logic for generating local result signals of the "fire" type (F) and a local output signal (NOUT) of the distance or category type on respective buses (NR-BUS, NOUT-BUS). An OR circuit (12) performs an OR function for all corresponding local result and output signals to generate respective first global result (R*) and output (OUT*) signals on respective buses (R*-BUS, OUT*-BUS) that are merged in an on-chip common communication bus (COM*-BUS) shared by all neuron circuits of the chip.
    Type: Grant
    Filed: June 7, 1995
    Date of Patent: February 10, 1998
    Assignee: International Business Machines Corporation
    Inventors: Andre Steimle, Pascal Tannhof, Guy Paillet
  • Patent number: 5710869
    Abstract: Each daisy chain circuit is serially connected to the two adjacent neuron circuits, so that all the neuron circuits form a chain. The daisy chain circuit distinguishes between the two possible states of the neuron circuit (engaged or free) and identifies the first free "or ready to learn" neuron circuit in the chain, based on the respective values of the input (DCI) and output (DCO) signals of the daisy chain circuit. The ready to learn neuron circuit is the only neuron circuit of the neural network having daisy chain input and output signals complementary to each other. The daisy chain circuit includes a 1-bit register (601) controlled by a store enable signal (ST) which is active at initialization or, during the learning phase when a new neuron circuit is engaged. At initialization, all the Daisy registers of the chain are forced to a first logic value.
    Type: Grant
    Filed: June 7, 1995
    Date of Patent: January 20, 1998
    Assignee: International Business Machines Corporation
    Inventors: Catherine Godefroy, Andre Steimle, Pascal Tannhof, Guy Paillet
  • Patent number: 5621863
    Abstract: In a neural network comprised of a plurality of neuron circuits, an improved neuron circuit that generates local result signals, e.g. of the fire type, and a local output signal of the distance or category type. The neuron circuit which is connected to buses that transport input data (e.g. the input category) and control signals. A multi-norm distance evaluation circuit calculates the distance D between the input vector and a prototype vector stored in a R/W memory circuit. A distance compare circuit compares this distance D with either the stored prototype vector's actual influence field or the lower limit thereof to generate first and second comparison signals. An identification circuit processes the comparison signals, the input category signal, the local category signal and a feedback signal to generate local result signals that represent the neuron circuit's response to the input vector.
    Type: Grant
    Filed: June 7, 1995
    Date of Patent: April 15, 1997
    Assignee: International Business Machines Corporation
    Inventors: Jean-Yves Boulet, Didier Louis, Catherine Godefroy, Andre Steimle, Pascal Tannhof, Guy Paillet
  • Patent number: 5155572
    Abstract: A vertical isolated-collector PNP transistor structure (58) comprises a P+ region (45), a N region (44) and a P- well region (46) which form the emitter, the base and the collector, respectively. The P- well region is enclosed in a N type pocket comprised of a N+ buried layer (48) and a N reach-through region (47) in contact therewith. The contact regions (46-1, 47-1) to the P- well region (46) and to the N reach-through region (47) are shorted to define a common collector contact (59). In addition, the thickness W of the P- well region (46) is so minimized to allow transistor action of the parasitic NPN transistor formed by N PNP base region (44), P- well region (46) and the N+ buried layer, (48) respectively as the collector, the base and the emitter of said PNP transistor. The PNP transistor structure (67) may be combined with a conventional NPN transistor structure (61).
    Type: Grant
    Filed: April 4, 1991
    Date of Patent: October 13, 1992
    Assignee: International Business Machines Corporation
    Inventors: Dominique Bonneau, Myriam Combes, Anthony J. Dally, Pierre Mollier, Seiki Ogura, Pascal Tannhof
  • Patent number: 5089725
    Abstract: The base circuit comprises a self-referenced preamplifier (31) of the differential type connected between first and second supply voltages and a push-pull output buffer stage connected between second and third supply voltages. The push-pull output buffer stage comprises a pull-up transistor and a pull-down transistor connected in series with the circuit output node coupled therebetween. These transistors are driven by complementary and substantially simultaneous signals S and S supplied by the preamplifier. Both branches of the preamplifier are tied at a first output node (M). The first branch comprises a logic block performing the desired logic function of the base circuit that is connected through a load rsistor to the second supply voltage. The logic block consists of three parallel-connected input NPN transistors, whose emitters are coupled together at the first output node for NOR operation.
    Type: Grant
    Filed: October 26, 1990
    Date of Patent: February 18, 1992
    Assignee: IBM Corporation
    Inventors: Pierre Mollier, Jean-Paul Nuez, Pascal Tannhof
  • Patent number: 5023478
    Abstract: The present invention relates to fast complementary emitter follower drivers/buffers to be used in either a CMOS or pure complementary bipolar environment. The output driver (22) comprises top NPN and bottom PNP output transistors (T1, T2) with a common output node (N) connected therebetween. A terminal (15) is connected to the said output node (N) where the output signal (VOUT) is available. The pair of bipolar output transistors is biased between the first and second supply voltages (VH, GND). The output driver is provided with a voltage translator circuit (S) connected between the base nodes (B1, B2) of the output transistors (T1, T2). Logic signals (IN1, IN2), supplied by a preceding driving circuit (21), are applied to said base nodes. According to the invention, the voltage translator circuit (S) comprises two diodes (D1, D2) connected in series, preferably implemented with a main bipolar transistor having a junction shorted by a diode connected transistor to form a Darlington-like configuration.
    Type: Grant
    Filed: March 13, 1990
    Date of Patent: June 11, 1991
    Assignee: International Business Machines Corporation
    Inventors: Gerard Boudon, Pierre Mollier, Seiki Ogura, Dominique Omet, Pascal Tannhof, Franck Wallart
  • Patent number: 5010257
    Abstract: According to the present invention, a CMOS interface circuit (C2) similar to a latch made by two CMOS cross coupled inverters (INV1, INV2) is placed directly on the output node (14) of conventional BICMOS logic circuit (11) operating alone in a partial swing mode. This latch is made of four FETs P5, P6, N8, N9 cross-coupled in a conventional way with the feedback loop connected to said output node (14). The partial voltage swing (VBE to VH-VBE) naturally given by the output bipolar transistors (T1, T2) mounted in a push pull configuration is reinforced to full swing (GND to VH) by the latch at the end of each transition. The state of the output node if forced by the latch because of the high driving capability due to the presence of said output bipolar transistors (T1, T2). As a result, the improved BICMOS logic circuit (D2) has an output signal (S) that ranges within the desired full swing voltage at the output terminal (15).
    Type: Grant
    Filed: March 13, 1990
    Date of Patent: April 23, 1991
    Assignee: International Business Machines Corporation
    Inventors: Gerard Boudon, Pierre Mollier, Jean-Paul Nuez, Ieng Ong, Pascal Tannhof, Franck Wallart