Patents Examined by Robert W. Downs
  • Patent number: 5619615
    Abstract: A method and apparatus for identifying an agent based on a decision tree is provided. The apparatus transmits a series of queries to the agent and the agent transmits responses to the queries back to the apparatus. Based on the responses, the apparatus determines the identity of the agent. The apparatus determines the series of queries by traversing the decision tree based on responses sent by the agent to previous queries. When the traversal arrives at a leaf node of the decision tree, an agent associated with the leaf node is used to identify the agent. If the traversal fails before arriving at a leaf node, then a weight is determined for each supported agent based on the responses sent by the agent, and the agent with the greatest weight is used to identify the agent.
    Type: Grant
    Filed: July 22, 1994
    Date of Patent: April 8, 1997
    Assignee: Bay Networks, Inc.
    Inventors: Balaji Pitchaikani, Chen-Yea Luo
  • Patent number: 5617511
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: April 1, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5617510
    Abstract: A method, useful in computer-aided design, of identifying possible solutions to an over-constrained system having a collection of entities and constraints. The method represents the entities in terms of degrees of freedom and incrementally assembles the system by adding entities, satisfying constraints and reducing the degrees of freedom of the entities. For an over-constrained system, the method constructs a dependency graph of the system and identifies the set of constraints which over-constrains the system. The over-constraining set includes the constraint which initiated the over-constraint and those constraints back traced in the dependency graph from the initiating constraint. Removal of one or more constraints from the over-constraining set results in a solvable fully or under-constrained system. Intelligent selection of the removed constraint may increase computational efficiency or system stability.
    Type: Grant
    Filed: April 12, 1995
    Date of Patent: April 1, 1997
    Assignee: Schlumberger Technology Corporation
    Inventors: Walid T. Keyrouz, Glenn A. Kramer, Jahir A. Pabon
  • Patent number: 5615306
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 25, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5615307
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 25, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5615308
    Abstract: In a production system in which a plurality of production rules are partitioned into a plurality of clusters for processing to effect cluster control, control expression elements, each describing an instruction for specifying a cluster to be invoked next for execution or an instruction for other flow control, are arranged in the order of execution to provide a control expression part separate from the clusters for processing. An instruction for a return to the control expression part is described in an action part of an appropriate production rule in a cluster, in place of the cluster specifying instruction, so that sequential invocation of clusters can be performed following the instructions described in the control expression part, thereby making it unnecessary to give a description for specifying the name of a cluster to be invoked next for execution in the action part of each production rule.
    Type: Grant
    Filed: June 7, 1995
    Date of Patent: March 25, 1997
    Assignee: Toyo Communication Equipment Co., Ltd.
    Inventors: Ichiro Ando, Akio Sasaki, Tomoyuki Minamiyama, Shoichi Maki, Siryo Yasui, Hiroshi Ichise
  • Patent number: 5615303
    Abstract: Circuit for calculation of values of membership functions in a controller operating with fuzzy logic procedures. The membership functions are of triangular or trapezoidal form and are defined in a so-called discourse universe discretized in a finite number of points. The controller includes a central control unit equipped with a memory section for storage of said membership functions, a microprocessor, and an interface. The membership functions are stored by means of a codification of the coordinate of the vertex and the slopes at the sides of the vertex. The circuit includes a calculator connected to the memory section, the microprocessor, and the interface, to determine the value of each membership functions at each point of the discourse universe using the stored vertex and slopes.
    Type: Grant
    Filed: March 21, 1995
    Date of Patent: March 25, 1997
    Assignee: Consorzio per la Ricerca sulla Microelettronica nel Mezzogiorno
    Inventors: Massimo Abruzzese, Biagio Giacalone
  • Patent number: 5613039
    Abstract: Apparatus for motion detection and tracking of objects in a region for collision avoidance includes a signal transmitter which provides first and second detection signals for at least partial reflection by an object located in a spatial region. The apparatus further includes a signal receiver for receiving the deflected first and second detection signals corresponding to first and second object parameter data signals. The apparatus further includes a Fourier transform circuit for receiving the first and second object parameter data signals and providing first and second Fourier transform object parameter data signals. The apparatus further includes a probabilistic neural network for receiving and sorting the first and second Fourier transform object parameter data signals without the use of a priori training data.
    Type: Grant
    Filed: January 3, 1994
    Date of Patent: March 18, 1997
    Assignee: AIL Systems, Inc.
    Inventors: C. David Wang, James P. Thompson
  • Patent number: 5613042
    Abstract: A chaotic recurrent neural network includes N chaotic neural networks for receiving an external input and the outputs of N-1 chaotic neural networks among said N chaotic neural networks and performing an operation according to the following dynamic equation ##EQU1## wherein W.sub.ij is a synapse connection coefficient of the feedback input from the "j"th neuron to the "i"th neuron, X.sub.i (t) is the output of the "i"th neuron at time t, and .gamma..sub.i, .alpha. and and k are a time-delaying constant, a non-negative parameter and a refractory time attenuation constant, respectively, and wherein Z.sub.i (t) represents X.sub.i (t) when i belongs to the neuron group I and represents a.sub.i (t) when i belongs to the external input group E. Also, a learning algorithm for the chaotic recurrent neural network increases its learning efficiency.
    Type: Grant
    Filed: January 27, 1995
    Date of Patent: March 18, 1997
    Assignee: Gold Star Electron Co., Ltd.
    Inventors: Ho-sun Chung, Hye-young Tak
  • Patent number: 5613040
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 18, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5613044
    Abstract: A Neural synapse processor apparatus having a neuron architecture for the synapse processing elements of the apparatus. The apparatus which we prefer will have a N neuron structure having synapse processing units that contain instruction and data storage units, receive instructions and data, and execute instructions. The N neuron structure should contain communicating adder trees, neuron activation function units, and an arrangement for communicating both instructions, data, and the outputs of neuron activation function units back to the input synapse processing units by means of the communicating adder trees. The apparatus can be structured as a bit-serial or word parallel system. The preferred structure contains N.sup.2 synapse processing units, each associated with a connection weight in the N neural network to be emulated, placed in the form of a N by N matrix that has been folded along the diagonal and made up of diagonal cells and general cells.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 18, 1997
    Assignee: International Business Machines Corporation
    Inventors: Gerald G. Pechanek, Stamatis Vassiliadis, Jose G. Delgado-Frias
  • Patent number: 5613043
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 18, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5611065
    Abstract: A base address prediction system for predicting one of a plurality of base addresses to be added to a known relative address in order to generate an absolute address. An actual base address determined from the relative address is also generated. The actual base address determination takes longer to generate than the predicted base address determination, and therefore the predicted base address is used to select a base address as long as the prediction is correct. Circuitry exists to compare the predicted base address with the actual base address, and if not equal, the predicted base address will be nullified, and the actual base address will be used. Prediction modes are dependent on whether the relative address indicates an instruction fetch or an operand fetch. Where the relative address indicates an instruction fetch, the prediction will be based on the last base address used, on the assumption that instructions will be contiguous in a single block of memory.
    Type: Grant
    Filed: September 14, 1994
    Date of Patent: March 11, 1997
    Assignee: Unisys Corporation
    Inventors: Merwin H. Alferness, Joseph P. Kerzman, John Z. Nguyen
  • Patent number: 5611020
    Abstract: A neural network shell has a defined interface to an application program. By interfacing with the neural network shell, any application program becomes a neural network application program. The neural network shell contains a set of utility programs that transfers data into and out of a neural network data structure. This set of utility programs allows an application program to define a new neural network model, create a neural network data structure, train a neural network, and run a neural network. Once trained, the neural network data structure can be transported to other computer systems or to application programs written in different computing languages running on similar or different computer systems.
    Type: Grant
    Filed: June 2, 1995
    Date of Patent: March 11, 1997
    Assignee: International Business Machines Corporation
    Inventor: Joseph P. Bigus
  • Patent number: 5608845
    Abstract: In a method for determining a remaining lifetime of an aggregate constructed of a plurality of components and having at least one function, and an apparatus thereof, a first remaining lifetime of the aggregate is acquired based upon experimental aging degradation data on one characteristic of at least one component of the aggregate; a second remaining lifetime of the aggregate is acquired based upon experimental aging data with respect to at least one function of the aggregate; a third remaining lifetime of the aggregate is acquired based on both the experimental aging degradation data on the one characteristic of at least one component of the aggregate and the experimental aging data on at least one function of the aggregate; and, a remaining lifetime having the shortest lifetime from the first through third remaining lifetimes is output as the remaining lifetime of the aggregate.
    Type: Grant
    Filed: November 29, 1993
    Date of Patent: March 4, 1997
    Assignee: Hitachi, Ltd.
    Inventors: Hisao Ohtsuka, Motoaki Utamura
  • Patent number: 5608842
    Abstract: In known methods for conducting a process in an automatically controlled system, the system is preset at the beginning of each process run according to at least one process parameter. The at least one process parameter is precomputed with a model of the process which is supplied with input values. During the process, the input values and the process parameters are measured and are used after the process run to adaptively improve the precomputed value of the process parameters. To simplify and improve the precomputed value of a model having a plurality of partial models, computed results of the partial models are supplied to a neural network. The neural network produces the process parameters to be precomputed as a network response. The network parameters of the neural network are modified after each process run to adapt the precomputed value to the actual process events.
    Type: Grant
    Filed: November 10, 1994
    Date of Patent: March 4, 1997
    Assignee: Siemens Aktiengesellschaft
    Inventors: Einar Broese, Otto Gramckow, Thomas Martinetz, Guenter Soergel
  • Patent number: 5608846
    Abstract: This invention relates to an apparatus and method for generating membership functions and rules for a fuzzy system whereby fuzzy rules and membership functions are synthesized by observing a sample output/input data array and by creating new fuzzy sets which closely approximate various data associations in accordance with maximum inference error calculations. Applications of this system are provided in a temperature controller, a vehicle suspension controller, and a neural network/fuzzy rule converter device.
    Type: Grant
    Filed: September 27, 1995
    Date of Patent: March 4, 1997
    Assignee: Omron Corporation
    Inventors: Keiji Mitsubuchi, Satoru Isaka
  • Patent number: 5606646
    Abstract: A recurrent, neural network-based fuzzy logic system includes neurons in a rule base layer which each have a recurrent architecture with an output-to-input feedback path including a time delay element and a neural weight. Further included is a neural network-based, fuzzy logic finite state machine wherein the neural network-based, fuzzy logic system has a recurrent architecture with an output-to-input feedback path including at least a time delay element. Still further included is a recurrent, neural network-based fuzzy logic rule generator wherein a neural network receives and fuzzifies input data and provides data corresponding to fuzzy logic membership functions and recurrent fuzzy logic rules.
    Type: Grant
    Filed: June 24, 1994
    Date of Patent: February 25, 1997
    Assignee: National Semiconductor Corporation
    Inventors: Emdadur R. Khan, Faith A. Unal
  • Patent number: 5604840
    Abstract: An information processing apparatus is composed of an input layer, a hidden layer and an output layer, and performs a computation in terms of neuron models. In the information processing apparatus, a forward network comprising the input layer, the hidden layer and the output layer executes a computation for externally input data to determine the values of outputs therefrom, and a backward network comprising the output layer and the hidden layer executes computation for output values expected for given inputs to determine learning signal values. The information processing apparatus transfers the output values and learning values between the forward network and the backward network to modify the synapse weights of the neuron models.
    Type: Grant
    Filed: March 29, 1993
    Date of Patent: February 18, 1997
    Assignee: Hitachi, Ltd.
    Inventors: Mitsuo Asai, Noboru Masuda, Moritoshi Yasunaga, Masayoshi Yagyu, Minoru Yamada, Katsunari Shibata
  • Patent number: 5604841
    Abstract: The presence of particular faults in a machine is determined using constraint suspension and Qualitative Physics by propagating values indicative of received machine signals through a subset of model variables that are either unrestricted or partially restricted. The implicants of each value assigned to a variable of the subset of variables can be determined and other variables can be restricted according to the union of the implicants. Values for fully restricted and partially restricted variables can be propagated in order to further restrict any other variables. The values of variables can be either qualitative, quantitative, or both. A Qualitative Physics model of a machine can be constructed by providing a user with a graphical user interface allowing selection and interconnection of machine components for the model such that the user can define a landmark domain and apply the landmark domain to provide definitions of qualitative value spaces of variables of the model.
    Type: Grant
    Filed: April 26, 1994
    Date of Patent: February 18, 1997
    Assignee: United Technologies Corporation
    Inventors: Thomas P. Hamilton, Robert T. Clark, Steven Gallo