Search Patents
  • Publication number: 20080301075
    Abstract: A neural network comprises trained interconnected neurons. The neural network is configured to constrain the relationship between one or more inputs and one or more outputs of the neural network so the relationships between them are consistent with expectations of the relationships; and/or the neural network is trained by creating a set of data comprising input data and associated outputs that represent archetypal results and providing real exemplary input data and associated output data and the created data to neural network. The real exemplary output data and the created associated output data is compared to the actual output of the neural network, which is adjusted to create a best fit to the real exemplary data and the created data.
    Type: Application
    Filed: November 7, 2007
    Publication date: December 4, 2008
    Applicant: Neural Technologies, Ltd.
    Inventors: George Bolt, John Manslow, Alan McLachlan
  • Patent number: 5832466
    Abstract: In the design and implementation of neural networks, training is determined by a series of architectural and parametric decisions. A method is disclosed that, using genetic algorithms, improves the training characteristics of a neural network. The method begins with a population and iteratively modifies one or more parameters in each generation based on the network with the best training response in the previous generation.
    Type: Grant
    Filed: August 12, 1996
    Date of Patent: November 3, 1998
    Assignee: International Neural Machines Inc.
    Inventor: Oleg Feldgajer
  • Patent number: 6067536
    Abstract: A neural network circuit for performing a processing of recognizing voices, images and the like comprises a weight memory for holding a lot of weight values (initial weight values) which correspond to a plurality of input terminals of each of a plurality of neurons forming a neural network and have been initially learned, and a difference value memory for storing difference values between the weight values of the weight memory and additionally learned weight values. The weight memory is formed by a ROM. The difference value memory is formed by a SRAM, for example. During operation of recognizing input data, the initial weight values of the weight memory and the difference values of the difference value memory are added together. The added weight values are used to calculate an output value of each neuron of an output layer. Accordingly, the initial weight values can be additionally learned at a high speed by existence of the difference value memory having a small capacity.
    Type: Grant
    Filed: May 29, 1997
    Date of Patent: May 23, 2000
    Assignee: Matsushita Electric Industrial Co., Ltd.
    Inventors: Masakatsu Maruyama, Hiroyuki Nakahira, Masaru Fukuda, Shiro Sakiyama
  • Patent number: 10198691
    Abstract: Disclosed are various embodiments of memristive neural networks comprising neural nodes. Memristive nanofibers are used to form artificial synapses in the neural networks. Each memristive nanofiber may couple one or more neural nodes to one or more other neural nodes. In one case, a memristive neural network includes a first neural node, a second neural node, and a memristive fiber that couples the first neural node to the second neural node. The memristive fiber comprises a conductive core and a memristive shell, where the conductive core forms a communications path between the first neural node and the second neural node and the memristive shell forms a memristor synapse between the first neural node and the second neural node.
    Type: Grant
    Filed: December 19, 2016
    Date of Patent: February 5, 2019
    Assignee: UNIVERSITY OF FLORIDA RESEARCH FOUNDATION, INC.
    Inventors: Juan Claudio Nino, Jack Kendall
  • Patent number: 11657284
    Abstract: An electronic apparatus for compressing a neural network model may acquire training data pairs based on an original, trained neural network model and train a compressed neural network model compressed from the original, trained neural network model using the acquired training data pairs.
    Type: Grant
    Filed: May 6, 2020
    Date of Patent: May 23, 2023
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Jaedeok Kim, Chiyoun Park, Youngchul Sohn, Inkwon Choi
  • Patent number: 11301749
    Abstract: A method for calculating an output of a neural network, including the steps of generating a first neural network that includes discrete edge weights from a neural network that includes precise edge weights by stochastic rounding; of generating a second neural network that includes discrete edge weights from the neural network that includes precise edge weights by stochastic rounding; and of calculating an output by adding together the output of the first neural network and of the second neural network.
    Type: Grant
    Filed: November 8, 2017
    Date of Patent: April 12, 2022
    Assignee: Robert Bosch GmbH
    Inventors: Christoph Schorn, Sebastian Vogel
  • Patent number: 11609792
    Abstract: The present disclosure relates to a method for allocating resources of an accelerator to two or more neural networks for execution. The two or more neural networks may include a first neural network and a second neural network. The method comprises analyzing workloads of the first neural network and the second neural network, wherein the first neural network and second neural network each includes multiple computational layers, evaluating computational resources of the accelerator for executing each computational layer of the first and second neural networks, and scheduling computational resources of the accelerator to execute one computational layer of the multiple computation layers of the first neural network and to execute one or more computational layers of the multiple computational layers of the second neural network.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: March 21, 2023
    Assignee: Alibaba Group Holding Limited
    Inventors: Lingjie Xu, Wei Wei
  • Patent number: 10748057
    Abstract: Methods, apparatus, and computer readable media related to combining and/or training one or more neural network modules based on version identifier(s) assigned to the neural network module(s). Some implementations are directed to using version identifiers of neural network modules in determining whether and/or how to combine multiple neural network modules to generate a combined neural network model for use by a robot and/or other apparatus. Some implementations are additionally or alternatively directed to assigning a version identifier to an endpoint of a neural network module based on one or more other neural network modules to which the neural network module is joined during training of the neural network module.
    Type: Grant
    Filed: September 21, 2016
    Date of Patent: August 18, 2020
    Assignee: X DEVELOPMENT LLC
    Inventors: Adrian Li, Mrinal Kalakrishnan
  • Patent number: 11630990
    Abstract: The present disclosure provides systems, methods and computer-readable media for optimizing the neural architecture search for the automated machine learning process. In one aspect, neural architecture search method including selecting a neural architecture for training as part of an automated machine learning process; collecting statistical parameters on individual nodes of the neural architecture during the training; determining, based on the statistical parameters, active nodes of the neural architecture to form a candidate neural architecture; and validating the candidate neural architecture to produce a trained neural architecture to be used in implemented an application or a service.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: April 18, 2023
    Assignee: Cisco Technology, Inc.
    Inventors: Abhishek Singh, Debojyoti Dutta
  • Patent number: 5852815
    Abstract: Constructing and simulating artificial neural networks and components thereof within a spreadsheet environment results in user friendly neural networks which do not require algorithmic based software in order to train or operate. Such neural networks can be easily cascaded to form complex neural networks and neural network systems, including neural networks capable of self-organizing so as to self-train within a spreadsheet, neural networks which train simultaneously within a spreadsheet, and neural networks capable of autonomously moving, monitoring, analyzing, and altering data within a spreadsheet. Neural networks can also be cascaded together in self training neural network form to achieve a device prototyping system.
    Type: Grant
    Filed: May 15, 1998
    Date of Patent: December 22, 1998
    Inventor: Stephen L. Thaler
  • Patent number: 8738554
    Abstract: The present invention provides an event-driven universal neural network circuit. The circuit comprises a plurality of neural modules. Each neural module comprises multiple digital neurons such that each neuron in a neural module has a corresponding neuron in another neural module. An interconnection network comprising a plurality of digital synapses interconnects the neural modules. Each synapse interconnects a first neural module to a second neural module by interconnecting a neuron in the first neural module to a corresponding neuron in the second neural module. Corresponding neurons in the first neural module and the second neural module communicate via the synapses. Each synapse comprises a learning rule associating a neuron in the first neural module with a corresponding neuron in the second neural module. A control module generates signals which define a set of time steps for event-driven operation of the neurons and event communication via the interconnection network.
    Type: Grant
    Filed: September 16, 2011
    Date of Patent: May 27, 2014
    Assignee: International Business Machines Corporation
    Inventor: Dharmendra S. Modha
  • Patent number: 10204118
    Abstract: Embodiments of the invention relate to mapping neural dynamics of a neural model on to a lookup table. One embodiment comprises defining a phase plane for a neural model. The phase plane represents neural dynamics of the neural model. The phase plane is coarsely sampled to obtain state transition information for multiple neuronal states. The state transition information is mapped on to a lookup table.
    Type: Grant
    Filed: June 29, 2016
    Date of Patent: February 12, 2019
    Assignee: International Business Machines Corporation
    Inventors: Rodrigo Alvarez-Icaza Rivera, John V. Arthur, Andrew S. Cassidy, Pallab Datta, Paul A. Merolla, Dharmendra S. Modha
  • Patent number: 11604941
    Abstract: A method of training an action selection neural network to perform a demonstrated task using a supervised learning technique. The action selection neural network is configured to receive demonstration data comprising actions to perform the task and rewards received for performing the actions. The action selection neural network has auxiliary prediction task neural networks on one or more of its intermediate outputs. The action selection policy neural network is trained using multiple combined losses, concurrently with the auxiliary prediction task neural networks.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: March 14, 2023
    Assignee: DeepMind Technologies Limited
    Inventor: Todd Andrew Hester
  • Patent number: 11521068
    Abstract: According to various embodiments, a method for generating one or more optimal neural network architectures is disclosed. The method includes providing an initial seed neural network architecture and utilizing sequential phases to synthesize the neural network until a desired neural network architecture is reached. The phases include a gradient-based growth phase and a magnitude-based pruning phase.
    Type: Grant
    Filed: October 25, 2018
    Date of Patent: December 6, 2022
    Assignee: THE TRUSTEES OF PRINCETON UNIVERSITY
    Inventors: Xiaoliang Dai, Hongxu Yin, Niraj K. Jha
  • Patent number: 5845271
    Abstract: Constructing and simulating artificial neural networks and components thereof within a spreadsheet environment results in user friendly neural networks which do not require algorithmic based software in order to train or operate. Such neural networks can be easily cascaded to form complex neural networks and neural network systems, including neural networks capable of self-organizing so as to self-train within a spreadsheet, neural networks which train simultaneously within a spreadsheet, and neural networks capable of autonomously moving, monitoring, analyzing, and altering data within a spreadsheet. Neural networks can also be cascaded together in self training neural network form to achieve a device prototyping system.
    Type: Grant
    Filed: January 26, 1996
    Date of Patent: December 1, 1998
    Inventor: Stephen L. Thaler
  • Patent number: 11195094
    Abstract: A method of updating a neural network may be provided. A method may include selecting a number of neurons for a layer for a neural network such that the number of neurons in the layer is less than at least one of a number of neurons in a first layer of the neural network and a number of neurons in a second, adjacent layer of the neural network. The method may further include and at least one of inserting the layer between the first layer and the second layer of the neural network and replacing one of the first layer and the second layer with the layer to reduce a number of connections in the neural network.
    Type: Grant
    Filed: January 17, 2017
    Date of Patent: December 7, 2021
    Assignee: FUJITSU LIMITED
    Inventor: Michael Lee
  • Publication number: 20130117210
    Abstract: Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 9, 2013
    Applicant: QUALCOMM Incorporated
    Inventors: Jason Frank Hunzinger, Victor Hokkiu Chan
  • Publication number: 20130073494
    Abstract: The present invention provides an event-driven universal neural network circuit. The circuit comprises a plurality of neural modules. Each neural module comprises multiple digital neurons such that each neuron in a neural module has a corresponding neuron in another neural module. An interconnection network comprising a plurality of digital synapses interconnects the neural modules. Each synapse interconnects a first neural module to a second neural module by interconnecting a neuron in the first neural module to a corresponding neuron in the second neural module. Corresponding neurons in the first neural module and the second neural module communicate via the synapses. Each synapse comprises a learning rule associating a neuron in the first neural module with a corresponding neuron in the second neural module. A control module generates signals which define a set of time steps for event-driven operation of the neurons and event communication via the interconnection network.
    Type: Application
    Filed: September 16, 2011
    Publication date: March 21, 2013
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventor: Dharmendra S. Modha
  • Publication number: 20130117211
    Abstract: Certain aspects of the present disclosure support techniques for unsupervised neural replay, learning refinement, association and memory transfer.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 9, 2013
    Applicant: QUALCOMM Incorporated
    Inventors: Jason Frank Hunzinger, Victor Hokkiu Chan
  • Patent number: 11977916
    Abstract: A neural network processing unit (NPU) includes a processing element array, an NPU memory system configured to store at least a portion of data of an artificial neural network model processed in the processing element array, and an NPU scheduler configured to control the processing element array and the NPU memory system based on artificial neural network model structure data or artificial neural network data locality information.
    Type: Grant
    Filed: December 31, 2020
    Date of Patent: May 7, 2024
    Assignee: DEEPX CO., LTD.
    Inventor: Lok Won Kim