Patents by Inventor James Bradley Aimone

James Bradley Aimone has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190079729
    Abstract: A method of increasing an efficiency at which a plurality of threshold gates arranged as neuromorphic hardware is able to perform a linear algebraic calculation having a dominant size of N. The computer-implemented method includes using the plurality of threshold gates to perform the linear algebraic calculation in a manner that is simultaneously efficient and at a near constant depth. “Efficient” is defined as a calculation algorithm that uses fewer of the plurality of threshold gates than a naïve algorithm. The naïve algorithm is a straightforward algorithm for solving the linear algebraic calculation. “Constant depth” is defined as an algorithm that has an execution time that is independent of a size of an input to the linear algebraic calculation. The near constant depth comprises a computing depth equal to or between O(log(log(N)) and the constant depth.
    Type: Application
    Filed: September 8, 2017
    Publication date: March 14, 2019
    Inventors: James Bradley Aimone, Ojas D. Parekh, Cynthia A. Phillips
  • Publication number: 20190034358
    Abstract: A method and system for accessing a memory for a data processing system. The method comprises sending a read request for a plurality of locations in the memory to read the plurality of locations in parallel based on an upper bound for reading the memory. The upper bound for a number of locations is based on a group of constraints for the memory. The method receives a summed value of a plurality of memory values in the plurality of locations in the memory.
    Type: Application
    Filed: July 31, 2017
    Publication date: January 31, 2019
    Inventors: Conrad D. James, Tu-Thach Quach, Sapan Agarwal, James Bradley Aimone
  • Publication number: 20170177993
    Abstract: A method and computer system for managing a neural network. Data is sent into an input layer in a portion of layers of nodes in the neural network. The data moves on an encode path through the portion such that an output layer in the portion outputs encoded data. The encoded data is sent into the output layer on a decode path through the portion back to the input layer to obtain a reconstruction of the data by the input layer. A determination is made as to whether an undesired amount of error has occurred in the output layer based on the data sent into the input layer and the reconstruction of the data. A number of new nodes is added to the output layer when a determination is present that the undesired amount of the error occurred, enabling reducing the error using the number of the new nodes.
    Type: Application
    Filed: December 18, 2015
    Publication date: June 22, 2017
    Inventors: Timothy J. Draelos, James Bradley Aimone
  • Patent number: 8630966
    Abstract: An apparatus, article and method containing an artificial neural network that, after training, produces new trainable nodes such that input data representative of a first event and input data representative of a second event both activate a subset of the new trainable nodes. The artificial neural network can generate an output that is influenced by the input data of both events. In various embodiments, the new trainable nodes are sequentially produced and show decreasing trainability over time such that, at a particular point in time, newer produced nodes are more trainable than earlier produced nodes. The artificial neural network can be included in various embodiments of methods, apparatus and articles for use in predicting or profiling events.
    Type: Grant
    Filed: January 27, 2010
    Date of Patent: January 14, 2014
    Assignee: Salk Institute for Biological Studies
    Inventors: Fred H. Gage, James Bradley Aimone, Janet Wiles
  • Publication number: 20100235310
    Abstract: An apparatus, article and method containing an artificial neural network that, after training, produces new trainable nodes such that input data representative of a first event and input data representative of a second event both activate a subset of the new trainable nodes. The artificial neural network can generate an output that is influenced by the input data of both events. In various embodiments, the new trainable nodes are sequentially produced and show decreasing trainability over time such that, at a particular point in time, newer produced nodes are more trainable than earlier produced nodes. The artificial neural network can be included in various embodiments of methods, apparatus and articles for use in predicting or profiling events.
    Type: Application
    Filed: January 27, 2010
    Publication date: September 16, 2010
    Inventors: Fred H. Gage, James Bradley Aimone, Janet Wiles