Patents by Inventor Aaron Russell VOELKER

Aaron Russell VOELKER has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11741098
    Abstract: The present invention relates to methods and systems for storing and querying database entries with neuromorphic computers. The system is comprised of a plurality of encoding subsystems that convert database entries and search keys into vector representations, a plurality of associative memory subsystems that match vector representations of search keys to vector representations of database entries using spike-based comparison operations, a plurality of binding subsystems that update retrieved vector representations during the execution of hierarchical queries, a plurality of unbinding subsystems that extract information from retrieved vector representations, a plurality of cleanup subsystems that remove noise from these retrieved representations, and one or more input search key representations that propagates spiking activity through the associative memory, binding, unbinding, cleanup, and readout subsystems to retrieve database entries matching the search key.
    Type: Grant
    Filed: July 15, 2020
    Date of Patent: August 29, 2023
    Assignee: APPLIED BRAIN RESEARCH INC.
    Inventors: Aaron Russell Voelker, Christopher David Eliasmith, Peter Blouw
  • Publication number: 20220172053
    Abstract: A method is described for designing systems that provide efficient implementations of feed-forward, recurrent, and deep networks that process dynamic signals using temporal filters and static or time-varying nonlinearities. A system design methodology is described that provides an engineered architecture. This architecture defines a core set of network components and operations for efficient computation of dynamic signals using temporal filters and static or time-varying nonlinearities. These methods apply to a wide variety of connected nonlinearities that include temporal filters in the connections. Here we apply the methods to synaptic models coupled with spiking and/or non-spiking neurons whose connection parameters are determined using a variety of methods of optimization.
    Type: Application
    Filed: December 20, 2021
    Publication date: June 2, 2022
    Applicant: Applied Brain Research Inc.
    Inventors: Aaron Russell Voelker, Christopher David Eliasmith
  • Publication number: 20220138382
    Abstract: The present invention relates to methods and systems for simulating and predicting dynamical systems with vector symbolic representations of continuous spaces. More specifically, the present invention specifies methods for simulating and predicting such dynamics through the definition of temporal fractional binding, collection, and decoding subsystems that collectively function to both create vector symbolic representations of multi-object trajectories, and decode these representations to simulate or predict the future states of these trajectories. Systems composed of one or more of these temporal fractional binding, collection, and decoding subsystems are combined to simulate or predict the behavior of at least one dynamical system that involves the motion of at least one object.
    Type: Application
    Filed: November 5, 2021
    Publication date: May 5, 2022
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH, Peter BLOUW, Terrance STEWART, NICOLE SANDRA-YAFFE DUMONT
  • Publication number: 20220083867
    Abstract: The present invention relates to methods and systems for using neural networks to simulate dynamical systems for purposes of solving optimization problems. More specifically, the present invention defines methods and systems that perform a process of “synaptic descent” for performing “synaptic descent”, wherein the state of a given synapse in a neural network is a variable being optimized, the input to the synapse is a gradient defined with respect to this state, and the synapse implements the computations of an optimizer that performs gradient descent over time. Synapse models regulate the dynamics of a given neural network by governing how the output of one neuron is passed as input to another, and since the process of synaptic descent performs gradient descent with respect to state variables defining these dynamics, it can be harnessed to evolve the neural network towards a state or sequence of states that encodes the solution to an optimization problem.
    Type: Application
    Filed: September 14, 2021
    Publication date: March 17, 2022
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH
  • Patent number: 11238345
    Abstract: Neural network architectures, with connection weights determined using Legendre Memory Unit equations, are trained while optionally keeping the determined weights fixed. Networks may use spiking or non-spiking activation functions, may be stacked or recurrently coupled with other neural network architectures, and may be implemented in software and hardware. Embodiments of the invention provide systems for pattern classification, data representation, and signal processing, that compute using orthogonal polynomial basis functions that span sliding windows of time.
    Type: Grant
    Filed: March 6, 2020
    Date of Patent: February 1, 2022
    Assignee: Applied Brain Research Inc.
    Inventors: Aaron Russell Voelker, Christopher David Eliasmith
  • Patent number: 11238337
    Abstract: A method is described for designing systems that provide efficient implementations of feed-forward, recurrent, and deep networks that process dynamic signals using temporal filters and static or time-varying nonlinearities. A system design methodology is described that provides an engineered architecture. This architecture defines a core set of network components and operations for efficient computation of dynamic signals using temporal filters and static or time-varying nonlinearities. These methods apply to a wide variety of connected nonlinearities that include temporal filters in the connections. Here we apply the methods to synaptic models coupled with spiking and/or non-spiking neurons whose connection parameters are determined using a variety of methods of optimization.
    Type: Grant
    Filed: August 22, 2016
    Date of Patent: February 1, 2022
    Assignee: Applied Brain Research Inc.
    Inventors: Aaron Russell Voelker, Christopher David Eliasmith
  • Publication number: 20210342668
    Abstract: Recurrent neural networks are efficiently mapped to hardware computation blocks specifically designed for Legendre Memory Unit (LMU) cells, Projected LSTM cells, and Feed Forward cells. Iterative resource allocation algorithms are used to partition recurrent neural networks and time multiplex them onto a spatial distribution of computation blocks, guided by multivariable optimizations for power, performance, and accuracy. Embodiments of the invention provide systems for low power, high performance deployment of recurrent neural networks for battery sensitive applications such as automatic speech recognition (ASR), keyword spotting (KWS), biomedical signal processing, and other applications that involve processing time-series data.
    Type: Application
    Filed: April 29, 2021
    Publication date: November 4, 2021
    Inventors: Gurshaant Singh Malik, Aaron Russell VOELKER, Christopher David ELIASMITH
  • Publication number: 20210133568
    Abstract: The present invention relates to methods of sparsifying signals over time in multi-bit spiking neural networks, methods of training and converting these networks by interpolating between spiking and non-spiking regimes, and their efficient implementation in digital hardware. Four algorithms are provided that encode signals produced by nonlinear functions, spiking neuron models, supplied as input to the network, and any linear combination thereof, as multi-bit spikes that may be compressed and adaptively scaled in size, in order to balance metrics including the desired accuracy of the network and the available energy in hardware.
    Type: Application
    Filed: October 30, 2020
    Publication date: May 6, 2021
    Inventor: Aaron Russell Voelker
  • Publication number: 20210133190
    Abstract: The present invention relates to methods and systems for storing and querying database entries with neuromorphic computers. The system is comprised of a plurality of encoding subsystems that convert database entries and search keys into vector representations, a plurality of associative memory subsystems that match vector representations of search keys to vector representations of database entries using spike-based comparison operations, a plurality of binding subsystems that update retrieved vector representations during the execution of hierarchical queries, a plurality of unbinding subsystems that extract information from retrieved vector representations, a plurality of cleanup subsystems that remove noise from these retrieved representations, and one or more input search key representations that propagates spiking activity through the associative memory, binding, unbinding, cleanup, and readout subsystems to retrieve database entries matching the search key.
    Type: Application
    Filed: July 15, 2020
    Publication date: May 6, 2021
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH, Peter Blouw
  • Publication number: 20210089912
    Abstract: Neural network architectures, with connection weights determined using Legendre Memory Unit equations, are trained while optionally keeping the determined weights fixed. Networks may use spiking or non-spiking activation functions, may be stacked or recurrently coupled with other neural network architectures, and may be implemented in software and hardware. Embodiments of the invention provide systems for pattern classification, data representation, and signal processing, that compute using orthogonal polynomial basis functions that span sliding windows of time.
    Type: Application
    Filed: March 6, 2020
    Publication date: March 25, 2021
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH
  • Publication number: 20200302281
    Abstract: The present invention relates to methods and systems for encoding and processing representations that include continuous structures using vector-symbolic representations. The system is comprised of a plurality of binding subsystems that implement a fractional binding operation, a plurality of unbinding subsystems that implement a fractional unbinding operation, and at least one input symbol representation that propagates activity through a binding subsystem and an unbinding subsystem to produce a high-dimensional vector representation of a continuous space.
    Type: Application
    Filed: March 18, 2020
    Publication date: September 24, 2020
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH, Brent KOMER, Terrence STEWART
  • Publication number: 20180053090
    Abstract: A method is described for designing systems that provide efficient implementations of feed-forward, recurrent, and deep networks that process dynamic signals using temporal filters and static or time-varying nonlinearities. A system design methodology is described that provides an engineered architecture. This architecture defines a core set of network components and operations for efficient computation of dynamic signals using temporal filters and static or time-varying nonlinearities. These methods apply to a wide variety of connected nonlinearities that include temporal filters in the connections. Here we apply the methods to synaptic models coupled with spiking and/or non-spiking neurons whose connection parameters are determined using a variety of methods of optimization.
    Type: Application
    Filed: August 22, 2016
    Publication date: February 22, 2018
    Inventors: Aaron Russell VOELKER, Christopher David ELIASMITH