Patents Assigned to Green Mountain Semiconductor Inc.
  • Patent number: 11586895
    Abstract: Techniques for manipulation of a recursive neural network using random access memory are disclosed. Neural network descriptor information and weight matrices are stored in a semiconductor random access memory device which includes neural network processing logic. The network descriptor information and weight matrices comprise a trained neural network functionality. An input matrix is obtained for processing on the memory device. The trained neural network functionality is executed on the input matrix, which includes processing data for a first layer from the neural network descriptor information to set up the processing logic; manipulating the input matrix using the processing logic and at least one of the weight matrices; caching results of the manipulating in a storage location of the memory device; and processing recursively the results that were cached through the processing logic. Additional data for additional layers is processed until the neural network functionality is complete.
    Type: Grant
    Filed: June 16, 2020
    Date of Patent: February 21, 2023
    Assignee: Green Mountain Semiconductor, Inc.
    Inventors: Bret Dale, David T Kinney
  • Patent number: 11048583
    Abstract: A novel architecture provides many of the advantages of the array and datapath architecture of DRAM products that do not utilize ECC (error correction code) functionality, while simultaneously allowing the flexible deployment of ECC error correction as needed. Aspects of the disclosure enable the minimization of write and read latency typically introduced by the implementation of ECC error correction. Sharing of circuit components between neighboring memory regions is also introduced, which allows for a reduction in circuit area as well as a reduction in loading on speed-critical data bus wiring, which improves overall performance. A very fast single error correct (SEC) and double error detect (DED) read-out for real-time system-level awareness is also provided.
    Type: Grant
    Filed: September 12, 2016
    Date of Patent: June 29, 2021
    Assignee: Green Mountain Semiconductor Inc.
    Inventors: Wolfgang Hokenmaier, Ryan A. Jurasek, Donald W. Labrecque
  • Patent number: 10818344
    Abstract: Techniques are disclosed for artificial neural network functionality within dynamic random-access memory. A plurality of dynamic random-access cells is accessed within a memory block. Data within the plurality of dynamic random-access cells is sensed using a plurality of sense amplifiers associated with the plurality of dynamic random-access cells. A plurality of select lines coupled to the plurality of sense amplifiers is activated to facilitate the sensing of the data within the plurality of dynamic random-access cells, wherein the activating is a function of inputs to a layer within a neural network, and wherein a bit within the plurality of dynamic random-access cells is sensed by a first sense amplifier and a second sense amplifier within the plurality of sense amplifiers. Resulting data is provided based on the activating wherein the resulting data is a function of weights within the neural network.
    Type: Grant
    Filed: July 22, 2019
    Date of Patent: October 27, 2020
    Assignee: Green Mountain Semiconductor, Inc.
    Inventors: Wolfgang Hokenmaier, Jacob Bucci, Ryan Jurasek
  • Patent number: 10360971
    Abstract: Techniques are disclosed for artificial neural network functionality within dynamic random-access memory. A plurality of dynamic random-access cells is accessed within a memory block. Data within the plurality of dynamic random-access cells is sensed using a plurality of sense amplifiers associated with the plurality of dynamic random-access cells. A plurality of select lines coupled to the plurality of sense amplifiers is activated to facilitate the sensing of the data within the plurality of dynamic random-access cells, wherein the activating is a function of inputs to a layer within a neural network, and wherein a bit within the plurality of dynamic random-access cells is sensed by a first sense amplifier and a second sense amplifier within the plurality of sense amplifiers. Resulting data is provided based on the activating wherein the resulting data is a function of weights within the neural network.
    Type: Grant
    Filed: April 24, 2018
    Date of Patent: July 23, 2019
    Assignee: Green Mountain Semiconductor, Inc.
    Inventors: Wolfgang Hokenmaier, Jacob Bucci, Ryan Jurasek
  • Patent number: 10002658
    Abstract: A highly configurable, extremely dense, high speed and low power artificial neural network is presented. The architecture may utilize DRAM cells for their density and high endurance to store weight and bias values. A number of primary sense amplifiers along with column select lines (CSLs), local data lines (LDLs), and sense circuitry may comprise a single neuron. Since the data in the primary sense amplifiers can be updated with a new row activation, the same hardware can be reused for many different neurons. The result is a large amount of neurons that can be connected by the user. Training can be done in hardware by actively varying weights and monitoring cost. The network can be run and trained at high speed since processing and/or data transfer that needs to be performed can be minimized.
    Type: Grant
    Filed: November 2, 2016
    Date of Patent: June 19, 2018
    Assignee: Green Mountain Semiconductor Inc.
    Inventors: Wolfgang Hokenmaier, Ryan A. Jurasek, Donald W. Labrecque
  • Patent number: 9899087
    Abstract: An extremely dense, high speed, and low power content addressable DRAM is presented. To enable a parallel searching, a data word to be searched may be driven onto column select lines (CSLs) of a DRAM array. Although two or more primary sense amplifiers typically are not connected at the same time to the same local data line during operation of a DRAM, in various embodiments presented herein, some or all sense amplifiers in a DRAM can be activated simultaneously to enable maximum parallelism with local data line sharing being explicitly allowed. Using this architecture, a data word can be simultaneously searched in all banks and with multiple wordlines. Since no input/output transactions are required and no data needs to be driven from the bank during execution of a search, overall current, and thus power usage, can be reduced.
    Type: Grant
    Filed: November 7, 2016
    Date of Patent: February 20, 2018
    Assignee: Green Mountain Semiconductor Inc.
    Inventors: Wolfgang Hokenmaier, Ryan A. Jurasek, Donald W. Labrecque, Aaron D. Willey