Patents by Inventor Vipin Tiwari

Vipin Tiwari has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230229888
    Abstract: Numerous examples of summing circuits for a neural network are disclosed. In one example, a circuit for summing current received from a plurality of synapses in a neural network comprises a voltage source; a load coupled between the voltage source and an output node; a voltage clamp coupled to the output node for maintaining a voltage at the output node; and a plurality of synapses coupled between the output node and ground; wherein an output current flows through the output node, the output current equal to a sum of currents drawn by the plurality of synapses.
    Type: Application
    Filed: March 20, 2023
    Publication date: July 20, 2023
    Inventors: Farnood Merrikh Bayat, Xinjie Guo, Dmitri Strukov, Nhan Do, Hieu Van Tran, Vipin Tiwari, Mark Reiten
  • Publication number: 20230229887
    Abstract: Numerous examples are disclosed for an output block coupled to a non-volatile memory array in a neural network and associated methods. In one example, a circuit for converting a current in a neural network into an output voltage comprises a non-volatile memory cell comprises a word line terminal, a bit line terminal, and a source line terminal, wherein the bit line terminal receives the current; and a switch for selectively coupling the word line terminal to the bit line terminal; wherein when the switch is closed, the current flows into the non-volatile memory cell and the output voltage is provided on the bit line terminal.
    Type: Application
    Filed: March 20, 2023
    Publication date: July 20, 2023
    Inventors: Farnood Merrikh BAYAT, Xinjie GUO, Dmitri STRUKOV, Nhan DO, Hieu Van TRAN, Vipin TIWARI, Mark REITEN
  • Publication number: 20230223077
    Abstract: A neural network device with synapses having memory cells each having a floating gate and a first gate over first and second portions of a channel region disposed between source and drain regions, and a second gate over the floating gate or the source region. First lines each electrically connect the first gates in one of the memory cell rows, second lines each electrically connect the second gates in one of the memory cell rows, third lines each electrically connect the source regions in one of the memory cell columns, and fourth lines each electrically connect the drain regions in one of the memory cell columns. The synapses receive a first plurality of inputs as electrical voltages on the first or second lines, and provide a first plurality of outputs as electrical currents on the third or fourth lines.
    Type: Application
    Filed: March 21, 2023
    Publication date: July 13, 2023
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20230206026
    Abstract: Numerous examples are disclosed for verifying a weight programmed into a selected non-volatile memory cell in a neural memory. In one example, a circuit comprises a digital-to-analog converter to convert a target weight comprising digital bits into a target voltage, a current-to-voltage converter to convert an output current from the selected non-volatile memory cell during a verify operation into an output voltage, and a comparator to compare the output voltage to the target voltage during a verify operation.
    Type: Application
    Filed: March 10, 2023
    Publication date: June 29, 2023
    Inventors: Farnood Merrikh Bayat, Xinjie Guo, Dmitri Strukov, Nhan Do, Hieu Van Tran, Vipin Tiwari, Mark Reiten
  • Patent number: 11682459
    Abstract: Two or more physical memory cells are grouped together to form a logical cell that stores one of N possible levels. Within each logical cell, the memory cells can be programmed using different mechanisms. For example, one or more of the memory cells in a logical cell can be programmed using a coarse programming mechanism, one or more of the memory cells can be programmed using a fine mechanism, and one or more of the memory cells can be programmed using a tuning mechanism. This achieves extreme programming accuracy and programming speed.
    Type: Grant
    Filed: October 28, 2020
    Date of Patent: June 20, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Stanley Hong, Stephen Trinh, Thuan Vu, Steven Lemke, Vipin Tiwari, Nhan Do
  • Patent number: 11683933
    Abstract: Numerous embodiments for reading a value stored in a selected memory cell in a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. In one embodiment, an input comprises a set of input bits that result in a series of input pulses applied to a terminal of the selected memory cell, further resulting in a series of output signals that are summed to determine the value stored in the selected memory cell. In another embodiment, an input comprises a set of input bits, where each input bit results in a single pulse or no pulse being applied to a terminal of the selected memory cell, further resulting in a series of output signals which are then weighted according to the binary bit location of the input bit, and where the weighted signals are then summed to determine the value stored in the selected memory cell.
    Type: Grant
    Filed: December 14, 2020
    Date of Patent: June 20, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20230178147
    Abstract: Numerous embodiments of analog neural memory arrays are disclosed. In one embodiment, a system comprises a first array of non-volatile memory cells, wherein the cells are arranged in rows and columns and the non-volatile memory cells in one or more of the columns stores W+ values, and wherein one of the columns in the first array is a dummy column; and a second array of non-volatile memory cells, wherein the cells are arranged in rows and columns and the non-volatile memory cells in one or more of the columns stores W? values, and wherein one of the columns in the second array is a dummy column; wherein pairs of cells from the first array and the second array store a differential weight, W, according to the formula W=(W+)?(W?).
    Type: Application
    Filed: January 30, 2023
    Publication date: June 8, 2023
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Vipin Tiwari
  • Patent number: 11646075
    Abstract: A neural network device with synapses having memory cells each having a floating gate and a first gate over first and second portions of a channel region disposed between source and drain regions, and a second gate over the floating gate or the source region. First lines each electrically connect the first gates in one of the memory cell rows, second lines each electrically connect the second gates in one of the memory cell rows, third lines each electrically connect the source regions in one of the memory cell columns, and fourth lines each electrically connect the drain regions in one of the memory cell columns. The synapses receive a first plurality of inputs as electrical voltages on the first or second lines, and provide a first plurality of outputs as electrical currents on the third or fourth lines.
    Type: Grant
    Filed: September 9, 2021
    Date of Patent: May 9, 2023
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Patent number: 11636322
    Abstract: Numerous embodiments of a precision programming algorithm and apparatus are disclosed for precisely and quickly depositing the correct amount of charge on the floating gate of a non-volatile memory cell within a vector-by-matrix multiplication (VMM) array in an artificial neural network. Selected cells thereby can be programmed with extreme precision to hold one of N different values.
    Type: Grant
    Filed: March 25, 2020
    Date of Patent: April 25, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Patent number: 11626176
    Abstract: The present embodiments relate to systems and methods for implementing wear leveling in a flash memory device that emulates an EEPROM. The embodiments utilize an index array, which stores an index word for each logical address in the emulated EEPROM. The embodiments comprise a system and method for receiving an erase command and a logical address, the logical address corresponding to a sector of physical words of non-volatile memory cells in an array of non-volatile memory cells, the sector comprising a first physical word, a last physical word, and one or more physical words between the first physical word and the last physical word; when a current word, identified by an index bit, is the last physical word in the sector, erasing the sector; and when the current word is not the last physical word in the sector, changing a next index bit.
    Type: Grant
    Filed: January 7, 2022
    Date of Patent: April 11, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Guangming Lin, Xiaozhou Qian, Xiao Yan Pi, Vipin Tiwari, Zhenlin Ding
  • Patent number: 11600321
    Abstract: Numerous embodiments of analog neural memory arrays are disclosed. In one embodiment, an analog neural memory system comprises an array of non-volatile memory cells, wherein the cells are arranged in rows and columns, the columns arranged in physically adjacent pairs of columns, wherein within each adjacent pair one column in the adjacent pair comprises cells storing W+ values and one column in the adjacent pair comprises cells storing W? values, wherein adjacent cells in the adjacent pair store a differential weight, W, according to the formula W=(W+)?(W?). In another embodiment, an analog neural memory system comprises a first array of non-volatile memory cells storing W+ values and a second array storing W? values.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: March 7, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Vipin Tiwari
  • Publication number: 20230031487
    Abstract: Numerous examples of an input function circuit block and an output neuron circuit block coupled to a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. In one example, an artificial neural network comprises a vector-by-matrix multiplication array comprising a plurality of non-volatile memory cells organized into rows and columns; an input function circuit block to receive digital input signals, convert the digital input signals into analog signals, and apply the analog signals to control gate terminals of non-volatile memory cells in one or more rows of the array during a programming operation; and an output neuron circuit block to receive analog currents from the columns of the array during a read operation and generate an output signal.
    Type: Application
    Filed: September 21, 2022
    Publication date: February 2, 2023
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20220405564
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. In one example, a method comprises programming a plurality of analog neural non-volatile memory cells in an array of analog neural non-volatile memory cells to store one of N different values, where N is a number of different levels that can be stored in any of the analog neural non-volatile memory cells; measuring a current drawn by the plurality of analog neural non-volatile memory cells; comparing the measured current to a target value; and identifying the plurality of the analog neural non-volatile memory cells as bad if the difference between the measured value and the target value exceeds a threshold.
    Type: Application
    Filed: August 22, 2022
    Publication date: December 22, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Nha Nguyen, Vipin Tiwari, Nhan Do
  • Patent number: 11532354
    Abstract: Numerous embodiments for performing tuning of a page or a word of non-volatile memory cells in an analog neural memory are disclosed. High voltage circuits used to generate high voltages applied to terminals of the non-volatile memory cells during the precision tuning process are also disclosed. Programming sequences for the application of the voltages to the terminals to minimize the occurrence of disturbances during tuning are also disclosed.
    Type: Grant
    Filed: September 17, 2020
    Date of Patent: December 20, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Vipin Tiwari, Nhan Do
  • Publication number: 20220398444
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. In one example, a method comprises programming an analog neural non-volatile memory cell in an array to a target value representing one of N different values, where N is an integer; verifying that a value stored in the analog neural non-volatile memory cell is within an acceptable window of values around the target value; repeating the programming and verifying for each of the N values; and identifying the analog neural non-volatile memory cell as bad if any of the verifying indicates a value stored in the cell outside of the acceptable window of values around the target value.
    Type: Application
    Filed: August 22, 2022
    Publication date: December 15, 2022
    Inventors: Hieu Van TRAN, Thuan VU, Stephen TRINH, Stanley HONG, Anh LY, Steven LEMKE, Nha NGUYEN, Vipin TIWARI, Nhan DO
  • Publication number: 20220391682
    Abstract: Numerous embodiments are disclosed for compensating for differences in the slope of the current-voltage characteristic curve among reference transistors, reference memory cells, and flash memory cells during a read operation in an analog neural memory in a deep learning artificial neural network. In one embodiment, a method comprises receiving a first voltage, multiplying the first voltage by a coefficient to generate a second voltage, applying the first voltage to a gate of one of a reference transistor and a selected memory cell, applying the second voltage to a gate of the other of a reference transistor and a selected memory cell, and using the reference transistor in a sense operation to determine a value stored in the selected memory cell.
    Type: Application
    Filed: August 10, 2022
    Publication date: December 8, 2022
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 11521683
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Compensation measures can be utilized that compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Grant
    Filed: March 3, 2021
    Date of Patent: December 6, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Patent number: 11521682
    Abstract: Numerous embodiments are disclosed for providing temperature compensation in an analog memory array. A method and related system are disclosed for compensating for temperature changes in an array of memory cells by measuring an operating temperature within the array of memory cells and changing a threshold voltage of a selected memory cell in the array of memory cells to compensate for a change in the operating temperature.
    Type: Grant
    Filed: November 11, 2020
    Date of Patent: December 6, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Steven Lemke, Nhan Do, Vipin Tiwari, Mark Reiten
  • Publication number: 20220383087
    Abstract: Numerous embodiments are disclosed for compensating for differences in the slope of the current-voltage characteristic curve among reference transistors, reference memory cells, and flash memory cells during a read operation in an analog neural memory in a deep learning artificial neural network. In one embodiment, a method comprises receiving an input voltage, multiplying the input voltage by a coefficient to generate an output voltage, applying the output voltage to a gate of a selected memory cell, performing a sense operating using the selected memory cell and a reference device to determine a value stored in the selected memory cell, wherein a slope of a current-voltage characteristic curve of the reference device and a slope of the current-voltage characteristic curve of the selected memory cell are approximately equal during the sense operation.
    Type: Application
    Filed: August 10, 2022
    Publication date: December 1, 2022
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Publication number: 20220383086
    Abstract: Numerous examples of a precision tuning algorithm and apparatus are disclosed for precisely and quickly depositing the correct amount of charge on the floating gate of a non-volatile memory cell within a vector-by-matrix multiplication (VMM) array in an artificial neural network. In one example, a method for performing a read or verify operation in a vector-by-matrix multiplication system comprising an input function circuit, a memory array, and an output circuit block is disclosed, the method comprising receiving, by the input function circuit, digital bit input values; converting the digital input values into an input signal; applying the input signal to control gate terminals of selected cells in the memory array; and generating, by the output circuit block, an output value in response to currents received from the memory array.
    Type: Application
    Filed: July 27, 2022
    Publication date: December 1, 2022
    Inventors: HIEU VAN TRAN, STEVEN LEMKE, VIPIN TIWARI, NHAN DO, MARK REITEN