Patents by Inventor Hieu Van Tran

Hieu Van Tran has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190341118
    Abstract: Numerous embodiments of a data refresh method and apparatus for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Various embodiments of a data drift detector suitable for detecting data drift in flash memory cells within the VMM array are disclosed.
    Type: Application
    Filed: May 16, 2019
    Publication date: November 7, 2019
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Publication number: 20190341110
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. Optionally, compensation measures can be utilized that compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Application
    Filed: July 23, 2018
    Publication date: November 7, 2019
    Inventors: Hieu Van Tran, Thuan Vu, Staniey Hong, Anh Ly, Vlpln Tlwarl, Nhan Do
  • Patent number: 10460811
    Abstract: A memory device and method of erasing same that includes a substrate of semiconductor material and a plurality of memory cells formed on the substrate and arranged in an array of rows and columns. Each of the memory cells includes spaced apart source and drain regions in the substrate, with a channel region in the substrate extending there between, a floating gate disposed over and insulated from a first portion of the channel region which is adjacent the source region, a select gate disposed over and insulated from a second portion of the channel region which is adjacent the drain region, and a program-erase gate disposed over and insulated from the source region. The program-erase gate lines alone or in combination with the select gate lines, or the source lines, are arranged in the column direction so that each memory cell can be individually programmed, read and erased.
    Type: Grant
    Filed: April 17, 2019
    Date of Patent: October 29, 2019
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10460810
    Abstract: An improved method and apparatus for programming advanced nanometer flash memory cells is disclosed.
    Type: Grant
    Filed: August 25, 2017
    Date of Patent: October 29, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Anh Ly, Thuan Vu, Hung Quoc Nguyen
  • Patent number: 10446246
    Abstract: Numerous embodiments of a data refresh method and apparatus for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Various embodiments of a data drift detector suitable for detecting data drift in flash memory cells within the VMM array are disclosed.
    Type: Grant
    Filed: May 25, 2018
    Date of Patent: October 15, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10431265
    Abstract: A system and method are disclosed for performing address fault detection in a flash memory system. An address fault detection array is used to confirm that an activated word line or bit line is the word line or bit line that was actually intended to be activated based upon the received address, which will identify a type of fault where the wrong word line or bit line is activated. The address fault detection array also is used to indicate whether more than one word line or bit line was activated, which will identify a type of fault where two or more word lines or bit lines are activated.
    Type: Grant
    Filed: March 23, 2017
    Date of Patent: October 1, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Xian Liu, Nhan Do
  • Publication number: 20190295647
    Abstract: A memory device includes rows and columns of memory cells, word lines each connected to a memory cell row, bit lines each connected to a memory cell column, a word line driver connected to the word lines, a bit line driver connected to the bit lines, word line switches each disposed on one of the word lines for selectively connecting one memory cell row to the word line driver, and bit line switches each disposed on one of the bit lines for selectively connecting one memory cell column to the bit line driver. A controller controls the word line switches to connect only some of the rows of memory cells to the word line driver at a first point in time, and controls the bit line switches to connect only some of the columns of memory cells to the bit line driver at a second point in time.
    Type: Application
    Filed: June 21, 2018
    Publication date: September 26, 2019
    Inventors: Vipin Tiwari, Hieu Van Tran, Nhan Do, Mark Reiten
  • Publication number: 20190286976
    Abstract: Numerous embodiments of decoders for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. The decoders include bit line decoders, word line decoders, control gate decoders, source line decoders, and erase gate decoders. In certain embodiments, a high voltage version and a low voltage version of a decoder is used.
    Type: Application
    Filed: May 29, 2018
    Publication date: September 19, 2019
    Inventors: Hieu Van Tran, Stanley Hong, Anh Ly, Thuan Vu, Hien Pham, Kha Nguyen, Han TRan
  • Publication number: 20190287621
    Abstract: Numerous embodiments of programming systems and methods for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Selected cells thereby can be programmed with extreme precision to hold one of N different values.
    Type: Application
    Filed: May 25, 2018
    Publication date: September 19, 2019
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20190287631
    Abstract: Numerous embodiments of a data refresh method and apparatus for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Various embodiments of a data drift detector suitable for detecting data drift in flash memory cells within the VMM array are disclosed.
    Type: Application
    Filed: May 25, 2018
    Publication date: September 19, 2019
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Publication number: 20190272876
    Abstract: A non-volatile memory device is disclosed. The non-volatile memory device comprises an array of flash memory cells comprising a plurality of flash memory cells organized into rows and columns, wherein the array is further organized into a plurality of sectors, each sector comprising a plurality of rows of flash memory cells, and a row driver selectively coupled to a first row and a second row.
    Type: Application
    Filed: May 20, 2019
    Publication date: September 5, 2019
    Inventors: Hieu Van Tran, Anh Ly, Thuan Vu, Vipin Tiwari, Nhan Do
  • Patent number: 10388389
    Abstract: A memory device that provides individual memory cell read, write and erase. In an array of memory cells arranged in rows and columns, each column of memory cells includes a column bit line, a first column control gate line for even row cells and a second column control gate line for odd row cells. Each row of memory cells includes a row source line. In another embodiment, each column of memory cells includes a column bit line and a column source line. Each row of memory cells includes a row control gate line. In yet another embodiment, each column of memory cells includes a column bit line and a column erase gate line. Each row of memory cells includes a row source line, a row control gate line, and a row select gate line.
    Type: Grant
    Filed: February 8, 2019
    Date of Patent: August 20, 2019
    Assignees: Silicon Storage Technology, Inc., The Regents Of The University of California
    Inventors: Xinjie Guo, Farnood Merrikh Bayat, Dmitri Strukov, Nhan Do, Hieu Van Tran, Vipin Tiwari
  • Publication number: 20190244669
    Abstract: A memory device and method of erasing same that includes a substrate of semiconductor material and a plurality of memory cells formed on the substrate and arranged in an array of rows and columns. Each of the memory cells includes spaced apart source and drain regions in the substrate, with a channel region in the substrate extending there between, a floating gate disposed over and insulated from a first portion of the channel region which is adjacent the source region, a select gate disposed over and insulated from a second portion of the channel region which is adjacent the drain region, and a program-erase gate disposed over and insulated from the source region. The program-erase gate lines alone or in combination with the select gate lines, or the source lines, are arranged in the column direction so that each memory cell can be individually programmed, read and erased.
    Type: Application
    Filed: April 17, 2019
    Publication date: August 8, 2019
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10373686
    Abstract: A three-dimensional flash memory system is disclosed. The system comprises a memory array comprising a plurality of stacked dies, where each die comprises memory cells. The system further comprises a plurality of pins, where the function of at least some of the pins can be configured using a mechanism that selects a function for those pins from a plurality of possible functions.
    Type: Grant
    Filed: July 26, 2017
    Date of Patent: August 6, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Hung Quoc Nguyen, Mark Reiten
  • Publication number: 20190237142
    Abstract: A neural network device with synapses having memory cells each having a floating gate and a first gate over first and second portions of a channel region, and second and third gates over the floating gate and over the source region. First lines each electrically connect the first gates in one of the memory cell rows, second lines each electrically connect the second gates in one of the memory cell rows, third lines each electrically connect the third gates in one of the memory cell rows, fourth lines each electrically connect the source regions in one of the memory cell rows, and fifth lines each electrically connect the drain regions in one of the memory cell columns. The synapses receive a first plurality of inputs as electrical voltages on the first, second or third lines, and provide a first plurality of outputs as electrical currents on the fifth lines.
    Type: Application
    Filed: April 11, 2019
    Publication date: August 1, 2019
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20190237136
    Abstract: A neural network device having a first plurality of synapses that includes a plurality of memory cells. Each memory cell includes a floating gate over a first portion of a channel region and a first gate over a second portion of the channel region. The memory cells are arranged in rows and columns. A plurality of first lines each electrically connect together the first gates in one of the memory cell rows, a plurality of second lines each electrically connect together the source regions in one of the memory cell rows, and a plurality of third lines each electrically connect together the drain regions in one of the memory cell columns. The first plurality of synapses receives a first plurality of inputs as electrical voltages on the plurality of third lines, and provides a first plurality of outputs as electrical currents on the plurality of second lines.
    Type: Application
    Filed: April 11, 2019
    Publication date: August 1, 2019
    Inventors: Hieu Van Tran, Steven Lemke, Vipin Tiwari, Nhan Do, Mark Reiten
  • Publication number: 20190206486
    Abstract: A memory device includes memory cells each configured to produce an output current during a read operation. Circuitry is configured to, for each of the memory cells, generate a read value based on the output current of the memory cell. Circuitry is configured to, for each of the memory cells, multiply the read value for the memory cell by a multiplier to generate a multiplied read value, wherein the multiplier for each of the memory cells is different from the multipliers for any others of the memory cells. Circuitry is configured to sum the multiplied read values. The read values can be electrical currents, electrical voltages or numerical values. Alternatively, added constant values can be used instead of multipliers. The multipliers or constants can be applied to read currents from individual cells, or read currents on entire bit lines.
    Type: Application
    Filed: December 7, 2018
    Publication date: July 4, 2019
    Inventors: Vipin Tiwari, Hieu Van Tran, Nhan Do
  • Publication number: 20190205729
    Abstract: Numerous embodiments for processing the current output of a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. The embodiments comprise a summer circuit and an activation function circuit. The summer circuit and/or the activation function circuit comprise circuit elements that can be adjusted in response to the total possible current received from the VMM to optimize power consumption.
    Type: Application
    Filed: March 27, 2018
    Publication date: July 4, 2019
    Inventors: Hieu Van Tran, Stanley Hong, Anh Ly, Thuan Vu, Hien Pham, Kha Nguyen, Han Tran
  • Patent number: 10340010
    Abstract: In one embodiment of the present invention, one row is selected and two columns are selected for a read or programming operation, such that twice as many flash memory cells can be read from or programmed in a single operation compared to the prior art. In another embodiment of the present invention, two rows in different sectors are selected and one column is selected for a read operation, such that twice as many flash memory cells can be read in a single operation compared to the prior art.
    Type: Grant
    Filed: August 16, 2016
    Date of Patent: July 2, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Anh Ly, Thuan Vu, Vipin Tiwari, Nhan Do
  • Publication number: 20190189214
    Abstract: An improved programming technique for non-volatile memory cell arrays, in which memory cells to be programmed with higher programming values are programmed first, and memory cells to be programmed with lower programming values are programmed second. The technique reduces or eliminates the number of previously programmed cells from being adversely incrementally programmed by an adjacent cell being programmed to higher program levels, and reduces the magnitude of adverse incremental programming for most of the memory cells, which is caused by floating gate to floating gate coupling. The memory device includes an array of non-volatile memory cells and a controller configured to identify programming values associated with incoming data, and perform a programming operation in which the incoming data is programmed into at least some of the non-volatile memory cells in a timing order of descending value of the programming values.
    Type: Application
    Filed: December 20, 2017
    Publication date: June 20, 2019
    Inventors: Vipin Tiwari, Nhan Do, Hieu Van Tran