Patents by Inventor Hieu Van Tran

Hieu Van Tran has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10607703
    Abstract: A memory device with memory cells in rows and columns, word lines connecting together the control gates for the memory cell rows, bit lines electrically connecting together the drain regions for the memory cell columns, first sub source lines each electrically connecting together the source regions in one of the memory cell rows and in a first plurality of memory cell columns, second sub source lines each electrically connecting together the source regions in one of the memory cell rows and in a second plurality of memory cell columns, first and second source lines, first select transistors each connected between one of first sub source lines and the first source line, second select transistors each connected between one of second sub source lines and the second source line, and select transistor lines each connected to gates of one of the first select transistors and one of the second select transistors.
    Type: Grant
    Filed: July 23, 2018
    Date of Patent: March 31, 2020
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Hsuan Liang, Jeng-Wei Yang, Man-Tang Wu, Nhan Do, Hieu Van Tran
  • Patent number: 10607710
    Abstract: Numerous embodiments of a data refresh method and apparatus for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Various embodiments of a data drift detector suitable for detecting data drift in flash memory cells within the VMM array are disclosed.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: March 31, 2020
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10600484
    Abstract: An improved programming technique for non-volatile memory cell arrays, in which memory cells to be programmed with higher programming values are programmed first, and memory cells to be programmed with lower programming values are programmed second. The technique reduces or eliminates the number of previously programmed cells from being adversely incrementally programmed by an adjacent cell being programmed to higher program levels, and reduces the magnitude of adverse incremental programming for most of the memory cells, which is caused by floating gate to floating gate coupling. The memory device includes an array of non-volatile memory cells and a controller configured to identify programming values associated with incoming data, and perform a programming operation in which the incoming data is programmed into at least some of the non-volatile memory cells in a timing order of descending value of the programming values.
    Type: Grant
    Filed: December 20, 2017
    Date of Patent: March 24, 2020
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Vipin Tiwari, Nhan Do, Hieu Van Tran
  • Patent number: 10580491
    Abstract: A memory device includes rows and columns of memory cells, word lines each connected to a memory cell row, bit lines each connected to a memory cell column, a word line driver connected to the word lines, a bit line driver connected to the bit lines, word line switches each disposed on one of the word lines for selectively connecting one memory cell row to the word line driver, and bit line switches each disposed on one of the bit lines for selectively connecting one memory cell column to the bit line driver. A controller controls the word line switches to connect only some of the rows of memory cells to the word line driver at a first point in time, and controls the bit line switches to connect only some of the columns of memory cells to the bit line driver at a second point in time.
    Type: Grant
    Filed: June 21, 2018
    Date of Patent: March 3, 2020
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Vipin Tiwari, Hieu Van Tran, Nhan Do, Mark Reiten
  • Publication number: 20200065650
    Abstract: Numerous embodiments are disclosed for an analog neuromorphic memory system for use in a deep learning neural network. The analog neuromorphic memory system comprises a plurality of vector-by-matrix multiplication arrays and various components shared by those arrays. The shared components include high voltage generation blocks, verify blocks, and testing blocks. The analog neuromorphic memory system optionally is used within a long short term memory system or a gated recurrent unit system.
    Type: Application
    Filed: November 6, 2018
    Publication date: February 27, 2020
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly
  • Publication number: 20200066345
    Abstract: Numerous embodiments are disclosed for providing temperature compensation and leakage compensation for an analog neuromorphic memory system used in a deep learning neural network. The embodiments for providing temperature compensation implement discreet or continuous adaptive slope compensation and renormalization for devices, reference memory cells, or selected memory cells in the memory system. The embodiments for providing leakage compensation within a memory cell in the memory system implement adaptive erase gate coupling or the application of a negative bias on a control gate terminal, a negative bias on a word line terminal, or a bias on a source line terminal.
    Type: Application
    Filed: November 7, 2018
    Publication date: February 27, 2020
    Inventors: Hieu Van Tran, Steven Lemke, Nhan Do, Vipin Tiwari, Mark Reiten
  • Publication number: 20200065660
    Abstract: Numerous embodiments are disclosed for a configurable hardware system for use in an analog neural memory system for a deep learning neural network. The components within the configurable hardware system that are configurable can include vector-by-matrix multiplication arrays, summer circuits, activation circuits, inputs, reference devices, neurons, and testing circuits. These devices can be configured to provide various layers or vector-by-matrix multiplication arrays of various sizes, such that the same hardware can be used in analog neural memory systems with different requirements.
    Type: Application
    Filed: November 6, 2018
    Publication date: February 27, 2020
    Inventors: Hieu Van Tran, Vipin Tiwari, Mark Reiten, Nhan Do
  • Publication number: 20200058356
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. Optionally, the duration of a programming voltage can change as the number of cells to be programmed changes.
    Type: Application
    Filed: August 24, 2019
    Publication date: February 20, 2020
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Publication number: 20200058357
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. The system can modify a high voltage signal applied to an array of cells during a programming operation as the number of cells being programmed changes.
    Type: Application
    Filed: August 25, 2019
    Publication date: February 20, 2020
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Publication number: 20200051635
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. Compensation measures are utilized to compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Application
    Filed: August 25, 2019
    Publication date: February 13, 2020
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Publication number: 20200051636
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. Optionally, compensation measures can be utilized that compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Application
    Filed: August 25, 2019
    Publication date: February 13, 2020
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Publication number: 20200035310
    Abstract: Numerous embodiments of a data refresh method and apparatus for use with a vector-by-matrix multiplication (VMM) array in an artificial neural network are disclosed. Various embodiments of a data drift detector suitable for detecting data drift in flash memory cells within the VMM array are disclosed.
    Type: Application
    Filed: October 2, 2019
    Publication date: January 30, 2020
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Publication number: 20200019849
    Abstract: Numerous embodiments are disclosed for accessing redundant non-volatile memory cells in place of one or more rows or columns containing one or more faulty non-volatile memory cells during a program, erase, read, or neural read operation in an analog neural memory system used in a deep learning artificial neural network.
    Type: Application
    Filed: October 3, 2018
    Publication date: January 16, 2020
    Inventors: Hieu Van Tran, Stanley Hong, Thuan Vu, Anh Ly, Hien Pham, Kha Nguyen, Han Tran
  • Publication number: 20200019848
    Abstract: Numerous embodiments are disclosed for compensating for differences in the slope of the current-voltage characteristic curve among reference transistors, reference memory cells, and flash memory cells during a read operation in an analog neural memory in a deep learning artificial neural network. The embodiments are able to compensate for slope differences during both sub-threshold and linear operation of reference transistors.
    Type: Application
    Filed: October 3, 2018
    Publication date: January 16, 2020
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10534554
    Abstract: Apparatus, and an associated method, for enhancing security and preventing hacking of a flash memory device. The apparatus and method use a random number to offset the read or write address in a memory cell. The random number is generated by determining the leakage current of memory cells. In another embodiment, random data can be written or read in parallel to thwart hackers from determining contents of data being written or read by monitoring sense amplifiers.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: January 14, 2020
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Vipin Tiwari, Nhan Do
  • Patent number: 10522226
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Different calibration algorithms and systems are also disclosed. Optionally, compensation measures can be utilized that compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Grant
    Filed: July 23, 2018
    Date of Patent: December 31, 2019
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Patent number: 10515694
    Abstract: A method of reading a memory device having a plurality of memory cells by, and a device configured for, reading a first memory cell of the plurality of memory cells to generate a first read current, reading a second memory cell of the plurality of memory cells to generate a second read current, applying a first offset value to the second read current, and then combining the first and second read currents to form a third read current, and then determining a program state using the third read current. Alternately, a first voltage is generated from the first read current, a second voltage is generated from the second read current, whereby the offset value is applied to the second voltage, wherein the first and second voltages are combined to form a third voltage, and then the program state is determined using the third voltage.
    Type: Grant
    Filed: October 1, 2018
    Date of Patent: December 24, 2019
    Assignee: Silicon Storage Technology, Inc.
    Inventors: Vipin Tiwari, Nhan Do, Hieu Van Tran
  • Publication number: 20190378548
    Abstract: A system and method are disclosed for performing address fault detection in a flash memory system. In one embodiment, a flash memory system comprises a memory array comprising flash memory cells arranged in rows and columns, a row decoder for receiving a row address as an input, the row decoder coupled to a plurality of word lines, wherein each word line is coupled to a row of flash memory cells in the memory array, an address fault detection array comprising a column of memory cells, wherein each of the plurality of word lines is coupled to a memory cell in the column, and an analog comparator for comparing a current drawn by the column with a reference current and for indicating a fault if the current drawn by the column exceeds the reference current.
    Type: Application
    Filed: August 26, 2019
    Publication date: December 12, 2019
    Inventors: Hieu Van Tran, Xian Liu, Nhan Do
  • Publication number: 20190355420
    Abstract: In one aspect, the invention concerns a memory system that compensates for power level variations in sense amplifiers for multilevel memory. For example, a compensation circuit can be employed to compensate for current or voltage variations in the power supplied to multilevel memory sense amplifiers. As another example, compensation can be accomplished by application of a bias voltage to the power supply. Another example is a sense amplifier configured with improved input common mode voltage range. Such sense amplifiers can be two-pair and three-pair sense amplifiers. Further examples of the invention include more simplified sense amplifier configurations, and sense amplifiers having reduced leakage current.
    Type: Application
    Filed: July 30, 2019
    Publication date: November 21, 2019
    Inventor: Hieu Van Tran
  • Publication number: 20190355424
    Abstract: A memory device with memory cells in rows and columns, word lines connecting together the control gates for the memory cell rows, bit lines electrically connecting together the drain regions for the memory cell columns, first sub source lines each electrically connecting together the source regions in one of the memory cell rows and in a first plurality of memory cell columns, second sub source lines each electrically connecting together the source regions in one of the memory cell rows and in a second plurality of memory cell columns, first and second source lines, first select transistors each connected between one of first sub source lines and the first source line, second select transistors each connected between one of second sub source lines and the second source line, and select transistor lines each connected to gates of one of the first select transistors and one of the second select transistors.
    Type: Application
    Filed: July 23, 2018
    Publication date: November 21, 2019
    Inventors: Hsuan Liang, Jeng-Wei Yang, Man-Tang Wu, Nhan Do, Hieu Van Tran