Patents by Inventor Thuan Vu

Thuan Vu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11586898
    Abstract: Various embodiments of high voltage generation circuits, high voltage operational amplifiers, adaptive high voltage supplies, adjustable high voltage incrementor, adjustable reference supplies, and reference circuits are disclosed. These circuits optionally can be used for programming a non-volatile memory cell in an analog neural memory to store one of many possible values.
    Type: Grant
    Filed: March 21, 2019
    Date of Patent: February 21, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly
  • Publication number: 20230049032
    Abstract: Numerous embodiments of output circuitry for an analog neural memory in a deep learning artificial neural network are disclosed. In some embodiments, a common mode circuit is used with differential cells, W+ and W?, that together store a weight, W. The common mode circuit can utilize current sources, variable resistors, or transistors as part of the structure for introducing a common mode voltage bias.
    Type: Application
    Filed: November 8, 2021
    Publication date: February 16, 2023
    Inventors: Hieu Van Tran, Thuan Vu
  • Publication number: 20230048411
    Abstract: Numerous embodiments of input circuitry for an analog neural memory in a deep learning artificial neural network are disclosed.
    Type: Application
    Filed: November 5, 2021
    Publication date: February 16, 2023
    Inventors: Hieu Van Tran, KHA NGUYEN, THUAN VU, HIEN PHAM, STANLEY HONG, STEPHEN TRINH
  • Patent number: 11568229
    Abstract: Numerous embodiments are disclosed for accessing redundant non-volatile memory cells in place of one or more rows or columns containing one or more faulty non-volatile memory cells during a program, erase, read, or neural read operation in an analog neural memory system used in a deep learning artificial neural network.
    Type: Grant
    Filed: October 3, 2018
    Date of Patent: January 31, 2023
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Stanley Hong, Thuan Vu, Anh Ly, Hien Pham, Kha Nguyen, Han Tran
  • Publication number: 20230018166
    Abstract: Various examples of decoders and physical layout designs for non-volatile flash memory arrays in an analog neural system are disclosed. In one example, a system comprises a plurality of vector-by-matrix multiplication arrays in an analog neural memory system, each vector-by-matrix multiplication array comprising an array of non-volatile memory cells organized into rows and columns, wherein each memory cell comprises a word line terminal; a plurality of read row decoders, each read row decoder coupled to one of the plurality of vector-by-matrix multiplication arrays for applying a voltage to one or more selected rows during a read operation; and a shared program row decoder coupled to all of the plurality of vector-by-matrix multiplication arrays for applying a voltage to one or more selected rows in one or more of the vector-by-matrix multiplication arrays during a program operation.
    Type: Application
    Filed: June 29, 2022
    Publication date: January 19, 2023
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Stephen Trinh, Anh Ly, Han Tran, Kha Nguyen, Hien Pham
  • Publication number: 20220405564
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. In one example, a method comprises programming a plurality of analog neural non-volatile memory cells in an array of analog neural non-volatile memory cells to store one of N different values, where N is a number of different levels that can be stored in any of the analog neural non-volatile memory cells; measuring a current drawn by the plurality of analog neural non-volatile memory cells; comparing the measured current to a target value; and identifying the plurality of the analog neural non-volatile memory cells as bad if the difference between the measured value and the target value exceeds a threshold.
    Type: Application
    Filed: August 22, 2022
    Publication date: December 22, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Nha Nguyen, Vipin Tiwari, Nhan Do
  • Patent number: 11532354
    Abstract: Numerous embodiments for performing tuning of a page or a word of non-volatile memory cells in an analog neural memory are disclosed. High voltage circuits used to generate high voltages applied to terminals of the non-volatile memory cells during the precision tuning process are also disclosed. Programming sequences for the application of the voltages to the terminals to minimize the occurrence of disturbances during tuning are also disclosed.
    Type: Grant
    Filed: September 17, 2020
    Date of Patent: December 20, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Vipin Tiwari, Nhan Do
  • Publication number: 20220398444
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. In one example, a method comprises programming an analog neural non-volatile memory cell in an array to a target value representing one of N different values, where N is an integer; verifying that a value stored in the analog neural non-volatile memory cell is within an acceptable window of values around the target value; repeating the programming and verifying for each of the N values; and identifying the analog neural non-volatile memory cell as bad if any of the verifying indicates a value stored in the cell outside of the acceptable window of values around the target value.
    Type: Application
    Filed: August 22, 2022
    Publication date: December 15, 2022
    Inventors: Hieu Van TRAN, Thuan VU, Stephen TRINH, Stanley HONG, Anh LY, Steven LEMKE, Nha NGUYEN, Vipin TIWARI, Nhan DO
  • Patent number: 11521683
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. Compensation measures can be utilized that compensate for changes in voltage or current as the number of cells being programmed changes.
    Type: Grant
    Filed: March 3, 2021
    Date of Patent: December 6, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do
  • Publication number: 20220374696
    Abstract: Numerous embodiments are disclosed for splitting an array of non-volatile memory cells in an analog neural memory in a deep learning artificial neural network into multiple parts. Each part of the array interacts with certain circuitry dedicated to that part and with other circuitry that is shared with one or more other parts of the array.
    Type: Application
    Filed: August 30, 2021
    Publication date: November 24, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Stephen Trinh, Anh Ly
  • Publication number: 20220374161
    Abstract: Numerous embodiments are disclosed for an output circuit for an analog neural memory in a deep learning artificial neural network. In some embodiments, an output block receives current from a W+ bit line and current from an associated W? bit line, and the output block generates an output signal that is a differential signal in certain embodiments and is a single ended signal in other embodiments.
    Type: Application
    Filed: August 31, 2021
    Publication date: November 24, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Mark Reiten
  • Patent number: 11507642
    Abstract: Configurable input blocks and output blocks and physical layouts are disclosed for analog neural memory systems that utilize non-volatile memory cells. An input block can be configured to support different numbers of arrays arranged in a horizontal direction, and an output block can be configured to support different numbers of arrays arranged in a vertical direction. Adjustable components are disclosed for use in the configurable input blocks and output blocks.
    Type: Grant
    Filed: June 21, 2019
    Date of Patent: November 22, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Stephen Trinh, Thuan Vu, Stanley Hong, Vipin Tiwari, Mark Reiten, Nhan Do
  • Publication number: 20220336020
    Abstract: Examples for ultra-precise tuning of a selected memory cell are disclosed. In one example, a method of programming a first memory cell in a neural memory to a target value is disclosed, the method comprising programming a second memory cell by applying programming voltages to terminals of the second memory cell; and determining if an output of the first memory cell has reached the target value.
    Type: Application
    Filed: June 27, 2022
    Publication date: October 20, 2022
    Inventors: Steven Lemke, Hieu Van Tran, Yuri Tkachev, Louisa Schneider, Henry A. Om'mani, Thuan Vu, Nhan Do, Vipin Tiwari
  • Publication number: 20220336011
    Abstract: Numerous examples for performing tuning of a page or a word of non-volatile memory cells in an analog neural memory are disclosed. In one example, a method comprises programming a word or page of non-volatile memory cells in an analog neural memory system; and identifying any fast bits in the word or page of non-volatile memory cells.
    Type: Application
    Filed: July 4, 2022
    Publication date: October 20, 2022
    Inventors: Hieu Van Tran, THUAN VU, STEPHEN TRINH, STANLEY HONG, ANH LY, STEVEN LEMKE, VIPIN TIWARI, NHAN DO
  • Publication number: 20220336010
    Abstract: Numerous examples for performing tuning of a page or a word of non-volatile memory cells in an analog neural memory are disclosed. In one example, an analog neural memory system comprises an array of non-volatile memory cells arranged into rows and columns, each non-volatile memory cell comprising a word line terminal, a bit line terminal, and an erase gate terminal; a plurality of word lines, each word line coupled to word line terminals of a row of non-volatile memory cells; a plurality of bit lines, each bit line coupled to bit line terminals of a column of non-volatile memory cells; and a plurality of erase gate enable transistors, each erase gate enable transistor coupled to erase gate terminals of a word of non-volatile memory cells.
    Type: Application
    Filed: July 1, 2022
    Publication date: October 20, 2022
    Inventors: Hieu Van Tran, THUAN VU, STEPHEN TRINH, STANLEY HONG, ANH LY, STEVEN LEMKE, VIPIN TIWARI, NHAN DO
  • Publication number: 20220319619
    Abstract: Circuitry and methods are disclosed for compensating for leakage in analog neural memory in deep learning artificial neural networks. In one example, a method is disclosed of compensating for leakage in an array of analog neural non-volatile memory cells, wherein the array is arranged in rows and columns, wherein each row is coupled to a word line and each column is coupled to a bitline, the method comprising measuring leakage for a column of analog neural non-volatile memory cells coupled to a bitline; storing the measured leakage value; and applying the measured leakage value during a read operation of the column of analog neural non-volatile memory cells to compensate for the leakage.
    Type: Application
    Filed: June 13, 2022
    Publication date: October 6, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Nha Nguyen, Vipin Tiwari, Nhan Do
  • Publication number: 20220319620
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. In one example, a method is disclosed of testing a plurality of non-volatile memory cells in an array of non-volatile memory cells, wherein the array is arranged in rows and columns, wherein each row is coupled to a word line and each column is coupled to a bit line, and wherein each word line is selectively coupled to a row decoder and each bit line is selectively coupled to a column decoder, the method comprising asserting, by the row decoder, all word lines in the array; asserting, by the column decoder, all bit lines in the array; performing a deep programming operation on the array of non-volatile memory cells; and measuring a total current received from the bit lines.
    Type: Application
    Filed: June 15, 2022
    Publication date: October 6, 2022
    Inventors: Hieu Van TRAN, Thuan VU, Stephen TRINH, Stanley HONG, Anh LY, Steven LEMKE, Nha NGUYEN, Vipin TIWARI, Nhan DO
  • Patent number: 11449741
    Abstract: Testing circuitry and methods are disclosed for use with analog neural memory in deep learning artificial neural networks. The analog neural memory comprises one or more arrays of non-volatile memory cells. The testing circuitry and methods can be utilized during sort tests, qualification tests, and other tests to verify programming operations of one or more cells.
    Type: Grant
    Filed: September 12, 2019
    Date of Patent: September 20, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stephen Trinh, Stanley Hong, Anh Ly, Steven Lemke, Nha Nguyen, Vipin Tiwari, Nhan Do
  • Patent number: 11423979
    Abstract: Various embodiments of word line decoders, control gate decoders, bit line decoders, low voltage row decoders, and high voltage row decoders and various types of physical layout designs for non-volatile flash memory arrays in an analog neural system are disclosed. Shared and segmented embodiments of high voltage row decoders are disclosed.
    Type: Grant
    Filed: July 3, 2019
    Date of Patent: August 23, 2022
    Assignee: SILICON STORAGE TECHNOLOGY, INC.
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Stephen Trinh, Anh Ly, Han Tran, Kha Nguyen, Hien Pham
  • Publication number: 20220254414
    Abstract: Numerous embodiments are disclosed for a high voltage generation algorithm and system for generating high voltages necessary for a particular programming operation in analog neural memory used in a deep learning artificial neural network. In one example, a method for programming a plurality of non-volatile memory cells in an array of non-volatile memory cells, comprises generating a high voltage, and programming a plurality of non-volatile memory cells in an array using the high voltage when a programming enable signal is asserted and providing a feedback loop to maintain the high voltage while programming the plurality of non-volatile memory cells.
    Type: Application
    Filed: May 2, 2022
    Publication date: August 11, 2022
    Inventors: Hieu Van Tran, Thuan Vu, Stanley Hong, Anh Ly, Vipin Tiwari, Nhan Do