Patents by Inventor Ru-Hui LIU

Ru-Hui LIU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11955338
    Abstract: A method includes providing a substrate having a surface such that a first hard mask layer is formed over the surface and a second hard mask layer is formed over the first hard mask layer, forming a first pattern in the second hard mask layer, where the first pattern includes a first mandrel oriented lengthwise in a first direction and a second mandrel oriented lengthwise in a second direction different from the first direction, and where the first mandrel has a top surface, a first sidewall, and a second sidewall opposite to the first sidewall, and depositing a material towards the first mandrel and the second mandrel such that a layer of the material is formed on the top surface and the first sidewall but not the second sidewall of the first mandrel.
    Type: Grant
    Filed: January 30, 2023
    Date of Patent: April 9, 2024
    Assignee: TAIWAN SEMICONDUCTOR MANUFACTURING CO., LTD.
    Inventors: Shih-Chun Huang, Ya-Wen Yeh, Chien-Wen Lai, Wei-Liang Lin, Ya Hui Chang, Yung-Sung Yen, Ru-Gun Liu, Chin-Hsiang Lin, Yu-Tien Shen
  • Patent number: 11392820
    Abstract: A transpose memory unit for a plurality of multi-bit convolutional neural network based computing-in-memory applications includes a memory cell and a transpose cell. The memory cell stores a weight. The transpose cell is connected to the memory cell and receives the weight from the memory cell. The transpose cell includes an input bit line, at least one first input word line, a first output bit line, at least one second input word line and a second output bit line. One of the at least one first input word line and the at least one second input word line transmits at least one multi-bit input value, and the transpose cell is controlled by the second word line to generate a multiply-accumulate output value on one of the first output bit line and the second output bit line according to the at least one multi-bit input value multiplied by the weight.
    Type: Grant
    Filed: January 14, 2020
    Date of Patent: July 19, 2022
    Assignee: NATIONAL TSING HUA UNIVERSITY
    Inventors: Meng-Fan Chang, Jian-Wei Su, Yen-Chi Chou, Ru-Hui Liu
  • Publication number: 20210216846
    Abstract: A transpose memory unit for a plurality of multi-bit convolutional neural network based computing-in-memory applications includes a memory cell and a transpose cell. The memory cell stores a weight. The transpose cell is connected to the memory cell and receives the weight from the memory cell. The transpose cell includes an input bit line, at least one first input word line, a first output bit line, at least one second input word line and a second output bit line. One of the at least one first input word line and the at least one second input word line transmits at least one multi-bit input value, and the transpose cell is controlled by the second word line to generate a multiply-accumulate output value on one of the first output bit line and the second output bit line according to the at least one multi-bit input value multiplied by the weight.
    Type: Application
    Filed: January 14, 2020
    Publication date: July 15, 2021
    Inventors: Meng-Fan CHANG, Jian-Wei SU, Yen-Chi CHOU, Ru-Hui LIU