Patents by Inventor Chieh-Fang TENG

Chieh-Fang TENG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250111215
    Abstract: A method can include determining which computing units in a computing-in-memory (CIM) macro are to be turned off, the CIM macro including an array of the computing units with X rows and Y columns, the X rows of computing units being organized into N row-groups, each row-group including multiple rows of computing units, the Y columns of computing units being organized into M column-groups, each column-group including multiple columns of computing units, based on the determination of which computing units in the CIM macro are to be turned off, turning off at least one row-group or column-group of computing units, each row-group and column-group of computing units being separately controllable to be turned off, and performing a computation based on kernel weights and activations of a neural network stored in the active computing units in the CIM macro that are not turned off.
    Type: Application
    Filed: September 28, 2023
    Publication date: April 3, 2025
    Applicant: MEDIATEK INC.
    Inventors: Chieh-Fang TENG, En-Jui CHANG, Chih Chung CHENG
  • Publication number: 20250007534
    Abstract: A coding apparatus and a coding method are proposed. The coding apparatus includes a memory and a processor. The processor is configured to obtain a feature map, perform lossy compression on the feature map to generate a lossy feature map, perform lossless compression on the lossy feature map to generate a resultant feature map, and store the resultant feature map in the memory.
    Type: Application
    Filed: June 30, 2023
    Publication date: January 2, 2025
    Applicant: Novatek Microelectronics Corp.
    Inventors: Cheng-Yang Chang, Chieh-Fang Teng, Yu Shan Tai, Kai-Ya Wei, An-Yu Wu, Yen-Hsi Lee
  • Publication number: 20240177019
    Abstract: Aspects of the present disclosure provide an apparatus. For example, the apparatus can include a compiler configured to compile a neural network (NN) model to generate a plurality of operations/threads and determine whether each of the operations/threads is compute bound or memory bound, and a memory coupled to the compiler and configured to store the operations/threads. The apparatus can also include a thread scheduler coupled to the memory and configured to schedule the operations/threads of the NN model. The apparatus can also include a multi-engine processing unit that includes a plurality of compute units (CUs), and an executor coupled between the thread scheduler and the multi-engine processing unit. The executor can be configured to allocate the operations/threads of the NN model and activate a number of the CUs of the multi-engine processing unit for each of the operations/threads based on whether the operation/thread is compute bound or memory bound.
    Type: Application
    Filed: May 25, 2023
    Publication date: May 30, 2024
    Applicant: MEDIATEK INC.
    Inventors: Chieh-Fang TENG, En-Jui Chang, Chih Chung CHENG