Patents by Inventor Jaehyuk Huh

Jaehyuk Huh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220414503
    Abstract: Disclosed is an SLO-aware artificial intelligence inference scheduler technology in a heterogeneous processor-based edge system. A scheduling method for a machine learning (ML) inference task, which is performed by a scheduling system, may include receiving inference task requests of multiple ML models with respect to an edge system composed of heterogeneous processors and operating heterogeneous processor resources of the edge system based on a service-level objective (SLO)-aware-based scheduling policy in response to the received inference task requests.
    Type: Application
    Filed: January 5, 2022
    Publication date: December 29, 2022
    Inventors: Jongse PARK, Wonik SEO, Sanghoon CHA, Yeonjae KIM, Jaehyuk HUH
  • Publication number: 20220374513
    Abstract: An electronic device includes a System on Chip (SoC) and a memory. The SoC includes a processor and a neural processing unit (NPU). The memory includes an enclave page cache (EPC), in which a validation table is stored, and at least one NPU enclave. The NPU enclave and the EPC have a trusted execution environment, which is isolated from an execution environment in which system software of the CPU is executed.
    Type: Application
    Filed: May 20, 2022
    Publication date: November 24, 2022
    Applicants: Samsung Electronics Co., Ltd., KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY
    Inventors: Jaehyuk HUH, Sunho LEE, Seonjin NA
  • Patent number: 10862876
    Abstract: A device transmits or receives a packet in a memory network including one or more processors and/or one or more memory devices. The device includes a key storage unit configured to store a one-time password (OTP) key that is shared with a target node, an encryption unit configured to encrypt a transmission packet with the OTP key stored in the key storage unit and to transmit the encrypted transmission packet to the target node, and a decryption unit configured to decrypt a receiving packet from the target node with the OTP key stored in the key storage unit. The device is a processor or a memory device in the memory network.
    Type: Grant
    Filed: September 14, 2017
    Date of Patent: December 8, 2020
    Assignees: SK hynix Inc., Korea Advanced Institute of Science and Technology
    Inventors: Yeonju Ro, Seongwook Jin, Jaehyuk Huh, John Dongjun Kim
  • Publication number: 20180176202
    Abstract: A device transmits or receives a packet in a memory network including one or more processors and/or one or more memory devices. The device includes a key storage unit configured to store a one-time password (OTP) key that is shared with a target node, an encryption unit configured to encrypt a transmission packet with the OTP key stored in the key storage unit and to transmit the encrypted transmission packet to the target node, and a decryption unit configured to decrypt a receiving packet from the target node with the OTP key stored in the key storage unit. The device is a processor or a memory device in the memory network.
    Type: Application
    Filed: September 14, 2017
    Publication date: June 21, 2018
    Inventors: Yeonju RO, Seongwook JIN, Jaehyuk HUH, John Dongjun KIM
  • Patent number: 7904657
    Abstract: The present invention proposes a novel cache residence prediction mechanism that predicts whether requested data of a cache miss can be found in another cache. The memory controller can use the prediction result to determine if it should immediately initiate a memory access, or initiate no memory access until a cache snoop response shows that the requested data cannot be supplied by a cache. The cache residence prediction mechanism can be implemented at the cache side, the memory side, or both. A cache-side prediction mechanism can predict that data requested by a cache miss can be found in another cache if the cache miss address matches an address tag of a cache line in the requesting cache and the cache line is in an invalid state. A memory-side prediction mechanism can make effective prediction based on observed memory and cache operations that are recorded in a prediction table.
    Type: Grant
    Filed: July 18, 2007
    Date of Patent: March 8, 2011
    Assignee: International Business Machines Corporation
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy
  • Patent number: 7676637
    Abstract: In shared-memory multiprocessor systems, cache interventions from different sourcing caches can result in different cache intervention costs. With location-aware cache coherence, when a cache receives a data request, the cache can determine whether sourcing the data from the cache will result in less cache intervention cost than sourcing the data from another cache. The decision can be made based on appropriate information maintained in the cache or collected from snoop responses from other caches. If the requested data is found in more than one cache, the cache that has or likely has the lowest cache intervention cost is generally responsible for supplying the data. The intervention cost can be measured by performance metrics that include, but are not limited to, communication latency, bandwidth consumption, load balance, and power consumption.
    Type: Grant
    Filed: April 27, 2004
    Date of Patent: March 9, 2010
    Assignee: International Business Machines Corporation
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy
  • Publication number: 20090024797
    Abstract: The present invention proposes a novel cache residence prediction mechanism that predicts whether requested data of a cache miss can be found in another cache. The memory controller can use the prediction result to determine if it should immediately initiate a memory access, or initiate no memory access until a cache snoop response shows that the requested data cannot be supplied by a cache. The cache residence prediction mechanism can be implemented at the cache side, the memory side, or both. A cache-side prediction mechanism can predict that data requested by a cache miss can be found in another cache if the cache miss address matches an address tag of a cache line in the requesting cache and the cache line is in an invalid state. A memory-side prediction mechanism can make effective prediction based on observed memory and cache operations that are recorded in a prediction table.
    Type: Application
    Filed: July 18, 2007
    Publication date: January 22, 2009
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy
  • Patent number: 7266642
    Abstract: The present invention proposes a novel cache residence prediction mechanism that predicts whether requested data of a cache miss can be found in another cache. The memory controller can use the prediction result to determine if it should immediately initiate a memory access, or initiate no memory access until a cache snoop response shows that the requested data cannot be supplied by a cache. The cache residence prediction mechanism can be implemented at the cache side, the memory side, or both. A cache-side prediction mechanism can predict that data requested by a cache miss can be found in another cache if the cache miss address matches an address tag of a cache line in the requesting cache and the cache line is in an invalid state. A memory-side prediction mechanism can make effective prediction based on observed memory and cache operations that are recorded in a prediction table.
    Type: Grant
    Filed: February 17, 2004
    Date of Patent: September 4, 2007
    Assignee: International Business Machines Corporation
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy
  • Publication number: 20050240735
    Abstract: In shared-memory multiprocessor systems, cache interventions from different sourcing caches can result in different cache intervention costs. With location-aware cache coherence, when a cache receives a data request, the cache can determine whether sourcing the data from the cache will result in less cache intervention cost than sourcing the data from another cache. The decision can be made based on appropriate information maintained in the cache or collected from snoop responses from other caches. If the requested data is found in more than one cache, the cache that has or likely has the lowest cache intervention cost is generally responsible for supplying the data. The intervention cost can be measured by performance metrics that include, but are not limited to, communication latency, bandwidth consumption, load balance, and power consumption.
    Type: Application
    Filed: April 27, 2004
    Publication date: October 27, 2005
    Applicant: International Business Machines Corporation
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy
  • Publication number: 20050182907
    Abstract: The present invention proposes a novel cache residence prediction mechanism that predicts whether requested data of a cache miss can be found in another cache. The memory controller can use the prediction result to determine if it should immediately initiate a memory access, or initiate no memory access until a cache snoop response shows that the requested data cannot be supplied by a cache. The cache residence prediction mechanism can be implemented at the cache side, the memory side, or both. A cache-side prediction mechanism can predict that data requested by a cache miss can be found in another cache if the cache miss address matches an address tag of a cache line in the requesting cache and the cache line is in an invalid state. A memory-side prediction mechanism can make effective prediction based on observed memory and cache operations that are recorded in a prediction table.
    Type: Application
    Filed: February 17, 2004
    Publication date: August 18, 2005
    Applicant: International Business Machines Corporation
    Inventors: Xiaowei Shen, Jaehyuk Huh, Balaram Sinharoy