Patents by Inventor Chang-Hyo Yu

Chang-Hyo Yu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250138872
    Abstract: A command processor determines whether a command descriptor describing a current command is in a first format or in a second format, wherein the first format includes a source memory address pointing to a memory area in a shared memory having a binary code to be accessed according to direct memory access (DMA) scheme, and the second format includes one or more object indices, a respective one of the one or more object indices indicating an object in an object database. If the command descriptor describing the current command is in the second format, the command processor converts a format of the command descriptor to the first format, generates one or more task descriptors describing neural network model tasks based on the command descriptor in the first format, and distributes the one or more task descriptors to the one or more neural processors.
    Type: Application
    Filed: January 3, 2025
    Publication date: May 1, 2025
    Inventors: Hongyun Kim, Chang-Hyo Yu, Yoonho Boo
  • Patent number: 12248371
    Abstract: An electronic device comprises a main chiplet including a first memory and at least one sub-chiplet including a second memory, wherein the main chiplet is configured to initialize a first interface for inter-chiplet connection based on first boot firmware stored in the first memory in response to receiving booting signal, acquire third boot firmware stored in an external memory, initialize a second interface for communication between an external device and the main chiplet based on the third boot firmware, set a configuration for interconnection between the main chiplet and the at least one sub-chiplet, initialize a third memory included in the main chiplet, and load at least one of an application firmware or an operating system to the third memory, and the at least one sub-chiplet is configured to initialize the first interface based on second boot firmware stored in the second memory in response to receiving the booting signal.
    Type: Grant
    Filed: August 19, 2024
    Date of Patent: March 11, 2025
    Assignee: REBELLIONS INC.
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Publication number: 20250077467
    Abstract: An electronic device comprising a plurality of chiplets is disclosed. The electronic device comprises a first chiplet that generates a transaction, a second chiplet that receives the transaction, and at least one third chiplet that relays the transaction, wherein the first chiplet determines a route path for the transaction that passes through the at least one third chiplet, and transmits the transaction through the determined route path for the transaction.
    Type: Application
    Filed: August 16, 2024
    Publication date: March 6, 2025
    Inventors: Young-Jae Jin, Chang-Hyo Yu
  • Publication number: 20250077468
    Abstract: The present disclosure relates to a method for communicating between chiplets in a chiplet system. The chiplet system includes a first chiplet and a second chiplet, the first chiplet includes a controller including a protocol layer, and the method includes, by the protocol layer of the first chiplet, receiving first data, by the protocol layer of the first chiplet, receiving conversion information from the second chiplet, and, by the protocol layer of the first chiplet, generating second data based on the received first data and conversion information.
    Type: Application
    Filed: August 16, 2024
    Publication date: March 6, 2025
    Inventors: Young-Jae Jin, Chang-Hyo Yu
  • Publication number: 20250077457
    Abstract: The present disclosure relates to a method for communicating between chiplets in a chiplet system. The chiplet system includes a first chiplet and a second chiplet, and the method includes, by the first chiplet, generating a die-to-die interface flit from a first protocol type transaction based on conversion information, by the first chiplet, transmitting the die-to-die interface flit to the second chiplet, and, by the second chiplet, generating a second protocol type transaction from the die-to-die interface flit based on the conversion information.
    Type: Application
    Filed: August 16, 2024
    Publication date: March 6, 2025
    Inventors: Young-Jae Jin, Chang-Hyo Yu
  • Publication number: 20250068521
    Abstract: An electronic device comprises a main chiplet including a first memory and at least one sub-chiplet including a second memory, wherein the main chiplet is configured to initialize a first interface for inter-chiplet connection based on first boot firmware stored in the first memory in response to receiving booting signal, acquire third boot firmware stored in an external memory, initialize a second interface for communication between an external device and the main chiplet based on the third boot firmware, set a configuration for interconnection between the main chiplet and the at least one sub-chiplet, initialize a third memory included in the main chiplet, and load at least one of an application firmware or an operating system to the third memory, and the at least one sub-chiplet is configured to initialize the first interface based on second boot firmware stored in the second memory in response to receiving the booting signal.
    Type: Application
    Filed: August 19, 2024
    Publication date: February 27, 2025
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Patent number: 12229587
    Abstract: A command processor determines whether a command descriptor describing a current command is in a first format or in a second format, wherein the first format includes a source memory address pointing to a memory area in a shared memory having a binary code to be accessed according to direct memory access (DMA) scheme, and the second format includes one or more object indices, a respective one of the one or more object indices indicating an object in an object database. If the command descriptor describing the current command is in the second format, the command processor converts a format of the command descriptor to the first format, generates one or more task descriptors describing neural network model tasks based on the command descriptor in the first format, and distributes the one or more task descriptors to the one or more neural processors.
    Type: Grant
    Filed: March 29, 2024
    Date of Patent: February 18, 2025
    Assignee: REBELLIONS INC.
    Inventors: Hongyun Kim, Chang-Hyo Yu, Yoonho Boo
  • Publication number: 20240378157
    Abstract: A neural processing device and a method of updating translation lookaside buffer thereof are provided. The neural processing device includes at least one processor module each of which includes at least one micro translation lookaside buffer (TLB), a hierarchical memory that is accessed by the at least one micro TLB, and a command processor configured to update the at least one micro TLB in a push mode by generating a first update signal which indicates update of the at least one micro TLB and transmitting the first update signal to the at least one micro TLB.
    Type: Application
    Filed: June 4, 2024
    Publication date: November 14, 2024
    Inventor: Chang-Hyo Yu
  • Publication number: 20240330041
    Abstract: A command processor determines whether a command descriptor describing a current command is in a first format or in a second format, wherein the first format includes a source memory address pointing to a memory area in a shared memory having a binary code to be accessed according to direct memory access (DMA) scheme, and the second format includes one or more object indices, a respective one of the one or more object indices indicating an object in an object database. If the command descriptor describing the current command is in the second format, the command processor converts a format of the command descriptor to the first format, generates one or more task descriptors describing neural network model tasks based on the command descriptor in the first format, and distributes the one or more task descriptors to the one or more neural processors.
    Type: Application
    Filed: March 29, 2024
    Publication date: October 3, 2024
    Inventors: Hongyun Kim, Chang-Hyo Yu, Yoonho Boo
  • Publication number: 20240311186
    Abstract: A task manager, a neural processing device, and a method for checking task dependencies thereof are provided. The task manager includes a task buffer configured to receive first and second tasks of different first and second types, a first queue configured to receive a first task descriptor for the first task from the task buffer, a second queue configured to receive a second task descriptor for the second task from the task buffer, a dependency checker configured to check dependencies of the first and second task descriptors, a third queue configured to receive the first task descriptor from the dependency checker, and a fourth queue configured to receive the second task descriptor from the dependency checker.
    Type: Application
    Filed: May 22, 2024
    Publication date: September 19, 2024
    Inventors: Wongyu Shin, Miock Chi, Hongyun Kim, Jinseok Kim, Chang-Hyo Yu
  • Publication number: 20240296245
    Abstract: A method for confidential computing is provided, which is performed by a security core including one or more processor, and includes storing first encrypted data associated with a first tenant in a first memory, in which the first encrypted data is obtained by performing encryption of the first plaintext data using a first encryption key associated with the first tenant, in response to receiving a request to access the first plaintext data, decrypting the first encrypted data using the first encryption key so as to generate the first plaintext data, and providing the first plaintext data to a main core that processes data stored in the first memory.
    Type: Application
    Filed: May 8, 2024
    Publication date: September 5, 2024
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Publication number: 20240256714
    Abstract: A method for runtime integrity check, performed by a security core including one or more processors includes storing a first output value, which is generated by using a one-way encryption algorithm based on first data and a first encryption key managed by an encryption key manager accessible by the security core, in a main memory that is a volatile memory in association with the first data, generating a second output value for the first data based on the first data and the first encryption key by using the one-way encryption algorithm, and checking for possible tampering of the first data stored in the main memory by comparing the first output value with the generated second output value.
    Type: Application
    Filed: November 28, 2023
    Publication date: August 1, 2024
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Patent number: 12038850
    Abstract: A neural processing device and a method of updating translation lookaside buffer thereof are provided. The neural processing device includes at least one processor module each of which includes at least one micro translation lookaside buffer (TLB), a hierarchical memory that is accessed by the at least one micro TLB, and a command processor configured to update the at least one micro TLB in a push mode by generating a first update signal which indicates update of the at least one micro TLB and transmitting the first update signal to the at least one micro TLB.
    Type: Grant
    Filed: November 2, 2023
    Date of Patent: July 16, 2024
    Assignee: Rebellions Inc.
    Inventor: Chang-Hyo Yu
  • Patent number: 12026548
    Abstract: A task manager, a neural processing device, and a method for checking task dependencies thereof are provided. The task manager includes a task buffer configured to receive first and second tasks of different first and second types, a first queue configured to receive a first task descriptor for the first task from the task buffer, a second queue configured to receive a second task descriptor for the second task from the task buffer, a dependency checker configured to check dependencies of the first and second task descriptors, a third queue configured to receive the first task descriptor from the dependency checker, and a fourth queue configured to receive the second task descriptor from the dependency checker.
    Type: Grant
    Filed: October 24, 2023
    Date of Patent: July 2, 2024
    Assignee: Rebellions Inc.
    Inventors: Wongyu Shin, Miock Chi, Hongyun Kim, Jinseok Kim, Chang-Hyo Yu
  • Publication number: 20240211410
    Abstract: A neural processing device and a method of updating translation lookaside buffer thereof are provided. The neural processing device includes at least one processor module each of which includes at least one micro translation lookaside buffer (TLB), a hierarchical memory that is accessed by the at least one micro TLB, and a command processor configured to update the at least one micro TLB in a push mode by generating a first update signal which indicates update of the at least one micro TLB and transmitting the first update signal to the at least one micro TLB.
    Type: Application
    Filed: November 2, 2023
    Publication date: June 27, 2024
    Inventor: Chang-Hyo Yu
  • Patent number: 12008132
    Abstract: A method for confidential computing is provided, which is performed by a security core including one or more processor, and includes storing first encrypted data associated with a first tenant in a first memory, in which the first encrypted data is obtained by performing encryption of the first plaintext data using a first encryption key associated with the first tenant, in response to receiving a request to access the first plaintext data, decrypting the first encrypted data using the first encryption key so as to generate the first plaintext data, and providing the first plaintext data to a main core that processes data stored in the first memory.
    Type: Grant
    Filed: June 20, 2023
    Date of Patent: June 11, 2024
    Assignee: REBELLIONS INC.
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Publication number: 20240152392
    Abstract: A task manager, a neural processing device, and a method for checking task dependencies thereof are provided. The task manager includes a task buffer configured to receive first and second tasks of different first and second types, a first queue configured to receive a first task descriptor for the first task from the task buffer, a second queue configured to receive a second task descriptor for the second task from the task buffer, a dependency checker configured to check dependencies of the first and second task descriptors, a third queue configured to receive the first task descriptor from the dependency checker, and a fourth queue configured to receive the second task descriptor from the dependency checker.
    Type: Application
    Filed: October 24, 2023
    Publication date: May 9, 2024
    Inventors: Wongyu Shin, Miock Chi, Hongyun Kim, Jinseok Kim, Chang-Hyo Yu
  • Patent number: 11874953
    Abstract: A method for runtime integrity check, performed by a security core including one or more processors includes storing a first output value, which is generated by using a one-way encryption algorithm based on first data and a first encryption key managed by an encryption key manager accessible by the security core, in a main memory that is a volatile memory in association with the first data, generating a second output value for the first data based on the first data and the first encryption key by using the one-way encryption algorithm, and checking for possible tampering of the first data stored in the main memory by comparing the first output value with the generated second output value.
    Type: Grant
    Filed: June 20, 2023
    Date of Patent: January 16, 2024
    Assignee: REBELLIONS INC.
    Inventors: Myunghoon Choi, Chang-Hyo Yu
  • Publication number: 20230244920
    Abstract: A neural processing device is provided. The neural processing device comprises a plurality of neural processors, a shared memory shared by the plurality of neural processors, a plurality of semaphore memories, and global interconnection. The plurality of neural processors generates a plurality of L3 sync targets, respectively. Each semaphore memory is associated with a respective one of the plurality of neural processors, and the plurality of semaphore memories receive and store the plurality of L3 sync targets, respectively. Synchronization of the plurality of neural processors is performed according to the plurality of L3 sync targets. The global interconnection connects the plurality of neural processors with the shared memory, and comprises an L3 sync channel through which an L3 synchronization signal corresponding to at least one L3 sync target is transmitted.
    Type: Application
    Filed: April 11, 2023
    Publication date: August 3, 2023
    Inventors: Jinwook Oh, Jinseok Kim, Kyeongryeol Bong, Wongyu Shin, Chang-Hyo Yu
  • Patent number: 11657261
    Abstract: A neural processing device is provided. The neural processing device comprises a plurality of neural processors, a shared memory shared by the plurality of neural processors, a plurality of semaphore memories, and global interconnection. The plurality of neural processors generates a plurality of L3 sync targets, respectively. Each semaphore memory is associated with a respective one of the plurality of neural processors, and the plurality of semaphore memories receive and store the plurality of L3 sync targets, respectively. Synchronization of the plurality of neural processors is performed according to the plurality of L3 sync targets. The global interconnection connects the plurality of neural processors with the shared memory, and comprises an L3 sync channel through which an L3 synchronization signal corresponding to at least one L3 sync target is transmitted.
    Type: Grant
    Filed: April 29, 2022
    Date of Patent: May 23, 2023
    Assignee: Rebellions Inc.
    Inventors: Jinwook Oh, Jinseok Kim, Kyeongryeol Bong, Wongyu Shin, Chang-Hyo Yu