Patents by Inventor Uiseok SONG

Uiseok SONG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240143463
    Abstract: An apparatus includes a processor configured to execute instructions, and a memory storing the instructions, which when executed by the processor configure the processor to generate system error prediction data using an error prediction neural network provided with one of a plurality of log data sequences generated by pre-processing a plurality of log data pieces of component log data of a system. The system error prediction data comprises information of a plurality of system errors occurring at a plurality of respective timepoints.
    Type: Application
    Filed: June 22, 2023
    Publication date: May 2, 2024
    Applicants: SAMSUNG ELECTRONICS CO., LTD., Korea University Research and Business Foundation
    Inventors: Uiseok SONG, Seoung Bum KIM, Jaehoon KIM, Byungwoo BANG, Junyeon LEE, Jiyoon LEE, Yoon Sang CHO, Hansam CHO, Minkyu KIM, Hun Seong CHOI
  • Publication number: 20240119297
    Abstract: A processor-implemented method with checkpointing includes: performing an operation for learning of an artificial neural network (ANN) model; and performing a checkpointing to store information about a state of the ANN model, simultaneously with performing the operation for the learning of the ANN model.
    Type: Application
    Filed: February 3, 2023
    Publication date: April 11, 2024
    Applicants: SAMSUNG ELECTRONICS CO., LTD., Seoul National University R&DB Foundation
    Inventors: Junyeon LEE, Jin-soo KIM, Seongyeop JEONG, Uiseok SONG, Byungwoo BANG, Wooseok CHANG, Hun Seong CHOI
  • Publication number: 20230281081
    Abstract: An electronic device and method with on-demand accelerator checkpointing are provided. In one general aspect, an electronic device includes a host processor, and an accelerator configured to operate according to instructions transmitted by the host processor to the accelerator, wherein, a memory of the host processor and a memory of the accelerator are respectively checkpointed to a storage at respective different intervals, and in response to a determination that a failure has occurred in the host processor, the memory of the accelerator is checkpointed to the storage.
    Type: Application
    Filed: November 21, 2022
    Publication date: September 7, 2023
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: Jaehyung Ahn, SEONGBEOM KIM, BYUNGWOO BANG, UISEOK SONG, JUNYEON LEE, WOOSEOK CHANG, HUN SEONG CHOI
  • Publication number: 20230076106
    Abstract: A method and apparatus with cosmic ray fault protection is included. A method includes obtaining cosmic ray information indicating at least one cosmic ray event, determining a soft error mitigation policy based on the cosmic ray information, accessing the soft error mitigation policy by a device, and based on the soft error mitigation policy, performing, by the device, a mitigation action that mitigates for soft errors related to the cosmic ray event.
    Type: Application
    Filed: August 5, 2022
    Publication date: March 9, 2023
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: BYUNGWOO BANG, SEONGBEOM KIM, UISEOK SONG, Jaehyung AHN, JUNYEON LEE, WOOSEOK CHANG
  • Publication number: 20220083838
    Abstract: A method includes predicting, for sets of input data, an input data number of a subsequent interval of a first interval using an input data number of the first interval and an input data number of a previous interval of the first interval set in a neural network inference optimization, determining the predicted input data number to be a batch size of the subsequent interval, determining whether pipelining is to be performed in a target device based on a resource state of the target device, and applying, to the target device, an inference policy including the determined batch size and a result of the determining of whether the pipelining is to be performed.
    Type: Application
    Filed: April 29, 2021
    Publication date: March 17, 2022
    Applicant: Samsung Electronics Co., Ltd.
    Inventors: UISEOK SONG, SANGGYU SHIN
  • Publication number: 20210216864
    Abstract: A processor-implemented neural network model reconfiguration method is provided. The method calculates required resource information of each of a plurality of layers of a neural network model, determines whether a layer of the plurality of layers is a reconfiguration target layer based on the required resource information and hardware information, determines reconfiguration layers, with respect to the reconfiguration target layer, based on required resource information of the reconfiguration target layer and the hardware information, selects one of the reconfiguration target layer and the determined reconfiguration layers based on performance information of the reconfiguration target layer and performance information of each of the determined reconfiguration layers, and implements one of the determined reconfigured layers in the neural network model based on the selecting of the determined reconfiguration layers.
    Type: Application
    Filed: July 9, 2020
    Publication date: July 15, 2021
    Applicant: Samsung Electronics Co., Ltd.
    Inventor: Uiseok SONG