Patents by Inventor Ji-Hoon NAM
Ji-Hoon NAM has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240117941Abstract: A hydrogen storage system is disclosed and includes a storage unit including a plurality of unit storage containers, in which metal hydride materials are respectively provided in an interior thereof and which are connected to each other in parallel, and a thermal fluid line defining a thermal fluid passage, which passes via the plurality of unit storage containers continuously and through which a thermal fluid flows for heating or cooling the unit storage containers, thereby enhancing a storage performance and an efficiency of the hydrogen.Type: ApplicationFiled: March 10, 2023Publication date: April 11, 2024Applicants: HYUNDAI MOTOR COMPANY, KIA CORPORATIONInventors: Ji Hye Park, Won Jung Kim, Kyung Moon Lee, Dong Hoon Nam, Young Jin Cho, Byeong Soo Shin, Ji Hoon Lee, Suk Hoon Hong, Hoon Mo Park, Yong Doo Son
-
Publication number: 20240117930Abstract: A hydrogen storage device includes a storage container having an accommodation space in an interior thereof, a first metal hydride material provided in the interior of the storage container and that stores hydrogen, and a second metal hydride material provided in the interior of the storage container and that stores the hydrogen at a pressure that is different from that of the first metal hydride material. An advantageous effect of restraining an excessive rise of a pressure of the storage container and enhancing safety and reliability may be obtained.Type: ApplicationFiled: March 10, 2023Publication date: April 11, 2024Applicants: HYUNDAI MOTOR COMPANY, KIA CORPORATIONInventors: Ji Hye Park, Won Jung Kim, Kyung Moon Lee, Dong Hoon Nam, Young Jin Cho, Byeong Soo Shin, Ji Hoon Lee, Suk Hoon Hong, Hoon Mo Park, Yong Doo Son
-
Publication number: 20230097363Abstract: A data processing system may include: a controller configured to receive a neural network processing request from a host device; a processing memory including: one or more sub arrays each including memory cells coupled between row lines and column lines; multiplexers (MUXs) provided for respective column line groups, which are configured by grouping the column lines by a preset number; and analog-to-digital converters (ADCs) coupled to the respective MUXs; and a deserializer. The deserializer is configured to receive, from the controller, data to be stored in a selected sub array and a first column address at which the data is to be stored, and remap the first column address to a second column address such that the data is distributed and stored in the memory cells coupled to the column line groups, in order to store the data in the sub array.Type: ApplicationFiled: April 14, 2022Publication date: March 30, 2023Inventors: Ji Hoon NAM, Joo Young KIM
-
Publication number: 20230040775Abstract: A semiconductor memory apparatus may include: a data adjusting circuit configured to conditionally adjust a weight data value for a MAC (Multiplication and ACcumulation) operation based on comparing the weight data value to a reference data value, and generate flag information indicating whether the weight data value has been adjusted; a memory cell array circuit configured to store the adjusted weight data value outputted from the data adjusting circuit; and a data calculation circuit configured to recover, on the flag information, a result based on the weight data value from a result based on the adjusted weight data value to perform the MAC operation on an input data value and the weight data value.Type: ApplicationFiled: November 23, 2021Publication date: February 9, 2023Inventors: Ji Hoon NAM, Sang Eun JE
-
Publication number: 20220249240Abstract: Proposed is a bearing component for an artificial knee joint, the bearing component including a body part, whose plane shape is oval, having an indentation portion formed by depressing a posterior center to a predetermined depth toward a center of the body part, a protruding portion protruding from an upper surface of the body part and introduced into an opening of a femoral component, a coupling portion provided on a lower surface of the body part, and having an engagement surface of a certain height to form a step difference with an outer circumferential surface of the body part, the engagement surface being formed on left and right sides of the coupling portion, and being not formed on an indentation surface, and a fastening portion having a plurality of coupling protrusions formed in a portion where the engagement surface of the coupling portion is not formed.Type: ApplicationFiled: January 4, 2022Publication date: August 11, 2022Inventors: Kyoung Tak KANG, Ji Hoon NAM, Yong Gon KOH
-
Publication number: 20220076115Abstract: Devices and methods for improving the performance of a data processing system that receives an input data comprising a training data for a neural network are described. An example system includes a plurality of accelerators, each of which is configured to perform a plurality of epoch segment processes, share, after performing at least one of the plurality of epoch segment processes, gradient data associated with a loss function with other accelerators, and update a weight of the neural network based on the gradient data. In some embodiments, each of the plurality of accelerators are further configured to adjust a precision of the gradient data based on at least one of a variance of the gradient data for the input data and a total number of the plurality of epoch segment processes, and transmit precision-adjusted gradient data to the other accelerators.Type: ApplicationFiled: February 5, 2021Publication date: March 10, 2022Inventor: Ji Hoon NAM
-
Publication number: 20220067494Abstract: Accelerating devices, data storing devices, data processing systems and operating methods of accelerating devices are disclosed. In one aspect, an accelerating device includes an accelerator. The accelerator processes a calculation by using a calculation method selected based on at least one of a batch size and a sequence size and by controlling an input path of first input data and second input data to the processing element array according to the selected calculation method. The calculation method includes a first calculation method and a second calculation method, and the first input data and the second input data are input to the processing element array based on the batch size according to the first calculation method and the first input data being input to the processing element array based on the sequence size according to the second calculation method.Type: ApplicationFiled: January 28, 2021Publication date: March 3, 2022Inventor: Ji Hoon NAM
-
Publication number: 20220058157Abstract: A data processing system including a shared memory; a host processor configured to possess an ownership of the shared memory, and process a first task by accessing the shared memory; a processor configured to possess the ownership transferred from the host processor, and process a second task by accessing the shared memory; and a memory controller coupled among the host processor, the processor, and the shared memory, and configured to allow the host processor or the processor to access the shared memory according to the ownership.Type: ApplicationFiled: October 26, 2021Publication date: February 24, 2022Applicant: SK hynix Inc.Inventors: Ji Hoon NAM, Eui Cheol LIM
-
Patent number: 11169953Abstract: A data processing system including a shared memory; a host processor configured to possess an ownership of the shared memory, and process a first task by accessing the shared memory; a processor configured to possess the ownership transferred from the host processor, and process a second task by accessing the shared memory; and a memory controller coupled among the host processor, the processor, and the shared memory, and configured to allow the host processor or the processor to access the shared memory according to the ownership.Type: GrantFiled: December 5, 2018Date of Patent: November 9, 2021Assignee: SK hynix inc.Inventors: Ji Hoon Nam, Eui Cheol Lim
-
Patent number: 11100016Abstract: A data processing system includes a host device and a data storage device. The host device is configured to select a speed mode of a memory bandwidth based on a network model, or a batch size, or both. The data storage device includes an accelerator configured to change a structure of a processing element (PE) array by controlling transmission paths of first input data and second input data that are input to the PE array based on the speed mode of the memory bandwidth. Computing power and memory power of the accelerator are adjusted according to the selection of the speed mode.Type: GrantFiled: December 5, 2019Date of Patent: August 24, 2021Assignee: SK hynix Inc.Inventor: Ji Hoon Nam
-
Publication number: 20210256360Abstract: A calculation circuit may include a plurality of calculator groups constituting a systolic array composed of a plurality of rows and columns, wherein calculator groups included in each of the rows propagate a data value set through a single data path corresponding to the row in a data propagation direction, and propagate a plurality of drain value sets through a plurality of drain paths corresponding to the row in a drain propagation direction, and wherein a calculator group of the calculator groups included in each of the rows comprises a plurality of MAC (Multiplier-Accumulator) circuits, and the MAC circuits generate drain values respectively included in the drain value sets at the same time. The calculator groups included in each column may further propagate a weight value set corresponding to the column through a plurality of weight data paths corresponding to the column.Type: ApplicationFiled: July 15, 2020Publication date: August 19, 2021Inventor: Ji Hoon NAM
-
Patent number: 11093418Abstract: A device includes a first memory including a plurality of memory dice that are arranged vertically to each other, a second memory of a second type, and a controller die that transfers first data between the first memory and a first internal memory of a processor using a first interface and that transfers second data between the second memory and a second internal memory of the processor using a second interface. The first and second memory types are different types of memories. The first and second interfaces are different from each other. The first and second internal memories are different from each other.Type: GrantFiled: July 11, 2019Date of Patent: August 17, 2021Assignee: SK hynix Inc.Inventor: Ji Hoon Nam
-
Patent number: 11068283Abstract: A semiconductor apparatus may include a storage device including a data area and a code area and storing program codes provided from a host device in the code area, a plurality of unit processors, each of the plurality of unit processors including an internal memory, and a main control component configured to receive an operation policy, which includes a processor ID, a code ID, and a code address, from the host device and to control the plurality of unit processors based on the operation policy. The processor ID is an identifier for each of the plurality of unit processors, the code ID is an identifier for each of the program codes, and the code address indicates a position of the code area where each of the program codes is stored.Type: GrantFiled: February 7, 2019Date of Patent: July 20, 2021Assignee: SK hynix Inc.Inventor: Ji Hoon Nam
-
Patent number: 10949380Abstract: A processing system may include a systolic array including a plurality of processing elements (PEs) arrayed in M rows and N columns, where M and N are natural numbers and M is not equal to N. The processing system may further include a row buffer configured to transmit row data to the systolic array in a row direction, and a column buffer configured to transmit column data to the systolic array in a column direction. When the processing system is in a first mode, the row data is input data and the columns data is weights. When the processing system is in a second mode the row data is the weights and the column data is the input data.Type: GrantFiled: November 14, 2019Date of Patent: March 16, 2021Assignee: SK hynix Inc.Inventor: Ji-Hoon Nam
-
Publication number: 20210011860Abstract: A data processing system includes a host device and a data storage device. The host device is configured to select a speed mode of a memory bandwidth based on a network model, or a batch size, or both. The data storage device includes an accelerator configured to change a structure of a processing element (PE) array by controlling transmission paths of first input data and second input data that are input to the PE array based on the speed mode of the memory bandwidth. Computing power and memory power of the accelerator are adjusted according to the selection of the speed mode.Type: ApplicationFiled: December 5, 2019Publication date: January 14, 2021Inventor: Ji Hoon NAM
-
Publication number: 20200285605Abstract: A processing system may include a systolic array including a plurality of processing elements (PEs) arrayed in M rows and N columns, where M and N are natural numbers and M is not equal to N. The processing system may further include a row buffer configured to transmit row data to the systolic array in a row direction, and a column buffer configured to transmit column data to the systolic array in a column direction. When the processing system is in a first mode, the row data is input data and the columns data is weights. When the processing system is in a second mode the row data is the weights and the column data is the input data.Type: ApplicationFiled: November 14, 2019Publication date: September 10, 2020Inventor: Ji-Hoon NAM
-
Publication number: 20200034318Abstract: A device includes a first memory including a plurality of memory dice that are arranged vertically to each other, a second memory of a second type, and a controller die that transfers first data between the first memory and a first internal memory of a processor using a first interface and that transfers second data between the second memory and a second internal memory of the processor using a second interface. The first and second memory types are different types of memories. The first and second interfaces are different from each other. The first and second internal memories are different from each other.Type: ApplicationFiled: July 11, 2019Publication date: January 30, 2020Inventor: Ji Hoon NAM
-
Publication number: 20200004557Abstract: A semiconductor apparatus may include a storage device including a data area and a code area and storing program codes provided from a host device in the code area, a plurality of unit processors, each of the plurality of unit processors including an internal memory, and a main control component configured to receive an operation policy, which includes a processor ID, a code ID, and a code address, from the host device and to control the plurality of unit processors based on the operation policy. The processor ID is an identifier for each of the plurality of unit processors, the code ID is an identifier for each of the program codes, and the code address indicates a position of the code area where each of the program codes is stored.Type: ApplicationFiled: February 7, 2019Publication date: January 2, 2020Inventor: Ji Hoon NAM
-
Publication number: 20190266123Abstract: A data processing system including a shared memory; a host processor configured to possess an ownership of the shared memory, and process a first task by accessing the shared memory; a processor configured to possess the ownership transferred from the host processor, and process a second task by accessing the shared memory; and a memory controller coupled among the host processor, the processor, and the shared memory, and configured to allow the host processor or the processor to access the shared memory according to the ownership.Type: ApplicationFiled: December 5, 2018Publication date: August 29, 2019Applicant: SK hynix Inc.Inventors: Ji Hoon NAM, Eui Cheol LIM
-
Patent number: 10340943Abstract: A data conversion apparatus may include: a receiver suitable for receiving input data; and a controller suitable for selectively converting the input data based on a distribution of a preset bit value included in the input data, and outputting any one of the input data and the converted data as output data, the converted data having a smaller size than the input data.Type: GrantFiled: October 27, 2017Date of Patent: July 2, 2019Assignee: SK hynix Inc.Inventor: Ji-Hoon Nam