Patents by Inventor Won-Ho Choi

Won-Ho Choi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11393517
    Abstract: An apparatus is provided that includes a memory device including a plurality of sub-arrays, and a memory controller. The memory controller is configured to determine a value of a parameter of a corresponding write pulse for each bit of a word based on a relative importance of each bit, and write each bit of the word to a corresponding one of the plurality of sub-arrays using the corresponding write pulses.
    Type: Grant
    Filed: May 13, 2021
    Date of Patent: July 19, 2022
    Assignee: Western Digital Technologies, Inc.
    Inventors: Yongjune Kim, Yoocharn Jeon, Won Ho Choi, Cyril Guyot, Yuval Cassuto
  • Patent number: 11378232
    Abstract: Disclosed is a device for automatically opening or closing a gas barrel valve. The device includes: a main plate installed so as to move up and down and to align the position of a gas barrel loaded in a cabinet; a gas barrel connecting portion installed on the lower portion of the main plate, separating an end cap from the gas barrel and storing the end cap, and then automatically screw-coupling a connector holder to a gas spray nozzle; a valve handle unit installed on the main plate so as to rotate around a first shaft and rotating a valve handle of the gas barrel such that the valve handle is locked or unlocked, while encompassing the valve handle of the gas barrel; and a valve handle opening or closing unit installed on the upper portion of the main plate so as to operate the valve handle unit in a direction, in which a valve of the gas barrel is opened.
    Type: Grant
    Filed: August 2, 2018
    Date of Patent: July 5, 2022
    Assignee: AMT CO., LTD.
    Inventors: Du Chul Kim, Jong Seong Lee, Won Ho Choi
  • Patent number: 11361829
    Abstract: Systems and methods for performing in-storage logic operations using one or more memory cell transistors and a programmable sense amplifier are described. The logic operations may comprise basic Boolean logic operations (e.g., OR and AND operations) or secondary Boolean logic operations (e.g., XOR and IMP operations). The one or more memory cell transistors may be used for storing user data during a first time period and then used for performing a logic operation during a second time period subsequent to the first time period. During the logic operation, a first memory cell transistor of the one or more memory cell transistors may be programmed with a threshold voltage that corresponds with a first input operand value and then a gate voltage bias may be applied to the first memory cell transistor during the logic operation that corresponds with a second input operand value.
    Type: Grant
    Filed: January 29, 2020
    Date of Patent: June 14, 2022
    Assignee: SanDisk Technologies LLC
    Inventors: Federico Nardi, Won Ho Choi
  • Publication number: 20220171992
    Abstract: Exemplary methods and apparatus are disclosed that implement super-sparse image/video compression by storing image dictionary elements within a cross-bar resistive random access memory (ReRAM) array (or other suitable cross-bar NVM array). In illustrative examples, each column of the cross-bar ReRAM array stores the values for one dictionary element (such as one 4×4 dictionary element). Methods and apparatus are described for training (configuring) the cross-bar ReRAM array to generate and store the dictionary elements by sequentially applying patches from training images to the array using an unstructured Hebbian training procedure. Additionally, methods and apparatus are described for compressing an input image by applying patches from the input image to the ReRAM array to read out cross-bar column indices identifying the columns storing the various dictionary elements that best fit the image. This may be done in parallel using a set of ReRAM arrays.
    Type: Application
    Filed: February 9, 2022
    Publication date: June 2, 2022
    Inventors: Wen Ma, Minghai Qin, Won Ho Choi, Pi-Feng Chiu, Martin Lueker-Boden
  • Patent number: 11328204
    Abstract: Use of a NAND array architecture to realize a binary neural network (BNN) allows for matrix multiplication and accumulation to be performed within the memory array. A unit synapse for storing a weight of a BNN is stored in a pair of series connected memory cells. A binary input is applied as a pattern of voltage values on a pair of word lines connected to the unit synapse to perform the multiplication of the input with the weight by determining whether or not the unit synapse conducts. The results of such multiplications are determined by a sense amplifier, with the results accumulated by a counter.
    Type: Grant
    Filed: March 28, 2019
    Date of Patent: May 10, 2022
    Assignee: SanDisk Technologies LLC
    Inventors: Won Ho Choi, Pi-Feng Chiu, Wen Ma, Minghai Qin, Gerrit Jan Hemink, Martin Lueker-Boden
  • Patent number: 11320094
    Abstract: Disclosed is system for automatically replacing a high-pressure gas tank, including: a high-pressure gas tank lift installed in a cabinet which is able to be elevated and including a die to load a high-pressure gas tank thereon; a high-pressure gas tank clamp clamping the high-pressure gas tank loaded on the die of the high-pressure gas tank lift to align the position of the high-pressure gas tank; a high-pressure gas tank connection unit removing an end cap from the high-pressure gas tank elevated by the high-pressure gas tank lift to automatically connect a connector holder to a gas injection nozzle and control the flow of gas; and control unit installed in the cabinet to control operation of the high-pressure gas tank connection unit, the high-pressure gas tank lift, and the high-pressure gas tank clamp.
    Type: Grant
    Filed: November 14, 2018
    Date of Patent: May 3, 2022
    Assignee: AMT CO., LTD.
    Inventors: Won Ho Choi, Chan Woo Kim
  • Publication number: 20220100375
    Abstract: Methods and apparatus are disclosed for implementing principal component analysis (PCA) within a non-volatile memory (NVM) die of solid state drive (SSD) to reduce the dimensionality of machine learning data before the data is transferred to other components of the SSD, such as to a data storage controller equipped with a machine learning engine. The machine learning data may include, for example, training images for training an image recognition system in which the SSD is installed. In some examples, the on-chip PCA components of the NVM die are configured as under-the-array or next-to-the-array components. In other examples, one or more arrays of the NVM die are configured as multiplication cores for performing PCA matrix multiplication. In still other aspects, multiple NVM dies are arranged in parallel, each with on-chip PCA components to permit parallel concurrent on-chip processing of machine learning data.
    Type: Application
    Filed: December 8, 2021
    Publication date: March 31, 2022
    Inventors: Won Ho Choi, Yongjune Kim, Martin Lueker-Boden
  • Publication number: 20220090740
    Abstract: A method for automatically opening or closing a gas barrel valve, includes: loading and aligning a gas barrel in a cabinet; separating an end cap from the gas barrel; screw-coupling a connector holder to a gas spray nozzle, from which the ends cap has been removed; winding a spring around a first shaft by enabling forward rotation of the first shaft while suppressing reverse rotation of the first shaft, which is installed in a valve handle holder so as to idle; opening a valve by enabling reverse rotation of a valve handle of the gas barrel while preventing forward rotation of the valve handle holder; and automatically closing the valve at the time of replacement of the gas barrel or when a gas leak is detected.
    Type: Application
    Filed: November 29, 2021
    Publication date: March 24, 2022
    Applicant: AMT CO., LTD.
    Inventors: Du Chul KIM, Jong Seong LEE, Won Ho CHOI
  • Patent number: 11275968
    Abstract: Exemplary methods and apparatus are disclosed that implement super-sparse image/video compression by storing image dictionary elements within a cross-bar resistive random access memory (ReRAM) array (or other suitable cross-bar NVM array). In illustrative examples, each column of the cross-bar ReRAM array stores the values for one dictionary element (such as one 4×4 dictionary element). Methods and apparatus are described for training (configuring) the cross-bar ReRAM array to generate and store the dictionary elements by sequentially applying patches from training images to the array using an unstructured Hebbian training procedure. Additionally, methods and apparatus are described for compressing an input image by applying patches from the input image to the ReRAM array to read out cross-bar column indices identifying the columns storing the various dictionary elements that best fit the image. This may be done in parallel using a set of ReRAM arrays.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: March 15, 2022
    Assignee: WESTERN DIGITAL TECHNOLOGIES, INC.
    Inventors: Wen Ma, Minghai Qin, Won Ho Choi, Pi-Feng Chiu, Martin Lueker-Boden
  • Patent number: 11248730
    Abstract: A pipe support device for a transformer is proposed. A brace having a grid shape and serving as a reinforcing member is provided on the surface of an outer housing constituting the exterior of the transformer. Supports are installed at predetermined intervals on the brace to be orthogonal thereto. A support base is positioned on each of the supports, and a pipe holder is coupled to the support base to support a pipe. An elastic supporting pad is positioned between the support base and the pipe and opposite flange portions of the pipe holder are seated on and coupled to the elastic supporting pad. An elastic close-contact pad is positioned between the pipe and an arched portion of the pipe holder and is brought into close contact with the pipe.
    Type: Grant
    Filed: December 21, 2018
    Date of Patent: February 15, 2022
    Assignee: HYOSUNG HEAVY INDUSTRIES CORPORATION
    Inventors: Chul Jun Park, Kyo Ho Lee, Do Jin Kim, Won Ho Choi
  • Patent number: 11216184
    Abstract: Methods and apparatus are disclosed for implementing principal component analysis (PCA) within a non-volatile memory (NVM) die of solid state drive (SSD) to reduce the dimensionality of machine learning data before the data is transferred to other components of the SSD, such as to a data storage controller equipped with a machine learning engine. The machine learning data may include, for example, training images for training an image recognition system in which the SSD is installed. In some examples, the on-chip PCA components of the NVM die are configured as under-the-array or next-to-the-array components. In other examples, one or more arrays of the NVM die are configured as multiplication cores for performing PCA matrix multiplication. In still other aspects, multiple NVM dies are arranged in parallel, each with on-chip PCA components to permit parallel concurrent on-chip processing of machine learning data.
    Type: Grant
    Filed: December 6, 2019
    Date of Patent: January 4, 2022
    Assignee: WESTERN DIGITAL TECHNOLOGIES, INC.
    Inventors: Won Ho Choi, Yongjune Kim, Martin Lueker-Boden
  • Publication number: 20210406672
    Abstract: Non-volatile memory structures for performing compute in memory inferencing for neural networks are presented. To improve performance, both in terms of speed and energy consumption, weight matrices are replaced with their singular value decomposition (SVD) and use of a low rank approximations (LRAs). The decomposition matrices can be stored in a single array, with the resultant LRA matrices requiring fewer weight values to be stored. The reduced sizes of the LRA matrices allow for inferencing to be performed more quickly and with less power. In a high performance and energy efficiency mode, a reduced rank for the SVD matrices stored on a memory die is determined and used to increase performance and reduce power needed for an inferencing operation.
    Type: Application
    Filed: June 26, 2020
    Publication date: December 30, 2021
    Applicant: SanDisk Technologies LLC
    Inventors: Tung Thanh Hoang, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210397930
    Abstract: A non-volatile memory device includes an array of non-volatile memory cells that are configured to store weights of a neural network. Associated with the array is a data latch structure that includes a page buffer, which can store weights for a layer of the neural network that is read out of the array, and a transfer buffer, that can store inputs for the neural network. The memory device can perform multiply and accumulate operations between inputs and weight of the neural network within the latch structure, avoiding the need to transfer data out of the array and associated latch structure for portions of an inference operation. By using binary weights and inputs, multiplication can be performed by bit-wise XNOR operations. The results can then be summed and activation applied, all within the latch structure.
    Type: Application
    Filed: June 22, 2020
    Publication date: December 23, 2021
    Applicant: Western Digital Technologies, Inc.
    Inventors: Anand Kulkarni, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210397974
    Abstract: Anon-volatile memory structure capable of storing weights for layers of a deep neural network (DNN) and perform an inferencing operation within the structure is presented. An in-array multiplication can be performed between multi-bit valued inputs, or activations, for a layer of the DNN and multi-bit valued weights of the layer. Each bit of a weight value is stored in a binary valued memory cell of the memory array and each bit of the input is applied as a binary input to a word line of the array for the multiplication of the input with the weight. To perform a multiply and accumulate operation, the results of the multiplications are accumulated by adders connected to sense amplifiers along the bit lines of the array. The adders can be configured to multiple levels of precision, so that the same structure can accommodate weights and activations of 8-bit, 4-bit, and 2-bit precision.
    Type: Application
    Filed: July 28, 2020
    Publication date: December 23, 2021
    Applicant: SanDisk Technologies LLC
    Inventors: Tung Thanh Hoang, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210379754
    Abstract: A layer jamming driving device is proposed, which includes an enclosure made of a variable material; and layer stacked structures having a plurality of layers stacked inside the enclosure, wherein the layer stacked structures can be coupled so as to be slidable and rotatable with respect to each other.
    Type: Application
    Filed: June 28, 2019
    Publication date: December 9, 2021
    Inventors: Dong Jun SHIN, Won Ho CHOI
  • Publication number: 20210384522
    Abstract: The present disclosure relates to a sub-nanometric particles-metal organic framework complex including a multi-shell hollow metal organic framework (MOF) and sub-nanometric particles (SNPs), and a method of preparing the same.
    Type: Application
    Filed: April 13, 2021
    Publication date: December 9, 2021
    Inventors: Jeung Ku Kang, Won Ho Choi, Byeong Cheul Moon, Dong Gyu Park, Jae Won Choi, Keon-Han Kim
  • Patent number: 11170290
    Abstract: Use of a NAND array architecture to realize a binary neural network (BNN) allows for matrix multiplication and accumulation to be performed within the memory array. A unit synapse for storing a weight of a BNN is stored in a pair of series connected memory cells. A binary input is applied as a pattern of voltage values on a pair of word lines connected to the unit synapse to perform the multiplication of the input with the weight by determining whether or not the unit synapse conducts. The results of such multiplications are determined by a sense amplifier, with the results accumulated by a counter. The arrangement can be extended to ternary inputs to realize a ternary-binary network (TBN) by adding a circuit to detect 0 input values and adjust the accumulated count accordingly.
    Type: Grant
    Filed: March 28, 2019
    Date of Patent: November 9, 2021
    Assignee: SanDisk Technologies LLC
    Inventors: Tung Thanh Hoang, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210334338
    Abstract: An innovative low-bit-width device may include a first digital-to-analog converter (DAC), a second DAC, a plurality of non-volatile memory (NVM) weight arrays, one or more analog-to-digital converters (ADCs), and a neural circuit. The first DAC is configured to convert a digital input signal into an analog input signal. The second DAC is configured to convert a digital previous hidden state (PHS) signal into an analog PHS signal. NVM weight arrays are configured to compute vector matrix multiplication (VMM) arrays based on the analog input signal and the analog PHS signal. The NVM weight arrays are coupled to the first DAC and the second DAC. The one or more ADCs are coupled to the plurality of NVM weight arrays and are configured to convert the VMM arrays into digital VMM values. The neural circuit is configured to process the digital VMM values into a new hidden state.
    Type: Application
    Filed: July 8, 2021
    Publication date: October 28, 2021
    Inventors: Wen Ma, Pi-Feng Chiu, Minghai Qin, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210326110
    Abstract: Technology for reconfigurable input precision in-memory computing is disclosed herein. Reconfigurable input precision allows the bit resolution of input data to be changed to meet the requirements of in-memory computing operations. Voltage sources (that may include DACs) provide voltages that represent input data to memory cell nodes. The resolution of the voltage sources may be reconfigured to change the precision of the input data. In one parallel mode, the number of DACs in a DAC node is used to configure the resolution. In one serial mode, the number of cycles over which a DAC provides voltages is used to configure the resolution. The memory system may include relatively low resolution voltage sources, which avoids the need to have complex high resolution voltage sources (e.g., high resolution DACs). Lower resolution voltage sources can take up less area and/or use less power than higher resolution voltage sources.
    Type: Application
    Filed: April 16, 2020
    Publication date: October 21, 2021
    Applicant: SanDisk Technologies LLC
    Inventors: Wen Ma, Pi-Feng Chiu, Won Ho Choi, Martin Lueker-Boden
  • Publication number: 20210325957
    Abstract: Certain aspects of the present disclosure provide a method for performing multimode inferencing, comprising: receiving machine learning model input data from a requestor; processing the machine learning model input data with a machine learning model using processing hardware at a first power level to generate first output data; selecting a second power level for the processing hardware based on comparing the first output data to a threshold value; processing the machine learning model input data with the machine learning model using the processing hardware at the second power level to generate second output data; and sending second output data to the requestor.
    Type: Application
    Filed: April 21, 2020
    Publication date: October 21, 2021
    Inventors: Yongjune KIM, Cyril GUYOT, Won Ho CHOI