Patents by Inventor Dong Gyu Sim

Dong Gyu Sim has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240171747
    Abstract: A method and an apparatus are configured for intra prediction coding of video data. An apparatus for decoding video data includes: a decoding unit which obtains, from a bitstream, information on a luma prediction mode and information on a chroma prediction mode of a current coding block; and an intra prediction unit which generates luma prediction samples and chroma prediction samples of the current coding block. The intra prediction unit derives a luma intra prediction type and a luma intra prediction mode of the current coding block on the basis of the information on the luma prediction mode, and determines a chroma intra prediction mode of the current coding block on the basis of the luma intra prediction type and the luma intra prediction mode of the current coding block, and the information on the chroma prediction mode.
    Type: Application
    Filed: January 16, 2024
    Publication date: May 23, 2024
    Inventors: Dong Gyu Sim, Joo Hyung Byeon, Sea Nae Park, Jun Taek Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240163444
    Abstract: A method and an apparatus are configured for intra prediction coding of video data. An apparatus for decoding video data includes: a decoding unit which obtains, from a bitstream, information on a luma prediction mode and information on a chroma prediction mode of a current coding block; and an intra prediction unit which generates luma prediction samples and chroma prediction samples of the current coding block. The intra prediction unit derives a luma intra prediction type and a luma intra prediction mode of the current coding block on the basis of the information on the luma prediction mode, and determines a chroma intra prediction mode of the current coding block on the basis of the luma intra prediction type and the luma intra prediction mode of the current coding block, and the information on the chroma prediction mode.
    Type: Application
    Filed: January 16, 2024
    Publication date: May 16, 2024
    Inventors: Dong Gyu Sim, Joo Hyung Byeon, Sea Nae Park, Jun Taek Park, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11973966
    Abstract: A video decoding method and a video decoding apparatus are configured to decode video. To efficiently code residual blocks obtained from block-based motion compensation, a video encoding apparatus and the video decoding apparatus divide a relevant residual block of a current block into two subblocks in a horizontal or vertical direction and encode one residual subblock alone out of the two residual subblocks.
    Type: Grant
    Filed: March 12, 2020
    Date of Patent: April 30, 2024
    Assignees: Hyundai Motor Company, Kia Motors Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Jong Seok Lee, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11973960
    Abstract: A video encoding/decoding apparatus according to the present invention acquires motion vector refinement information, performs motion compensation on the basis of a motion vector of a current block, refines the motion vector of the current block using at least one or both of the motion vector refinement information and the output of the motion compensation, and performs motion compensation using the refined motion vector.
    Type: Grant
    Filed: June 9, 2022
    Date of Patent: April 30, 2024
    Assignee: INTELLECTUAL DISCOVERY CO., LTD.
    Inventors: Yong Jo Ahn, Dong Gyu Sim, Ho Chan Ryu, Seanae Park, Byung Tae Oh, Byung Cheol Song
  • Patent number: 11962777
    Abstract: An inverse quantization method is implemented by an inverse quantization device, the method configured for acquiring quantized coefficients, estimating a quantization parameter in quantization groups or quantization parameter prediction group units, generating an inverse quantization matrix for adaptive quantization, and generating transform coefficients from the quantized coefficients using the quantization parameter and the inverse quantization matrix.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: April 16, 2024
    Assignees: Hyundai Motor Company, Kia Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Sea Nae Park, Jong Seok Lee, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11962775
    Abstract: An inverse quantization method is implemented by an inverse quantization device, the method configured for acquiring quantized coefficients, estimating a quantization parameter in quantization groups or quantization parameter prediction group units, generating an inverse quantization matrix for adaptive quantization, and generating transform coefficients from the quantized coefficients using the quantization parameter and the inverse quantization matrix.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: April 16, 2024
    Assignees: Hyundai Motor Company, Kia Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Sea Nae Park, Jong Seok Lee, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11962776
    Abstract: An inverse quantization method is implemented by an inverse quantization device, the method configured for acquiring quantized coefficients, estimating a quantization parameter in quantization groups or quantization parameter prediction group units, generating an inverse quantization matrix for adaptive quantization, and generating transform coefficients from the quantized coefficients using the quantization parameter and the inverse quantization matrix.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: April 16, 2024
    Assignees: Hyundai Motor Company, Kia Corporation, Kwangwon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Sea Nae Park, Jong Seok Lee, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11949881
    Abstract: The present invention discloses an encoding apparatus using a Discrete Cosine Transform (DCT) scanning, which includes a mode selection means for selecting an optimal mode for intra prediction; an intra prediction means for performing intra prediction onto video inputted based on the mode selected in the mode selection means; a DCT and quantization means for performing DCT and quantization onto residual coefficients of a block outputted from the intra prediction means; and an entropy encoding means for performing entropy encoding onto DCT coefficients acquired from the DCT and quantization by using a scanning mode decided based on pixel similarity of the residual coefficients.
    Type: Grant
    Filed: April 1, 2021
    Date of Patent: April 2, 2024
    Assignees: Electronics and Telecommunications Research Institute, Kwangwoon University Research Institute for Industry Cooperation, Industry-Academia Cooperation Group of Sejong University
    Inventors: Se-Yoon Jeong, Hae-Chul Choi, Jeong-Il Seo, Seung-Kwon Beack, In-Seon Jang, Jae-Gon Kim, Kyung-Ae Moon, Dae-Young Jang, Jin-Woo Hong, Jin-Woong Kim, Yung-Lyul Lee, Dong-Gyu Sim, Seoung-Jun Oh, Chang-Beom Ahn, Dae-Yeon Kim, Dong-Kyun Kim
  • Publication number: 20240107032
    Abstract: The present invention relates to an image encoding and decoding technique, and more particularly, to an image encoder and decoder using unidirectional prediction. The image encoder includes a dividing unit to divide a macro block into a plurality of sub-blocks, a unidirectional application determining unit to determine whether an identical prediction mode is applied to each of the plurality of sub-blocks, and a prediction mode determining unit to determine a prediction mode with respect to each of the plurality of sub-blocks based on a determined result of the unidirectional application determining unit.
    Type: Application
    Filed: December 7, 2023
    Publication date: March 28, 2024
    Applicants: Electronics and Telecommunications Research Institute, Kwangwoon University Industry-Academic Collaboration Foundation, University-Industry Cooperation Group of Kyung Hee University
    Inventors: Hae Chul CHOI, Se Yoon JEONG, Sung-Chang LIM, Jin Soo CHOI, Jin Woo HONG, Dong Gyu SIM, Seoung-Jun OH, Chang-Beom AHN, Gwang Hoon PARK, Seung Ryong KOOK, Sea-Nae PARK, Kwang-Su JEONG
  • Patent number: 11943441
    Abstract: An inverse quantization method is implemented by an inverse quantization device, the method configured for acquiring quantized coefficients, estimating a quantization parameter in quantization groups or quantization parameter prediction group units, generating an inverse quantization matrix for adaptive quantization, and generating transform coefficients from the quantized coefficients using the quantization parameter and the inverse quantization matrix.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: March 26, 2024
    Assignees: Hyundai Motor Company, Kia Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Sea Nae Park, Jong Seok Lee, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240089452
    Abstract: A method and for reconstructing chroma blocks and a video decoding apparatus are disclosed. In accordance with one aspect of the present disclosure, provided is a method for reconstructing a chroma block of a target block to be reconstructed. The method includes decoding correlation information between first residual samples and second residual samples, the first residual sample, and prediction information of the chroma block from a bitstream, wherein the first residual samples are residual samples of a first chroma component and the second residual samples are residual samples of a second chroma component. The method further includes generating predicted samples of the first chroma component and predicted samples of the second chroma information on the basis of the prediction information, and deriving the second residual samples by applying the correlation information to the first residual samples.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Dong Gyu Sim, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240089455
    Abstract: A method and for reconstructing chroma blocks and a video decoding apparatus are disclosed. In accordance with one aspect of the present disclosure, provided is a method for reconstructing a chroma block of a target block to be reconstructed. The method includes decoding correlation information between first residual samples and second residual samples, the first residual sample, and prediction information of the chroma block from a bitstream, wherein the first residual samples are residual samples of a first chroma component and the second residual samples are residual samples of a second chroma component. The method further includes generating predicted samples of the first chroma component and predicted samples of the second chroma information on the basis of the prediction information, and deriving the second residual samples by applying the correlation information to the first residual samples.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Dong Gyu Sim, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240089454
    Abstract: A method and for reconstructing chroma blocks and a video decoding apparatus are disclosed. In accordance with one aspect of the present disclosure, provided is a method for reconstructing a chroma block of a target block to be reconstructed. The method includes decoding correlation information between first residual samples and second residual samples, the first residual sample, and prediction information of the chroma block from a bitstream, wherein the first residual samples are residual samples of a first chroma component and the second residual samples are residual samples of a second chroma component. The method further includes generating predicted samples of the first chroma component and predicted samples of the second chroma information on the basis of the prediction information, and deriving the second residual samples by applying the correlation information to the first residual samples.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Dong Gyu Sim, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240089456
    Abstract: A method and for reconstructing chroma blocks and a video decoding apparatus are disclosed. In accordance with one aspect of the present disclosure, provided is a method for reconstructing a chroma block of a target block to be reconstructed. The method includes decoding correlation information between first residual samples and second residual samples, the first residual sample, and prediction information of the chroma block from a bitstream, wherein the first residual samples are residual samples of a first chroma component and the second residual samples are residual samples of a second chroma component. The method further includes generating predicted samples of the first chroma component and predicted samples of the second chroma information on the basis of the prediction information, and deriving the second residual samples by applying the correlation information to the first residual samples.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Dong Gyu Sim, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240089453
    Abstract: A method and for reconstructing chroma blocks and a video decoding apparatus are disclosed. In accordance with one aspect of the present disclosure, provided is a method for reconstructing a chroma block of a target block to be reconstructed. The method includes decoding correlation information between first residual samples and second residual samples, the first residual sample, and prediction information of the chroma block from a bitstream, wherein the first residual samples are residual samples of a first chroma component and the second residual samples are residual samples of a second chroma component. The method further includes generating predicted samples of the first chroma component and predicted samples of the second chroma information on the basis of the prediction information, and deriving the second residual samples by applying the correlation information to the first residual samples.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Dong Gyu Sim, Sea Nae Park, Seung Wook Park, Wha Pyeong Lim
  • Patent number: 11930180
    Abstract: A method and an apparatus are configured for intra prediction coding of video data. An apparatus for decoding video data includes: a decoding unit which obtains, from a bitstream, information on a luma prediction mode and information on a chroma prediction mode of a current coding block; and an intra prediction unit which generates luma prediction samples and chroma prediction samples of the current coding block. The intra prediction unit derives a luma intra prediction type and a luma intra prediction mode of the current coding block on the basis of the information on the luma prediction mode, and determines a chroma intra prediction mode of the current coding block on the basis of the luma intra prediction type and the luma intra prediction mode of the current coding block, and the information on the chroma prediction mode.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: March 12, 2024
    Assignees: Hyundai Motor Company, Kia Motors Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Joo Hyung Byeon, Sea Nae Park, Jun Taek Park, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240078710
    Abstract: Disclosed herein are a method, an apparatus and a storage medium for encoding/decoding using a transform-based feature map. An optimal basis vector is extracted from one or more feature maps, and a transform coefficient is acquired through a transform using the basis vector. The basis vector and the transform coefficient may be transmitted through a bitstream. In an embodiment, one or more feature maps are reconstructed using the basis vector and the transform coefficient, which are decoded from the bitstream.
    Type: Application
    Filed: September 1, 2023
    Publication date: March 7, 2024
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Youn-Hee KIM, Jooyoung LEE, Se-Yoon JEONG, Jin-Soo CHOI, Dong-Gyu SIM, Na-Seong KWON, Seung-Jin PARK, Min-Hun LEE, Han-Sol CHOI
  • Publication number: 20240073416
    Abstract: The present invention relates to an apparatus and method for encoding and decoding an image by skip encoding. The image-encoding method by skip encoding, which performs intra-prediction, comprises: performing a filtering operation on the signal which is reconstructed prior to an encoding object signal in an encoding object image; using the filtered reconstructed signal to generate a prediction signal for the encoding object signal; setting the generated prediction signal as a reconstruction signal for the encoding object signal; and not encoding the residual signal which can be generated on the basis of the difference between the encoding object signal and the prediction signal, thereby performing skip encoding on the encoding object signal.
    Type: Application
    Filed: November 6, 2023
    Publication date: February 29, 2024
    Applicants: Electronics and Telecommunications Research Institute, Kwangwoon University Industry-Academic Collaboration Foundation, Universily-lndustry Cooperation Group of Kyung Hee University
    Inventors: Sung Chang LIM, Ha Hyun LEE, Se Yoon JEONG, Hui Yong KIM, Suk Hee CHO, Jong Ho KIM, Jin Ho LEE, Jin Soo CHOI, Jin Woong KIM, Chie Teuk AHN, Dong Gyu SIM, Seoung Jun OH, Gwang Hoon PARK, Sea Nae PARK, Chan Woong JEON
  • Patent number: 11910023
    Abstract: A method and an apparatus for parallel encoding and decoding of moving picture data are provided. The method includes decoding, from a bitstream, a syntax element indicating that a picture can be decoded using wavefront parallel processing and decoding encoded data of the picture. The step of decoding encoded data of the picture includes for a first coding block of a current CTU row encoded in a palette mode, predicting a palette table for the first coding block by using palette data from a first CTU of a previous CTU row and decoding the first coding block in the palette mode using the palette table predicted for the first coding block.
    Type: Grant
    Filed: December 11, 2022
    Date of Patent: February 20, 2024
    Assignees: Hyundai Motor Company, Kia Corporation, Kwangwoon University Industry-Academic Collaboration Foundation
    Inventors: Dong Gyu Sim, Sea Nae Park, Joo Hyung Byeon, Seung Wook Park, Wha Pyeong Lim
  • Publication number: 20240056587
    Abstract: The present invention relates to a method and an apparatus for interlayer prediction, and the method for interlayer prediction, according to the present invention, comprises the steps of: deciding whether to apply an interlayer prediction to an enhancement layer; and performing a prediction on a current block of the enhancement layer based on reference information that is generalized and generated from a reference picture, which is decoded, of a reference layer, when the interlayer prediction is applied, wherein the reference layer information can be encoding information of a reference block, which corresponds to a current block of the enhancement layer, from the reference layer, and residual information.
    Type: Application
    Filed: October 23, 2023
    Publication date: February 15, 2024
    Applicants: Electronics and Telecommunications Research Institute, KWANGWOON UNIVERSITY INDUSTRY-ACADEMIC COLLABORATION FOUNDATION
    Inventors: Ha Hyun LEE, Jung Won KANG, Jin Soo CHOI, Jin Woong KIM, Dong Gyu SIM, Hyo Min CHOI, Jung Hak NAM