Patents by Inventor Jun-Young Jeong

Jun-Young Jeong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11044456
    Abstract: Disclosed are a method and a player for outputting an overlay that displays additional information on a 360-degree video. According to the present invention, an image processing method includes: decoding an overlay; and rendering the decoded overlay on a 360-degree video on the basis of overlay-related information. Here, the overlay-related information includes information indicating the number of the overlays and information indicating a unique identifier assigned to the overlay, and when multiple overlays are present, the identifiers assigned to the respective multiple overlays differ.
    Type: Grant
    Filed: May 31, 2019
    Date of Patent: June 22, 2021
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Jun Young Jeong, Kug Jin Yun
  • Patent number: 11037362
    Abstract: A method and an apparatus for generating a three-dimension (3D) virtual viewpoint image including: segmenting a first image into a plurality of images indicating different layers based on depth information of the first image at a gaze point of a user; and inpainting an area occluded by foreground in the plurality of images based on depth information of a reference viewpoint image are provided.
    Type: Grant
    Filed: June 26, 2020
    Date of Patent: June 15, 2021
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Hong-Chang Shin, Gwang Soon Lee, Jun Young Jeong
  • Publication number: 20210099687
    Abstract: Disclosed herein is an immersive video processing method. The immersive video processing method includes: determining a priority order of pruning for input videos; extracting patches from the input videos based on the priority order of pruning; generating at least one atlas based on the extracted patches; and encoding metadata. Herein, the metadata may include information on a priority order of pruning among input videos.
    Type: Application
    Filed: September 25, 2020
    Publication date: April 1, 2021
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Hong Chang SHIN, Gwang Soon LEE, Ho Min EUM, Jun Young JEONG, Kug Jin YUN
  • Publication number: 20210092346
    Abstract: Disclosed herein is an immersive video processing method. The immersive video processing method includes: determining a priority order of pruning for source videos; extracting patches from the source videos based on the priority order of pruning; generating at least one atlas based on the extracted patches; and encoding metadata. Herein, the metadata may include first threshold information that becomes a criterion for distinguishing between a valid pixel and an invalid pixel in the atlas video.
    Type: Application
    Filed: September 23, 2020
    Publication date: March 25, 2021
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Gwang Soon LEE, Hong Chang SHIN, Ho Min EUM, Jun Young JEONG
  • Publication number: 20210067757
    Abstract: Disclosed herein is an immersive video processing method. The immersive video processing method includes: determining a priority order of pruning for source videos; extracting patches from the source videos based on the priority order of pruning; generating at least one atlas based on the extracted patches; and encoding metadata. Herein, a first flag indicating whether or not an atlas includes a patch including information on an entire region of a first source video may be encoded into the metadata.
    Type: Application
    Filed: August 28, 2020
    Publication date: March 4, 2021
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Kug Jin YUN, Jun Young JEONG, Gwang Soon LEE, Hong Chang SHIN, Ho Min EUM
  • Publication number: 20210006830
    Abstract: Disclosed herein is an immersive video processing method. The immersive video processing method may include classifying a multiplicity of source view videos into base view videos and additional view videos, generating residual data for the additional view videos, packing a patch, which is generated based on the residual data, into an altas video, and generating metadata for the patch.
    Type: Application
    Filed: March 19, 2020
    Publication date: January 7, 2021
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Kug Jin YUN, Jun Young JEONG, Gwang Soon LEE, Hong Chang SHIN, Ho Min EUM, Sang Woon KWAK
  • Publication number: 20210006831
    Abstract: Disclosed herein is an image encoding/decoding method and apparatus for virtual view synthesis. The image decoding for virtual view synthesis may include decoding texture information and depth information of at least one or more basic view images and at least one or more additional view images from a bit stream and synthesizing a virtual view on the basis of the texture information and the depth information, wherein the basic view image and the additional view image comprise a non-empty region and an empty region, and wherein the synthesizing of the virtual view comprises determining the non-empty region through a specific value in the depth information and a threshold and synthesizing the virtual view by using the determined non-empty region.
    Type: Application
    Filed: March 19, 2020
    Publication date: January 7, 2021
    Applicants: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE, Poznan University of Technology
    Inventors: Gwang Soon LEE, Jun Young JEONG, Hong Chang SHIN, Kug Jin YUN, Marek Domanski, Olgierd Stankiewicz, Dawid Mieloch, Adrian Dziembowski, Adam Grzelka, Jakub Stankowski
  • Publication number: 20210006764
    Abstract: An immersive video processing method according to the present disclosure includes determining a priority order of pruning for source view videos, generating a residual video for an additional view video based on the priority order of pruning, packing a patch generated based on the residual video into an atlas video, and encoding the atlas video.
    Type: Application
    Filed: July 6, 2020
    Publication date: January 7, 2021
    Applicants: Electronics and Telecommunications Research Institute, IUCF-HYU (INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG UNIVERSITY)
    Inventors: Hong Chang SHIN, Gwang Soon LEE, Ho Min EUM, Jun Young JEONG, Jong Il PARK, Jun Young YUN
  • Publication number: 20200413094
    Abstract: Disclosed herein are an image encoding/decoding method and apparatus and a recording medium storing a bitstream.
    Type: Application
    Filed: June 12, 2020
    Publication date: December 31, 2020
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Gwang Soon LEE, Hong Chang SHIN, Kug Jin YUN, Jun Young JEONG
  • Publication number: 20200410746
    Abstract: A method and an apparatus for generating a three-dimension (3D) virtual viewpoint image including: segmenting a first image into a plurality of images indicating different layers based on depth information of the first image at a gaze point of a user; and inpainting an area occluded by foreground in the plurality of images based on depth information of a reference viewpoint image are provided.
    Type: Application
    Filed: June 26, 2020
    Publication date: December 31, 2020
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Hong-Chang SHIN, Gwang Soon LEE, Jun Young JEONG
  • Publication number: 20200396485
    Abstract: A video encoding method of encoding a multi-view image including one or more basic view images and a plurality of reference view images includes determining a pruning order of the plurality of reference view images, acquiring a plurality of residual reference view images, by pruning the plurality of reference view images based on the one or more basic view images according to the pruning order, encoding the one or more basic view images and the plurality of residual reference view images, and outputting a bitstream including encoding information of the one or more basic view images and the plurality of residual reference view images.
    Type: Application
    Filed: June 15, 2020
    Publication date: December 17, 2020
    Applicants: Electronics and Telecommunications Research Institute, IUCF-HYU (INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG UNIVERSITY)
    Inventors: Hong Chang SHIN, Gwang Soon LEE, Ho Min EUM, Jun Young JEONG, Kug Jin YUN, Jun Young YUN, Jong Il PARK
  • Patent number: 10848837
    Abstract: Disclosed is an apparatus and method of providing a high quality 360-degree VR image. A method of decoding a 360-degree VR image according to the present disclosure includes: receiving a bit stream including 360-degree VR image information; decoding information related to a 360-degree VR service from the bitstream; detecting a region of interest based on the information related to the 360-degree VR service; and providing to a user a 360-degree VR image for the region of interest.
    Type: Grant
    Filed: April 3, 2018
    Date of Patent: November 24, 2020
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Kug Jin Yun, Jun Young Jeong
  • Publication number: 20200359000
    Abstract: Disclosed herein is an immersive video processing method. The immersive video processing method may include classifying a multiplicity of view videos into a base view and an additional view, generating a residual video for the additional view video classified as an additional view, packing a patch, which is generated based on the residual video, into an atlas video, and generating metadata for the patch.
    Type: Application
    Filed: March 20, 2020
    Publication date: November 12, 2020
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Hong Chang SHIN, Gwang Soon LEE, Sang Woon KWAK, Kug Jin YUN, Jun Young JEONG
  • Publication number: 20200336724
    Abstract: Disclosed herein is an immersive video formatting method and apparatus for supporting motion parallax, The immersive video formatting method includes acquiring a basic video at a basic position, acquiring a multiple view video at at least one position different from the basic position, acquiring at least one residual video plus depth (RVD) video using the basic video and the multiple view video, and generating at least one of a packed video plus depth (PVD) video or predetermined metadata using the acquired basic video and the at least one RVD video.
    Type: Application
    Filed: January 31, 2020
    Publication date: October 22, 2020
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Gwang Soon LEE, Hong Chang SHIN, Kug Jin YUN, Jun Young JEONG
  • Publication number: 20200084516
    Abstract: Disclosed is an apparatus and method of providing a high quality 360-degree VR image. A method of decoding a 360-degree VR image according to the present disclosure includes: receiving a bit stream including 360-degree VR image information; decoding information related to a 360-degree VR service from the bitstream; detecting a region of interest based on the information related to the 360-degree VR service; and providing to a user a 360-degree VR image for the region of interest.
    Type: Application
    Filed: April 3, 2018
    Publication date: March 12, 2020
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Kug Jin YUN, Jun Young JEONG
  • Publication number: 20190373243
    Abstract: Disclosed are a method and a player for outputting an overlay that displays additional information on a 360-degree video. According to the present invention, an image processing method includes: decoding an overlay; and rendering the decoded overlay on a 360-degree video on the basis of overlay-related information. Here, the overlay-related information includes information indicating the number of the overlays and information indicating a unique identifier assigned to the overlay, and when multiple overlays are present, the identifiers assigned to the respective multiple overlays differ.
    Type: Application
    Filed: May 31, 2019
    Publication date: December 5, 2019
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Jun Young JEONG, Kug Jin YUN
  • Publication number: 20160063748
    Abstract: An operating method of an electronic device is provided. The method includes determining whether a present point in time corresponds to a predetermined period or event, obtaining a content associated with the predetermined period or event, and providing the content as at least a part of a user interface displayed on a user interface screen of the electronic device.
    Type: Application
    Filed: September 1, 2015
    Publication date: March 3, 2016
    Inventors: Ji-Hyun KIM, Wook-Hyun Jeong, Hye-Jin Kang, Jee-Youn Kim, Hyun-Seok Kim, Seok-Hee Na, Ha-Yang Seo, Sun-Mi You, Bo-Na Lee, Dong-Hoe Lim, Yoon-Cheung Chang, Min-Woo Chong, Jun-Young Jeong, Ji-Hea Park, Yu-Jeong Jeon
  • Patent number: 9075269
    Abstract: An array substrate includes a substrate; gate lines over the substrate along a first direction; data lines over the substrate along a second direction and crossing the gate lines to define pixel regions; a thin film transistor at each crossing portion of the gate and data lines; an insulating layer covering the thin film transistor and having a flat top surface; a common electrode on the insulating layer all over the substrate; a common line on the common electrode; a passivation layer on the common line; and a pixel electrode on the passivation layer in each pixel region and connected to the thin film transistor, the pixel electrode including electrode patterns, wherein the passivation layer has a step height at a top surface of the passivation layer due to the plurality of common lines.
    Type: Grant
    Filed: November 15, 2013
    Date of Patent: July 7, 2015
    Assignee: LG Display Co., Ltd.
    Inventors: Kyung-Mo Son, Jae-Kyun Lee, Sung-Chol Yi, Taek-Jun Jung, Sun-Ju Ku, Soon-Hwan Hong, Sang-Su Jang, Jun-Young Jeong, Eun-Hye Lee
  • Publication number: 20140168554
    Abstract: An array substrate includes a substrate; gate lines over the substrate along a first direction; data lines over the substrate along a second direction and crossing the gate lines to define pixel regions; a thin film transistor at each crossing portion of the gate and data lines; an insulating layer covering the thin film transistor and having a flat top surface; a common electrode on the insulating layer all over the substrate; a common line on the common electrode; a passivation layer on the common line; and a pixel electrode on the passivation layer in each pixel region and connected to the thin film transistor, the pixel electrode including electrode patterns, wherein the passivation layer has a step height at a top surface of the passivation layer due to the plurality of common lines.
    Type: Application
    Filed: November 15, 2013
    Publication date: June 19, 2014
    Applicant: LG DISPLAY CO., LTD.
    Inventors: Kyung-Mo SON, Jae-Kyun LEE, Sung-Chol YI, Taek-Jun JUNG, Sun-Ju KU, Soon-Hwan HONG, Sang-Su JANG, Jun-Young JEONG, Eun-Hye LEE
  • Patent number: 8654915
    Abstract: A control signal receiver includes a converting circuit and a synchronization detection circuit. The converting circuit generates a complex control symbol stream including transmission configurations by converting an input signal. The synchronization detection circuit generates a first bit stream by applying a first determination criterion to the complex control symbol stream and generates a first synchronization signal by comparing the first bit stream with a reference synchronization word. The synchronization detection circuit generates a second bit stream by applying the first determination criterion and a second determination criterion to the complex control symbol stream in that order and generates a second synchronization signal by comparing the second bit stream with the reference synchronization word. The synchronization detection circuit outputs one of the first synchronization signal and the second synchronization signal as asynchronization enable signal.
    Type: Grant
    Filed: May 15, 2012
    Date of Patent: February 18, 2014
    Assignee: Samsung Electronics Co., Ltd.
    Inventor: Jun-Young Jeong