Patents by Inventor Young-Mi Cha

Young-Mi Cha has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190313082
    Abstract: An apparatus and method for measuring the position of a stereo camera. The apparatus for measuring a position of the camera according to an embodiment includes a feature point extraction unit for extracting feature points from images captured by a first camera and a second camera and generating a first feature point list based on the feature points, a feature point recognition unit for extracting feature points from images captured by the cameras after the cameras have moved, generating a second feature point list based on the feature points, and recognizing actual feature points based on the first feature point list and the second feature point list, and a position variation measurement unit for measuring variation in positions of the cameras based on variation in relative positions of the actual feature points.
    Type: Application
    Filed: June 5, 2019
    Publication date: October 10, 2019
    Inventors: Jae-Hean KIM, Hyun KANG, Soon-Chul JUNG, Young-Mi CHA, Jin-Sung CHOI
  • Patent number: 10356394
    Abstract: An apparatus and method for measuring the position of a stereo camera. The apparatus for measuring a position of the camera according to an embodiment includes a feature point extraction unit for extracting feature points from images captured by a first camera and a second camera and generating a first feature point list based on the feature points, a feature point recognition unit for extracting feature points from images captured by the cameras after the cameras have moved, generating a second feature point list based on the feature points, and recognizing actual feature points based on the first feature point list and the second feature point list, and a position variation measurement unit for measuring variation in positions of the cameras based on variation in relative positions of the actual feature points.
    Type: Grant
    Filed: February 18, 2016
    Date of Patent: July 16, 2019
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jae-Hean Kim, Hyun Kang, Soon-Chul Jung, Young-Mi Cha, Jin-Sung Choi
  • Patent number: 9858684
    Abstract: An image processing apparatus and method for calibrating a depth of a depth sensor. The image processing method may include obtaining a depth image of a target object captured by a depth sensor and a color image of the target object captured by a color camera; and calibrating a depth of the depth sensor by calibrating a geometrical relation between a projector and a depth camera, which are included in the depth sensor, based the obtained depth and color images and calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector.
    Type: Grant
    Filed: July 28, 2014
    Date of Patent: January 2, 2018
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jae-Hean Kim, Chang-Woo Chu, Young-Mi Cha, Jin-Sung Choi, Bon-Ki Koo
  • Patent number: 9710963
    Abstract: A primitive fitting apparatus is provided. The primitive fitting apparatus may include a selecting unit to receive, from a user, a selection of points used to fit a primitive a user desires to fit from a point cloud, an identifying unit to receive a selection of the primitive from the user and to identify the selected primitive, and a fitting unit to fit the primitive to correspond to the points, using the points and primitive.
    Type: Grant
    Filed: February 28, 2014
    Date of Patent: July 18, 2017
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Young Mi Cha, Chang Woo Chu, Jae Hean Kim, Il Kyu Park, Bon Ki Koo, Jin Sung Choi
  • Patent number: 9619892
    Abstract: Provided is an apparatus and method for extracting a movement path, the movement path extracting apparatus including an image receiver to receive an image from a camera group in which a mutual positional relationship among cameras is fixed, a geographic coordinates receiver to receive geographic coordinates of a moving object on which the camera group is fixed, and a movement path extractor to extract a movement path of the camera group based on a direction and a position of a reference camera of the camera group using the image and the geographic coordinates.
    Type: Grant
    Filed: May 6, 2014
    Date of Patent: April 11, 2017
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Chang Woo Chu, Jae Hean Kim, Il Kyu Park, Young Mi Cha, Jin Sung Choi, Bon Ki Koo
  • Patent number: 9613457
    Abstract: Provided are a multi-primitive fitting method including an acquiring point cloud data by collecting data of each of input points, a obtaining a segment for the points using the point cloud data, and a performing primitive fitting using data of points included in the segment and the point cloud data, and a multi-primitive fitting device that performs the method.
    Type: Grant
    Filed: January 28, 2015
    Date of Patent: April 4, 2017
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Young Mi Cha, Chang Woo Chu, Jae Hean Kim
  • Publication number: 20160379354
    Abstract: An apparatus and method for measuring the position of a stereo camera. The apparatus for measuring a position of the camera according to an embodiment includes a feature point extraction unit for extracting feature points from images captured by a first camera and a second camera and generating a first feature point list based on the feature points, a feature point recognition unit for extracting feature points from images captured by the cameras after the cameras have moved, generating a second feature point list based on the feature points, and recognizing actual feature points based on the first feature point list and the second feature point list, and a position variation measurement unit for measuring variation in positions of the cameras based on variation in relative positions of the actual feature points.
    Type: Application
    Filed: February 18, 2016
    Publication date: December 29, 2016
    Inventors: Jae-Hean KIM, Hyun KANG, Soon-Chul JUNG, Young-Mi CHA, Jin-Sung CHOI
  • Publication number: 20160364905
    Abstract: An apparatus and method for generating a 3D model, in which point clouds from a scanner are automatically aligned using marker information, with markers attached to an object. The apparatus includes a point cloud refinement unit for receiving point clouds and marker sets from scanning of an object, refining the point clouds, and transmitting refined point clouds and marker sets, a point cloud alignment unit for inquiring of a point cloud and marker set database about a point cloud group of point cloud pairs and a marker set group of marker set pairs, and aligning the point cloud group, wherein the point cloud pairs and the marker set pairs are acquired by scanning a single object in different directions, and a surface generation unit for generating a 3D model by reconstructing a surface of the object based on the aligned point cloud group.
    Type: Application
    Filed: February 17, 2016
    Publication date: December 15, 2016
    Inventors: Soon-Chul JUNG, Hyun KANG, Jae-Hean KIM, Young-Mi CHA, Jin-Sung CHOI
  • Patent number: 9390330
    Abstract: Disclosed herein is an apparatus and method for extracting correspondences between aerial images. The apparatus includes a line extraction unit, a line direction determination unit, a building top area extraction unit, and a correspondence extraction unit. The line extraction unit extracts lines corresponding buildings from aerial images. The line direction determination unit defines the directions of the lines as x, y and z axis directions based on a two-dimensional (2D) coordinate system. The building top area extraction unit rotates lines in the x and y axis directions so that the lines are arranged in parallel with the horizontal and vertical directions of the 2D image, and then extracts building top areas from rectangles. The correspondence extraction unit extracts correspondences between the aerial images by comparing the locations of the building top areas extracted from the aerial images.
    Type: Grant
    Filed: December 14, 2011
    Date of Patent: July 12, 2016
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Il-Kyu Park, Chang-Woo Chu, Young-Mi Cha, Bon-Ki Koo
  • Patent number: 9208607
    Abstract: Disclosed are an apparatus and a method of producing a 3D model in which a 3D model having a static background is produced using a point cloud and an image obtained through 3D scanning. The apparatus includes an image matching unit for producing a matched image by matching a point cloud obtained by scanning a predetermined region to a camera image obtained by photographing the predetermined region; a mesh model processing unit for producing an object positioned in the region as a mesh model; and a 3D model processing unit for producing a 3D model for the object by reflecting texture information obtained from the matched image to the mesh model. The disclosed may be used for a 3D map service.
    Type: Grant
    Filed: September 14, 2012
    Date of Patent: December 8, 2015
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Chang Woo Chu, Il Kyu Park, Young Mi Cha, Ji Hyung Lee, Bon Ki Koo
  • Publication number: 20150279016
    Abstract: An image processing apparatus and method for calibrating a depth of a depth sensor. The image processing method may include obtaining a depth image of a target object captured by a depth sensor and a color image of the target object captured by a color camera; and calibrating a depth of the depth sensor by calibrating a geometrical relation between a projector and a depth camera, which are included in the depth sensor, based the obtained depth and color images and calculating a correct feature point on an image plane of the depth camera that corresponds to a feature point of an image plane of the projector.
    Type: Application
    Filed: July 28, 2014
    Publication date: October 1, 2015
    Inventors: Jae-Hean KIM, Chang-Woo CHU, Young-Mi CHA, Jin-Sung CHOI, Bon-Ki KOO
  • Patent number: 9147249
    Abstract: Disclosed is a method of calibrating a depth image based on a relationship between a depth sensor and a color camera, and an apparatus for calibrating a depth image may include a three-dimensional (3D) point determiner to determine a 3D point of a camera image and a 3D point of a depth image simultaneously captured with the camera image, a calibration information determiner to determine calibration information for calibrating an error of a depth image captured by the depth sensor and a geometric information between the depth sensor and a color camera, using the 3D point of the camera image and the 3D point of the depth image, and a depth image calibrator to calibrate the depth image based on the calibration information and the 3D point of the depth image.
    Type: Grant
    Filed: October 22, 2013
    Date of Patent: September 29, 2015
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Jae Hean Kim, Chang Woo Chu, Il Kyu Park, Young Mi Cha, Jin Sung Choi, Bon Ki Koo
  • Publication number: 20150213644
    Abstract: Provided are a multi-primitive fitting method including an acquiring point cloud data by collecting data of each of input points, a obtaining a segment for the points using the point cloud data, and a performing primitive fitting using data of points included in the segment and the point cloud data, and a multi-primitive fitting device that performs the method.
    Type: Application
    Filed: January 28, 2015
    Publication date: July 30, 2015
    Inventors: Young Mi CHA, Chang Woo CHU, Jae Hean KIM
  • Publication number: 20140334675
    Abstract: Provided is an apparatus and method for extracting a movement path, the movement path extracting apparatus including an image receiver to receive an image from a camera group in which a mutual positional relationship among cameras is fixed, a geographic coordinates receiver to receive geographic coordinates of a moving object on which the camera group is fixed, and a movement path extractor to extract a movement path of the camera group based on a direction and a position of a reference camera of the camera group using the image and the geographic coordinates.
    Type: Application
    Filed: May 6, 2014
    Publication date: November 13, 2014
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Chang Woo CHU, Jae Hean KIM, Il Kyu PARK, Young Mi CHA, Jin Sung CHOI, Bon Ki KOO
  • Publication number: 20140245231
    Abstract: A primitive fitting apparatus is provided. The primitive fitting apparatus may include a selecting unit to receive, from a user, a selection of points used to fit a primitive a user desires to fit from a point cloud, an identifying unit to receive a selection of the primitive from the user and to identify the selected primitive, and a fitting unit to fit the primitive to correspond to the points, using the points and primitive.
    Type: Application
    Filed: February 28, 2014
    Publication date: August 28, 2014
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Young Mi CHA, Chang Woo CHU, Jae Hean KIM, Il Kyu PARK, Bon Ki KOO, Jin Sung CHOI
  • Publication number: 20140218354
    Abstract: A view image providing device and method are provided. The view image providing device may include a panorama image generation unit to generate a panorama image using a cube map including a margin area by obtaining an omnidirectional image, a mesh information generation unit to generate 3-dimensional (3D) mesh information that uses the panorama image as a texture by obtaining 3D data, and a user data rendering unit to render the panorama image and the mesh information into user data according to a position and direction input by a user.
    Type: Application
    Filed: December 11, 2013
    Publication date: August 7, 2014
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Il Kyu PARK, Young Mi CHA, Chang Woo CHU, Jae Hean KIM, Jin Sung CHOI, Bon Ki KOO
  • Publication number: 20140112574
    Abstract: Disclosed is a method of calibrating a depth image based on a relationship between a depth sensor and a color camera, and an apparatus for calibrating a depth image may include a three-dimensional (3D) point determiner to determine a 3D point of a camera image and a 3D point of a depth image simultaneously captured with the camera image, a calibration information determiner to determine calibration information for calibrating an error of a depth image captured by the depth sensor and a geometric information between the depth sensor and a color camera, using the 3D point of the camera image and the 3D point of the depth image, and a depth image calibrator to calibrate the depth image based on the calibration information and the 3D point of the depth image.
    Type: Application
    Filed: October 22, 2013
    Publication date: April 24, 2014
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jae Hean KIM, Chang Woo CHU, Il Kyu PARK, Young Mi CHA, Jin Sung CHOI, Bon Ki KOO
  • Publication number: 20130207966
    Abstract: Disclosed are an apparatus and a method of producing a 3D model in which a 3D model having a static background is produced using a point cloud and an image obtained through 3D scanning. The apparatus includes an image matching unit for producing a matched image by matching a point cloud obtained by scanning a predetermined region to a camera image obtained by photographing the predetermined region; a mesh model processing unit for producing an object positioned in the region as a mesh model; and a 3D model processing unit for producing a 3D model for the object by reflecting texture information obtained from the matched image to the mesh model. The disclosed may be used for a 3D map service.
    Type: Application
    Filed: September 14, 2012
    Publication date: August 15, 2013
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Chang Woo CHU, IL Kyu PARK, Young Mi CHA, Ji Hyung LEE, Bon Ki KOO
  • Publication number: 20120162215
    Abstract: The present invention relates to an apparatus and method for generating a texture of a 3D reconstructed object depending on a resolution level of a 2D image. The apparatus includes a 3D object reconstruction unit for extracting, from images captured from at least two areas located at different distances, information about a 3D object and information about cameras, and then reconstructing the 3D object. A resolution calculation unit measures size of a space area, covered by one pixel of each of the images in a photorealistic image of the 3D object, and then calculates resolutions of the images. A texture generation unit generates textures for respective levels corresponding to classified images by using the images classified according to resolution level. A rendering unit selects a texture for a relevant level depending on a size of the 3D object on a screen, and then renders the selected texture.
    Type: Application
    Filed: December 21, 2011
    Publication date: June 28, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Young-Mi CHA, Chang-Woo Chu, Il-Kyu Park, Bon-Ki Koo
  • Publication number: 20120155745
    Abstract: Disclosed herein is an apparatus and method for extracting correspondences between aerial images. The apparatus includes a line extraction unit, a line direction determination unit, a building top area extraction unit, and a correspondence extraction unit. The line extraction unit extracts lines corresponding buildings from aerial images. The line direction determination unit defines the directions of the lines as x, y and z axis directions based on a two-dimensional (2D) coordinate system. The building top area extraction unit rotates lines in the x and y axis directions so that the lines are arranged in parallel with the horizontal and vertical directions of the 2D image, and then extracts building top areas from rectangles. The correspondence extraction unit extracts correspondences between the aerial images by comparing the locations of the building top areas extracted from the aerial images.
    Type: Application
    Filed: December 14, 2011
    Publication date: June 21, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Il-Kyu PARK, Chang-Woo Chu, Young-Mi Cha, Bon-Ki Koo