Patents by Inventor Su Yong YEON

Su Yong YEON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12025708
    Abstract: The present invention relates to a method and system of generating a three-dimensional (3D) map. The 3D map generating method of the present invention includes: collecting spatial data and image data on a specific space by using a lidar sensor and a camera sensor that are each provided on a collecting device; estimating a movement trajectory of the lidar sensor by using the spatial data; and generating 3D structure data on the specific space based on structure-from-motion (SFM), by using the image data and the movement trajectory as input data.
    Type: Grant
    Filed: March 2, 2022
    Date of Patent: July 2, 2024
    Assignee: NAVER LABS CORPORATION
    Inventors: Yong Han Lee, Su Yong Yeon, Soo Hyun Ryu, Deok Hwa Kim, Dong Hwan Lee
  • Patent number: 11747477
    Abstract: The data collecting method includes: collecting first and second sensor data respectively through a first and a second sensors while a data collecting apparatus moves within a target area, and tagging a first and a second timestamp values respectively to the first and the second sensor data; generating map data of the target area and location data at a point of time corresponding to the first timestamp value, based on the first sensor data; generating map information of the target area based on the map data, and generating moving path information on the map based on the location data; and estimating a sensing location at a point of time corresponding to the second timestamp value based on the moving path information, and tagging the sensing location to the second sensor data.
    Type: Grant
    Filed: September 25, 2019
    Date of Patent: September 5, 2023
    Assignee: NAVER LABS CORPORATION
    Inventors: Su Yong Yeon, Soo Hyun Ryu, Dong Hwan Lee, Jeong Hee Kim, Kay Park, Sang Ok Seok
  • Publication number: 20220358665
    Abstract: A localization method includes scanning a surrounding space by using laser outputted from a reference region; processing spatial information about the surrounding space based on a reflection signal of the laser; extracting feature vectors to which the spatial information has been reflected by using a deep learning network which uses space vectors including the spatial information as input data; and comparing the feature vectors with preset reference map data, and thus estimating location information about the reference region.
    Type: Application
    Filed: July 25, 2022
    Publication date: November 10, 2022
    Inventors: Min Young CHANG, Su Yong YEON, Soo Hyun RYU, Dong Hwan LEE
  • Publication number: 20220283310
    Abstract: The present invention relates to a method and system of generating a three-dimensional (3D) map. The 3D map generating method of the present invention includes: collecting spatial data and image data on a specific space by using a lidar sensor and a camera sensor that are each provided on a collecting device; estimating a movement trajectory of the lidar sensor by using the spatial data; and generating 3D structure data on the specific space based on structure-from-motion (SFM), by using the image data and the movement trajectory as input data.
    Type: Application
    Filed: March 2, 2022
    Publication date: September 8, 2022
    Inventors: Yong Han LEE, Su Yong YEON, Soo Hyun RYU, Deok Hwa KIM, Dong Hwan LEE
  • Publication number: 20200103529
    Abstract: The data collecting method includes: collecting first and second sensor data respectively through a first and a second sensors while a data collecting apparatus moves within a target area, and tagging a first and a second timestamp values respectively to the first and the second sensor data; generating map data of the target area and location data at a point of time corresponding to the first timestamp value, based on the first sensor data; generating map information of the target area based on the map data, and generating moving path information on the map based on the location data; and estimating a sensing location at a point of time corresponding to the second timestamp value based on the moving path information, and tagging the sensing location to the second sensor data.
    Type: Application
    Filed: September 25, 2019
    Publication date: April 2, 2020
    Inventors: Su Yong YEON, Soo Hyun RYU, Dong Hwan LEE, Jeong Hee KIM, Kay PARK, Sang Ok SEOK