Patents by Inventor Yong-Woo JO
Yong-Woo JO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250043191Abstract: Embodiments of the present disclosure relate to a method for producing refined hydrocarbons from waste plastics, the method including: a pretreatment process of pretreating waste plastics; a pyrolysis process of producing pyrolysis gas by introducing the waste plastics pretreated in the pretreatment process into a pyrolysis reactor; a lightening process of producing pyrolysis oil by introducing the pyrolysis gas into a hot filter; and a distillation process of distilling the pyrolysis oil to obtain refined hydrocarbons, wherein a liquid condensed in the hot filter is re-introduced into the pyrolysis reactor, and a system for producing refined hydrocarbons from waste plastics.Type: ApplicationFiled: October 22, 2024Publication date: February 6, 2025Inventors: Sang Hwan JO, Soo Kil KANG, Yong Woon KIM, Min Gyoo PARK, Min Woo SHIN, Jin Seong JANG
-
Patent number: 12154033Abstract: Disclosed herein are a deep network learning method using an autonomous vehicle and an apparatus for the same. The deep network learning apparatus includes a processor configured to select a deep network model requiring an update in consideration of performance, assign learning amounts for respective vehicles in consideration of respective operation patterns of multiple autonomous vehicles registered through user authentication, distribute the deep network model and the learning data to the multiple autonomous vehicles based on the learning amounts for respective vehicles, and receive learning results from the multiple autonomous vehicles, and memory configured to store the deep network model and the learning data.Type: GrantFiled: August 11, 2022Date of Patent: November 26, 2024Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Joo-Young Kim, Kyoung-Wook Min, Yong-Woo Jo, Doo-Seop Choi, Jeong-Dan Choi
-
Patent number: 11940814Abstract: Disclosed herein are a cooperative driving method based on driving negotiation and an apparatus for the same. The cooperative driving method is performed by a cooperative driving apparatus for cooperative driving based on driving negotiation, and includes determining whether cooperative driving is possible in consideration of a driving mission of a requesting vehicle that requests cooperative driving with neighboring vehicles, when it is determined that cooperative driving is possible, setting a responding vehicle from which cooperative driving is to be requested among the neighboring vehicles, performing driving negotiation between the requesting vehicle and the responding vehicle based on a driving negotiation protocol, and when the driving negotiation is completed, performing cooperative driving by providing driving guidance information for vehicle control to at least one of the requesting vehicle and the responding vehicle.Type: GrantFiled: November 22, 2021Date of Patent: March 26, 2024Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Yoo-Seung Song, Joo-Young Kim, Kyoung-Wook Min, Yong-Woo Jo, Jeong-Dan Choi
-
Publication number: 20230386332Abstract: Disclosed herein are a method and apparatus for processing a driving cooperation message. The method for processing a driving cooperation message includes receiving multiple first driving cooperation messages from neighboring autonomous vehicles, adjusting cooperation classes of the multiple first driving cooperation messages, creating driving strategies corresponding to the adjusted cooperation classes in descending order of priorities of the adjusted cooperation classes, generating second driving cooperation messages including the adjusted cooperation classes and the driving strategies corresponding to the adjusted cooperation classes, and sending the second driving cooperation messages to the neighboring autonomous vehicles requiring cooperative driving.Type: ApplicationFiled: January 18, 2023Publication date: November 30, 2023Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Shin-Kyung LEE, Yoo-Seung SONG, Kyoung-Wook MIN, Yong-Woo JO
-
Publication number: 20230056581Abstract: Disclosed herein are a cooperative driving method based on driving negotiation and an apparatus for the same. The cooperative driving method is performed by a cooperative driving apparatus for cooperative driving based on driving negotiation, and includes determining whether cooperative driving is possible in consideration of a driving mission of a requesting vehicle that requests cooperative driving with neighboring vehicles, when it is determined that cooperative driving is possible, setting a responding vehicle from which cooperative driving is to be requested among the neighboring vehicles, performing driving negotiation between the requesting vehicle and the responding vehicle based on a driving negotiation protocol, and when the driving negotiation is completed, performing cooperative driving by providing driving guidance information for vehicle control to at least one of the requesting vehicle and the responding vehicle.Type: ApplicationFiled: November 22, 2021Publication date: February 23, 2023Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Yoo-Seung SONG, Joo-Young KIM, Kyoung-Wook MIN, Yong-Woo JO, Jeong-Dan CHOI
-
Patent number: 11587256Abstract: The autonomous driving device including a communication circuit configured to communicate with an unmanned aerial vehicle, a plurality of sensors disposed in the autonomous vehicle to monitor all directions of the autonomous vehicle, and a processor, wherein the processor is configured to: control the unmanned aerial vehicle to hover at each of a plurality of waypoints of a designated flight path by controlling a relative position of the unmanned aerial vehicle through the communication circuit, change a posture angle of the unmanned aerial vehicle to a plurality of posture angles corresponding to the waypoints of the flight path, generate a plurality of images including the checkerboard and corresponding to the plurality of waypoints and the plurality of posture angles through the plurality of sensors, and calibrate the plurality of sensors on the basis of a relationship between matching points of the plurality of images.Type: GrantFiled: December 3, 2020Date of Patent: February 21, 2023Assignee: Electronics and Telecommunications Research InstituteInventors: Jae Hyuck Park, Yong Woo Jo, Doo Seop Choi, Kyoung Wook Min, Jeong Dan Choi
-
Publication number: 20230053134Abstract: Disclosed herein are a deep network learning method using an autonomous vehicle and an apparatus for the same. The deep network learning apparatus includes a processor configured to select a deep network model requiring an update in consideration of performance, assign learning amounts for respective vehicles in consideration of respective operation patterns of multiple autonomous vehicles registered through user authentication, distribute the deep network model and the learning data to the multiple autonomous vehicles based on the learning amounts for respective vehicles, and receive learning results from the multiple autonomous vehicles, and memory configured to store the deep network model and the learning data.Type: ApplicationFiled: August 11, 2022Publication date: February 16, 2023Inventors: Joo-Young KIM, Kyoung-Wook Min, Yong-Woo Jo, Doo-Seop Choi, Jeong-Dan Choi
-
Patent number: 11507783Abstract: Disclosed herein are an object recognition apparatus of an automated driving system using error removal based on object classification and a method using the same. The object recognition method is configured to train a multi-object classification model based on deep learning using training data including a data set corresponding to a noise class, into which a false-positive object is classified, among classes classified by the types of objects, to acquire a point cloud and image data respectively using a LiDAR sensor and a camera provided in an autonomous vehicle, to extract a crop image, corresponding to at least one object recognized based on the point cloud, from the image data and input the same to the multi-object classification model, and to remove a false-positive object classified into the noise class, among the at least one object, by the multi-object classification model.Type: GrantFiled: July 20, 2021Date of Patent: November 22, 2022Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Dong-Jin Lee, Do-Wook Kang, Jungyu Kang, Joo-Young Kim, Kyoung-Wook Min, Jae-Hyuck Park, Kyung-Bok Sung, Yoo-Seung Song, Taeg-Hyun An, Yong-Woo Jo, Doo-Seop Choi, Jeong-Dan Choi, Seung-Jun Han
-
Publication number: 20220164609Abstract: Disclosed herein are an object recognition apparatus of an automated driving system using error removal based on object classification and a method using the same. The object recognition method is configured to train a multi-object classification model based on deep learning using training data including a data set corresponding to a noise class, into which a false-positive object is classified, among classes classified by the types of objects, to acquire a point cloud and image data respectively using a LiDAR sensor and a camera provided in an autonomous vehicle, to extract a crop image, corresponding to at least one object recognized based on the point cloud, from the image data and input the same to the multi-object classification model, and to remove a false-positive object classified into the noise class, among the at least one object, by the multi-object classification model.Type: ApplicationFiled: July 20, 2021Publication date: May 26, 2022Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Dong-Jin LEE, Do-Wook KANG, Jungyu KANG, Joo-Young KIM, Kyoung-Wook MIN, Jae-Hyuck PARK, Kyung-Bok SUNG, Yoo-Seung SONG, Taeg-Hyun AN, Yong-Woo JO, Doo-Seop CHOI, Jeong-Dan CHOI, Seung-Jun HAN
-
Patent number: 11181389Abstract: Provided is a driving guide system, and more specifically, a system for guiding a vehicle occupant in driving through linguistic description. One embodiment of the present invention is an apparatus for guiding driving with linguistic description of a destination, which is installed on a vehicle and guides driving by outputting a linguistic description of a destination building, wherein the apparatus sets a destination according to a command or input, receives appearance information of a building of the destination from a server, and represents and outputs the appearance information in a linguistic form.Type: GrantFiled: November 27, 2019Date of Patent: November 23, 2021Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jung Gyu Kang, Kyoung Wook Min, Yong Woo Jo, Doo Seop Choi, Jeong Dan Choi, Dong Jin Lee, Seung Jun Han
-
Publication number: 20210174547Abstract: The autonomous driving device including a communication circuit configured to communicate with an unmanned aerial vehicle, a plurality of sensors disposed in the autonomous vehicle to monitor all directions of the autonomous vehicle, and a processor, wherein the processor is configured to: control the unmanned aerial vehicle to hover at each of a plurality of waypoints of a designated flight path by controlling a relative position of the unmanned aerial vehicle through the communication circuit, change a posture angle of the unmanned aerial vehicle to a plurality of posture angles corresponding to the waypoints of the flight path, generate a plurality of images including the checkerboard and corresponding to the plurality of waypoints and the plurality of posture angles through the plurality of sensors, and calibrate the plurality of sensors on the basis of a relationship between matching points of the plurality of images.Type: ApplicationFiled: December 3, 2020Publication date: June 10, 2021Applicant: Electronics and Telecommunications Research InstituteInventors: Jae Hyuck PARK, Yong Woo JO, Doo Seop CHOI, Kyoung Wook MIN, Jeong Dan CHOI
-
Publication number: 20200182648Abstract: Provided is a driving guide system, and more specifically, a system for guiding a vehicle occupant in driving through linguistic description. One embodiment of the present invention is an apparatus for guiding driving with linguistic description of a destination, which is installed on a vehicle and guides driving by outputting a linguistic description of a destination building, wherein the apparatus sets a destination according to a command or input, receives appearance information of a building of the destination from a server, and represents and outputs the appearance information in a linguistic form.Type: ApplicationFiled: November 27, 2019Publication date: June 11, 2020Inventors: Jung Gyu KANG, Kyoung Wook MIN, Yong Woo JO, Doo Seop CHOI, Jeong Dan CHOI, Dong Jin LEE, Seung Jun HAN
-
Publication number: 20200174492Abstract: Provided is an autonomous driving technology in which the autonomous driving method includes planning global travelling such that guidance information of global node points is acquired, determining a location of a subject vehicle, generating a first local high-definition map such that the first local high-definition map is generated for at least one section in a global-travelling planned route included in the planning of the global travelling using at least one of a road view and an aerial view provided from a map server, planning a local route for autonomous driving using the first local high-definition map, and controlling the subject vehicle according to the planning of the local route to perform the autonomous driving.Type: ApplicationFiled: November 27, 2019Publication date: June 4, 2020Applicant: Electronics and Telecommunications Research InstituteInventors: Dong Jin LEE, Jeong Dan CHOI, Jung Gyu KANG, Joo Young KIM, Kyoung Wook MIN, Kyung Bok SUNG, Taeg Hyun AN, Bong Jin OH, Yong Woo JO, Doo Seop CHOI, Seung Jun HAN
-
Publication number: 20200174475Abstract: Provided is an autonomous driving method. The autonomous driving method includes a global driving planning operation in which global guidance information for global node points are acquired, a host vehicle location determination operation, an information acquisition operation in which information regarding an obstacle and a road surface marking within a preset distance ahead is acquired, a local precise map generation operation in which a local precise map for a corresponding range is generated using the information acquired within the preset distance ahead, a local route planning operation in which a local route plan for autonomous driving within at least the preset distance is established using the local precise map, and an operation of controlling a host vehicle according to the local route plan to perform autonomous driving.Type: ApplicationFiled: November 27, 2019Publication date: June 4, 2020Applicant: Electronics and Telecommunications Research InstituteInventors: Kyoung Wook MIN, Jeong Dan CHOI, Yong Woo JO, Seung Jun HAN
-
Patent number: 10371534Abstract: Provided are an apparatus and method for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle. The apparatus for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle includes a sensing section which senses surrounding vehicles traveling within a preset distance from the autonomous vehicle, a communicator which transmits and receives data between the autonomous vehicle and another vehicle or a cloud server, a storage which stores precise lane-level map data, and a learning section which generates mapping data centered on the autonomous vehicle by mapping driving environment data of a sensing result of the sensing section to the precise map data, transmits the mapping data to the other vehicle or the cloud server through the communicator, and performs learning for autonomous driving using the mapping data and data received from the other vehicle or the cloud server.Type: GrantFiled: May 23, 2017Date of Patent: August 6, 2019Assignee: Electronics and Telecommunications Research InstituteInventors: Kyoung Wook Min, Jeong Dan Choi, Jun Gyu Kang, Sang Heon Park, Kyung Bok Sung, Joo Chan Sohn, Dong Jin Lee, Yong Woo Jo, Seung Jun Han
-
Publication number: 20190172225Abstract: Provided is a technology for a vehicle sensor calibration, in which a vehicle sensor calibration apparatus according to an embodiment includes a sensor device installed inside a calibration room that is a space in which a vehicle having a vehicle sensor mounted thereon is positioned, and configured to obtain basic position information and basic orientation information of the vehicle, a calibration execution module installed in the vehicle and configured to execute calibration on the vehicle sensor on the basis of information received from the outside, and a control module configured to generate calibration position information and calibration orientation information for calibration of the vehicle by analyzing the basic position information and the basic orientation information obtained by the sensor device.Type: ApplicationFiled: September 17, 2018Publication date: June 6, 2019Applicant: Electronics and Telecommunications Research InstituteInventors: Sang Heon PARK, Jeong Dan CHOI, Seung Jun HAN, Jungyu KANG, Kyoung Wook MIN, Kyung Bok SUNG, Dong Jin LEE, Yong Woo JO
-
Patent number: 10082385Abstract: A system for measuring displacement of an accelerating tube by using a micro-alignment telescope, which includes a vacuum chamber; a hollow accelerating tube in the vacuum chamber; a sighting target attached to a surface of the accelerating tube while protruding from the surface of the accelerating tube; the micro-alignment telescope spaced apart from one side surface of the vacuum chamber; a first lens device interposed between the micro-alignment telescope and the vacuum chamber; and a second lens device spaced apart from an opposite side surface of the vacuum chamber by a distance, wherein the vacuum chamber includes first and second viewports placed on the surfaces of the vacuum chamber in correspondence with each other, and the micro-alignment telescope, the first lens device, the first viewport, the sighting target, the second viewport and the second lens device are aligned on a same axis in one direction.Type: GrantFiled: May 20, 2016Date of Patent: September 25, 2018Assignee: Institute for Basic ScienceInventors: Min-Ki Lee, Young-Kwon Kim, Yong-Woo Jo, Jong-Wan Choi, Woo-Kang Kim, Hee-Tae Kim
-
Publication number: 20180101172Abstract: Provided are an apparatus and method for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle. The apparatus for sharing and learning driving environment data to improve the decision intelligence of an autonomous vehicle includes a sensing section which senses surrounding vehicles traveling within a preset distance from the autonomous vehicle, a communicator which transmits and receives data between the autonomous vehicle and another vehicle or a cloud server, a storage which stores precise lane-level map data, and a learning section which generates mapping data centered on the autonomous vehicle by mapping driving environment data of a sensing result of the sensing section to the precise map data, transmits the mapping data to the other vehicle or the cloud server through the communicator, and performs learning for autonomous driving using the mapping data and data received from the other vehicle or the cloud server.Type: ApplicationFiled: May 23, 2017Publication date: April 12, 2018Applicant: Electronics and Telecommunications Research InstituteInventors: Kyoung Wook MIN, Jeong Dan CHOI, Jun Gyu KANG, Sang Heon PARK, Kyung Bok SUNG, Joo Chan SOHN, Dong Jin LEE, Yong Woo JO, Seung Jun HAN
-
Publication number: 20170167851Abstract: A system for measuring displacement of an accelerating tube by using a micro-alignment telescope, which includes a vacuum chamber; a hollow accelerating tube in the vacuum chamber; a sighting target attached to a surface of the accelerating tube while protruding from the surface of the accelerating tube; the micro-alignment telescope spaced apart from one side surface of the vacuum chamber; a first lens device interposed between the micro-alignment telescope and the vacuum chamber; and a second lens device spaced apart from an opposite side surface of the vacuum chamber by a distance, wherein the vacuum chamber includes first and second viewports placed on the surfaces of the vacuum chamber in correspondence with each other, and the micro-alignment telescope, the first lens device, the first viewport, the sighting target, the second viewport and the second lens device are aligned on a same axis in one direction.Type: ApplicationFiled: May 20, 2016Publication date: June 15, 2017Inventors: Min-Ki LEE, Young-Kwon KIM, Yong-Woo JO, Jong-Wan CHOI, Woo-Kang KIM, Hee-Tae KIM