Patents by Inventor Jang Hee Yoo
Jang Hee Yoo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240127627Abstract: Disclosed herein is an apparatus and method for detecting an emotional change through facial expression analysis. The apparatus for detecting an emotional change through facial expression analysis includes a memory having at least one program recorded thereon, and a processor configured to execute the program, wherein the program includes a camera image acquisition unit configured to acquire a moving image including at least one person, a preprocessing unit configured to extract a face image of a user from the moving image and preprocess the extracted face image, a facial expression analysis unit configured to extract a facial expression vector from the face image of the user and cumulatively store the facial expression vector, and an emotional change analysis unit configured to detect a temporal location of a sudden emotional change by analyzing an emotion signal extracted based on cumulatively stored facial expression vector values.Type: ApplicationFiled: October 11, 2023Publication date: April 18, 2024Inventors: Byung-Ok HAN, Ho-Won KIM, Jang-Hee YOO, Cheol-Hwan YOO, Jae-Yoon JANG
-
Patent number: 11749023Abstract: Disclosed herein are an apparatus and method for monitoring a user based on multi-view face images. The apparatus includes memory in which at least one program is recorded and a processor for executing the program. The program may include a face detection unit for extracting face area images from respective user images captured from two or more different viewpoints, a down-conversion unit for generating at least one attribute-specific 2D image by mapping information about at least one attribute in the 3D space of the face area images onto a 2D UV space, and an analysis unit for generating user monitoring information by analyzing the at least one attribute-specific 2D image.Type: GrantFiled: October 30, 2020Date of Patent: September 5, 2023Assignee: Electronics and Telecommunications Research InstituteInventors: Ho-Won Kim, Jang-Hee Yoo, Byung-Ok Han
-
Patent number: 11709919Abstract: Disclosed herein are a method and apparatus for active identification based on gaze path analysis. The method may include extracting the face image of a user, extracting the gaze path of the user based on the face image, verifying the identity of the user based on the gaze path, and determining whether the face image is authentic.Type: GrantFiled: October 27, 2020Date of Patent: July 25, 2023Assignee: Electronics and Telecommunications Research InstituteInventors: Ho-Won Kim, Jang-Hee Yoo
-
Publication number: 20230071037Abstract: Disclosed herein are an apparatus for recognizing a user command using non-contact gaze-based head motion information and a method using the same. The method includes monitoring the gaze and the head motion of a user based on a sensor, displaying a user interface at a location corresponding to the gaze based on gaze-based head motion information acquired by combining the gaze and the head motion, and recognizing a user command selected from the user interface.Type: ApplicationFiled: December 21, 2021Publication date: March 9, 2023Inventors: Ho-Won KIM, Cheol-Hwan YOO, Jang-Hee YOO, Jae-Yoon JANG
-
APPARATUS AND METHOD FOR SUPPORTING ATTENTION TEST BASED ON ATTENTION MAP AND ATTENTION MOVEMENT MAP
Publication number: 20220392080Abstract: Disclosed herein is a method for supporting an attention test based on an attention map and an attention movement map. The method includes generating a score distribution for each segment area of frames satisfying preset conditions, among frames of video content (video) that is produced in advance so as to be suitable for the purpose of a test, generating an attention map corresponding to the frames based on the distribution of the gaze point of a subject, generating an attention movement map corresponding to the frames based on information about movement of the gaze point of the subject, and calculating the attention of the subject using the score distribution for each segment area, the attention map, and the attention movement map.Type: ApplicationFiled: November 23, 2021Publication date: December 8, 2022Inventors: Jang-Hee YOO, Ho-Won KIM, Jae-Yoon JANG -
Publication number: 20210374402Abstract: Disclosed herein are an apparatus and method for monitoring a user based on multi-view face images. The apparatus includes memory in which at least one program is recorded and a processor for executing the program. The program may include a face detection unit for extracting face area images from respective user images captured from two or more different viewpoints, a down-conversion unit for generating at least one attribute-specific 2D image by mapping information about at least one attribute in the 3D space of the face area images onto a 2D UV space, and an analysis unit for generating user monitoring information by analyzing the at least one attribute-specific 2D image.Type: ApplicationFiled: October 30, 2020Publication date: December 2, 2021Applicant: Electronics and Telecommunications Research InstituteInventors: Ho-Won KIM, Jang-Hee YOO, Byung-Ok HAN
-
Publication number: 20210294883Abstract: Disclosed herein are a method and apparatus for active identification based on gaze path analysis. The method may include extracting the face image of a user, extracting the gaze path of the user based on the face image, verifying the identity of the user based on the gaze path, and determining whether the face image is authentic.Type: ApplicationFiled: October 27, 2020Publication date: September 23, 2021Applicant: Electronics and Telecommunications Research InstituteInventors: Ho-Won KIM, Jang-Hee YOO
-
Publication number: 20180157814Abstract: Disclosed herein are a method and apparatus for authenticating a user based on a fingertip gesture. The authentication apparatus may display a pattern generated based on geometric information about a hand geometry or size of a user, and may recognize a fingertip gesture via interaction with the user with respect to the pattern. The authentication apparatus may authenticate a user using the recognized fingertip gesture. The pattern may include a gesture inducement/relation pattern and a fake pattern. Information about the fingertip gesture may include fingertip touch locations of the fingertip gesture, a touch order of the fingertip touch locations, and a moving direction of the fingertip touch locations.Type: ApplicationFiled: November 15, 2017Publication date: June 7, 2018Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jang-Hee YOO, Ki-Young MOON, Young-Jae LIM, Kyo-Il CHUNG
-
Patent number: 9443151Abstract: Disclosed herein is an apparatus and method of searching for a wanted vehicle, capable of interoperating with black boxes mounted in vehicles of unspecified individuals, recognizing and searching for registration numbers of vehicles in proximity of each black box in real time, and identifying a location of the wanted vehicle in real time using information about locations of the searched vehicles. The method includes requesting, by an apparatus for searching for a wanted vehicle, a black box installed in at least one vehicle to search for a registration number of the wanted vehicle, and receiving a response corresponding to the request, and acquiring information about the wanted vehicle and a location of the wanted vehicle corresponding to the response using the black box for recognizing a vehicle registration number or the black box for detecting a vehicle registration number region.Type: GrantFiled: July 30, 2014Date of Patent: September 13, 2016Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jang-Hee Yoo, Jong-Gook Ko, Jin-Woo Choi, Ki-Young Moon
-
Patent number: 9369665Abstract: Disclosed is a video recording apparatus for a vehicle, which includes a camera unit formed so as to change a capturing direction; a driver's viewing direction detecting unit configured to detect a driver's viewing direction; a control unit configured to control the camera unit so that the detected viewing direction corresponds to the capturing direction; and a storing unit configured to store a video obtained by the camera unit. Therefore, a video in a direction at which the driver views is obtained and stored without mounting a plurality of cameras so that it is possible to obtain a video for a situation of an accident of a vehicle of the driver or a vehicle of another driver occurring not only in the front of the driver's vehicle but also in the side of the driver's vehicle.Type: GrantFiled: June 19, 2013Date of Patent: June 14, 2016Assignee: Electronics and Telecommunications Research InstituteInventors: Dae Sung Moon, Han Sung Lee, Jin Woo Choi, Ki Young Moon, Jang Hee Yoo
-
Publication number: 20150332476Abstract: The present invention relates to a method of tracking an object in a multiple cameras environment and the method includes generating first feature information of the object from an image input from a first camera; detecting a second camera in which identification information for the object is recognized when the object moves out of a view angle of the first camera, and comparing second feature information of the object generated from an image input from the second camera with the first feature information to track the object from the image input from the second camera. According to the present invention, the object is tracked based on an image in one camera image and if the object moves out of the camera, the identification information of the terminal which is possessed by the object is recognized to hand over the camera to continuously track the same object.Type: ApplicationFiled: December 26, 2013Publication date: November 19, 2015Applicant: Electronics and Telecommunications Research InstituteInventors: So Hee PARK, Jong Gook KO, Ki Young MOON, Jang Hee YOO
-
Patent number: 9129397Abstract: A human tracking method using a color histogram is disclosed. The human tracking method using the color histogram according to the present invention can more adaptively perform human tracking using different target color histograms according to the human poses, instead of applying only one target color histogram to the tracking process of one person, such that the accuracy of human tracking can be increased. The human tracking method includes performing color space conversion of input video data; calculating a state equation of a particle based on the color-space conversion data; calculating the state equation, and calculating human pose-adaptive observation likelihood; resampling the particle using the observation likelihood, and estimating a state value of the human; and updating a target color histogram.Type: GrantFiled: August 31, 2012Date of Patent: September 8, 2015Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jin-Woo Choi, So Hee Park, Jong-Gook Ko, Jang-Hee Yoo
-
Patent number: 9124812Abstract: Disclosed herein is an object image capture apparatus. The object image capture apparatus includes a first camera unit, a second camera unit, and a control unit. The first camera unit obtains a wide-area view image by capturing a wide-area view region. The second camera unit obtains a close-up view image by capturing the close-up view region of the wide-area view region. The control unit controls the second camera unit by dividing the wide-area view image into a plurality of blocks, analyzing the resulting blocks, and defining a target block, in which a moving object is detected, as a close-up view region from among the plurality of blocks.Type: GrantFiled: December 6, 2011Date of Patent: September 1, 2015Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jang-Hee Yoo, Dae-Sung Moon, Yun-Su Chung
-
Publication number: 20150193662Abstract: Disclosed herein is an apparatus and method of searching for a wanted vehicle, capable of interoperating with black boxes mounted in vehicles of unspecified individuals, recognizing and searching for registration numbers of vehicles in proximity of each black box in real time, and identifying a location of the wanted vehicle in real time using information about locations of the searched vehicles. The method includes requesting, by an apparatus for searching for a wanted vehicle, a black box installed in at least one vehicle to search for a registration number of the wanted vehicle, and receiving a response corresponding to the request, and acquiring information about the wanted vehicle and a location of the wanted vehicle corresponding to the response using the black box for recognizing a vehicle registration number or the black box for detecting a vehicle registration number region.Type: ApplicationFiled: July 30, 2014Publication date: July 9, 2015Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jang-Hee YOO, Jong-Gook KO, Jin-Woo CHOI, Ki-Young MOON
-
Publication number: 20150169979Abstract: Provided is a trajectory modeling apparatus and method based on trajectory transformation which models a trajectory and compares a trajectory of an object and the modeled trajectory. The trajectory modeling apparatus includes an image input unit configured to receive an input image, an object trajectory generating unit configured to trace an object included in the input image to generate a trajectory of the object, a trajectory model generating unit configured to generate a trajectory model according to a directionality of the trajectory of the object by using the trajectory of the object, and a trajectory analyzing unit configured to analyze a trajectory of a target included in a test image by using the trajectory model to determine whether a behavior of the target is normal, based on the target trajectory analysis result.Type: ApplicationFiled: July 17, 2014Publication date: June 18, 2015Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jong Gook KO, Jin Woo CHOI, Ki Young MOON, Jang Hee YOO
-
Publication number: 20150161435Abstract: Disclosed herein is a frontal face detection apparatus and method using a facial pose. The frontal face detection apparatus includes an image input unit for receiving an input image. A candidate extraction unit extracts a face region candidate and face element candidates from the input image. A face region verification unit verifies, based on a plurality of face element candidates extracted by the candidate extraction unit, whether the extracted face region candidate is a final face region. A face element calculation unit calculates a plurality of final face elements in correspondence with a facial pose score for a final face region including the extracted face element candidates generated based on a predefined average face model. A final frontal face detection unit detects a final frontal face from the final face region including the plurality of final face elements, based on a position pattern between the final face elements.Type: ApplicationFiled: October 27, 2014Publication date: June 11, 2015Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Sung-Uk JUNG, Jang-Hee YOO, So-Hee PARK, Han-Sung LEE
-
Patent number: 9036039Abstract: An apparatus for acquiring a face image using multiple cameras so as to identify a human being located at a remote site is disclosed. The apparatus for acquiring a face image using multiple cameras allows a PTZ camera to track an interest object from among objects detected/tracked by a fixed camera, and obtains an optimum face image for remote human identification from images generated by the PTZ camera. The apparatus for acquiring the face image using multiple cameras so as to identify a human located at a remote site includes a multi-camera control module for tracking an interest object being detected/tracked by a fixed camera through a Pan-Tilt-Zoom (PTZ) camera, and generating an image of the interest object; and a face-image acquisition module for acquiring a face image appropriate for identifying a face image of the remote human in the interest object image generated by the multi-camera control module.Type: GrantFiled: November 2, 2012Date of Patent: May 19, 2015Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Han Sung Lee, Dae Sung Moon, Yongjin Lee, Jang-Hee Yoo
-
Publication number: 20140178030Abstract: Disclosed is a video recording apparatus for a vehicle, which includes a camera unit formed so as to change a capturing direction; a driver's viewing direction detecting unit configured to detect a driver's viewing direction; a control unit configured to control the camera unit so that the detected viewing direction corresponds to the capturing direction; and a storing unit configured to store a video obtained by the camera unit. Therefore, a video in a direction at which the driver views is obtained and stored without mounting a plurality of cameras so that it is possible to obtain a video for a situation of an accident of a vehicle of the driver or a vehicle of another driver occurring not only in the front of the driver's vehicle but also in the side of the driver's vehicle.Type: ApplicationFiled: June 19, 2013Publication date: June 26, 2014Applicant: Electronics and Telecommunications Research InstituteInventors: Dae Sung MOON, Han Sung LEE, Jin Woo CHOI, Ki Young MOON, Jang Hee YOO
-
Patent number: 8699799Abstract: A fingerprint verification apparatus that adds chaff fingerprint information to real fingerprint information of a user and then, hides and stores the fingerprint information of the user with a polynomial generated by unique information of the individual, thereby safely protecting the important fingerprint information of the user stored in a storing unit from an external attacker and safely managing an private key using the fingerprint information when using the private key as the unique information for making the polynomial.Type: GrantFiled: September 14, 2010Date of Patent: April 15, 2014Assignee: Electronics and Telecommunications Research InstituteInventors: Dae-Sung Moon, Ki-Young Moon, Jang-Hee Yoo, Yun-Su Chung, Woo-Yong Choi, So-Hee Park, Byung-Jun Kang, Yong-Jin Lee
-
Publication number: 20140089236Abstract: Disclosed is a learning method using extracted data features for simplifying a learning process or improving accuracy of estimation. The learning method includes dividing input learning data into two groups based on a predetermined reference, extracting data features for distinguishing the two divided groups, and performing learning using the extracted data features.Type: ApplicationFiled: January 3, 2013Publication date: March 27, 2014Applicant: Electronics and Telecommunications Research InstituteInventors: Yong Jin LEE, So Hee PARK, Jong Gook KO, Ki Young MOON, Jang Hee YOO