Patents by Inventor Jang Hee Yoo

Jang Hee Yoo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12340623
    Abstract: Disclosed is a computing device, which includes a processor, a camera that captures RGB image data, a memory that stores image data including the RGB image data, and a pupil detection device that detects a pupil from the RGB image data stored in the memory in response to a request of the processor. The pupil detection device includes an image converter that receives the RGB image data, detects first eye area data including eye area information from the received RGB image data, and converts the first eye area data into second eye area data having IR (InfraRed) image characteristics by using a deep neural network, a pupil candidate detector that detects pupil candidate data from the second eye area data, and a pupil boundary detector that detects pupil boundary data from the pupil candidate data.
    Type: Grant
    Filed: October 26, 2023
    Date of Patent: June 24, 2025
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Jang-Hee Yoo, Cheolhwan Yoo, ByungOk Han
  • Publication number: 20250124598
    Abstract: Disclosed herein are a deep neural network (DNN) learning method for generalizing appearance-based gaze estimation and an apparatus for the same. The deep neural network (DNN) learning method includes creating multiple augmented images based on an original image, inputting the multiple augmented images to a DNN to output a gaze estimation value, calculating a total loss between a gaze ground truth of the original image and the gaze estimation value through gaze consistency regularization (GCR) using a spherical gaze distance (SGD), and updating parameters of the DNN by backpropagation of the total loss.
    Type: Application
    Filed: August 6, 2024
    Publication date: April 17, 2025
    Inventors: Moon-Ki BACK, Jang-Hee YOO, Cheol-Hwan YOO, Byung-Ok HAN
  • Publication number: 20250125049
    Abstract: Disclosed herein is a method for assisting in Autism Spectrum Disorder (ASD) diagnosis. The method includes transmitting social-interaction-inducing content, receiving input images containing a response of an assessment subject to the social-interaction-inducing content, receiving an ASD diagnosis result for a preset number of input images, among the received input images, and outputting a diagnostic assistive result for the received input images using ASD diagnosis input for the preset number of input images and a pretrained global ASD diagnosis model.
    Type: Application
    Filed: August 6, 2024
    Publication date: April 17, 2025
    Inventors: Cheol-Hwan YOO, Moon-Ki BACK, Jang-Hee YOO, Byung-Ok HAN
  • Publication number: 20250114021
    Abstract: Disclosed herein are an interaction-based artificial intelligence analysis apparatus and method. The interaction-based artificial intelligence analysis apparatus is configured to output structured video content for each evaluation index so as to elicit an interaction to an examinee at each stimulus time-frame of a preset timeline, collect response data of the examinee for each evaluation index through a camera and a microphone at each response time-frame of the preset timeline, and analyze the response data for each evaluation index using an Artificial Intelligence (AI) analysis module corresponding to the evaluation index.
    Type: Application
    Filed: September 26, 2024
    Publication date: April 10, 2025
    Inventors: Jang-Hee YOO, Cheol-Hwan YOO, Jae-Yoon JANG, Byung-Ok HAN
  • Publication number: 20250078568
    Abstract: Disclosed herein are an emotion recognition method and method based on context information. The emotion recognition method includes detecting information corresponding to an emotion recognition subject from an input image, extracting a recognition subject feature based on the information corresponding to the emotion recognition subject, extracting a context feature based on the input image, storing the recognition subject feature and the context feature in a short-term memory, and storing the context feature in a long-term memory.
    Type: Application
    Filed: November 6, 2023
    Publication date: March 6, 2025
    Inventors: Byung-Ok HAN, Moon-Ki BACK, Jang-Hee YOO, Cheol-Hwan YOO
  • Patent number: 12211215
    Abstract: Disclosed herein is a method for supporting an attention test based on an attention map and an attention movement map. The method includes generating a score distribution for each segment area of frames satisfying preset conditions, among frames of video content (video) that is produced in advance so as to be suitable for the purpose of a test, generating an attention map corresponding to the frames based on the distribution of the gaze point of a subject, generating an attention movement map corresponding to the frames based on information about movement of the gaze point of the subject, and calculating the attention of the subject using the score distribution for each segment area, the attention map, and the attention movement map.
    Type: Grant
    Filed: November 23, 2021
    Date of Patent: January 28, 2025
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Jang-Hee Yoo, Ho-Won Kim, Jae-Yoon Jang
  • Publication number: 20240415413
    Abstract: Disclosed herein is a method for stereotyped behavior detection for supporting diagnosis of Autism Spectrum Disorder (ASD). The method includes detecting a target object to be assessed in an input video, detecting a section in which a periodic behavior occurs using an image sequence of the target object, and classifying a stereotyped behavior in the section in which the periodic behavior occurs.
    Type: Application
    Filed: May 10, 2024
    Publication date: December 19, 2024
    Inventors: Cheol-Hwan YOO, Moon-Ki BACK, Jang-Hee YOO, Byung-Ok HAN
  • Publication number: 20240407685
    Abstract: Disclosed herein is a method for supporting Autism Spectrum Disorder (ASD) diagnosis based Artificial Intelligence (AI). The method includes extracting a detection area and voice corresponding to an inspector from an input video, extracting a detection area and voice corresponding to an assessment subject from the input video, extracting a feature of the inspector and a feature of the assessment subject, and extracting an interaction feature using the feature of the inspector and the feature of the assessment subject.
    Type: Application
    Filed: April 1, 2024
    Publication date: December 12, 2024
    Inventors: Byung-Ok HAN, Moon-Ki BACK, Jang-Hee YOO, Cheol-Hwan YOO
  • Publication number: 20240203162
    Abstract: Disclosed is a computing device, which includes a processor, a camera that captures RGB image data, a memory that stores image data including the RGB image data, and a pupil detection device that detects a pupil from the RGB image data stored in the memory in response to a request of the processor. The pupil detection device includes an image converter that receives the RGB image data, detects first eye area data including eye area information from the received RGB image data, and converts the first eye area data into second eye area data having IR (InfraRed) image characteristics by using a deep neural network, a pupil candidate detector that detects pupil candidate data from the second eye area data, and a pupil boundary detector that detects pupil boundary data from the pupil candidate data.
    Type: Application
    Filed: October 26, 2023
    Publication date: June 20, 2024
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jang-Hee YOO, Cheolhwan YOO, ByungOk HAN
  • Publication number: 20240196086
    Abstract: A method and apparatus for detecting the best shot in a long-range face recognition system is provided. The method of detecting the best shot includes detecting a facial area in an image received from the outside, calculating a quality element measurement value of the face image by analyzing the facial area, and selecting a best shot face image, among the detected facial areas, based on the quality element measurement value.
    Type: Application
    Filed: October 17, 2023
    Publication date: June 13, 2024
    Inventors: Jang-Hee Yoo, Cheolhwan Yoo, ByungOk HAN
  • Publication number: 20240127627
    Abstract: Disclosed herein is an apparatus and method for detecting an emotional change through facial expression analysis. The apparatus for detecting an emotional change through facial expression analysis includes a memory having at least one program recorded thereon, and a processor configured to execute the program, wherein the program includes a camera image acquisition unit configured to acquire a moving image including at least one person, a preprocessing unit configured to extract a face image of a user from the moving image and preprocess the extracted face image, a facial expression analysis unit configured to extract a facial expression vector from the face image of the user and cumulatively store the facial expression vector, and an emotional change analysis unit configured to detect a temporal location of a sudden emotional change by analyzing an emotion signal extracted based on cumulatively stored facial expression vector values.
    Type: Application
    Filed: October 11, 2023
    Publication date: April 18, 2024
    Inventors: Byung-Ok HAN, Ho-Won KIM, Jang-Hee YOO, Cheol-Hwan YOO, Jae-Yoon JANG
  • Patent number: 11749023
    Abstract: Disclosed herein are an apparatus and method for monitoring a user based on multi-view face images. The apparatus includes memory in which at least one program is recorded and a processor for executing the program. The program may include a face detection unit for extracting face area images from respective user images captured from two or more different viewpoints, a down-conversion unit for generating at least one attribute-specific 2D image by mapping information about at least one attribute in the 3D space of the face area images onto a 2D UV space, and an analysis unit for generating user monitoring information by analyzing the at least one attribute-specific 2D image.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: September 5, 2023
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Ho-Won Kim, Jang-Hee Yoo, Byung-Ok Han
  • Patent number: 11709919
    Abstract: Disclosed herein are a method and apparatus for active identification based on gaze path analysis. The method may include extracting the face image of a user, extracting the gaze path of the user based on the face image, verifying the identity of the user based on the gaze path, and determining whether the face image is authentic.
    Type: Grant
    Filed: October 27, 2020
    Date of Patent: July 25, 2023
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Ho-Won Kim, Jang-Hee Yoo
  • Publication number: 20230071037
    Abstract: Disclosed herein are an apparatus for recognizing a user command using non-contact gaze-based head motion information and a method using the same. The method includes monitoring the gaze and the head motion of a user based on a sensor, displaying a user interface at a location corresponding to the gaze based on gaze-based head motion information acquired by combining the gaze and the head motion, and recognizing a user command selected from the user interface.
    Type: Application
    Filed: December 21, 2021
    Publication date: March 9, 2023
    Inventors: Ho-Won KIM, Cheol-Hwan YOO, Jang-Hee YOO, Jae-Yoon JANG
  • Publication number: 20220392080
    Abstract: Disclosed herein is a method for supporting an attention test based on an attention map and an attention movement map. The method includes generating a score distribution for each segment area of frames satisfying preset conditions, among frames of video content (video) that is produced in advance so as to be suitable for the purpose of a test, generating an attention map corresponding to the frames based on the distribution of the gaze point of a subject, generating an attention movement map corresponding to the frames based on information about movement of the gaze point of the subject, and calculating the attention of the subject using the score distribution for each segment area, the attention map, and the attention movement map.
    Type: Application
    Filed: November 23, 2021
    Publication date: December 8, 2022
    Inventors: Jang-Hee YOO, Ho-Won KIM, Jae-Yoon JANG
  • Publication number: 20210374402
    Abstract: Disclosed herein are an apparatus and method for monitoring a user based on multi-view face images. The apparatus includes memory in which at least one program is recorded and a processor for executing the program. The program may include a face detection unit for extracting face area images from respective user images captured from two or more different viewpoints, a down-conversion unit for generating at least one attribute-specific 2D image by mapping information about at least one attribute in the 3D space of the face area images onto a 2D UV space, and an analysis unit for generating user monitoring information by analyzing the at least one attribute-specific 2D image.
    Type: Application
    Filed: October 30, 2020
    Publication date: December 2, 2021
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Ho-Won KIM, Jang-Hee YOO, Byung-Ok HAN
  • Publication number: 20210294883
    Abstract: Disclosed herein are a method and apparatus for active identification based on gaze path analysis. The method may include extracting the face image of a user, extracting the gaze path of the user based on the face image, verifying the identity of the user based on the gaze path, and determining whether the face image is authentic.
    Type: Application
    Filed: October 27, 2020
    Publication date: September 23, 2021
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Ho-Won KIM, Jang-Hee YOO
  • Publication number: 20180157814
    Abstract: Disclosed herein are a method and apparatus for authenticating a user based on a fingertip gesture. The authentication apparatus may display a pattern generated based on geometric information about a hand geometry or size of a user, and may recognize a fingertip gesture via interaction with the user with respect to the pattern. The authentication apparatus may authenticate a user using the recognized fingertip gesture. The pattern may include a gesture inducement/relation pattern and a fake pattern. Information about the fingertip gesture may include fingertip touch locations of the fingertip gesture, a touch order of the fingertip touch locations, and a moving direction of the fingertip touch locations.
    Type: Application
    Filed: November 15, 2017
    Publication date: June 7, 2018
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jang-Hee YOO, Ki-Young MOON, Young-Jae LIM, Kyo-Il CHUNG
  • Patent number: 9443151
    Abstract: Disclosed herein is an apparatus and method of searching for a wanted vehicle, capable of interoperating with black boxes mounted in vehicles of unspecified individuals, recognizing and searching for registration numbers of vehicles in proximity of each black box in real time, and identifying a location of the wanted vehicle in real time using information about locations of the searched vehicles. The method includes requesting, by an apparatus for searching for a wanted vehicle, a black box installed in at least one vehicle to search for a registration number of the wanted vehicle, and receiving a response corresponding to the request, and acquiring information about the wanted vehicle and a location of the wanted vehicle corresponding to the response using the black box for recognizing a vehicle registration number or the black box for detecting a vehicle registration number region.
    Type: Grant
    Filed: July 30, 2014
    Date of Patent: September 13, 2016
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jang-Hee Yoo, Jong-Gook Ko, Jin-Woo Choi, Ki-Young Moon
  • Patent number: 9369665
    Abstract: Disclosed is a video recording apparatus for a vehicle, which includes a camera unit formed so as to change a capturing direction; a driver's viewing direction detecting unit configured to detect a driver's viewing direction; a control unit configured to control the camera unit so that the detected viewing direction corresponds to the capturing direction; and a storing unit configured to store a video obtained by the camera unit. Therefore, a video in a direction at which the driver views is obtained and stored without mounting a plurality of cameras so that it is possible to obtain a video for a situation of an accident of a vehicle of the driver or a vehicle of another driver occurring not only in the front of the driver's vehicle but also in the side of the driver's vehicle.
    Type: Grant
    Filed: June 19, 2013
    Date of Patent: June 14, 2016
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Dae Sung Moon, Han Sung Lee, Jin Woo Choi, Ki Young Moon, Jang Hee Yoo