Patents by Inventor Ga-Gue Kim

Ga-Gue Kim has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220215932
    Abstract: Provided is a method of analyzing multimodal user experience data for providing a psychological stability service, the method including: receiving multimodal user experience data collected on the basis of a user device that is predetermined; analyzing an association between the collected multimodal user experience data to identify whether a user is in a stressful situation; when the stressful situation is identified, analyzing a frequency pattern of the stressful situation of the user; recognizing stress context information about the stressful situation occurring to the user on the basis of the frequency pattern of the stressful situation; and providing the user device with a preemptive psychological stability service on the basis of the stress context information corresponding to the stressful situation.
    Type: Application
    Filed: November 29, 2021
    Publication date: July 7, 2022
    Inventors: Kyoung Ju NOH, Jeong Muk LIM, Chi Yoon JEONG, Hyun Tae JEONG, Ga Gue KIM, Ji Youn LIM, Seung Eun CHUNG
  • Publication number: 20220207382
    Abstract: Provided is an apparatus for refining data and improving the performance of a behavior recognition model by reflecting time-series characteristics of a behavior. The apparatus includes: a data pre-processing unit configured to receive training data and real-time data as input, identify a missing value of sensor data, and interpolate the sensor data; a behavior recognition unit configured to, through a behavior recognition model, generate a behavior recognition classification result for the preprocessed real-time data; a data refinement unit configured to correct the behavior recognition classification result to generate a refined dataset; a learning model update unit configured to analyze a similarity of the refined dataset and, based on a result of the analysis, perform learning to generate the behavior recognition model; and an information output unit configured to express a corrected behavior recognition result to a user.
    Type: Application
    Filed: November 30, 2021
    Publication date: June 30, 2022
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Seung Eun CHUNG, Chi Yoon JEONG, Hyun Tae JEONG, Ga Gue KIM, Kyoung Ju Noh, Jeong Muk LIM, Ji Youn LIM
  • Publication number: 20220122727
    Abstract: Provided are an apparatus and method for generating a narrative for lifestyle recognition. The apparatus for generating a narrative for lifestyle recognition includes an input unit configured to collect basic data of narrative generation for the lifestyle recognition, a memory configured to store a program for generating a user's lifestyle narrative using the basic data, and a processor configured to execute the program, in which the processor performs a causal relationship analysis on a cause of a mind state and a result of the mind state using the lifestyle narrative.
    Type: Application
    Filed: October 15, 2021
    Publication date: April 21, 2022
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Jiyoun LIM, Hyuntae JEONG, Ga Gue KIM, Kyoung Ju NOH, Jeong Mook LIM, Seungeun CHUNG, Chiyoon JEONG
  • Patent number: 10983808
    Abstract: The present invention relates to a method and apparatus for providing an emotion-adaptive user interface (UI) on the basis of an affective computing service, in which the provided service is configured with at least one of a service operation condition, a service end condition, and an emotion-adaptive UI service type on the basis of purpose information of the service, and the detailed pattern is changed and provided on the basis of the purpose information and the usefulness information of the service.
    Type: Grant
    Filed: August 22, 2019
    Date of Patent: April 20, 2021
    Assignee: Electronics and Telecommunications Research institute
    Inventors: Kyoung Ju Noh, Hyun Tae Jeong, Ga Gue Kim, Ji Youn Lim, Seung Eun Chung
  • Patent number: 10789961
    Abstract: Disclosed is technology for providing a proper UI/UX through various devices or services when occurrence of registered concerned context is predicted or recognized in order to predict or recognize the circumstances that require attention or emotion control with regard to a change in his/her biological information With this, a user designates his/her own biological information range or emotional state with regard to circumstances which catch his/her attention, and registers concerned context by selectively designating attributes of circumstantial elements. Further, a user registers feedback desired to be given and an external device/service desired to interface with when the occurrence of the concerned context is predicted or recognized. According to the attributes of the circumstances designated in the registered concerned context, points in time for collecting and managing UX data are automatically determined, thereby processing and managing the UX data as useful information.
    Type: Grant
    Filed: December 13, 2018
    Date of Patent: September 29, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Kyoung Ju Noh, Ji Youn Lim, Ga Gue Kim, Seung Eun Chung, Hyun Tae Jeong
  • Publication number: 20200225963
    Abstract: The present invention relates to a method and apparatus for providing an emotion-adaptive user interface (UI) on the basis of an affective computing service, in which the provided service is configured with at least one of a service operation condition, a service end condition, and an emotion-adaptive UI service type on the basis of purpose information of the service, and the detailed pattern is changed and provided on the basis of the purpose information and the usefulness information of the service.
    Type: Application
    Filed: August 22, 2019
    Publication date: July 16, 2020
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Kyoung Ju NOH, Hyun Tae JEONG, Ga Gue KIM, Ji Youn LIM, Seung Eun CHUNG
  • Publication number: 20200111569
    Abstract: Provided are a system, computing device, and method for analyzing a sleep-related activity pattern using multimodal sensor data. The system includes a user device configured to be attached to a user's body and collect data through a sensor module and a computing device configured to recognize actions of the user from the data, generate action sequence sets on the basis of chronological order of the recognized actions of the user, cluster activities of the user as sleep activities and non-sleep activities on the basis of the action sequence sets, extract non-sleep activity patterns associated with the sleep activities through correlation analysis between sequences including time information of the clustered non-sleep activities and sleep activities, and provide the extracted non-sleep activity patterns to the user device.
    Type: Application
    Filed: August 27, 2019
    Publication date: April 9, 2020
    Inventors: Ji Youn LIM, Hyun Tae JEONG, Ga Gue KIM, Kyoung Ju NOH, Seung Eun CHUNG
  • Publication number: 20190371344
    Abstract: Disclosed is technology for providing a proper UI/UX through various devices or services when occurrence of registered concerned context is predicted or recognized in order to predict or recognize the circumstances that require attention or emotion control with regard to a change in his/her biological information With this, a user designates his/her own biological information range or emotional state with regard to circumstances which catch his/her attention, and registers concerned context by selectively designating attributes of circumstantial elements. Further, a user registers feedback desired to be given and an external device/service desired to interface with when the occurrence of the concerned context is predicted or recognized. According to the attributes of the circumstances designated in the registered concerned context, points in time for collecting and managing UX data are automatically determined, thereby processing and managing the UX data as useful information.
    Type: Application
    Filed: December 13, 2018
    Publication date: December 5, 2019
    Inventors: Kyoung Ju NOH, Ji Youn LIM, Ga Gue KIM, Seung Eun CHUNG, Hyun Tae JEONG
  • Patent number: 9835878
    Abstract: Disclosed herein is a wearable eyeglass device including: an optical communication module receiving a first optical signal and transmitting a second optical signal; a display module displaying information corresponding to at least one of the first and second optical signals; and a control module controlling the display module to display first information corresponding to the first optical signal at the time of receiving the first optical signal and controlling the optical communication module to transmit the second optical signal corresponding to second information.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: December 5, 2017
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Dong-Woo Lee, Hyung-Cheol Shin, Ga-Gue Kim, Sung-Yong Shin, Gi-Su Heo, Yong-Ki Son, Hyun-Tae Jeong
  • Patent number: 9342751
    Abstract: Technology for a method of detecting a user hand by a user hand detecting device. The method according to an aspect of the present invention includes extracting a first mask image from a depth image in which the user hand is imaged; extracting a second mask image having a preset skin color value among regions corresponding to the first mask image in a color image in which the user hand is imaged; generating a skin color value histogram model in a color space different from a region of the color image corresponding to a color region of the second mask image; generating a skin color probability image of the different color space from the color image using the skin color value histogram model and an algorithm for detecting a skin color region; and combining the skin color probability image with the second mask image and detecting the user's hand region.
    Type: Grant
    Filed: May 14, 2015
    Date of Patent: May 17, 2016
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Gi Su Heo, Dong Woo Lee, Sung Yong Shin, Ga Gue Kim, Hyung Cheol Shin
  • Publication number: 20150332471
    Abstract: Technology for a method of detecting a user hand by a user hand detecting device. The method according to an aspect of the present invention includes extracting a first mask image from a depth image in which the user hand is imaged; extracting a second mask image having a preset skin color value among regions corresponding to the first mask image in a color image in which the user hand is imaged; generating a skin color value histogram model in a color space different from a region of the color image corresponding to a color region of the second mask image; generating a skin color probability image of the different color space from the color image using the skin color value histogram model and an algorithm for detecting a skin color region; and combining the skin color probability image with the second mask image and detecting the user's hand region.
    Type: Application
    Filed: May 14, 2015
    Publication date: November 19, 2015
    Inventors: Gi Su HEO, Dong Woo LEE, Sung Yong SHIN, Ga Gue KIM, Hyung Cheol SHIN
  • Publication number: 20150219933
    Abstract: Disclosed herein is a wearable eyeglass device including: an optical communication module receiving a first optical signal and transmitting a second optical signal; a display module displaying information corresponding to at least one of the first and second optical signals; and a control module controlling the display module to display first information corresponding to the first optical signal at the time of receiving the first optical signal and controlling the optical communication module to transmit the second optical signal corresponding to second information.
    Type: Application
    Filed: January 30, 2015
    Publication date: August 6, 2015
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Dong-Woo LEE, Hyung-Cheol SHIN, Ga-Gue KIM, Sung-Yong SHIN, Gi-Su HEO, Yong-Ki SON, Hyun-Tae JEONG
  • Publication number: 20140274001
    Abstract: A method of connecting a call by a recipient terminal includes the recipient terminal receiving a call from a caller terminal, and displaying the purpose of the call, input by a caller and included in the call, on a call reception indication screen.
    Type: Application
    Filed: June 7, 2013
    Publication date: September 18, 2014
    Inventors: Ji Yong KIM, Jong Ho WON, Ga Gue KIM, Kyoung PARK
  • Publication number: 20100245118
    Abstract: There are provided a multimodal fusion apparatus capable of remotely controlling a number of electronic devices and a method for remotely controlling a number of electronic devices in the multimodal fusion apparatus. In accordance with the present invention, instead of one input device, such as a remote control or the like, multimodal commands, such as user-familiar voice, gesture and the like, are used to remotely control a number of electronic devices equipped at home or within a specific space. That is, diverse electronic devices are controlled in the same manner by the multimodal commands. When a new electronic device is added, control commands thereof are automatically configured to control the new electronic device.
    Type: Application
    Filed: July 8, 2008
    Publication date: September 30, 2010
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Dong Woo Lee, Il Yeon Cho, Ga Gue Kim, Ji Eun Kim, Jeong Mook Lim, John Sunwoo
  • Publication number: 20090241171
    Abstract: A wearable system and a method for transferring and controlling information/service based on biologically generated information from a user are provided. In the method, an intuitive bio signal generated by a user is sensed and a device pointed by the sensed bio signal is selected. Then, bio signal information is created using the sensed bio signal and the generated bio signal information is transmitted to the selected device. After transmitting, the information/service is transferred to the selected device after confirming that the selected device that receives the bio signal information is activated.
    Type: Application
    Filed: August 29, 2007
    Publication date: September 24, 2009
    Inventors: John Sunwoo, Yong-Ki Son, Myoung-Hwan Oh, Hyun-Tae Jeong, Ji-Young Choi, Ji-Eun Kim, Hee-Joong Ahn, Jin-Ho Yoo, Bae-Sun Kim, Dong-Woo Lee, Ga-Gue Kim, Il-Yeon Cho, Dong-Won Han