Patents by Inventor Cheonshu Park

Cheonshu Park has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10596708
    Abstract: The present disclosure herein relates to an interaction device capable of performing an interaction with a human, and more particularly, to an interaction device capable of performing an interaction with a plurality of participants. The interaction device includes a role classifying unit configured to classify a role for each of a plurality of participants based on an external stimulus signal for each of the plurality of participants and an action adjusting unit configured to perform different interaction operations for each of the plurality of participants based on the role for each of the plurality of participants. An interaction device according to an embodiment of the present application classifies the roles of a plurality of participants according to a participation degree and/or an action state and provides a customized interaction operation for each of participants according to the classified roles. Therefore, it is possible to perform a natural interaction operation with a plurality of participants.
    Type: Grant
    Filed: March 22, 2017
    Date of Patent: March 24, 2020
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Cheonshu Park, Jae Hong Kim, Daeha Lee, Min Su Jang
  • Publication number: 20170274535
    Abstract: The present disclosure herein relates to an interaction device capable of performing an interaction with a human, and more particularly, to an interaction device capable of performing an interaction with a plurality of participants. The interaction device includes a role classifying unit configured to classify a role for each of a plurality of participants based on an external stimulus signal for each of the plurality of participants and an action adjusting unit configured to perform different interaction operations for each of the plurality of participants based on the role for each of the plurality of participants. An interaction device according to an embodiment of the present application classifies the roles of a plurality of participants according to a participation degree and/or an action state and provides a customized interaction operation for each of participants according to the classified roles. Therefore, it is possible to perform a natural interaction operation with a plurality of participants.
    Type: Application
    Filed: March 22, 2017
    Publication date: September 28, 2017
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Cheonshu PARK, Jae Hong KIM, Daeha LEE, Min Su JANG
  • Publication number: 20140172909
    Abstract: An apparatus for providing a service application using a robot includes a sensing unit configured to generate environmental sensing information on the surrounding of a moving path of the robot and user state information; and a user circumstance determination unit configured to determine the circumstance and intention of a user to generate user recognition information, and searches and downloads a service application. Further, the apparatus includes a service provision unit configured to search service representation devices around the moving path of the robot and migrate a service corresponding to the service application to at least one of the searched service representation devices; and a user feedback management unit configured to manage feedback information corresponding to an interaction between the user and the robot.
    Type: Application
    Filed: June 10, 2013
    Publication date: June 19, 2014
    Inventors: Cheonshu PARK, Min Su JANG, Daeha LEE, Jae Hong KIM, Young-Jo CHO, Jong-Hyun PARK
  • Publication number: 20120162411
    Abstract: An apparatus for operation of a moving object in an unstructured environment includes a sensor unit configured to sense vibration of the moving object to produce a sensing signal; an image capturing unit configured to capture an image of a surrounding environment on which the moving object travels; a signal synchronization unit configured to synchronize the sensing signal from the sensor unit with the image captured by an image capturing unit through mapping therebetween; an amplitude extraction unit configured to extract a tilt of the moving object based on the sensing signal from the sensor unit and calculate a viewing angle and a moving distance of the moving object based on the calculated tilt; and an image correction unit configured to perform coordinate transformation on the captured image by using the viewing angle and the moving distance to generate a corrected image.
    Type: Application
    Filed: December 22, 2011
    Publication date: June 28, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Min Su JANG, Young-Jo CHO, Hyeonsung CHO, Cheonshu PARK, Jae Hong KIM, Joo Chan SOHN
  • Patent number: 8099372
    Abstract: A method of modeling a composite emotion in a multidimensional vector space, is provided with creating an emotion vector space by defining dimensions of a vector space in consideration of stimuli affecting emotions, and dividing a defined multidimensional vector space into emotion regions. Further, the method of modeling a composite emotion in a multidimensional vector space includes creating a composite emotion by calculating a fuzzy partitioned matrix between a current state vector and respective representative vectors in the created emotion vector space.
    Type: Grant
    Filed: June 27, 2008
    Date of Patent: January 17, 2012
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Joung Woo Ryu, Cheonshu Park, Joo Chan Sohn, Hyun Kyu Cho, Young-Jo Cho
  • Publication number: 20090248372
    Abstract: A method of modeling a composite emotion in a multidimensional vector space, is provided with creating an emotion vector space by defining dimensions of a vector space in consideration of stimuli affecting emotions, and dividing a defined multidimensional vector space into emotion regions. Further, the method of modeling a composite emotion in a multidimensional vector space includes creating a composite emotion by calculating a fuzzy partitioned matrix between a current state vector and respective representative vectors in the created emotion vector space.
    Type: Application
    Filed: June 27, 2008
    Publication date: October 1, 2009
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Joung Woo Ryu, Cheonshu Park, Joo Chan Sohn, Hyun Kyu Cho, Young-Jo Cho
  • Publication number: 20080119959
    Abstract: A method and apparatus for expressing an emotion of a robot, which are applicable to different emotion robot platforms are provided. The method of expressing an emotion of a robot, the method includes: collecting emotion information by at least one internal or external sensor; generating an emotion and determining a behavior based on the collected information; determining an emotion expression, emotional intensity, and an action unit according to the generated emotion; generating an emotion expression document according to the determined emotion expression, emotional intensity, and action unit; analyzing the emotion expression document; and controlling the robot based on the initial status information of the robot and the generated emotion expression document.
    Type: Application
    Filed: October 31, 2007
    Publication date: May 22, 2008
    Inventors: Cheonshu PARK, Joung Woo RYU, Joo Chan SOHN, Young Jo CHO