Patents Assigned to CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
  • Patent number: 9927869
    Abstract: Disclosed is an apparatus for outputting a virtual keyboard, the apparatus including: a virtual keyboard image output unit determining coordinates of a virtual keyboard image by using hand information of a user and outputting the virtual keyboard image; a contact recognition unit determining a contact state by using collision information between a virtual physical collider associated with an end point of a user's finger and a virtual physical collider associated with each virtual key of the virtual keyboard image; a keyboard input unit providing multiple input values for a single virtual key; and a feedback output unit outputting respective feedback for the multiple input values. Accordingly, input convenience and efficiency may be provided by outputting the virtual keyboard in a three dimensional virtual space and reproducing an input method using a keyboard form that is similar to the real world.
    Type: Grant
    Filed: June 12, 2017
    Date of Patent: March 27, 2018
    Assignee: Center Of Human-Centered Interaction for Coexistence
    Inventors: Joung Huem Kwon, Yongho Lee, Bum Jae You
  • Publication number: 20180055367
    Abstract: Provided is an in vivo bioimaging method including irradiating near-infrared (NIR) light onto a living body, converting the NIR light passed through the living body, into visible light using upconversion nanoparticles (UCNPs), and generating a bioimage of the living body by receiving the visible light using a complementary metal-oxide-semiconductor (CMOS) image sensor.
    Type: Application
    Filed: July 31, 2017
    Publication date: March 1, 2018
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Hwa Sup LIM, Seok Joon KWON, Sang Chul AHN, Bum Jae YOU
  • Publication number: 20180046738
    Abstract: A method of controlling a virtual model to perform physics simulation to the virtual model in a virtual space includes: generating a first virtual model having a first object physics field which is a range with respect to a first field parameter; generating a second virtual model having a second object physics field which is a range with respect to a second field parameter; when the field parameters are capable of corresponding to each other, checking whether there is a portion where the object physics fields correspond to each other; and when there is a portion where the object physics fields correspond to each other, generating an interaction of the virtual models.
    Type: Application
    Filed: June 7, 2017
    Publication date: February 15, 2018
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jung Min PARK, Joongjae Lee, Jisu Kim
  • Publication number: 20170371405
    Abstract: Disclosed is an apparatus for outputting a virtual keyboard, the apparatus including: a virtual keyboard image output unit determining coordinates of a virtual keyboard image by using hand information of a user and outputting the virtual keyboard image; a contact recognition unit determining a contact state by using collision information between a virtual physical collider associated with an end point of a user's finger and a virtual physical collider associated with each virtual key of the virtual keyboard image; a keyboard input unit providing multiple input values for a single virtual key; and a feedback output unit outputting respective feedback for the multiple input values. Accordingly, input convenience and efficiency may be provided by outputting the virtual keyboard in a three dimensional virtual space and reproducing an input method using a keyboard form that is similar to the real world.
    Type: Application
    Filed: June 12, 2017
    Publication date: December 28, 2017
    Applicant: Center of Human-Centered Interaction For Coexistence
    Inventors: Joung Huem Kwon, Yongho Lee, Bum Jae You
  • Publication number: 20170354353
    Abstract: A motion capture system includes a motion sensor having a flexible body and a fiber bragg gratings (FBG) sensor inserted into the body, a fixture configured to fix the motion sensor to a human body of a user, a light source configured to irradiate light to the motion sensor, and a measurer configured to analyze a reflected light output from the motion sensor, wherein the FBG sensor includes an optical fiber extending along a longitudinal direction of the body and a sensing unit formed in a partial region of the optical fiber and having a plurality of gratings, and wherein a change of a wavelength spectrum of the reflected light, caused by the change of an interval of the gratings due to a motion of the user, is detected to measure a motion state of the user.
    Type: Application
    Filed: June 2, 2017
    Publication date: December 14, 2017
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jinseok KIM, Hyun Joon SHIN, Bum-Jae YOU, Sungwook YANG
  • Patent number: 9843772
    Abstract: A method for providing telepresence by employing avatars is provided. The method includes steps of: (a) a corresponding location searching part determining a location in a first space where an avatar Y? corresponding to a human Y in a second space will be placed, if a change of a location of the human Y in the second space is detected from an initial state, by referring to (i) information on the first space and the second space and (ii) information on locations of the humans X and Y, and the avatar X? in the first and the second spaces; and (b) an avatar motion creating part creating a motion of the avatar Y? by referring to information on the determined location where the avatar Y? will be placed.
    Type: Grant
    Filed: November 30, 2015
    Date of Patent: December 12, 2017
    Assignees: Korea Advanced Institute Of Science And Technology, Center Of Human-Centered Interaction For Coexistence
    Inventors: Sung Hee Lee, Bum Jae You
  • Patent number: 9838587
    Abstract: A space registration system includes a display apparatus, an image sensor, a processor, and a plane mirror. The display apparatus has an inherent display coordinate system which defines a coordinate of the virtual space. The image sensor has an inherent sensor coordinate system which defines a coordinate of the real space. The processor analogizes a transformation equation of the sensor coordinate system and the display coordinate system by means of symmetry of an incidence angle and a reflection angle of light with respect to the mirror surface, compares a coordinate of the reflection image with respect to the display coordinate system with a known coordinate of the reflection image with respect to the sensor coordinate system, and adjusts the transformation equation of the sensor coordinate system and the display coordinate system.
    Type: Grant
    Filed: June 13, 2016
    Date of Patent: December 5, 2017
    Assignee: Center of Human-centered Interaction for Coexistence
    Inventors: Young-Yong Kim, Junsik Kim, Jung Min Park
  • Publication number: 20170333724
    Abstract: A bio-stimulation robot includes a stationary platform, a plurality of drive modules coupled to the stationary platform, and a motion platform coupled to the drive modules to operate to change a position of the motion platform. Each of the drive modules includes a first guide member having an arc shape, a motion member coupled to the first guide, and a leg member having a first end coupled to the motion member and a second end fixed to the motion platform. The motion member slides along the first guide member. The second end of the leg member is rotatably connected to the motion platform. The second end of the leg member is rotatably connected to the motion platform.
    Type: Application
    Filed: May 17, 2016
    Publication date: November 23, 2017
    Applicants: Center of Human-Centered Interaction for Coexistence, Industry-University Cooperation Foundation Hanyang University Erica Campus
    Inventors: Sungon Lee, Jun-Woo Kim, Woo-Seok Ryu, Sung-Teak Cho, Hyung-Min Kim, Bum-Jae You
  • Patent number: 9804677
    Abstract: An apparatus for creating virtual joint sensation is provided. The apparatus includes: a controlling part for creating control signals for controlling respective user's joints by referring to information on torques to be applied to the respective user's joints, wherein the information on the torques is acquired by analyzing information on forces to be applied to the user's body contacting a virtual object; and a torque-applying part, including one or more torque-applying units worn on the respective user's joints, for giving the torques to the respective user's joints by using the control signals.
    Type: Grant
    Filed: February 4, 2016
    Date of Patent: October 31, 2017
    Assignees: Korea Institute of Science and Technology, Center Of Human-Centered Interaction For Coexistence
    Inventors: Do Ik Kim, Jung Min Park, Joong-Jae Lee
  • Publication number: 20170049375
    Abstract: A social intimacy determining method includes detecting electrocardiogram (ECG) data from at least two subjects; detecting heart rhythm pattern (HRP) data from ECG signals of the two subjects; and determining a relationship (intimacy) between the two subjects by comparing the FIRP data of the two subjects.
    Type: Application
    Filed: June 19, 2014
    Publication date: February 23, 2017
    Applicants: SANGMYUNG UNIVERSITY SEOUL INDUSTRY-ACADEMY COOPERATION FOUNDATION, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Sang In PARK, Min Cheol WHANG, Myoung Ju WON, Sung Teac HWANG
  • Publication number: 20170026490
    Abstract: In accordance with an aspect, there is provided a method for supporting provision of service so that a client terminal is provided with desired service by adaptively modifying a network topology depending on service properties, including (a) when service type information indicating a type of desired service is acquired from the client terminal, and status information indicating status of one or more service provision servers is acquired, acquiring, by a management server, network configuration information as information corresponding to the service type information and the status information with reference to a DB, wherein the network configuration information is required by the client terminal to be provided with the service from a specific service provision server; and (b) transmitting, by the management server, acquired network configuration information to the client terminal, thus supporting network configuration such that the client terminal configures a network based on the network configuration informa
    Type: Application
    Filed: July 14, 2016
    Publication date: January 26, 2017
    Applicant: Center of Human-Centered Interaction for Coexistence
    Inventors: Joong Jae Lee, Eun Mi Lee, Sang Hun Nam, Bum Jae You
  • Publication number: 20160371889
    Abstract: A space registration system includes a display apparatus, an image sensor, a processor, and a plane mirror. The display apparatus has an inherent display coordinate system which defines a coordinate of the virtual space. The image sensor has an inherent sensor coordinate system which defines a coordinate of the real space. The processor analogizes a transformation equation of the sensor coordinate system and the display coordinate system by means of symmetry of an incidence angle and a reflection angle of light with respect to the mirror surface, compares a coordinate of the reflection image with respect to the display coordinate system with a known coordinate of the reflection image with respect to the sensor coordinate system, and adjusts the transformation equation of the sensor coordinate system and the display coordinate system.
    Type: Application
    Filed: June 13, 2016
    Publication date: December 22, 2016
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Young-Yong KIM, Junsik KIM, Jung Min PARK
  • Patent number: 9524031
    Abstract: The present invention relates to an apparatus for recognizing a gesture in a space. In accordance with an embodiment, a spatial gesture recognition apparatus includes a pattern formation unit for radiating light onto a surface of an object required to input a gesture in a virtual air bounce, and forming a predetermined pattern on the surface of the object, an image acquisition unit for acquiring a motion image of the object, and a processing unit for recognizing a gesture input by the object based on the pattern formed on the surface of the object using the acquired image. In this way, depending on the depths of an object required to input a gesture in a space, haptic feedbacks having different intensities are provided, and thus a user can precisely input his or her desired gesture.
    Type: Grant
    Filed: November 18, 2013
    Date of Patent: December 20, 2016
    Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Kiwon Yeom, Hyejin Han, Bumjae You
  • Publication number: 20160358380
    Abstract: A head-mounted device (HMD) for enabling a 3D drawing interaction in a mixed-reality space is provided. The HMD includes a frame section, a rendering unit providing a specified image, a camera unit attached to the frame section to pick up an image for rendering, and a control unit configured to, when the camera unit picks up an image of a specified marker, perform a calibration process based on position information of the image of the marker displayed on a screen of the HMD and to, when there is a motion of an input device for interaction with a virtual whiteboard, obtain position information of an image of the input device displayed on a virtual camera screen based on position information of the whiteboard.
    Type: Application
    Filed: June 2, 2016
    Publication date: December 8, 2016
    Applicant: Center of Human-Centered Interaction for Coexistence
    Inventors: Ki Won Yeom, Joung Huem Kwon, Ji Yong Lee, Bum Jae You
  • Publication number: 20160357258
    Abstract: An apparatus for providing haptic force feedback to a user interacting with a virtual object in a virtual space is provided. The apparatus includes a force-feedback provision unit providing a sensation of touch arising from an interaction with the virtual object to at least a portion of the user's body. A processor unit performs (i) receiving at least one of a first force-feedback control data generated based on vector information on a motion of the at least a portion of the user's body and information on the virtual object and a second force-feedback control data generated based on image information the information on the virtual object from the external terminal, or (ii) generating at least one of the first and second force-feedback control data.
    Type: Application
    Filed: June 2, 2016
    Publication date: December 8, 2016
    Applicant: Center of Human-Centered Interaction for Coexistence
    Inventors: Ki Won Yeom, Joung Huem Kwon, Ji Yong Lee, Bum Jae You
  • Patent number: 9332219
    Abstract: The present invention relates to a telepresence device that is capable of enabling various types of verbal or non-verbal interaction between a remote user and a local user. In accordance with an embodiment, the telepresence device may include a camera combined with direction control means; a projector provided on a top of the camera, and configured such that a direction of the projector is controlled by the direction control means along with a direction of the camera; and a control unit configured to control the direction of the camera by operating the direction control means, to extract an object, at which a remote user gazes, from an image acquired by the camera whose direction has been controlled, to generate a projection image related to the object, and to project the projection image around the object by controlling the projector.
    Type: Grant
    Filed: November 18, 2013
    Date of Patent: May 3, 2016
    Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jounghuem Kwon, Bumjae You, Shinyoung Kim, Kwangkyu Lee
  • Patent number: 9280846
    Abstract: A method for performing occlusion queries is disclosed. The method includes steps of: (a) a graphics processing unit (GPU) using a first depth buffer of a first frame to thereby predict a second depth buffer of a second frame; and (b) the GPU performing occlusion queries for the second frame by using the predicted second depth buffer, wherein the first frame is a frame predating the second frame. In accordance with the present invention, a configuration for classifying the objects into the occluders and the occludees is not required and the occlusion queries for the predicted second frame are acquired in advance at the last of the first frame or the first of the second frame.
    Type: Grant
    Filed: July 2, 2015
    Date of Patent: March 8, 2016
    Assignees: Center Of Human-Centered Interaction For Coexistence, Research & Business Foundation Sungkyunkwan University
    Inventors: Sung Kil Lee, Young Uk Kim
  • Patent number: 9265974
    Abstract: The present invention relates to an apparatus for creating a tactile sensation through non-invasive brain stimulation by using ultrasonic waves. The apparatus includes: an ultrasonic transducer module for inputting the ultrasonic waves to stimulate a specific part of the brain of a specified user non-invasively through at least one ultrasonic transducer unit; a compensating module for acquiring information on a range of tactile perception areas in the brain of the specified user and compensating properties of ultrasonic waves to be inputted to the specified user through the ultrasonic transducer unit by referring to the acquired information thereon; and an ultrasonic waves generating module for generating ultrasonic waves to be inputted to the specified user through the ultrasonic transducer unit by referring to a compensating value decided by the compensating module.
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: February 23, 2016
    Assignees: Center Of Human-Centered Interaction For Coexistence, Korea Institute of Science and Technology, Catholic University Industry Academic Cooperation Foundation
    Inventors: Bum Jae You, Sung On Lee, Yong An Chung
  • Patent number: 9189008
    Abstract: An apparatus interacting with an external device by using a pedal module is provided. The apparatus includes: a pedal module; a parallel position-measuring sensor for sensing a degree of a parallel motion; a rotary position-measuring sensor for sensing a degree of a rotary motion; and a control part for ordering the external device to be driven by referring to at least either of the degree of the parallel motion sensed by the parallel position-measuring sensor or that of the rotary motion sensed by the rotary position-measuring sensor or for receiving a control signal from the external device and driving a motor group including at least one motor to apply force feedback to the pedal module by referring to the control signal.
    Type: Grant
    Filed: December 17, 2014
    Date of Patent: November 17, 2015
    Assignee: Center Of Human-Centered Interaction For Coexistence
    Inventors: Dae Keun Yoon, Kwang Kyu Lee, Shin Young Kim, Jai Hi Cho, Bum Jae You
  • Publication number: 20150281640
    Abstract: The present invention relates to a telepresence device that is capable of enabling various types of verbal or non-verbal interaction between a remote user and a local user. In accordance with an embodiment, the telepresence device may include a camera combined with direction control means; a projector provided on a top of the camera, and configured such that a direction of the projector is controlled by the direction control means along with a direction of the camera; and a control unit configured to control the direction of the camera by operating the direction control means, to extract an object, at which a remote user gazes, from an image acquired by the camera whose direction has been controlled, to generate a projection image related to the object, and to project the projection image around the object by controlling the projector.
    Type: Application
    Filed: November 18, 2013
    Publication date: October 1, 2015
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jounghuem Kwon, Bumjae You, Shinyoung Kim, Kwangkyu Lee