Patents Assigned to CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
-
Publication number: 20150251023Abstract: The present invention relates to an apparatus for creating a tactile sensation through non-invasive brain stimulation by using ultrasonic waves. The apparatus includes: an ultrasonic transducer module for inputting the ultrasonic waves to stimulate a specific part of the brain of a specified user non-invasively through at least one ultrasonic transducer unit; a compensating module for acquiring information on a range of tactile perception areas in the brain of the specified user and compensating properties of ultrasonic waves to be inputted to the specified user through the ultrasonic transducer unit by referring to the acquired information thereon; and an ultrasonic waves generating module for generating ultrasonic waves to be inputted to the specified user through the ultrasonic transducer unit by referring to a compensating value decided by the compensating module.Type: ApplicationFiled: May 23, 2014Publication date: September 10, 2015Applicants: Center Of Human-Centered Interaction For Coexistence, Korea Institute of Science and Technology, Catholic University Industry Academic Cooperation FoundationInventors: Bum Jae You, Sung On Lee, Yong An Chung
-
Patent number: 9123180Abstract: A method for displaying a shadow of a 3D virtual object, includes steps of: (a) acquiring information on a viewpoint of a user looking at a 3D virtual object displayed in a specific location in 3D space by a wall display device; (b) determining a location and a shape of a shadow of the 3D virtual object to be displayed by referring to information on the viewpoint of the user and the information on a shape of the 3D virtual object; and (c) allowing the shadow of the 3D virtual object to be displayed by at least one of the wall display device and a floor display device by referring to the determined location and the determined shape of the shadow of the 3D virtual object. Accordingly, the user is allowed to feel the accurate sense of depth or distance regarding the 3D virtual object.Type: GrantFiled: April 15, 2014Date of Patent: September 1, 2015Assignee: Center Of Human-Centered Interaction For CoexistenceInventors: Joung Huem Kwon, Romain Destenay, Jai Hi Cho, Bum Jae You
-
Patent number: 9077974Abstract: Disclosed herein is a 3D teleconferencing apparatus and method enabling eye contact. The 3D teleconferencing apparatus enabling eye contact according to the present invention includes an image acquisition unit for acquiring depth images and color images by manipulating cameras in real time in consideration of images obtained by capturing a subject that is a teleconference participant and images received over a network and corresponding to a counterpart involved in the teleconference; a full face generation unit for generating a final depth image and a final color image corresponding to a full face of the participant for eye contact using the depth images and the color images; and a 3D image generation unit for generating a 3D image corresponding to the counterpart and displaying the 3D image on a display device.Type: GrantFiled: July 6, 2012Date of Patent: July 7, 2015Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCEInventors: Bum-Jae You, Eun-Kyung Lee, Ji-Yong Lee, Jai-Hi Cho, Shin-Young Kim
-
Patent number: 9069354Abstract: The present invention provides a method for planning a path for an autonomous walking humanoid robot that takes an autonomous walking step using environment map information, the method comprising: an initialization step of initializing path input information of the autonomous walking humanoid robot using origin information, destination information, and the environment map information; an input information conversion step of forming a virtual robot including information on the virtual robot obtained by considering the radius and the radius of gyration of the autonomous walking humanoid robot based on the initialized path input information; a path generation step of generating a path of the virtual robot using the virtual robot information, the origin information S, the destination information G, and the environment map information; and an output information conversion step of converting the path of the autonomous walking humanoid robot based on the virtual robot path generated in the path generation step.Type: GrantFiled: December 31, 2012Date of Patent: June 30, 2015Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCEInventors: Soo Hyun Ryu, Nak Ju Doh, Yeon Sik Kang, Bum Jae You
-
Patent number: 9030307Abstract: An apparatus for generating haptic feedback, includes: multiple haptic units placed on a first portion of a body; and a control unit placed on a second portion, near the first portion, of the body, wherein the control unit includes: a first module for acquiring information on relative position (i) among the respective multiple haptic units and (ii) between the respective haptic units and the control unit, a second module for acquiring information on absolute position of the control unit by measuring a position of the control unit in reference to an external reference point, and a haptic command module for creating a command signal by referring to at least one piece of the information on relative position acquired by the first module and the information on absolute position acquired by the second module and delivering the created command signal to a corresponding haptic unit among all the multiple haptic units.Type: GrantFiled: December 17, 2014Date of Patent: May 12, 2015Assignee: Center Of Human-Centered Interaction For CoexistenceInventors: Kwang Kyu Lee, Shin Young Kim, Dae Keun Yoon, Jai Hi Cho, Bum Jae You
-
Patent number: 9007422Abstract: A method makes a first and a second devices support for interactions with respect to a 3D object. The method includes steps of: (a) allowing the first device to acquire information on physical 3D object and information on images of a user; (b) allowing the second device to receive the information relating to the physical 3D object and the information on images of the user of the first device, then display virtual 3D object corresponding to the physical 3D object and display 3D avatar of the user of the first device; (c) allowing the first device to transmit information on manipulation of the user of the first device regarding the physical 3D object and information on images of the user of the first device who is manipulating the physical 3D object and then allowing the second device to display the 3D avatar of the user of the first device.Type: GrantFiled: December 22, 2014Date of Patent: April 14, 2015Assignee: Center of Human-Centered Interaction for CoexistenceInventors: Joung Heum Kwon, Ji Yong Lee, Bum Jae You
-
Patent number: 8938131Abstract: The present invention relates to an auxiliary registration apparatus for registering a display device and an image sensor. The apparatus includes a camera; a panel interoperated with the camera and on which a first pattern is displayed; and a control part which allows the first pattern to be shot with the image sensor and a second pattern displayed on a screen of the display device to be shot with the camera; wherein the control part allows information on a transformation relationship between a coordinate system of the display device and that of the image sensor to be acquired by referring to information on a transformation relationship between a coordinate system of the panel and that of the image sensor and information on a transformation relationship between a coordinate system of the camera and that of the display device.Type: GrantFiled: June 2, 2014Date of Patent: January 20, 2015Assignee: Center of Human-Centered Interaction for CoexistenceInventors: Jun Sik Kim, Jung Min Park
-
Publication number: 20140375639Abstract: A method for displaying a shadow of a 3D virtual object, includes steps of: (a) acquiring information on a viewpoint of a user looking at a 3D virtual object displayed in a specific location in 3D space by a wall display device; (b) determining a location and a shape of a shadow of the 3D virtual object to be displayed by referring to information on the viewpoint of the user and the information on a shape of the 3D virtual object; and (c) allowing the shadow of the 3D virtual object to be displayed by at least one of the wall display device and a floor display device by referring to the determined location and the determined shape of the shadow of the 3D virtual object. Accordingly, the user is allowed to feel the accurate sense of depth or distance regarding the 3D virtual object.Type: ApplicationFiled: April 15, 2014Publication date: December 25, 2014Applicant: Center Of Human-Centered Interaction For CoexistenceInventors: Joung Huem Kwon, Romain Destenay, Jai Hi Cho, Bum Jae You
-
Patent number: 8884746Abstract: An apparatus for generating tactile sensation by using a magnetic field, includes: a first magnet and a second one which are placed across a target as a subject to provide tactile sensation; and a first electromagnet placed between the target and the first magnet; wherein a strength of the whole magnetic field between the first and the second magnets is allowed to be adjusted by adjusting at least either of a direction and a strength of a magnetic field arising from the first electromagnet to control the strength of the force with which the target is pressed by the first and the second magnets. Because a magnet itself that generates a magnetic field without any complicated mechanical components may perform a function as an actuator that generates tactile sensation, the apparatus for generating tactile sensation may become simple, light and compact.Type: GrantFiled: April 15, 2014Date of Patent: November 11, 2014Assignee: Center Of Human-Centered Interaction For CoexistenceInventors: Jai Hi Cho, Joung Huem Kwon, Shin Young Kim, Bum Jae You
-
Publication number: 20140218467Abstract: Disclosed herein is a 3D teleconferencing apparatus and method enabling eye contact. The 3D teleconferencing apparatus enabling eye contact according to the present invention includes an image acquisition unit for acquiring depth images and color images by manipulating cameras in real time in consideration of images obtained by capturing a subject that is a teleconference participant and images received over a network and corresponding to a counterpart involved in the teleconference; a full face generation unit for generating a final depth image and a final color image corresponding to a full face of the participant for eye contact using the depth images and the color images; and a 3D image generation unit for generating a 3D image corresponding to the counterpart and displaying the 3D image on a display device.Type: ApplicationFiled: July 6, 2012Publication date: August 7, 2014Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCEInventors: Bum-Jae You, Eun-Kyung Lee, Ji-Yong Lee, Jai-Hi cho, Shin-Young Kim
-
Publication number: 20140025201Abstract: The present invention provides a method for planning a path for an autonomous walking humanoid robot that takes an autonomous walking step using environment map information, the method comprising: an initialization step of initializing path input information of the autonomous walking humanoid robot using origin information, destination information, and the environment map information; an input information conversion step of forming a virtual robot including information on the virtual robot obtained by considering the radius and the radius of gyration of the autonomous walking humanoid robot based on the initialized path input information; a path generation step of generating a path of the virtual robot using the virtual robot information, the origin information S, the destination information G, and the environment map information; and an output information conversion step of converting the path of the autonomous walking humanoid robot based on the virtual robot path generated in the path generation step.Type: ApplicationFiled: December 31, 2012Publication date: January 23, 2014Applicants: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE, KOREA UNIVERSITY RESEARCH AND BUSINESS FOUNDATIONInventors: Korea University Research and Business Foundation, Center of Human-Centered Interaction for Coexistence