Patents by Inventor Bum Jae You

Bum Jae You has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10524701
    Abstract: A motion capture system includes a motion sensor having a flexible body and a fiber bragg gratings (FBG) sensor inserted into the body, a fixture configured to fix the motion sensor to a human body of a user, a light source configured to irradiate light to the motion sensor, and a measurer configured to analyze a reflected light output from the motion sensor, wherein the FBG sensor includes an optical fiber extending along a longitudinal direction of the body and a sensing unit formed in a partial region of the optical fiber and having a plurality of gratings, and wherein a change of a wavelength spectrum of the reflected light, caused by the change of an interval of the gratings due to a motion of the user, is detected to measure a motion state of the user.
    Type: Grant
    Filed: June 2, 2017
    Date of Patent: January 7, 2020
    Assignees: Korea Institute of Science and Technology, Center of Human-Centered Interaction for Coexistence
    Inventors: Jinseok Kim, Hyun Joon Shin, Bum-Jae You, Sungwook Yang, Minsu Jang, Jun Sik Kim
  • Publication number: 20190332172
    Abstract: Provided is a force conveyance system that is configured to have 6 degrees of freedom, thereby allowing freedom of movement such as opening/closing of a hand and adduction/abduction of a finger and reflecting a desired force to a fingertip without obstructing the movement of a finger. Also, the force conveyance system may estimate a fingertip position and a finger joint angle, measure a finger movement, and convey a more accurate force accordingly.
    Type: Application
    Filed: June 13, 2017
    Publication date: October 31, 2019
    Inventors: Joon Bum BAE, Jeong Soo LEE, In Seong JO, Yeon Gyu PARK, Bum Jae YOU
  • Patent number: 10434322
    Abstract: A bio-stimulation robot includes a stationary platform, a plurality of drive modules coupled to the stationary platform, and a motion platform coupled to the drive modules to operate to change a position of the motion platform. Each of the drive modules includes a first guide member having an arc shape, a motion member coupled to the first guide, and a leg member having a first end coupled to the motion member and a second end fixed to the motion platform. The motion member slides along the first guide member. The second end of the leg member is rotatably connected to the motion platform. The second end of the leg member is rotatably connected to the motion platform.
    Type: Grant
    Filed: May 17, 2016
    Date of Patent: October 8, 2019
    Assignees: Center of Human-Centered Interaction for Coexistence, Industry University-Cooperation Foundation Hanyang University Erica Campus
    Inventors: Sungon Lee, Jun-Woo Kim, Woo-Seok Ryu, Sung-Teak Cho, Hyung-Min Kim, Bum-Jae You
  • Publication number: 20190206119
    Abstract: A mixed reality display device according to one embodiment of the present invention comprises: a virtual environment rendering unit for generating a virtual object by using information on a scene in a virtual reality, and then generating a color map and a depth map for the virtual object; a depth rendering unit for generating a depth map for a real object by using information on a real environment; an occlusion processing unit for performing occlusion processing by using the color map and the depth map, for the virtual object, received from the virtual environment rendering unit, the depth map, for a real object, received from the depth rendering unit, and a color map, for the real object, received from a see-through camera; and a display unit for outputting a color image by using a color map for the virtual object and a color map for the real object, which are received from the occlusion processing unit.
    Type: Application
    Filed: June 12, 2017
    Publication date: July 4, 2019
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Sang Hun Nam, Joung Huem Kwon, Younguk Kim, Bum Jae You
  • Patent number: 10331217
    Abstract: Disclosed is an actuator including a support member, an actuating unit rotatably installed in the support member and having a first electrode installed on one side and a stimulation providing unit installed on the other side to provide stimulation by rotation, and an attraction force providing unit having a second electrode to provide an attraction force to the first electrode, wherein when an electrostatic attraction force is provided to the first electrode through the second electrode, the actuating unit pivots to enable the stimulation providing unit to apply stimulation to a sensing unit.
    Type: Grant
    Filed: March 13, 2018
    Date of Patent: June 25, 2019
    Assignees: Korea Institute of Science and Technology, Center of Human-Centered Interaction for Coexistence
    Inventors: Youngsu Cha, Kahye Song, Jung Min Park, Bum-Jae You
  • Publication number: 20190171290
    Abstract: Provided is a tactile transmission device, which includes a base unit forming one surface of the tactile transmission device, a tip-tilt elastic member stacked on the base unit and configured to transmit a tactile feel to a finger of a user in a first direction oriented upward from a bottom surface of the finger and a second direction intersecting the first direction at a predetermined angle, and a cover disposed at an upper side of the tip-tilt elastic member to form another surface of the tactile transmission device.
    Type: Application
    Filed: November 26, 2018
    Publication date: June 6, 2019
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Donghyun HWANG, Keehoon Kim, Byeongkyu Lim, Bum-Jae You
  • Publication number: 20190121434
    Abstract: Disclosed is an actuator including a support member, an actuating unit rotatably installed in the support member and having a first electrode installed on one side and a stimulation providing unit installed on the other side to provide stimulation by rotation, and an attraction force providing unit having a second electrode to provide an attraction force to the first electrode, wherein when an electrostatic attraction force is provided to the first electrode through the second electrode, the actuating unit pivots to enable the stimulation providing unit to apply stimulation to a sensing unit.
    Type: Application
    Filed: March 13, 2018
    Publication date: April 25, 2019
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Youngsu Cha, Kahye Song, Jung Min Park, Bum-Jae You
  • Publication number: 20190124153
    Abstract: The present invention relates to a data processing device, method, and computer program for data sharing among multiple users. The device includes a sensor module collecting data by using at least one of a camera sensor, a distance sensor, a microphone array, a motion capture sensor, an environment scanner, and a haptic device; a memory module storing and controlling the data collected by the sensor module; and a network module transmitting the data stored in the memory module to a remote location, or receiving predetermined data from the remote location, wherein the memory module stores the stored data and the predetermined data as data in a standardized format according to data features.
    Type: Application
    Filed: April 24, 2017
    Publication date: April 25, 2019
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Sang Hun Nam, Joung Huem Kwon, Eunseok Choi, Bum Jae You
  • Patent number: 10251556
    Abstract: Provided is an in vivo bioimaging method including irradiating near-infrared (NIR) light onto a living body, converting the NIR light passed through the living body, into visible light using upconversion nanoparticles (UCNPs), and generating a bioimage of the living body by receiving the visible light using a complementary metal-oxide-semiconductor (CMOS) image sensor.
    Type: Grant
    Filed: July 31, 2017
    Date of Patent: April 9, 2019
    Assignees: Korea Institute of Science an Technology, Center of Human-Centered Interaction for Coexistence
    Inventors: Hwa Sup Lim, Seok Joon Kwon, Sang Chul Ahn, Bum Jae You
  • Publication number: 20190099782
    Abstract: Disclosed is an actuator generating haptic sensations, the actuator having a spherical rotor driven by a magnetic force vector created around the same, a stator having a space corresponding in shape to the spherical rotor defined therein to allow the spherical rotor to be positioned in the space and having a portion of an upper part of the spherical rotor exposed, at least three rotation-driving coils formed in the stator at a given distance from each other to provide the magnetic force vector to the spherical rotor, and a driving unit independently controlling electric current supplied to each of the rotation-driving coils to create the magnetic force vector.
    Type: Application
    Filed: March 23, 2017
    Publication date: April 4, 2019
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Jai Hi Cho, Shin Young Kim, Dae Keun Yoon, Joong Jae Lee, Bum Jae You
  • Publication number: 20190087011
    Abstract: A virtual model control system and method for controlling a virtual model formed in virtual space are provided. The virtual model control system and method according to an embodiment of the present disclosure increases the accuracy of implementation by independently controlling two virtual objects combined with each other, and in the event of movement, performs location correction of the object so that their combination is maintained, thereby achieving more accurate control of the two virtual objects combined with each other.
    Type: Application
    Filed: September 12, 2018
    Publication date: March 21, 2019
    Applicant: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Junsik KIM, Jung Min PARK, Bum-Jae YOU
  • Publication number: 20190080517
    Abstract: A three-dimensional information augmented video see-through display device according to an exemplary embodiment of the present disclosure includes a camera interface module which obtains at least two real images from at least two camera modules, a rectification module which performs rectification on the at least two real images, a lens distortion correction module which corrects at least two composite images obtained by combining a virtual image to the at least two real images, based on a lens distortion compensation value indicating a value for compensating for a distortion of a wide angle lens for the at least two real images, and an image generation module which performs side-by-side image processing on the at least two composite images to generate a three-dimensional image for virtual reality VR or augmented reality AR.
    Type: Application
    Filed: April 12, 2017
    Publication date: March 14, 2019
    Inventors: Bum-Jae YOU, Juseong LEE
  • Publication number: 20180351476
    Abstract: Disclosed is a flexible printed circuit board (FPCB) actuator including an FPCB core having a first surface and a second surface, wherein the first surface and the second surface are parallel to each other, a first electrode installed on the first surface and having first parts, wherein the first parts are spaced apart from each other in a first direction at least in part, and a second electrode installed covering at least a portion of the second surface, wherein as control voltage is applied to the first and second electrodes, an electrostatic force generated between the first electrode and the second electrode in a second direction perpendicular to the first direction allows the FPCB core to make a bending motion.
    Type: Application
    Filed: December 20, 2017
    Publication date: December 6, 2018
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Youngsu CHA, Junseok Lee, Jung Min Park, Bum-Jae You
  • Patent number: 10129076
    Abstract: Provided is a parallel fieldbus network-based motor control system including one or more slave modules each including a basic processor and an auxiliary processor that control one or more motors, and a master module including at least one master controller that generates command data for controlling each of the one or more motors. The master module further includes a basic network master controller, an auxiliary network master controller and a wireless network master controller, and the slave module includes a basic network slave controller, an auxiliary network slave controller and a wireless network module.
    Type: Grant
    Filed: August 10, 2016
    Date of Patent: November 13, 2018
    Assignee: Center of Human-centered Interaction for Coexistence
    Inventors: Bum Jae You, Dae Keun Yoon, Shin Young Kim, Jai Hi Cho
  • Patent number: 10038759
    Abstract: In accordance with an aspect, there is provided a method for supporting provision of service so that a client terminal is provided with desired service by adaptively modifying a network topology depending on service properties, including (a) when service type information indicating a type of desired service is acquired from the client terminal, and status information indicating status of one or more service provision servers is acquired, acquiring, by a management server, network configuration information as information corresponding to the service type information and the status information with reference to a DB, wherein the network configuration information is required by the client terminal to be provided with the service from a specific service provision server; and (b) transmitting, by the management server, acquired network configuration information to the client terminal, thus supporting network configuration such that the client terminal configures a network based on the network configuration informa
    Type: Grant
    Filed: July 14, 2016
    Date of Patent: July 31, 2018
    Assignee: CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Joong Jae Lee, Eun Mi Lee, Sang Hun Nam, Bum Jae You
  • Patent number: 10032313
    Abstract: A head-mounted device (HMD) for enabling a 3D drawing interaction in a mixed-reality space is provided. The HMD includes a frame section, a rendering unit providing a specified image, a camera unit attached to the frame section to pick up an image for rendering, and a control unit configured to, when the camera unit picks up an image of a specified marker, perform a calibration process based on position information of the image of the marker displayed on a screen of the HMD and to, when there is a motion of an input device for interaction with a virtual whiteboard, obtain position information of an image of the input device displayed on a virtual camera screen based on position information of the whiteboard.
    Type: Grant
    Filed: June 2, 2016
    Date of Patent: July 24, 2018
    Assignee: Center of Human-Centered Interaction for Coexistence
    Inventors: Ki Won Yeom, Joung Huem Kwon, Ji Yong Lee, Bum Jae You
  • Patent number: 10019064
    Abstract: Provided is a hand-held type user interface device for providing force feedback to a user according to an interaction with a virtual object or a user at a remote place, which remarkably increases the sense of reality in the interaction with the virtual object without limiting an action of the user by sensing the three-dimensional (3D) position and direction of the device and diversifying a force feedback provided to the user according to an action of the user holding the device.
    Type: Grant
    Filed: August 10, 2016
    Date of Patent: July 10, 2018
    Assignee: Center of Human-centered Interaction for Coexistence
    Inventors: Kwang Kyu Lee, Shin Young Kim, Dae Keun Yoon, Bum Jae You
  • Patent number: 9927869
    Abstract: Disclosed is an apparatus for outputting a virtual keyboard, the apparatus including: a virtual keyboard image output unit determining coordinates of a virtual keyboard image by using hand information of a user and outputting the virtual keyboard image; a contact recognition unit determining a contact state by using collision information between a virtual physical collider associated with an end point of a user's finger and a virtual physical collider associated with each virtual key of the virtual keyboard image; a keyboard input unit providing multiple input values for a single virtual key; and a feedback output unit outputting respective feedback for the multiple input values. Accordingly, input convenience and efficiency may be provided by outputting the virtual keyboard in a three dimensional virtual space and reproducing an input method using a keyboard form that is similar to the real world.
    Type: Grant
    Filed: June 12, 2017
    Date of Patent: March 27, 2018
    Assignee: Center Of Human-Centered Interaction for Coexistence
    Inventors: Joung Huem Kwon, Yongho Lee, Bum Jae You
  • Publication number: 20180055367
    Abstract: Provided is an in vivo bioimaging method including irradiating near-infrared (NIR) light onto a living body, converting the NIR light passed through the living body, into visible light using upconversion nanoparticles (UCNPs), and generating a bioimage of the living body by receiving the visible light using a complementary metal-oxide-semiconductor (CMOS) image sensor.
    Type: Application
    Filed: July 31, 2017
    Publication date: March 1, 2018
    Applicants: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, CENTER OF HUMAN-CENTERED INTERACTION FOR COEXISTENCE
    Inventors: Hwa Sup LIM, Seok Joon KWON, Sang Chul AHN, Bum Jae YOU
  • Publication number: 20170371405
    Abstract: Disclosed is an apparatus for outputting a virtual keyboard, the apparatus including: a virtual keyboard image output unit determining coordinates of a virtual keyboard image by using hand information of a user and outputting the virtual keyboard image; a contact recognition unit determining a contact state by using collision information between a virtual physical collider associated with an end point of a user's finger and a virtual physical collider associated with each virtual key of the virtual keyboard image; a keyboard input unit providing multiple input values for a single virtual key; and a feedback output unit outputting respective feedback for the multiple input values. Accordingly, input convenience and efficiency may be provided by outputting the virtual keyboard in a three dimensional virtual space and reproducing an input method using a keyboard form that is similar to the real world.
    Type: Application
    Filed: June 12, 2017
    Publication date: December 28, 2017
    Applicant: Center of Human-Centered Interaction For Coexistence
    Inventors: Joung Huem Kwon, Yongho Lee, Bum Jae You