Patents by Inventor Wookho Son
Wookho Son has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12148096Abstract: Provided is a method and device for outputting a large-capacity 3D model for an augmented reality (AR) device. A method of outputting a large-capacity 3D model for an AR device includes generating a multi-texture and a 3D mesh based on a multi-view image, generating a 3D model using the multi-texture and the 3D mesh, and transmitting, to the AR device, an image of the 3D model in a view, to which a camera of the AR device is directed, according to camera movement and rotation information of the AR device, and the AR device outputs the image in the view, to which the camera is directed.Type: GrantFiled: October 26, 2022Date of Patent: November 19, 2024Assignee: Electronics and Telecommunications Research InstituteInventors: Jeung Chul Park, Wookho Son, Beom Ryeol Lee, Yongho Lee
-
Publication number: 20230237741Abstract: Provided is a method and device for outputting a large-capacity 3D model for an augmented reality (AR) device. A method of outputting a large-capacity 3D model for an AR device includes generating a multi-texture and a 3D mesh based on a multi-view image, generating a 3D model using the multi-texture and the 3D mesh, and transmitting, to the AR device, an image of the 3D model in a view, to which a camera of the AR device is directed, according to camera movement and rotation information of the AR device, and the AR device outputs the image in the view, to which the camera is directed.Type: ApplicationFiled: October 26, 2022Publication date: July 27, 2023Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Jeung Chul PARK, Wookho SON, Beom Ryeol LEE, Yongho LEE
-
Publication number: 20220221981Abstract: A computing device adapts an interface for extended reality. The computing device collects user information and external environment information when a user loads a virtual interface to experience extended reality content, and selects a highest interaction accuracy from among one or more interaction accuracies mapped to the collected user information and external environment information. The computing device determines content information mapped to the highest interaction accuracy, and reloads the virtual interface based on a state of the virtual interface that is determined based on the determined content information.Type: ApplicationFiled: October 13, 2021Publication date: July 14, 2022Inventors: Yongho LEE, Wookho SON, Beom Ryeol LEE
-
Patent number: 9760795Abstract: A method for extracting features from an image for use in a computing device, the method comprising: producing Gaussian Scale Space (GSS) images in the type of a pyramid from the image inputted to the computing device; performing a Scale Normalized Laplacian Filtering on the GSS images; detecting interest points from the images that are subject to the Scale Normalized Laplacian Filtering; and extracting features of the image using the detected interest points.Type: GrantFiled: April 29, 2014Date of Patent: September 12, 2017Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Seungjae Lee, Sangil Na, Keun Dong Lee, Sungkwan Je, Da-Un Jung, Weon Geun Oh, Young Ho Suh, Wookho Son
-
Patent number: 9224241Abstract: Disclosed is a system for producing a digital holographic (DH) content, which includes: a 3D image information acquiring unit configured to acquire 3D information on real and virtual objects; a computer-generated hologram (CGH) processing unit configured to generate the digital holographic content by mathematical modeling from the 3D information acquired from the 3D image information acquiring unit; a DH content editing unit configured to edit the generated digital holographic content; a DH image restoring unit configured to visualize the generated digital holographic content in a 3D image; and a digital holographic content process managing unit configured to manage a parameter and a processing time of each functioning unit so as to process each processing process of each of the 3D image information acquiring unit, the computer-generated hologram processing unit, the digital holographic content editing unit, and the digital hologram image restoring unit.Type: GrantFiled: December 14, 2012Date of Patent: December 29, 2015Assignee: Electronics and Telecommunications Research InstituteInventor: Wookho Son
-
Publication number: 20150242703Abstract: A method for extracting features from an image for use in a computing device, the method comprising: producing Gaussian Scale Space (GSS) images in the type of a pyramid from the image inputted to the computing device; performing a Scale Normalized Laplacian Filtering on the GSS images; detecting interest points from the images that are subject to the Scale Normalized Laplacian Filtering; and extracting features of the image using the detected interest points.Type: ApplicationFiled: April 29, 2014Publication date: August 27, 2015Applicant: Electronics and Telecommunications Research InstituteInventors: Seungjae LEE, Sangil NA, Keun Dong LEE, Sungkwan JE, DA-UN JUNG, Weon Geun OH, Young Ho SUH, Wookho SON
-
Patent number: 8059091Abstract: Provided are an apparatus and method for inputting characters by making head motion. In the method for inputting a character by making head motion, a predetermined numeral group is selected among a plurality of numeral groups by shifting a center-point key set on a keyboard displayed on a screen in a predetermined direction along a sensed head motion. Then, one of numeral keys in the selected numeral group is selected by sensing the head motion, and the selected numeral key is inputted.Type: GrantFiled: November 29, 2007Date of Patent: November 15, 2011Assignee: Electronics and Telecommunications Research InstituteInventors: Hyuk Jeong, Jong-Sung Kim, Wookho Son
-
Patent number: 7856343Abstract: A design evaluation system of mobile devices using virtual reality based prototypes is proposed. The system includes: a storage unit for storing therein 3-dimensional computer-aided design data of a virtual mobile device; a component selection unit for performing a selection among the 3-dimensional computer-aided design data stored in the storage unit and fetching the selected data therefrom; a design parameter setting unit for setting design parameters of the virtual mobile device; a 3-dimensional model visualization unit for visualizing the virtual mobile device to a 3-dimensional model by using the design parameters and the fetched data; a product motion control unit for simulating motions and functions of the visualized virtual mobile device; and a design preference display unit for displaying user design preference for the visualized and simulated virtual mobile device.Type: GrantFiled: December 12, 2007Date of Patent: December 21, 2010Assignee: Electronics and Telecommunications Research InstituteInventors: Dong-Sik Jo, Ung-Yeon Yang, WookHo Son
-
Patent number: 7804507Abstract: A display apparatus for a mixed reality environment includes an image processor for mixing an actual image of an object around a user and a artificial stereo images to produce multiple external image signals, a user information extractor for extracting the user's sight line information including the user's position his/her eye position, direction of a sight line and focal distance; an image creator for creating a stereo image signal based on the extracted user's sight line information; an image mixer for synchronously mixing the multiple external image signals and the stereo image signal; and an image output unit for outputting the mixed image signal to the user.Type: GrantFiled: December 14, 2006Date of Patent: September 28, 2010Assignee: Electronics and Telecommunications Research InstituteInventors: Ung-Yeon Yang, Gun Adven Lee, Sun Yu Hwang, Wookho Son
-
Patent number: 7783391Abstract: An apparatus and method for controlling vehicle by teeth clenching are provided. The apparatus includes: an electromyogram signal obtaining unit including electromyogram sensors disposed at both sides for generating an electromyogram signal according to a predetermined muscle moved when a disabled person clenches teeth, and a ground electrode connected to a body of the disabled persons for providing a reference voltage; and a vehicle driving unit including a control command generating unit for generating a vehicle driving command according to the electromyogram signal by classifying the electromyogram signal based on a side of teeth clenched, a duration time for clenching teeth and a sequence of teeth clenching motions made by the disabled person, a control command interface for generating a predetermined level of voltage according to the created vehicle driving command, and a vehicle driving unit for driving the vehicle according to the generated voltage.Type: GrantFiled: January 4, 2006Date of Patent: August 24, 2010Assignee: Electronics and Telecommunications Research InstituteInventors: Hyuk Jeong, Jong Sung Kim, Wookho Son
-
Patent number: 7580028Abstract: An apparatus and method for selecting and outputting characters by making a teeth clenching motion for disabled person are disclosed. The apparatus includes: an electromyogram signal obtaining unit including electromyogram sensors disposed at both sides for generating an electromyogram signal according to a predetermined muscle activated when a disabled person clenches teeth, and a ground electrode connected to a body of the disabled persons for providing a reference voltage; and an electromyogram signal processing unit for outputting a character by identifying a teeth clenching motion pattern of each block according to a side of teeth clenched, a duration of clenching teeth and consecutive clenching motions through analyzing the obtained electromyogram and selecting characters according to the identified teeth clenching motion pattern.Type: GrantFiled: April 20, 2006Date of Patent: August 25, 2009Assignee: Electronics and Telecommunications Research InstituteInventors: Hyuk Jeong, Jong Sung Kim, Wookho Son
-
Publication number: 20090157478Abstract: A system for evaluating usability of a virtual mobile information appliance unites a virtual reality technology integrating various digital data created in planning and in designing a product to operate a virtual product and visualizing the virtual product in photo-realistic level, an affective technology organizing a customer's emotional evaluation on a product's design in view of engineering, an ergonomic technology quantatively measuring and analyzing body force activity involved in the operation of a product in a biomechanic manner, and a mixed reality technology supporting both tangible interface capable of directly touching the digital data and a photo-realistic visualization, to thereby finding problems of usability early, obtaining improvements such as a design of the product, improving efficiently an overall quality of the product and managing product-lifetime-cycle in a company manufacturing the product.Type: ApplicationFiled: May 8, 2008Publication date: June 18, 2009Applicant: Electronics and Telecommunications Research InstituteInventors: Ung-Yeon YANG, Dong-Sik JO, Wookho SON
-
Publication number: 20090153587Abstract: A mixed reality system includes a camera for providing captured image information in an arbitrary work environment; a sensing unit for providing sensed information based on operation of the camera; a process simulation unit for performing simulation on part/facility/process data of the arbitrary work environment, which is stored in a process information database (DB); a process allocation unit for handling allocation status between the data and simulation information; a mixed reality visualization unit for receiving the captured information and the sensed information, determining a location of the process allocation unit, combining the captured information and sensed information with the simulation information, and then outputting resulting information; and a display-based input/output unit for displaying mixed reality output information from the mixed reality visualization unit and inputting information requested by a user. Further, there is provided a method of implementing the same.Type: ApplicationFiled: December 12, 2008Publication date: June 18, 2009Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTEInventors: Hyun KANG, Gun Lee, Wookho Son
-
Publication number: 20090063104Abstract: A design evaluation system of mobile devices using virtual reality based prototypes is proposed. The system includes: a storage unit for storing therein 3-dimensional computer-aided design data of a virtual mobile device; a component selection unit for performing a selection among the 3-dimensional computer-aided design data stored in the storage unit and fetching the selected data therefrom; a design parameter setting unit for setting design parameters of the virtual mobile device; a 3-dimensional model visualization unit for visualizing the virtual mobile device to a 3-dimensional model by using the design parameters and the fetched data; a product motion control unit for simulating motions and functions of the visualized virtual mobile device; and a design preference display unit for displaying user design preference for the visualized and simulated virtual mobile device.Type: ApplicationFiled: December 12, 2007Publication date: March 5, 2009Applicant: Electronics and Telecommunications Research InstituteInventors: Dong-Sik JO, Ung-Yeon Yang, WookHo Son
-
Patent number: 7460125Abstract: Provided is an apparatus and method for immediately creating and controlling a virtual reality interactive human body model for a user-centric interface. The apparatus and method can transform a whole body three-dimensional (3D) default model into a model close to the body of a user by collecting data from the user's hands, generates a user coincident 3D model by selecting a skin texture map and applying it to the transformed model, and controlling the user coincident 3D model in a deformable polygon mesh structure to naturally visualize the motion of the user coincident 3D model. The apparatus and method can make users participate in a virtual reality system conveniently and work naturally in the same manner as reality, i.e., interaction. The apparatus and method provides ordinary users with an easy access to virtual reality systems and overcomes the shortcomings of existing virtual reality systems which simply provide visual effects.Type: GrantFiled: May 23, 2006Date of Patent: December 2, 2008Assignee: Electronics and Telecommunications Research InstituteInventors: Ung Yeon Yang, Wookho Son
-
Publication number: 20080136681Abstract: Provided are an apparatus and method for inputting characters by making head motion. In the method for inputting a character by making head motion, a predetermined numeral group is selected among a plurality of numeral groups by shifting a center-point key set on a keyboard displayed on a screen in a predetermined direction along a sensed head motion. Then, one of numeral keys in the selected numeral group is selected by sensing the head motion, and the selected numeral key is inputted.Type: ApplicationFiled: November 29, 2007Publication date: June 12, 2008Applicant: Electronics and Telecommunications Research InstituteInventors: Hyuk JEONG, Jong Sung Kim, Wookho Son
-
Publication number: 20080024597Abstract: A display apparatus for a mixed reality environment includes an image processor for mixing an actual image of an object around a user and a artificial stereo images to produce multiple external image signals, a user information extractor for extracting the user's sight line information including the user's position his/her eye position, direction of a sight line and focal distance; an image creator for creating a stereo image signal based on the extracted user's sight line information; an image mixer for synchronously mixing the multiple external image signals and the stereo image signal; and an image output unit for outputting the mixed image signal to the user.Type: ApplicationFiled: December 14, 2006Publication date: January 31, 2008Inventors: Ung-Yeon Yang, Gun Adven Lee, Sun Yu Hwang, Wookho Son
-
Publication number: 20070164985Abstract: An apparatus and method for selecting and outputting characters by making a teeth clenching motion for disabled person are disclosed. The apparatus includes: an electromyogram signal obtaining unit including electromyogram sensors disposed at both sides for generating an electromyogram signal according to a predetermined muscle activated when a disabled person clenches teeth, and a ground electrode connected to a body of the disabled persons for providing a reference voltage; and an electromyogram signal processing unit for outputting a character by identifying a teeth clenching motion pattern of each block according to a side of teeth clenched, a duration of clenching teeth and consecutive clenching motions through analyzing the obtained electromyogram and selecting characters according to the identified teeth clenching motion pattern.Type: ApplicationFiled: April 20, 2006Publication date: July 19, 2007Inventors: Hyuk Jeong, Jong Kim, Wookho Son
-
Publication number: 20070132722Abstract: A hand interface glove using a miniaturized absolute position sensor and a hand interface system using the same for allowing a user to naturally interact with a virtual environment are provided. The hand interface glove includes a glove unit formed in a shape of a hand to be worn by a hand; a sensor unit for sensing analog signals representing absolute positions of finger joints, which change according to motions made by the finger joints, by disposing a plurality of miniaturized absolute sensors tracking the absolute positions of finger joints at predetermined positions of the glove unit corresponding to finger joints; and a data collecting unit for receiving the sensed analog signals from the sensing unit, transforming the analog signals to digital signals through amplifying and filtering the received analog signals, and outputting the digital signals.Type: ApplicationFiled: October 27, 2006Publication date: June 14, 2007Inventors: Yong Kim, Yongseok Jang, Wookho Son
-
Publication number: 20070126733Abstract: Provided is an apparatus and method for immediately creating and controlling a virtual reality interactive human body model for a user-centric interface. The apparatus and method can transform a whole body three-dimensional (3D) default model into a model close to the body of a user by collecting data from the user's hands, generates a user coincident 3D model by selecting a skin texture map and applying it to the transformed model, and controlling the user coincident 3D model in a deformable polygon mesh structure to naturally visualize the motion of the user coincident 3D model. The apparatus and method can make users participate in a virtual reality system conveniently and work naturally in the same manner as reality, i.e., interaction. The apparatus and method provides ordinary users with an easy access to virtual reality systems and overcomes the shortcomings of existing virtual reality systems which simply provide visual effects.Type: ApplicationFiled: May 23, 2006Publication date: June 7, 2007Inventors: Ung Yang, Wookho Son