Patents by Inventor Young Jo Cho

Young Jo Cho has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 7584284
    Abstract: Provided is a path-token-based web service caching method including determining whether or not stored cache data exists when a web service call request exists, and when the cache data does not exist, creating a predetermined path-token set and a predetermined tag data set based on a message schema of Web Services Description Language (WSDL), and creating a request Simple Object Access Protocol (SOAP) message, creating a request SOAP message template by using a path-token for the created request SOAP message, and calling the web service, and creating cache data including the tag data set, input values set, the request SOAP message template, the request SOAP message, and SOAP binding information. Accordingly, the method can solve the problems of a conventional web service caching method whereby the method can not cope with change in the number of inputs, and an exact input position is not searched for when an input value is changed.
    Type: Grant
    Filed: December 7, 2006
    Date of Patent: September 1, 2009
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Daeha Lee, Byoung Youl Song, Rockwon Kim, Jin Young Moon, Yeon Jun Kim, Moonyoung Chung, Kyung Il Kim, Seung Woo Jung, Hyeonsung Cho, Young Jo Cho
  • Publication number: 20090153499
    Abstract: A system for recognizing a touch action that a person has performed upon an object, is provided with a sensor component for detecting a sensor value corresponding to the touch action through an inertial sensor attached to part of a body of the person; and a signal processing component for recognizing the touch action from the sensor value detected by the sensor component and transferring the recognized touch action to the object. Further, a method for recognizing a touch action that a person has performed upon an object, is provided with: detecting a sensor value corresponding to the touch action through an inertial sensor attached to part of a body of the person; and recognizing the touch action from the detected sensor value and transferring the recognized touch action to the object.
    Type: Application
    Filed: August 13, 2008
    Publication date: June 18, 2009
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jae Hong KIM, Sang Seung Kang, Joo Chan Sohn, Hyun Kyu Cho, Young-Jo Cho
  • Publication number: 20090099693
    Abstract: A system for control of emotional action expression including an emotion engine for creating an emotion according to information provided from a plurality of sensors, and an emotional action expression/actuation control unit for detecting an emotion platform profile and an emotion property from the created emotion and determining the action expression corresponding to the created emotion to control a target actuator. A control unit controls the motion of the target actuator under the control of the emotional action expression/actuation control unit.
    Type: Application
    Filed: September 10, 2008
    Publication date: April 16, 2009
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Sang Seung Kang, Jae Hong Kim, Joo Chan Sohn, Hyun Kyu Cho, Young Jo Cho
  • Publication number: 20090018698
    Abstract: The present invention relates to a network-based robot system and an executing method thereof. According to an exemplary embodiment of the present invention, predefine environment information is expressed in a universal data model (UDM) described by a linkage that shows a relationship among nodes, each node being an object of a virtual space abstracted by a real physical space. The universal data model is updated based on the context information, event occurrence information is transmitted to a task engine when the context information data value is changed, and the task engine executes a corresponding task through reasoning and invokes an external service. The robot can better recognize the context information by utilizing the external sensing function and external processing function. In addition, the robot system can provide an active service by reasoning the recognized context information and obtaining high-level information.
    Type: Application
    Filed: April 27, 2005
    Publication date: January 15, 2009
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTIT
    Inventors: Hyun Kim, Kang-Woo Lee, Joo-Haeng Lee, Tae-Gun Kang, Ae-Kyeung Moon, Young-Ho Suh, Joon-Myun Cho, Young-Jo Cho
  • Publication number: 20080310682
    Abstract: Provided is a system and method for providing location information of robot in real time using an artificial mark. The system includes: an image processing module for obtaining an image signal by photographing artificial marks installed at a predetermined space with a space coordinate and detecting an image coordinate of artificial mark from the obtaining the image signal; a location calculating module for calculating a current location by comparing the image coordinate of detected artificial mark and a pre-stored space coordinate of artificial mark; and an artificial mark identifying module for updating current location information by selectively using one of an artificial mark tracing process and an image coordinate estimating process.
    Type: Application
    Filed: December 14, 2005
    Publication date: December 18, 2008
    Inventors: Jae Yeong Lee, Heesung Chae, Won Pil Yu, Young Jo Cho
  • Publication number: 20080137950
    Abstract: A system and method for analyzing the motions of an object based on the silhouettes of the object are provided. The system includes a foreground detector, a contour extractor, a model generator, a corner histogram generator, and a value of similarity measuring unit. The foreground detector detects a moving foreground object from an input image. The contour extractor extracts silhouette contour of the detected foreground object, and the model generator generates mean value histogram models as references to determine motions of the object.
    Type: Application
    Filed: November 6, 2007
    Publication date: June 12, 2008
    Inventors: Chan Kyu Park, Joo Chan Sohn, Hyun Kyu Cho, Young Jo Cho
  • Publication number: 20080140809
    Abstract: Provided are a system and method for providing contents service. A service storing apparatus stores service providing information and service request information. A service requesting apparatus composes a service search inquiry according to a contents service request, receives the inquiry result, and calls a corresponding service based on the received result to provide a corresponding contents service. A service relaying apparatus searches related service providing information from the service storing apparatus to provide information necessary for calling the service when the service search inquiry is received. A service providing apparatus provides service proxy information of a content service and provides a corresponding contents service when a service is called by a service requesting apparatus.
    Type: Application
    Filed: October 26, 2007
    Publication date: June 12, 2008
    Inventors: Rock Won KIM, Yeon Jun KIM, Hyun KIM, Young Jo CHO
  • Publication number: 20080119959
    Abstract: A method and apparatus for expressing an emotion of a robot, which are applicable to different emotion robot platforms are provided. The method of expressing an emotion of a robot, the method includes: collecting emotion information by at least one internal or external sensor; generating an emotion and determining a behavior based on the collected information; determining an emotion expression, emotional intensity, and an action unit according to the generated emotion; generating an emotion expression document according to the determined emotion expression, emotional intensity, and action unit; analyzing the emotion expression document; and controlling the robot based on the initial status information of the robot and the generated emotion expression document.
    Type: Application
    Filed: October 31, 2007
    Publication date: May 22, 2008
    Inventors: Cheonshu PARK, Joung Woo RYU, Joo Chan SOHN, Young Jo CHO
  • Publication number: 20080082209
    Abstract: A robot actuator and a robot actuating method. In the robot actuator, when an input part detects an external stimulus signal according to a user's contact, a control part receives the detected external stimulus signal to create sensor data. The control part determines an output reaction, and an output actuator through the created sensor data and controls the output actuator according to the determined output reaction. Thus, an axial skeletal unit of an output part is moved according to an operation of the output actuator to express the output reaction. Accordingly, a natural, lively reaction of the robot actuator to an external stimulus can be achieved.
    Type: Application
    Filed: September 26, 2007
    Publication date: April 3, 2008
    Inventors: Sang Seung KANG, Jae Hong KIM, Joo Chan SOHN, Young Jo CHO
  • Publication number: 20080082342
    Abstract: An apparatus for providing a content-information service comprises: a user-content interface for receiving content-provision request information collected by several user I/O interfaces which includes a voice recognition interface, and providing content data corresponding to the provision request information to users; a content-provision relay for requesting the content data using content-associated information corresponding to the content-provision request information, and transmitting the content data to the user-content interface; a content-information manager for registering and managing the content-associated information associated with the content data; and a content-storage unit for storing and managing a plurality of providable content data.
    Type: Application
    Filed: September 18, 2007
    Publication date: April 3, 2008
    Inventors: Rock Won Kim, Kang Woo Lee, Young Ho Suh, Min Young Kim, Yeon Jun Kim, Hyun Kim, Young Jo Cho
  • Publication number: 20080077277
    Abstract: An apparatus and method for expressing emotions in an intelligent robot. In the apparatus, a plurality of different sensors sense information about internal/external stimuli. A state information collector processes the detected information about the internal/external stimuli in a hierarchical structure to collect external state information. An emotion motive determiner determines an emotion need motive on the basis of the external state information and the degree of a change in an emotion need parameter corresponding to internal state information. An emotion state manager extracts available means information for satisfaction of the generated emotion needs motive and state information corresponding to the available means information. An emotion generator generates emotion information on the basis of a feature value of the extracted state information.
    Type: Application
    Filed: August 28, 2007
    Publication date: March 27, 2008
    Inventors: Cheon Shu PARK, Joung Woo RYU, Joo Chan SOHN, Young Jo CHO
  • Publication number: 20070150098
    Abstract: An apparatus for controlling a robot and a method thereof are provided. The apparatus includes: a state interpretation unit determining whether or not a current situation belongs to a preset unstable state by evaluating the current situation based on a plurality of perception information items; and a target generation unit setting a target action of the robot by comparing the current situation and the determination result with a predetermined value system, and then, modifying the target action by receiving a feedback of the action performance result of the robot as perception information. According to the method and apparatus, by inputting a processing procedure and a value system to solve a variety of unstable states that can occur in situations of a user and circumstances surrounding the robot, the robot can actively respond with actions.
    Type: Application
    Filed: November 1, 2006
    Publication date: June 28, 2007
    Inventors: Min Su Jang, Joo Chan Sohn, Young Cheol Go, Sang Seung Kang, Young Jo Cho
  • Publication number: 20070150097
    Abstract: A localization system and method of a mobile robot using a camera and artificial landmarks in a home and a general office environment (or working zone) is provided. The localization system includes artificial landmarks having an LED flash function in an invisible wavelength band, a camera with a wide-angle lens, a module flashing landmarks attached at the ceiling and identifying positions and IDs of the landmarks from an image photographed by the camera having a filter, a module calculating position and orientation of the robot using two landmarks of the image in a stop state, a module, when a ceiling to which the landmarks are attached has different heights, a position of the robot, and a module, when a new landmark is attached in the working zone, calculating a position of the new landmark on an absolute coordinate.
    Type: Application
    Filed: August 23, 2006
    Publication date: June 28, 2007
    Inventors: Heesung Chae, Won Pil Yu, Jae Young Lee, Young Jo Cho
  • Publication number: 20070150102
    Abstract: A method of supporting robot application programming and a programming tool for the same are provided. In the method of supporting robot application programming, behaviors that constitute operations to be performed by a robot are assembled in the programming tool for programming an application program for the operations to be performed by the robot. The method includes: (a) classifying the operations to be performed by the robot by functions of the robot; (b) displaying the behavior of the robot which can constitute one of the functions on a display device in a graphical form with block shapes that can be connected one another visually with a plurality of conditions; and (c) converting a set of the blocks into an XML file, when the function of the robot is constructed as a robot task by the set of blocks in the graphical form.
    Type: Application
    Filed: December 7, 2006
    Publication date: June 28, 2007
    Inventors: Joong Ki Park, Joong Bae Kim, Woo Young Kwon, Kyeong Ho Lee, Young Jo Cho
  • Publication number: 20060204058
    Abstract: In realizing a user recognition system and a method thereof, a user feature vector is extracted from an input facial image, and at least one cluster is generated and a user feature template is enrolled. The cluster includes the feature vector as a member of it. When a user facial image is inputted, a user feature vector is extracted from the image, and a similarity between the feature vector and the user feature template is calculated. When the similarity is greater than a predetermined threshold value, a user of the user feature template is recognized as the user of the input image.
    Type: Application
    Filed: December 1, 2005
    Publication date: September 14, 2006
    Inventors: Do-Hyung Kim, Jae-Yeon Lee, Ho-Sub Yoon, Young-Jo Cho
  • Publication number: 20060147001
    Abstract: The present invention relates to a remote control system, a remote server, a remote control agent, and a remote control method. According to the present invention, when controlling the remote apparatus to perform a target service, remote apparatus information and service information are searched and a service plan is generated based on the searched remote apparatus information and service information for execution of the target service. In addition, a remote apparatus control process generated by the service plan and the process is performed. Therefore, users can easily control the remote apparatus and can be provided desired services.
    Type: Application
    Filed: November 22, 2005
    Publication date: July 6, 2006
    Inventors: Young-Guk Ha, Joo-Chan Sohn, Young-Jo Cho
  • Publication number: 20060120569
    Abstract: A user recognizing system and method is provided. According to the user recognizing system and method, user ID and predetermined user feature information are stored, first and second user feature information are extracted from the user image data transmitted from the image input unit, and first and second probabilities that the extracted first and second user feature information determine the predetermined user are respectively generated based on the information stored at the user information database, the first user feature information being absolutely unique biometric information and the second user feature information being unique semibiometric information under a predetermined condition, and ID of the input image is finally determined by combining the first probability and the second probability. According to the user recognizing system and method, a user identity can be authenticated even when the user freely moves.
    Type: Application
    Filed: November 30, 2005
    Publication date: June 8, 2006
    Inventors: Do-Hyung Kim, Jae-Yeon Lee, Su-Young Chi, Ho-Sub Yoon, Kye-Kyung Kim, Soo-Hyun Cho, Hye-Jin Kim, Young-Jo Cho
  • Patent number: 6731089
    Abstract: The present invention relates to a flexible and compact motor control module based on the Controller Area Network (CAN) communication network. More specifically, the present invention relates to a motor control module which is based on the ISO11898 standard Controller Area Network (CAN), recognized for its suitability in the communication of intelligent sensors and actuators, and also capable of obtaining the location, speed and torque control commands of a motor and executing digital control functions irrespective of motor's type and power consumption, transmitting feedback data.
    Type: Grant
    Filed: November 30, 2001
    Date of Patent: May 4, 2004
    Assignee: Korea Institute of Science and Technology
    Inventors: Young Jo Cho, Sung On Lee, Bum Jae You, Sang Rok Oh
  • Publication number: 20020091469
    Abstract: The present invention relates to a flexible and compact motor control module based on the Controller Area Network (CAN) communication network. More specifically, the present invention relates to a motor control module which is based on the ISO11898 standard Controller Area Network (CAN), recognized for its suitability in the communication of intelligent sensors and actuators, and also capable of obtaining the location, speed and torque control commands of a motor and executing digital control functions irrespective of motor's type and power consumption, transmitting feedback data.
    Type: Application
    Filed: November 30, 2001
    Publication date: July 11, 2002
    Applicant: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY
    Inventors: Young Jo Cho, Sung On Lee, Bum Jae You, Sang Rok Oh
  • Patent number: 6173208
    Abstract: Structured control codes which control a subject system are generated in a process control system having a user interface of a function block diagram editor including basic function block menus which represent a plurality of predetermined basic function blocks. The structured control codes are generated by the steps of: providing input/output data of the subject system in the form of database; retrieving the input/output data of the subject system; providing the input/output data as an input/output block menu on the user interface of the function block diagram editor; building a control algorithm by using the input/output block menus and the basic function block menus on the user interface; and converting the control algorithm to structured control codes.
    Type: Grant
    Filed: April 13, 1998
    Date of Patent: January 9, 2001
    Assignee: Korea Institute of Science and Technology
    Inventors: Jung Min Park, Young Jo Cho, Woo Jung Huh, Eung Seok Kim