Patents by Inventor Young Woo Yoon

Young Woo Yoon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240123827
    Abstract: A high-voltage battery control apparatus includes a plurality of high-voltage battery controllers configured to respectively monitor states of a plurality of high-voltage batteries included in one high-voltage battery pack depending on a predetermined period for monitoring the high-voltage battery pack, wherein the high-voltage battery controllers are configured to transition state of the plurality of high-voltage battery controllers together to an end state when a high-voltage battery controller among the high-voltage battery controllers, which has completed a monitoring operation, first waits until all monitoring operations of the high-voltage battery controllers that have not yet completed monitoring operations are completed and then all the monitoring operations of the high-voltage battery controllers are completed.
    Type: Application
    Filed: April 14, 2023
    Publication date: April 18, 2024
    Inventors: Geon Woo Park, Young Tae Ko, Hyeon Jun Kim, Byung Mo Kang, Jong Seo Yoon
  • Publication number: 20240128572
    Abstract: An eco-friendly power source such as a battery module is provided for a transportation vehicle, including: a first sub-module and a second sub-module respectively including a cell stack in which a plurality of battery cells are stacked; and a central wall disposed between the first sub-module and the second sub-module, wherein the central wall includes a first central wall facing the first sub-module and a second central wall facing the second sub-module, wherein the first central wall has a rotationary symmetrical shape of the second central wall around a first axis.
    Type: Application
    Filed: January 17, 2023
    Publication date: April 18, 2024
    Inventors: Ho Yeon KIM, Sang Tae AN, Hwa Kyoo YOON, Gang U LEE, Young Sun CHOI, Jeong Woo HAN
  • Publication number: 20240094601
    Abstract: A camera module includes a housing, a movable body configured to move in a direction of an optical axis of the housing; a reinforcing member formed integrally on one surface of the movable body, and configured to increase a rigidity of the movable body; and a first buffer member formed in the reinforcing member, and configured to reduce an impactive force between the housing and the movable body.
    Type: Application
    Filed: November 29, 2023
    Publication date: March 21, 2024
    Applicant: SAMSUNG ELECTRO-MECHANICS CO., LTD.
    Inventors: Jae Hyuk LEE, Soo Cheol LIM, Byung Woo KANG, Young Bok YOON, Jong Woo HONG, Jung Seok LEE
  • Publication number: 20240079547
    Abstract: The present invention relates to electrode slurry coating apparatus and method, the present invention ultimately allowing the process efficiency to be increased and rate of errors to be reduced when double-layer structured active material layers are formed by temporally adjusting the height of first and second discharge outlets through which active material is discharged.
    Type: Application
    Filed: November 7, 2023
    Publication date: March 7, 2024
    Applicant: LG Energy Solution, Ltd.
    Inventors: Taek Soo Lee, Young Joon Jo, Sang Hoon Choy, Ki Tae Kim, Ji Hee Yoon, Cheol Woo Kim
  • Publication number: 20240076182
    Abstract: An operating method is disclosed for a dehydrogenation reaction system. The method includes providing a system having: an acid aqueous solution tank including an acid aqueous solution; a dehydrogenation reactor including a chemical hydride of a solid state and receiving an acid aqueous solution from the acid aqueous solution tank to react the chemical hydride with the acid aqueous solution to generate hydrogen; and a fuel cell stack receiving hydrogen generated from the dehydrogenation reactor to be reacted with oxygen to generate water and simultaneously to generate electrical energy. The method also includes recycling the water generated from the fuel cell stack to one or all of the acid aqueous solution tank, the dehydrogenation reactor, and a separate water tank. The acid is formic acid and, in in the dehydrogenation reactor, the temperature is in a range of 10° C. to 400° C. and the pressure is in a range of 1 bar to 100 bar.
    Type: Application
    Filed: November 7, 2023
    Publication date: March 7, 2024
    Applicants: HYUNDAI MOTOR COMPANY, KIA CORPORATION, KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY
    Inventors: Pyung Soon Kim, Jin Woo Choung, Jihui Seo, Suk Woo Nam, Young Suk Jo, Hyangsoo Jeong, Jaewon Kirk, Chang Won Yoon, Yongmin Kim
  • Patent number: 11740184
    Abstract: Provided is a fiber web for a gas sensor. In one exemplary embodiment of the present invention, there is provided a fiber web for a gas sensor including nanofibers including a fiber-forming material and a sensing material for reacting with a target substance in a test gas. According to the exemplary embodiment, the fiber web for a gas sensor is capable of identifying the presence or absence of a target substance in a test gas and quantitatively determining the concentration of a target substance, and exhibits improved sensitivity due to having an increased area of contact and reaction with a target substance contained in a test gas.
    Type: Grant
    Filed: January 26, 2018
    Date of Patent: August 29, 2023
    Assignee: Amogreentech Co., Ltd.
    Inventor: Young Woo Yoon
  • Patent number: 11691291
    Abstract: Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.
    Type: Grant
    Filed: November 27, 2020
    Date of Patent: July 4, 2023
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Woo-Ri Ko, Do-Hyung Kim, Jae-Hong Kim, Young-Woo Yoon, Jae-Yeon Lee, Min-Su Jang
  • Publication number: 20220055221
    Abstract: Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.
    Type: Application
    Filed: November 27, 2020
    Publication date: February 24, 2022
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Woo-Ri KO, Do-Hyung KIM, Jae-Hong KIM, Young-Woo YOON, Jae-Yeon LEE, Min-Su JANG
  • Publication number: 20210394021
    Abstract: Disclosed herein are an apparatus and method for evaluating a human motion using a mobile robot. The method may include identifying the exercise motion of a user by analyzing an image of the entire body of the user captured using a camera installed in the mobile robot, evaluating the pose of the user by comparing the standard pose of the identified exercise motion with images of the entire body of the user captured by the camera of the mobile robot from two or more target locations, and comprehensively evaluating the exercise motion of the user based on the pose evaluation information of the user from each of the two or more target locations.
    Type: Application
    Filed: November 30, 2020
    Publication date: December 23, 2021
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Jae-Hong KIM, Young-Woo YOON, Jae-Yeon LEE, Min-Su JANG, Jeong-Dan CHOI
  • Patent number: 11113988
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Grant
    Filed: July 1, 2020
    Date of Patent: September 7, 2021
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung Kim, Min-Su Jang, Jae-Hong Kim, Young-Woo Yoon, Jae-Il Cho
  • Publication number: 20200335007
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Application
    Filed: July 1, 2020
    Publication date: October 22, 2020
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Min-Su JANG, Jae-Hong KIM, Young-Woo YOON, Jae-Il CHO
  • Patent number: 10777198
    Abstract: Disclosed herein are an apparatus and method for determining the speech and motion properties of an interactive robot. The method for determining the speech and motion properties of an interactive robot includes receiving interlocutor conversation information including at least one of voice information and image information about an interlocutor that interacts with an interactive robot, extracting at least one of a verbal property and a nonverbal property of the interlocutor by analyzing the interlocutor conversation information, determining at least one of a speech property and a motion property of the interactive robot based on at least one of the verbal property, the nonverbal property, and context information inferred from a conversation between the interactive robot and the interlocutor, and controlling the operation of the interactive robot based on at least one of the determined speech property and motion property of the interactive robot.
    Type: Grant
    Filed: August 13, 2018
    Date of Patent: September 15, 2020
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Young-Woo Yoon, Jae-Hong Kim, Jae-Yeon Lee, Min-Su Jang
  • Patent number: 10748444
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Grant
    Filed: March 6, 2017
    Date of Patent: August 18, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung Kim, Min-Su Jang, Jae-Hong Kim, Young-Woo Yoon, Jae-Il Cho
  • Publication number: 20190391082
    Abstract: Provided is a fiber web for a gas sensor. In one exemplary embodiment of the present invention, there is provided a fiber web for a gas sensor including nanofibers including a fiber-forming material and a sensing material for reacting with a target substance in a test gas. According to the exemplary embodiment, the fiber web for a gas sensor is capable of identifying the presence or absence of a target substance in a test gas and quantitatively determining the concentration of a target substance, and exhibits improved sensitivity due to having an increased area of contact and reaction with a target substance contained in a test gas.
    Type: Application
    Filed: January 26, 2018
    Publication date: December 26, 2019
    Applicant: AMOGREENTECH CO., LTD.
    Inventor: Young Woo YOON
  • Publication number: 20190164548
    Abstract: Disclosed herein are an apparatus and method for determining the speech and motion properties of an interactive robot. The method for determining the speech and motion properties of an interactive robot includes receiving interlocutor conversation information including at least one of voice information and image information about an interlocutor that interacts with an interactive robot, extracting at least one of a verbal property and a nonverbal property of the interlocutor by analyzing the interlocutor conversation information, determining at least one of a speech property and a motion property of the interactive robot based on at least one of the verbal property, the nonverbal property, and context information inferred from a conversation between the interactive robot and the interlocutor, and controlling the operation of the interactive robot based on at least one of the determined speech property and motion property of the interactive robot.
    Type: Application
    Filed: August 13, 2018
    Publication date: May 30, 2019
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Young-Woo YOON, Jae-Hong KIM, Jae-Yeon LEE, Min-Su JANG
  • Patent number: 9990538
    Abstract: A face recognition technology using physiognomic feature information, which can improve the accuracy of face recognition. For this, the face recognition method using physiognomic feature information includes defining standard physiognomic types for respective facial elements, capturing a facial image of a user, detecting information about facial elements from the facial image, and calculating similarity scores relative to the standard physiognomic types for respective facial elements of the user based on the facial element information.
    Type: Grant
    Filed: April 4, 2016
    Date of Patent: June 5, 2018
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Ho-Sub Yoon, Kyu-Dae Ban, Young-Woo Yoon, Jae-Hong Kim
  • Publication number: 20170358243
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Application
    Filed: March 6, 2017
    Publication date: December 14, 2017
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Min-Su JANG, Jae-Hong KIM, Young-Woo YOON, Jae-Il CHO
  • Publication number: 20170193284
    Abstract: A face recognition technology using physiognomic feature information, which can improve the accuracy of face recognition. For this, the face recognition method using physiognomic feature information includes defining standard physiognomic types for respective facial elements, capturing a facial image of a user, detecting information about facial elements from the facial image, and calculating similarity scores relative to the standard physiognomic types for respective facial elements of the user based on the facial element information.
    Type: Application
    Filed: April 4, 2016
    Publication date: July 6, 2017
    Inventors: Ho-Sub YOON, Kyu-Dae BAN, Young-Woo YOON, Jae-Hong KIM
  • Publication number: 20170076629
    Abstract: Disclosed herein are an apparatus and method for supporting choreography, which can easily and systematically search for existing dances through various interfaces and can check the simulation of the found dances. For this, the apparatus includes a dance motion DB for storing pieces of motion capture data about respective multiple dance motions, a dance attribute DB for storing pieces of biomechanical information about respective multiple dance motions, a search unit for receiving a search target dance from a user using a method corresponding to at least one of a sectional motion search and a dance attribute search, and searching the dance motion DB and the dance attribute DB for choreographic data based on similarity determination, and a display unit for displaying choreographic data of the dance motion DB and the dance attribute DB, found as a result of the search based on similarity determined by the search unit.
    Type: Application
    Filed: March 3, 2016
    Publication date: March 16, 2017
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Jae-Hong KIM, Young-Woo YOON, Min-Su JANG, Cheon-Shu PARK, Sung-Woong SHIN
  • Patent number: 9201425
    Abstract: Provided are a human-tracking method and a robot apparatus. The human-tracking method includes receiving an image frame including a color image and a depth image, determining whether user tracking was successful in a previous image frame, and determining a location of a user and a goal position to which a robot apparatus is to move based on the color image and the depth image in the image frame, when user tracking was successful in the previous frame. Accordingly, a current location of the user can be predicted from the depth image, user tracking can be quickly performed, and the user can be re-detected and tracked using user information acquired in user tracking when detection of the user fails due to obstacles or the like.
    Type: Grant
    Filed: September 9, 2013
    Date of Patent: December 1, 2015
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Young Woo Yoon, Do Hyung Kim, Woo Han Yun, Ho Sub Yoon, Jae Yeon Lee, Jae Hong Kim, Jong Hyun Park