Patents by Inventor Min Su Jang

Min Su Jang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200390371
    Abstract: Disclosed herein are an apparatus and method for evaluating a physical activity ability. The apparatus includes one or more processors and executable memory for storing at least one program executed by the one or more processors. The at least one program recognizes the position of a human by analyzing an image input through a camera, identifies the motion of the human by analyzing the sequence of the image, and evaluates the physical activity ability of the human from the motion of the human based on the body segment of the human.
    Type: Application
    Filed: December 11, 2019
    Publication date: December 17, 2020
    Inventors: Do-Hyung KIM, Jae-Hong KIM, Jae-Yeon LEE, Min-Su JANG, Sung-Woong SHIN
  • Publication number: 20200335007
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Application
    Filed: July 1, 2020
    Publication date: October 22, 2020
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Min-Su JANG, Jae-Hong KIM, Young-Woo YOON, Jae-Il CHO
  • Patent number: 10800043
    Abstract: Disclosed herein are an interaction apparatus and method. The interaction apparatus includes an input unit for receiving multimodal information including an image and a voice of a target to allow the interaction apparatus to interact with the target, a recognition unit for recognizing turn-taking behavior of the target using the multimodal information, and an execution unit for taking an activity for interacting with the target based on results of recognition of the turn-taking behavior.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: October 13, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Cheon-Shu Park, Jae-Hong Kim, Jae-Yeon Lee, Min-Su Jang
  • Patent number: 10789458
    Abstract: Disclosed herein are a human behavior recognition apparatus and method. The human behavior recognition apparatus includes a multimodal sensor unit for generating at least one of image information, sound information, location information, and Internet-of-Things (IoT) information of a person using a multimodal sensor, a contextual information extraction unit for extracting contextual information for recognizing actions of the person from the at least one piece of generated information, a human behavior recognition unit for generating behavior recognition information by recognizing the actions of the person using the contextual information and recognizing a final action of the person using the behavior recognition information and behavior intention information, and a behavior intention inference unit for generating the behavior intention information based on context of action occurrence related to each of the actions of the person included in the behavior recognition information.
    Type: Grant
    Filed: December 7, 2018
    Date of Patent: September 29, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung Kim, Jin-Hyeok Jang, Jae-Hong Kim, Sung-Woong Shin, Jae-Yeon Lee, Min-Su Jang
  • Patent number: 10777198
    Abstract: Disclosed herein are an apparatus and method for determining the speech and motion properties of an interactive robot. The method for determining the speech and motion properties of an interactive robot includes receiving interlocutor conversation information including at least one of voice information and image information about an interlocutor that interacts with an interactive robot, extracting at least one of a verbal property and a nonverbal property of the interlocutor by analyzing the interlocutor conversation information, determining at least one of a speech property and a motion property of the interactive robot based on at least one of the verbal property, the nonverbal property, and context information inferred from a conversation between the interactive robot and the interlocutor, and controlling the operation of the interactive robot based on at least one of the determined speech property and motion property of the interactive robot.
    Type: Grant
    Filed: August 13, 2018
    Date of Patent: September 15, 2020
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Young-Woo Yoon, Jae-Hong Kim, Jae-Yeon Lee, Min-Su Jang
  • Patent number: 10748444
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Grant
    Filed: March 6, 2017
    Date of Patent: August 18, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung Kim, Min-Su Jang, Jae-Hong Kim, Young-Woo Yoon, Jae-Il Cho
  • Publication number: 20200094416
    Abstract: Disclosed herein are an interaction apparatus and method. The interaction apparatus includes an input unit for receiving multimodal information including an image and a voice of a target to allow the interaction apparatus to interact with the target, a recognition unit for recognizing turn-taking behavior of the target using the multimodal information, and an execution unit for taking an activity for interacting with the target based on results of recognition of the turn-taking behavior.
    Type: Application
    Filed: November 30, 2018
    Publication date: March 26, 2020
    Inventors: Cheon-Shu PARK, Jae-Hong KIM, Jae-Yeon LEE, Min-Su JANG
  • Patent number: 10596708
    Abstract: The present disclosure herein relates to an interaction device capable of performing an interaction with a human, and more particularly, to an interaction device capable of performing an interaction with a plurality of participants. The interaction device includes a role classifying unit configured to classify a role for each of a plurality of participants based on an external stimulus signal for each of the plurality of participants and an action adjusting unit configured to perform different interaction operations for each of the plurality of participants based on the role for each of the plurality of participants. An interaction device according to an embodiment of the present application classifies the roles of a plurality of participants according to a participation degree and/or an action state and provides a customized interaction operation for each of participants according to the classified roles. Therefore, it is possible to perform a natural interaction operation with a plurality of participants.
    Type: Grant
    Filed: March 22, 2017
    Date of Patent: March 24, 2020
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Cheonshu Park, Jae Hong Kim, Daeha Lee, Min Su Jang
  • Publication number: 20200087951
    Abstract: The present disclosure relates to a bicycle locking apparatus having an abnormal locking prevention function. The bicycle locking apparatus includes a sensor unit which measures a driving condition of a bicycle to be locked by using a plurality of sensors and outputs sensing data about the driving condition, a locking operation unit which performs a locking operation or an unlocking operation on the bicycle to be locked in response to input of a lock signal or an unlock signal, and a control unit which controls the overall operation of the bicycle locking apparatus. The control unit includes a first driving state determination unit which determines a first state of the bicycle to be locked by using sensing data of a first sensor group, a second driving state determination unit which determines a second state of the bicycle to be locked by using sensing data of a second sensor group.
    Type: Application
    Filed: November 22, 2019
    Publication date: March 19, 2020
    Applicant: BISECU INC.
    Inventors: Jong Hyun LEE, Jung Ho HYUN, Ji Soon KANG, Min Su JANG
  • Publication number: 20200074158
    Abstract: Disclosed herein are a human behavior recognition apparatus and method. The human behavior recognition apparatus includes a multimodal sensor unit for generating at least one of image information, sound information, location information, and Internet-of-Things (IoT) information of a person using a multimodal sensor, a contextual information extraction unit for extracting contextual information for recognizing actions of the person from the at least one piece of generated information, a human behavior recognition unit for generating behavior recognition information by recognizing the actions of the person using the contextual information and recognizing a final action of the person using the behavior recognition information and behavior intention information, and a behavior intention inference unit for generating the behavior intention information based on context of action occurrence related to each of the actions of the person included in the behavior recognition information.
    Type: Application
    Filed: December 7, 2018
    Publication date: March 5, 2020
    Inventors: Do-Hyung KIM, Jin-Hyeok JANG, Jae-Hong KIM, Sung-Woong SHIN, Jae-Yeon LEE, Min-Su JANG
  • Publication number: 20190164548
    Abstract: Disclosed herein are an apparatus and method for determining the speech and motion properties of an interactive robot. The method for determining the speech and motion properties of an interactive robot includes receiving interlocutor conversation information including at least one of voice information and image information about an interlocutor that interacts with an interactive robot, extracting at least one of a verbal property and a nonverbal property of the interlocutor by analyzing the interlocutor conversation information, determining at least one of a speech property and a motion property of the interactive robot based on at least one of the verbal property, the nonverbal property, and context information inferred from a conversation between the interactive robot and the interlocutor, and controlling the operation of the interactive robot based on at least one of the determined speech property and motion property of the interactive robot.
    Type: Application
    Filed: August 13, 2018
    Publication date: May 30, 2019
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Young-Woo YOON, Jae-Hong KIM, Jae-Yeon LEE, Min-Su JANG
  • Patent number: 10029238
    Abstract: A method for preparing a catalyst having catalytically active materials selectively impregnated or supported only in the surface region of the catalyst particle using the mutual repulsive force of a hydrophobic solution and a hydrophilic solution and the solubility difference to a metal salt precursor between the hydrophobic and hydrophilic solutions. The hydrophobic solvent is a C2-C6 alcohol. The hydrophobic solvent is introduced into the catalyst support and then removed of a part of the pores connected to the outer part of the catalyst particle by drying under appropriate conditions. Then, a hydrophilic solution containing a metal salt is introduced to occupy the void spaces removed of the hydrophobic solvent, and the catalyst particle is dried at a low rate to selectively support or impregnate the catalytically active material or the precursor of the catalytically active material only in the outer part of the catalyst particle.
    Type: Grant
    Filed: November 4, 2016
    Date of Patent: July 24, 2018
    Assignee: INDUSTRY FOUNDATION OF CHONNAM NATIONAL UNIVERSITY
    Inventors: Chang Hyun Ko, Gyeong Ju Seo, Min Su Jang, Seong Mi Ahn
  • Patent number: 9931892
    Abstract: A tread kerf of a snow tire is capable of securing the performance of the tire on a dry road, a wet road, a snowy road, and an icy road, maintaining uniform rigidity of the block, and improving wear resistance. The tread kerf is formed in the outer surface of a tread block of the snow tire in a depth direction of the block in a shape of a polygonal wave extending along the outer surface of the block, wherein the entrance distance of the polygonal wave is less than the bottom distance of the polygonal wave such that side surfaces between the entrance and the bottom of the polygonal wave are formed as inclined surfaces, the distance between which decreases toward the entrance. The polygonal wave is moved in an advancing direction of the polygonal wave while being twisted in the depth direction of the block.
    Type: Grant
    Filed: August 17, 2015
    Date of Patent: April 3, 2018
    Assignee: HANKOOK TIRE CO., LTD.
    Inventors: Min Su Jang, Sang Tak Joo, Sung Hee Youn
  • Patent number: 9898681
    Abstract: An apparatus and method for detecting an object using a multi-directional integral image are disclosed. The apparatus includes an area segmentation unit, an integral image calculation unit, and an object detection unit. The area segmentation unit places windows having a size of x*y on a full image having w*h pixels so that they overlap each other at their edges, thereby segmenting the full image into a single area, a double area and a quadruple area. The integral image calculation unit calculates a single directional integral image for the single area, and calculates multi-directional integral images for the double and quadruple areas. The object detection unit detects an object for the full image using the single directional integral image and the multi-directional integral images.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: February 20, 2018
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Dae-Ha Lee, Cheon-Shu Park, Min-Su Jang, Jae-Hong Kim, Jong-Hyun Park
  • Patent number: 9861962
    Abstract: A method for preparing a catalyst having catalytically active materials selectively impregnated or supported only in the surface region of the catalyst particle using the mutual repulsive force of a hydrophobic solution and a hydrophilic solution and the solubility difference to a metal salt precursor between the hydrophobic and hydrophilic solutions. The hydrophobic solvent is a C2-C6 alcohol. The hydrophobic solvent is introduced into the catalyst support and then removed of a part of the pores connected to the outer part of the catalyst particle by drying under appropriate conditions. Then, a hydrophilic solution containing a metal salt is introduced to occupy the void spaces removed of the hydrophobic solvent, and the catalyst particle is dried at a low rate to selectively support or impregnate the catalytically active material or the precursor of the catalytically active material only in the outer part of the catalyst particle.
    Type: Grant
    Filed: August 23, 2014
    Date of Patent: January 9, 2018
    Assignee: INDUSTRY FOUNDATION OF CHONNAM NATIONAL UNIVERSITY
    Inventors: Chang Hyun Ko, Gyeong Ju Seo, Min Su Jang, Seong Mi Ahn
  • Publication number: 20170358243
    Abstract: Disclosed herein are an apparatus for writing a motion script and an apparatus and method for self-teaching of a motion. The method for self-teaching of a motion, in which the apparatus for writing a motion script and the apparatus for self-teaching of a motion are used, includes creating, by the apparatus for writing a motion script, a motion script based on expert motion of a first user; analyzing, by the apparatus for self-teaching of a motion, a motion of a second user, who learns the expert motion, based on the motion script; and outputting, by the apparatus for self-teaching of a motion, a result of analysis of the motion of the second user.
    Type: Application
    Filed: March 6, 2017
    Publication date: December 14, 2017
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Do-Hyung KIM, Min-Su JANG, Jae-Hong KIM, Young-Woo YOON, Jae-Il CHO
  • Publication number: 20170274535
    Abstract: The present disclosure herein relates to an interaction device capable of performing an interaction with a human, and more particularly, to an interaction device capable of performing an interaction with a plurality of participants. The interaction device includes a role classifying unit configured to classify a role for each of a plurality of participants based on an external stimulus signal for each of the plurality of participants and an action adjusting unit configured to perform different interaction operations for each of the plurality of participants based on the role for each of the plurality of participants. An interaction device according to an embodiment of the present application classifies the roles of a plurality of participants according to a participation degree and/or an action state and provides a customized interaction operation for each of participants according to the classified roles. Therefore, it is possible to perform a natural interaction operation with a plurality of participants.
    Type: Application
    Filed: March 22, 2017
    Publication date: September 28, 2017
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Cheonshu PARK, Jae Hong KIM, Daeha LEE, Min Su JANG
  • Patent number: 9606543
    Abstract: Provided are a method and a control apparatus for cooperative cleaning using multiple cleaning robots, including monitoring an overall cleaning condition of an extensive space and automatically assigning multiple cleaning robots to a space required to be cleaned, and when a cleaning area is fixed based on a cooperative cleaning method, data on an amount of garbage generated from the cleaning area or a cleaning condition of the cleaning area may be accumulated to facilitate easier management of the cleaning.
    Type: Grant
    Filed: April 16, 2014
    Date of Patent: March 28, 2017
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Seo Hyun Jeon, Min Su Jang, Dae Ha Lee, Chang Eun Lee, Hyun Ja Im, Young Jo Cho, Jae Hong Kim, Jong Hyun Park
  • Publication number: 20170072385
    Abstract: A method for preparing a catalyst having catalytically active materials selectively impregnated or supported only in the surface region of the catalyst particle using the mutual repulsive force of a hydrophobic solution and a hydrophilic solution and the solubility difference to a metal salt precursor between the hydrophobic and hydrophilic solutions. The hydrophobic solvent is a C2-C6 alcohol. The hydrophobic solvent is introduced into the catalyst support and then removed of a part of the pores connected to the outer part of the catalyst particle by drying under appropriate conditions. Then, a hydrophilic solution containing a metal salt is introduced to occupy the void spaces removed of the hydrophobic solvent, and the catalyst particle is dried at a low rate to selectively support or impregnate the catalytically active material or the precursor of the catalytically active material only in the outer part of the catalyst particle.
    Type: Application
    Filed: November 4, 2016
    Publication date: March 16, 2017
    Inventors: Chang Hyun KO, Gyeong Ju SEO, Min Su JANG, Seong Mi AHN
  • Patent number: D782904
    Type: Grant
    Filed: December 9, 2015
    Date of Patent: April 4, 2017
    Assignee: Daelim Industrial Co., Ltd.
    Inventors: Tae Heoui Choi, Min Su Jang