Patents by Inventor Jin-Xia Huang

Jin-Xia Huang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150227510
    Abstract: The present invention relates to a translation function and discloses an automatic translation operating device, including: at least one of voice input devices which collects voice signals input by a plurality of speakers and a communication module which receives the voice signals; and a control unit which controls to classify voice signals by speakers from the voice signals and cluster the speaker based voice signals classified in accordance with a predefined condition and then perform voice recognition and translation and a method thereof and a system including the same.
    Type: Application
    Filed: January 28, 2015
    Publication date: August 13, 2015
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jong Hun SHIN, Ki Young LEE, Young Ae SEO, Jin Xia HUANG, Sung Kwon CHOI, Yun JIN, Chang Hyun KIM, Seung Hoon NA, Yoon Hyung ROH, Oh Woog KWON, Sang Keun JUNG, Eun Jin PARK, Kang Il KIM, Young Kil KIM, Sang Kyu PARK
  • Publication number: 20150199340
    Abstract: The present invention relates to a system for translating a language based on a user's reaction, the system includes an interface unit which inputs uttered sentences of the first user and the second user and outputs the translated result; a translating unit which translates the uttered sentences of the first user and the second user; a conversation intention recognizing unit which figures out a conversation intention of the second user from the reply of the second user for a translation result of the utterance of the first user; a translation result evaluating unit which evaluates the translation of the uttered sentence of the first user based on the conversation intention of the second user which is determined by the conversation intention recognizing unit; and a translation result evaluation storing unit which stores the translation result and an evaluation of the translation result.
    Type: Application
    Filed: January 12, 2015
    Publication date: July 16, 2015
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Oh Woog KWON, Young Kil KIM, Chang Hyun KIM, Seung Hoon NA, Yoon Hyung ROH, Ki Young LEE, Sang Keun JUNG, Sung Kwon CHOI, Yun JIN, Eun Jin PARK, Jong Hun SHIN, Jin Xia HUANG, Kang Il KIM, Young Ae SEO, Sang Kyu PARK
  • Publication number: 20150193410
    Abstract: The present invention relates to a system for editing a text of a portable terminal and a method thereof, and more particularly to a technology which edits a text which is input into a portable terminal based on a touch interface. An exemplary embodiment of the present invention provides a text editing system of a portable terminal, including: an interface unit which inputs or outputs a text or voice; a text generating unit which generates the input text or voice as a text; a control unit which provides a keyboard based editing screen or a character recognition based editing screen for the generated text through the interface unit; and a text editing unit which performs an editing command which is input from a user through the keyboard based editing screen or the character recognition based editing screen under the control of the control unit.
    Type: Application
    Filed: September 12, 2014
    Publication date: July 9, 2015
    Inventors: Yun JIN, Chang Hyun KIM, Young Ae SEO, Jin Xia HUANG, Oh Woog KWON, Seung Hoon NA, Yoon Hyung ROH, Ki Young LEE, Sang Keun JUNG, Sung Kwon CHOI, Jong Hun SHIN, Eun Jin PARK, Kang Il KIM, Young Kil KIM, Sang Kyu PARK
  • Publication number: 20150127361
    Abstract: Disclosed are an automatic translation apparatus and method capable of optimizing limited translation knowledge in a database mounted in a portable mobile communication terminal, obtaining translation knowledge from external servers in order to provide translation knowledge appropriate for respective users, and effectively updating the database mounted in the terminal.
    Type: Application
    Filed: October 2, 2014
    Publication date: May 7, 2015
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jong-Hun SHIN, Chang-Hyun KIM, Oh-Woog KWON, Ki-Young LEE, Young-Ae SEO, Sung-Kwon CHOI, Yun JIN, Eun-Jin PARK, Jin-Xia HUANG, Seung-Hoon NA, Yoon-Hyung ROH, Sang-Keun JUNG, Young-Kil KIM, Sang-Kyu PARK
  • Publication number: 20140297257
    Abstract: Disclosed herein is a motion sensor-based portable automatic interpretation apparatus and control method thereof, which can precisely detect the start time and the end time of utterance of a user in a portable automatic interpretation system, thus improving the quality of the automatic interpretation system. The motion sensor-based portable automatic interpretation apparatus includes a motion sensing unit for sensing a motion of the portable automatic interpretation apparatus. An utterance start time detection unit detects an utterance start time based on an output signal of the motion sensing unit. An utterance end time detection unit detects an utterance end time based on an output signal of the motion sensing unit after the utterance start time has been detected.
    Type: Application
    Filed: October 29, 2013
    Publication date: October 2, 2014
    Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Jong-Hun SHIN, Young-Kil KIM, Chang-Hyun KIM, Young-Ae SEO, Seong-Il YANG, Jin-Xia HUANG, Seung-Hoon NA, Oh-Woog KWON, Ki-Young LEE, Yoon-Hyung ROH, Sung-Kwon CHOI, Sang-Keun JUNG, Yun JIN, Eun-Jin PARK, Sang-Kyu PARK
  • Publication number: 20130346060
    Abstract: Disclosed herein are a translation interfacing apparatus and method using vision tracking. The translation interfacing apparatus includes a vision tracking unit, a comparison unit, a sentence detection unit, a sentence translation unit, and a sentence output unit. The vision tracking unit tracks a user's eyes based on one or more images input via the camera of a portable terminal, and extracts time information about a period for which the user's eyes have been fixed and location information about a location on which the user's eyes are focused. The comparison unit compares the time information with a preset eye fixation period. The sentence detection unit detects a sentence corresponding to the location information if the time information is equal to or longer than the eye fixation period. The sentence translation unit translates the detected sentence.
    Type: Application
    Filed: June 6, 2013
    Publication date: December 26, 2013
    Inventors: Jong-Hun SHIN, Young-Ae Seo, Seong-II Yang, Jin-Xia Huang, Chang-Hyun Kim, Young-Kll Kim
  • Publication number: 20120156660
    Abstract: A dialogue system include learning initiation unit which receives conversation education domain and target completion condition inconversation education domain and receives user's utterance, voice recognition unit which converts user's utterance into utterance text based on utterance information, language understanding unit which determines user's dialogue act based on converted utterance text and generates logical expression using slot expression corresponding to determined dialogue act and slot expression defined in conversation education domain, dialogue/progress management unit which determines utterance vertex with logical expression similar to that of utterance patterns of plurality of utterance vertices connected to system's final utterance vertex in dynamic dialogue graph and determines utterance vertices connected to determined utterance vertex as next utterance, system dialogue generation unit which retrieves utterance patterns connected to utterance vertex corresponding to determined next utterance
    Type: Application
    Filed: December 15, 2011
    Publication date: June 21, 2012
    Applicant: Electronics and Telecommunications Research Institute
    Inventors: Oh Woog KWON, Sung Kwon CHOI, Ki Young LEE, Yoon Hyung ROH, Young Kil KIM, Eun Jin PARK, Yun JIN, Chang Hyun KIM, Young Ae SEO, Seong YANG, II, Jin Xia HUANG, Jong Hun SHIN, Yun Keun LEE, Sang Kyu PARK
  • Publication number: 20120150529
    Abstract: A method and apparatus for generating a translation knowledge server, which can generate a translation knowledge server based on translation knowledge collected in real time is provided. The apparatus for generating translation knowledge server may include: data collector which collects initial translation knowledge data; data analyzer which performs morphological analysis and syntactic analysis on the initial translation knowledge data received from the data collector and outputs analyzed data; and translation knowledge learning unit which learns real-time translation knowledge by determining target word for each domain from the analyzed data based on predetermined domain information or by determining a domain by automatic clustering. According to the present invention, it is possible to obtain translation knowledge by analyzing documents present in a web or provided by a user in real time and to improve the quality of translation by applying the obtained translation knowledge to a translation engine.
    Type: Application
    Filed: December 9, 2011
    Publication date: June 14, 2012
    Applicant: Electronics and Telecommunication Research Institute
    Inventors: Chang Hyun KIM, Young Ae Seo, Seong Il Yang, Jin Xia Huang, Sung Kwon Choi, Yoon Hyung Roh, Ki Young Lee, Oh Woog Kwon, Yun Jin, Eun Jin Park, Jong Hun Shin, Young Kil Kim, Sang Kyu Park
  • Publication number: 20080133218
    Abstract: The present invention performs machine translation by matching fragments of a source language sentence to be translated to source language portions of an example in example base. When all relevant examples have been identified in the example base, the examples are subjected to phrase alignment in which fragments of the target language sentence in each example are aligned against the matched fragments of the source language sentence in the same example. A translation component then substitutes the aligned target language phrases from the matched examples for the matched fragments in the source language sentence.
    Type: Application
    Filed: November 6, 2007
    Publication date: June 5, 2008
    Applicant: Microsoft Corporation
    Inventors: Ming Zhou, Jin-Xia Huang, Chang Ning (Tom) Huang, Wei Wang
  • Patent number: 7353165
    Abstract: The present invention performs machine translation by matching fragments of a source language sentence to be translated to source language portions of an example in example base. When all relevant examples have been identified in the example base, the examples are subjected to phrase alignment in which fragments of the target language sentence in each example are aligned against the matched fragments of the source language sentence in the same example. A translation component then substitutes the aligned target language phrases from the matched examples for the matched fragments in the source language sentence.
    Type: Grant
    Filed: June 28, 2002
    Date of Patent: April 1, 2008
    Assignee: Microsoft Corporation
    Inventors: Ming Zhou, Jin-Xia Huang, Chang Ning (Tom) Huang, Wei Wang
  • Publication number: 20040002848
    Abstract: The present invention performs machine translation by matching fragments of a source language sentence to be translated to source language portions of an example in example base. When all relevant examples have been identified in the example base, the examples are subjected to phrase alignment in which fragments of the target language sentence in each example are aligned against the matched fragments of the source language sentence in the same example. A translation component then substitutes the aligned target language phrases from the matched examples for the matched fragments in the source language sentence.
    Type: Application
    Filed: June 28, 2002
    Publication date: January 1, 2004
    Inventors: Ming Zhou, Jin-Xia Huang, Chang Ning Huang, Wei Wang