Patents by Inventor Bu Ru CHANG

Bu Ru CHANG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240087294
    Abstract: Systems and methods for domain generalization configured in accordance with some embodiments of the invention are illustrated. One embodiment includes a method for domain generalization of a machine learning model. The method sets a parameter of a first model and a parameter of a second model based on a pre-trained model. The method learns the second model by performing a predetermined task on a source domain. The method estimates an unobservable gradient for model updates on an unseen domain based on: the parameter of the first model, and the parameter of the second model. The method updates the first model based on the estimated unobservable gradient.
    Type: Application
    Filed: September 13, 2023
    Publication date: March 14, 2024
    Applicant: Hyperconnect LLC
    Inventors: Bu Ru Chang, Byoung Gyu Lew, Dong Hyun Son
  • Publication number: 20230281457
    Abstract: A method for training a training dynamics prediction model comprising acquiring classification information on training data included in a first dataset based on a classification model, acquiring target training dynamics information based on the classification information and a set of one or more classification information acquired based on the classification model in one or more previous epochs, acquiring predictive training dynamics information on the training data based on the training dynamics prediction model, and training the training dynamics prediction model based on the target training dynamics information and the predictive training dynamics information is disclosed.
    Type: Application
    Filed: February 23, 2023
    Publication date: September 7, 2023
    Applicant: Hyperconnect LLC
    Inventors: Seong Min Kye, Kwang Hee Choi, Bu Ru Chang
  • Publication number: 20230229864
    Abstract: Provided is an apparatus for evaluating and improving responses, and a method and a computer readable recording medium thereof. The apparatus for evaluating responses according to the present disclosure obtains cluster classifying information for training responses, and based on distribution of clusters to which test responses output from the dialogue generation model are classified, evaluate semantic diversity of the responses output from the dialogue generation model.
    Type: Application
    Filed: October 31, 2022
    Publication date: July 20, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Beom Su Kim, Bu Ru Chang, Seung Ju Han
  • Publication number: 20230229964
    Abstract: Provided is an apparatus for evaluating and improving responses, and a method and a computer readable recording medium thereof. The apparatus for evaluating responses according to the present disclosure obtains cluster classifying information for training responses, and based on distribution of clusters to which test responses output from the dialogue generation model are classified, evaluate semantic diversity of the responses output from the dialogue generation model.
    Type: Application
    Filed: October 31, 2022
    Publication date: July 20, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Beom Su Kim, Bu Ru Chang, Seung Ju Han
  • Publication number: 20230154453
    Abstract: Systems and techniques to generate imitative responses are illustrated. response generation method performed in an electronic apparatus of the present disclosure includes acquiring at least one piece of utterance data, acquiring a first context corresponding to the utterance data from a context candidate set, generating one or more dialogue sets including the first context and the utterance data, receiving a second context from a user, and acquiring a response corresponding to the second context using a language model based on the one or more dialogue sets.
    Type: Application
    Filed: October 13, 2022
    Publication date: May 18, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Enkhbayar Erdenee, Beom Su Kim, Sang Bum Kim, Seok Jun Seo, Jin Yong Yoo, Bu Ru Chang, Seung Ju Han
  • Publication number: 20230135163
    Abstract: Provided is a method for training a model, including generating a plurality of attention maps by inputting training data into a previously trained teacher model, generating a set of attention weights of the teacher model based on the plurality of attention maps, generating a set of attention weights of a student model by inputting the training data into the student model, calculating a value of a first loss function based on the set of attention weights of the teacher model and the set of attention weights of the student model, calculating a value of a second loss function according to an inference of the student model with respect to the training data, and training the student model based on the value of the first loss function and the value of the second loss function.
    Type: Application
    Filed: September 20, 2022
    Publication date: May 4, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Jacob Richard Morton, Martin Kersner, Sang Il Ahn, Bu Ru Chang, Kwang Hee Choi
  • Publication number: 20230080930
    Abstract: Disclosed is a method of training a dialogue model in an electronic device, the method including selecting a first context from a first dialogue data set including at least one pair of a context and a response corresponding to the context, generating a first response corresponding to the first context through a first dialogue model, generating an augmented dialogue dataset by incorporating a pair of the first context and the first response corresponding to the first context into the first dialogue data set, and training a second dialogue model based on the augmented dialogue dataset.
    Type: Application
    Filed: June 17, 2022
    Publication date: March 16, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Seok Jun Seo, Seung Ju Han, Beom Su Kim, Bu Ru Chang, Enkhbayar Erdenee
  • Publication number: 20230077528
    Abstract: A training method of a conversation model according to various example embodiments of the present disclosure may include identifying a first context, identifying a first response set corresponding to the first context based on a first model, identifying a response subset selected from the first response set based on a gold response corresponding to the first context and training a second model based on the first context information and the response subset.
    Type: Application
    Filed: June 2, 2022
    Publication date: March 16, 2023
    Applicant: Hyperconnect Inc.
    Inventors: Enkhbayar Erdenee, Beom Su Kim, Seok Jun Seo, Sang Il Ahn, Bu Ru Chang, Seung Ju Han
  • Publication number: 20220207368
    Abstract: A method of training a neural network model for predicting a click-through rate (CTR) of a user in an electronic device includes normalizing an embedding vector on the basis of a feature-wise linear transformation parameter, and inputting the normalized embedding vector into a neural network layer, wherein the feature-wise linear transformation parameter is defined such that the same value is applied to all elements of the embedding vector.
    Type: Application
    Filed: December 29, 2021
    Publication date: June 30, 2022
    Applicant: Hyperconnect, Inc.
    Inventors: Sang Il Ahn, Joon Young Yi, Beom Su Kim, Bu Ru Chang
  • Publication number: 20210202047
    Abstract: The present disclosure provides a new drug candidate material output apparatus, including: a communication module; a memory in which a new drug candidate material output program is stored; and a processor executing the new drug candidate material output program. The new drug candidate material output program provides a drug learning model in which an embedding vector for a chemical structure of a chemical compound and an embedding vector for change information on an amount of a transcriptome induced by each chemical compound are located in a same vector space, outputs a result of the change information on the amount of the transcriptome that matches the embedding vector for the chemical structure of the new material input to the drug learning model, or outputs information on one or more drugs that match the change information on the amount of the transcriptome that is a target input to the drug learning model.
    Type: Application
    Filed: December 31, 2020
    Publication date: July 1, 2021
    Applicant: Korea University Research and Business Foundation
    Inventors: Jaewoo KANG, Min Ji JEON, Bu Ru CHANG, Jung Soo PARK, Sung Joon PARK, Sun Kyu KIM