Patents by Inventor Kyogu Lee

Kyogu Lee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230129425
    Abstract: An example method involves comparing a primary element of a first piece of audio data to a primary element of a second piece of audio data; based on the comparing of the primary elements, determining that the first and second pieces of audio data have the same predominant mood category; in response to determining that the first and second pieces of audio data have the same predominant mood category, comparing a first mood score of the primary element of the first piece of audio data to a second mood score of the primary element of a second piece of audio data; determining that an output of the comparison of the two mood scores exceeds a threshold value; and in response to determining that the output of the comparison of the two mood scores exceeds the threshold value, providing an indicator to an application.
    Type: Application
    Filed: December 22, 2022
    Publication date: April 27, 2023
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Patent number: 11580120
    Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, by a processor, from a database, a score that represents a degree of congruency between a first mood vector that describes first media data and a second mood vector that describes second media data, wherein the score is generated based on (i) a first value that the first mood vector associates with a first mood, (ii) a second value that the second mood vector associates with a second mood, and (iii) a degree of congruency between the first and second moods, based on the score, comparing, by the processor, a first characteristic of the first media data, other than the first mood, with a second characteristic of the second media data, other than the second mood, and based at least in part on an output of the comparing, providing an indicator to a module.
    Type: Grant
    Filed: November 12, 2021
    Date of Patent: February 14, 2023
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20220067057
    Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, by a processor, from a database, a score that represents a degree of congruency between a first mood vector that describes first media data and a second mood vector that describes second media data, wherein the score is generated based on (i) a first value that the first mood vector associates with a first mood, (ii) a second value that the second mood vector associates with a second mood, and (iii) a degree of congruency between the first and second moods, based on the score, comparing, by the processor, a first characteristic of the first media data, other than the first mood, with a second characteristic of the second media data, other than the second mood, and based at least in part on an output of the comparing, providing an indicator to a module.
    Type: Application
    Filed: November 12, 2021
    Publication date: March 3, 2022
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Patent number: 11204930
    Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, from a database, a first mood vector that describes first media data and specifies an association between a first value and a first mood, accessing, from the database, a second mood vector that describes a second media data and specifies an association between a second value and a second mood, retrieving a first score that represents congruency between the first and second moods, the first score being retrieved from a data structure that correlates the first and second moods, using a processor, generating a second score that represents congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score, and, based on at least the generated second score, providing an indicator to a module.
    Type: Grant
    Filed: November 15, 2019
    Date of Patent: December 21, 2021
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20200081897
    Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, from a database, a first mood vector that describes first media data and specifies an association between a first value and a first mood, accessing, from the database, a second mood vector that describes a second media data and specifies an association between a second value and a second mood, retrieving a first score that represents congruency between the first and second moods, the first score being retrieved from a data structure that correlates the first and second moods, using a processor, generating a second score that represents congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score, and, based on at least the generated second score, providing an indicator to a module.
    Type: Application
    Filed: November 15, 2019
    Publication date: March 12, 2020
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Patent number: 10558674
    Abstract: Examples described herein may perform various operations based on mood congruency. An example implementation accesses (i) a first mood vector that describes first media data and specifies a first mood-value pair, the first mood-value pair assigning a first value to a first mood and (ii) a second mood vector that describes a second media data and specifies a second mood-value pair, the second mood-value pair assigning a second value to a second mood. The implementation retrieves a first score that quantifies congruency between the first and second moods and generates a second score that quantifies congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score. Based on at least the generated second score, the implementation provides an indicator to an application.
    Type: Grant
    Filed: November 8, 2017
    Date of Patent: February 11, 2020
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20180075039
    Abstract: Examples described herein may perform various operations based on mood congruency. An example implementation accesses (i) a first mood vector that describes first media data and specifies a first mood-value pair, the first mood-value pair assigning a first value to a first mood and (ii) a second mood vector that describes a second media data and specifies a second mood-value pair, the second mood-value pair assigning a second value to a second mood. The implementation retrieves a first score that quantifies congruency between the first and second moods and generates a second score that quantifies congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score. Based on at least the generated second score, the implementation provides an indicator to an application.
    Type: Application
    Filed: November 8, 2017
    Publication date: March 15, 2018
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Patent number: 9842146
    Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.
    Type: Grant
    Filed: June 23, 2014
    Date of Patent: December 12, 2017
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Patent number: 9632607
    Abstract: A mobile terminal and a method of controlling the mobile terminal are disclosed. According to the embodiment of the present invention, when a touching event for a touch input to a touch screen is generated, a writing sound source corresponding to the generated touching event is selected from a memory, and a feedback sound is output in real time, that is generated by changing a sound property of the writing sound source that is selected according to a pressure of a touch applied to the touch screen and a moving speed of the touch. This enables the user to feel as if he/she performed the writing input in the actual analog environment, and provides the use with a yearning for the analog environment and a pleasure to experience it.
    Type: Grant
    Filed: January 16, 2014
    Date of Patent: April 25, 2017
    Assignee: LG ELECTRONICS INC.
    Inventors: Byoungzoo Jeong, Hoon Heo, Jaeho Choi, Kyogu Lee, Hanwook Chung, Dooyong Sung
  • Patent number: 9189802
    Abstract: A method for music recommendation is provided using collaborative filtering methods while still managing to produce novel yet relevant items and by utilizing the long-tailed distribution of listening behavior of users, in which their playlists are biased towards a few songs while the rest of the songs, those in the long tail, have relatively low play counts. Also a link analysis method is applied to users with links between them to create an increasingly fine-grained approach in calculating weights for the recommended items. Results show that the method manages to include novel recommendations that are still relevant, and shows the potential for a new way of generating novel recommendations.
    Type: Grant
    Filed: August 23, 2012
    Date of Patent: November 17, 2015
    Assignee: SEOUL NATIONAL UNIVERSITY R&DB FOUNDATION
    Inventors: Kibeom Lee, Kyogu Lee
  • Publication number: 20140330848
    Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.
    Type: Application
    Filed: June 23, 2014
    Publication date: November 6, 2014
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20140253478
    Abstract: A mobile terminal and a method of controlling the mobile terminal are disclosed. According to the embodiment of the present invention, when a touching event for a touch input to a touch screen is generated, a writing sound source corresponding to the generated touching event is selected from a memory, and a feedback sound is output in real time, that is generated by changing a sound property of the writing sound source that is selected according to a pressure of a touch applied to the touch screen and a moving speed of the touch. This enables the user to feel as if he/she performed the writing input in the actual analog environment, and provides the use with a yearning for the analog environment and a pleasure to experience it.
    Type: Application
    Filed: January 16, 2014
    Publication date: September 11, 2014
    Inventors: Byoungzoo JEONG, Hoon HEO, Jaeho CHOI, Kyogu LEE, Hanwook CHUNG, Dooyong SUNG
  • Patent number: 8805854
    Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.
    Type: Grant
    Filed: June 23, 2009
    Date of Patent: August 12, 2014
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20140059128
    Abstract: A method for music recommendation is provided using collaborative filtering methods while still managing to produce novel yet relevant items and by utilizing the long-tailed distribution of listening behavior of users, in which their playlists are biased towards a few songs while the rest of the songs, those in the long tail, have relatively low play counts. Also a link analysis method is applied to users with links between them to create an increasingly fine-grained approach in calculating weights for the recommended items. Results show that the method manages to include novel recommendations that are still relevant, and shows the potential for a new way of generating novel recommendations.
    Type: Application
    Filed: August 23, 2012
    Publication date: February 27, 2014
    Inventors: Kibeom Lee, Kyogu Lee
  • Patent number: 8071869
    Abstract: The prominent tempo of an audio data is determined by detecting a plurality of beat rates of the audio data. One or more audio data characteristics are used to filter through the beat rates to determine the prominent tempo. Systems, methods, and apparatuses to determine the prominent tempo are discussed herein.
    Type: Grant
    Filed: May 6, 2009
    Date of Patent: December 6, 2011
    Assignee: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20100325135
    Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.
    Type: Application
    Filed: June 23, 2009
    Publication date: December 23, 2010
    Applicant: Gracenote, Inc.
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
  • Publication number: 20100282045
    Abstract: The prominent tempo of an audio data is determined by detecting a plurality of beat rates of the audio data. One or more audio data characteristics are used to filter through the beat rates to determine the prominent tempo. Systems, methods, and apparatuses to determine the prominent tempo are discussed herein.
    Type: Application
    Filed: May 6, 2009
    Publication date: November 11, 2010
    Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer