Patents by Inventor Kyogu Lee
Kyogu Lee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230129425Abstract: An example method involves comparing a primary element of a first piece of audio data to a primary element of a second piece of audio data; based on the comparing of the primary elements, determining that the first and second pieces of audio data have the same predominant mood category; in response to determining that the first and second pieces of audio data have the same predominant mood category, comparing a first mood score of the primary element of the first piece of audio data to a second mood score of the primary element of a second piece of audio data; determining that an output of the comparison of the two mood scores exceeds a threshold value; and in response to determining that the output of the comparison of the two mood scores exceeds the threshold value, providing an indicator to an application.Type: ApplicationFiled: December 22, 2022Publication date: April 27, 2023Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Patent number: 11580120Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, by a processor, from a database, a score that represents a degree of congruency between a first mood vector that describes first media data and a second mood vector that describes second media data, wherein the score is generated based on (i) a first value that the first mood vector associates with a first mood, (ii) a second value that the second mood vector associates with a second mood, and (iii) a degree of congruency between the first and second moods, based on the score, comparing, by the processor, a first characteristic of the first media data, other than the first mood, with a second characteristic of the second media data, other than the second mood, and based at least in part on an output of the comparing, providing an indicator to a module.Type: GrantFiled: November 12, 2021Date of Patent: February 14, 2023Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20220067057Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, by a processor, from a database, a score that represents a degree of congruency between a first mood vector that describes first media data and a second mood vector that describes second media data, wherein the score is generated based on (i) a first value that the first mood vector associates with a first mood, (ii) a second value that the second mood vector associates with a second mood, and (iii) a degree of congruency between the first and second moods, based on the score, comparing, by the processor, a first characteristic of the first media data, other than the first mood, with a second characteristic of the second media data, other than the second mood, and based at least in part on an output of the comparing, providing an indicator to a module.Type: ApplicationFiled: November 12, 2021Publication date: March 3, 2022Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Patent number: 11204930Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, from a database, a first mood vector that describes first media data and specifies an association between a first value and a first mood, accessing, from the database, a second mood vector that describes a second media data and specifies an association between a second value and a second mood, retrieving a first score that represents congruency between the first and second moods, the first score being retrieved from a data structure that correlates the first and second moods, using a processor, generating a second score that represents congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score, and, based on at least the generated second score, providing an indicator to a module.Type: GrantFiled: November 15, 2019Date of Patent: December 21, 2021Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20200081897Abstract: Examples described herein may perform various operations based on mood congruency. An example method involves accessing, from a database, a first mood vector that describes first media data and specifies an association between a first value and a first mood, accessing, from the database, a second mood vector that describes a second media data and specifies an association between a second value and a second mood, retrieving a first score that represents congruency between the first and second moods, the first score being retrieved from a data structure that correlates the first and second moods, using a processor, generating a second score that represents congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score, and, based on at least the generated second score, providing an indicator to a module.Type: ApplicationFiled: November 15, 2019Publication date: March 12, 2020Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Patent number: 10558674Abstract: Examples described herein may perform various operations based on mood congruency. An example implementation accesses (i) a first mood vector that describes first media data and specifies a first mood-value pair, the first mood-value pair assigning a first value to a first mood and (ii) a second mood vector that describes a second media data and specifies a second mood-value pair, the second mood-value pair assigning a second value to a second mood. The implementation retrieves a first score that quantifies congruency between the first and second moods and generates a second score that quantifies congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score. Based on at least the generated second score, the implementation provides an indicator to an application.Type: GrantFiled: November 8, 2017Date of Patent: February 11, 2020Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20180075039Abstract: Examples described herein may perform various operations based on mood congruency. An example implementation accesses (i) a first mood vector that describes first media data and specifies a first mood-value pair, the first mood-value pair assigning a first value to a first mood and (ii) a second mood vector that describes a second media data and specifies a second mood-value pair, the second mood-value pair assigning a second value to a second mood. The implementation retrieves a first score that quantifies congruency between the first and second moods and generates a second score that quantifies congruency between the first and second mood vectors, the generating the second score being based on the accessed first and second values and the retrieved first score. Based on at least the generated second score, the implementation provides an indicator to an application.Type: ApplicationFiled: November 8, 2017Publication date: March 15, 2018Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Patent number: 9842146Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.Type: GrantFiled: June 23, 2014Date of Patent: December 12, 2017Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Patent number: 9632607Abstract: A mobile terminal and a method of controlling the mobile terminal are disclosed. According to the embodiment of the present invention, when a touching event for a touch input to a touch screen is generated, a writing sound source corresponding to the generated touching event is selected from a memory, and a feedback sound is output in real time, that is generated by changing a sound property of the writing sound source that is selected according to a pressure of a touch applied to the touch screen and a moving speed of the touch. This enables the user to feel as if he/she performed the writing input in the actual analog environment, and provides the use with a yearning for the analog environment and a pleasure to experience it.Type: GrantFiled: January 16, 2014Date of Patent: April 25, 2017Assignee: LG ELECTRONICS INC.Inventors: Byoungzoo Jeong, Hoon Heo, Jaeho Choi, Kyogu Lee, Hanwook Chung, Dooyong Sung
-
Patent number: 9189802Abstract: A method for music recommendation is provided using collaborative filtering methods while still managing to produce novel yet relevant items and by utilizing the long-tailed distribution of listening behavior of users, in which their playlists are biased towards a few songs while the rest of the songs, those in the long tail, have relatively low play counts. Also a link analysis method is applied to users with links between them to create an increasingly fine-grained approach in calculating weights for the recommended items. Results show that the method manages to include novel recommendations that are still relevant, and shows the potential for a new way of generating novel recommendations.Type: GrantFiled: August 23, 2012Date of Patent: November 17, 2015Assignee: SEOUL NATIONAL UNIVERSITY R&DB FOUNDATIONInventors: Kibeom Lee, Kyogu Lee
-
Publication number: 20140330848Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.Type: ApplicationFiled: June 23, 2014Publication date: November 6, 2014Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20140253478Abstract: A mobile terminal and a method of controlling the mobile terminal are disclosed. According to the embodiment of the present invention, when a touching event for a touch input to a touch screen is generated, a writing sound source corresponding to the generated touching event is selected from a memory, and a feedback sound is output in real time, that is generated by changing a sound property of the writing sound source that is selected according to a pressure of a touch applied to the touch screen and a moving speed of the touch. This enables the user to feel as if he/she performed the writing input in the actual analog environment, and provides the use with a yearning for the analog environment and a pleasure to experience it.Type: ApplicationFiled: January 16, 2014Publication date: September 11, 2014Inventors: Byoungzoo JEONG, Hoon HEO, Jaeho CHOI, Kyogu LEE, Hanwook CHUNG, Dooyong SUNG
-
Patent number: 8805854Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.Type: GrantFiled: June 23, 2009Date of Patent: August 12, 2014Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20140059128Abstract: A method for music recommendation is provided using collaborative filtering methods while still managing to produce novel yet relevant items and by utilizing the long-tailed distribution of listening behavior of users, in which their playlists are biased towards a few songs while the rest of the songs, those in the long tail, have relatively low play counts. Also a link analysis method is applied to users with links between them to create an increasingly fine-grained approach in calculating weights for the recommended items. Results show that the method manages to include novel recommendations that are still relevant, and shows the potential for a new way of generating novel recommendations.Type: ApplicationFiled: August 23, 2012Publication date: February 27, 2014Inventors: Kibeom Lee, Kyogu Lee
-
Patent number: 8071869Abstract: The prominent tempo of an audio data is determined by detecting a plurality of beat rates of the audio data. One or more audio data characteristics are used to filter through the beat rates to determine the prominent tempo. Systems, methods, and apparatuses to determine the prominent tempo are discussed herein.Type: GrantFiled: May 6, 2009Date of Patent: December 6, 2011Assignee: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20100325135Abstract: In an embodiment, a method is provided for determining a mood profile of media data. In this method, mood is determined across multiple elements of mood for the media data to create a mood profile associated with the media data. In some embodiments, the mood profile is then used to determine congruencies between one or more pieces of media data.Type: ApplicationFiled: June 23, 2009Publication date: December 23, 2010Applicant: Gracenote, Inc.Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer
-
Publication number: 20100282045Abstract: The prominent tempo of an audio data is determined by detecting a plurality of beat rates of the audio data. One or more audio data characteristics are used to filter through the beat rates to determine the prominent tempo. Systems, methods, and apparatuses to determine the prominent tempo are discussed herein.Type: ApplicationFiled: May 6, 2009Publication date: November 11, 2010Inventors: Ching-Wei Chen, Kyogu Lee, Peter C. DiMaria, Markus K. Cremer