Abstract: A method for the real-time metatagging and captioning of an event. The method for the real-time metatagging and captioning of an event may include embedding metatag information in a caption file provided by a captioner. The embedded metatag information may allow a user to access additional information via the text of the captioned event. The metatag information may be embedded using a captioning device that creates both the text code and embeds the metatag code.
Abstract: A synchronization process between captioning data and/or corresponding metatags and the associated media file parses the media file, correlates the caption information and/or metatags with segments of the media file, and provides a capability for textual search and selection of particular segments. A time-synchronized version of the captions is created that is synchronized to the moment that the speech is uttered in the recorded media. The caption data is leveraged to enable search engines to index not merely the title of a video, but the entirety of what was said during the video as well as any associated metatags relating to contents of the video. Further, because the entire media file is indexed, a search can request a particular scene or occurrence within the event recorded by the media file, and the exact moment within the media relevant to the search can be accessed and played for the requester.
Type:
Grant
Filed:
September 21, 2010
Date of Patent:
April 22, 2014
Assignee:
Caption Colorado L.L.C.
Inventors:
Richard T. Polumbus, Michael W. Homyack
Abstract: A captioning evaluation system. The system accepts captioning data and determines a number of errors in the captioning data, as well as the number of words per minute across the entirety of an event corresponding to the captioning data and time intervals of the event. The errors may be used to determine the accuracy of the captioning and the words per minute, both for the entire event and the time intervals, used to determine a cadence and/or rhythm for the captioning. The accuracy and cadence may be used to score the captioning data and captioner.
Abstract: A synchronization process between captioning data and/or corresponding metatags and the associated media file parses the media file, correlates the caption information and/or metatags with segments of the media file, and provides a capability for textual search and selection of particular segments. A time-synchronized version of the captions is created that is synchronized to the moment that the speech is uttered in the recorded media. The caption data is leveraged to enable search engines to index not merely the title of a video, but the entirety of what was said during the video as well as any associated metatags relating to contents of the video. Further, because the entire media file is indexed, a search can request a particular scene or occurrence within the event recorded by the media file, and the exact moment within the media relevant to the search can be accessed and played for the requester.
Type:
Application
Filed:
September 21, 2010
Publication date:
March 24, 2011
Applicant:
Caption Colorado L.L.C.
Inventors:
Richard T. Polumbus, Michael W. Homyack
Abstract: A method for the real-time metatagging and captioning of an event. The method for the real-time metatagging and captioning of an event may include embedding metatag information in a caption file provided by a captioner. The embedded metatag information may allow a user to access additional information via the text of the captioned event. The metatag information may be embedded using a captioning device that creates both the text code and embeds the metatag code.