Methods and apparatus for identifying a digital audio signal

Method and apparatus for identifying broadcast digital audio signals include structure and/or function whereby the digital audio signal is provided to processing structure which is configured to (i) identify a program-identifying code in the received digital audio signal, (ii) identify a program-identifying code in a decompressed received digital audio signal, (iii) identify a feature signature in the received digital audio signal, and (iv) identify a feature signature in the decompressed received digital audio signal. Preferably, such processing structure is disposed in a dwelling or a monitoring site in an audience measurement system, such as the Nielsen TV ratings system.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description
FIELD OF THE INVENTION

This application is a national stage application under 35 U.S.C. §371 of PCT Application Ser. No. PCT/US02/32043, which was filed on Oct. 9, 2002, and which claims priority under 35 U.S.C. § 119 to U.S. application Ser. No. 09/973,893, filed on Jan. 8, 2002. The present invention related to methods and apparatus for identifying a program signal that is broadcast to members of an audience. More particularly, the present invention relates to methods and apparatus for identifying a program signal having a digital audio component. Preferably, such methods and apparatus will find use in audience measurement and/or broadcast monitoring services.

RELATED ART

Third party measurements are typically used in the broadcasting industry to verify that program elements (e.g., commercials) are disseminated in accordance with contractual arrangements, and to estimate the size and composition of the audience. One measurement technique involves reading an ancillary encoded identification label or signal that is transmitted with the program. Another measurement technique involves extracting characteristic features (commonly called “signatures”) from the program, and then comparing the extracted features with a library of features from known program elements.

There is some degree of overlap in the code and signal processing between the above two techniques. If a broadcast program is encoded with a label associating it with a final distributor of the program (e.g., a local news broadcast labeled as such by the originating station), that label may be essentially self-attesting, or may be interpreted by recourse to a master look-up table. On the other hand, if the encoded label only identifies an originator or intermediate distributor (e.g., a syndicated program labeled only with the program name and episode number), then it may be necessary to compare that label with a library of labels collected from a number of local stations, in order to identify the station that transmitted the program in question.

U.S. Pat. No. 5,481,294 to Thomas et al. (incorporated herein by reference) discloses apparatus and method whereby a program signal that is to be identified is initially processed to extract an ancillary identifying code. If the code is found, it is stored along with the time at which it was received or otherwise selected by the user (known as the “read time”) in a memory for subsequent transmission to a central data collection and processing facility. If no code is found, a signature is extracted from the program signal, stored as a time-stamped record in the memory, and subsequently communicated to the central facility where it is compared with similar signatures extracted from known programs at monitoring facilities that may be remote from the central facility.

A television measurement system of the sort taught by Thomas et al. may employ codes written into either the video or audio components of the composite broadcast signal, and also may extract signatures from either the video or audio portion of that signal. Moreover, one may configure a system of this sort to extract signatures from a program signal even when an identifying label or code is read. An arrangement of this sort can provide signature data to provide “fill-in” identification at times between sequential transmissions of an audio code.

Also of note is the identification tag reading system disclosed in U.S. Pat. No. 6,202,218 to Ludtke (incorporated herein by reference). The disclosed measurement system is embedded in an in-home entertainment network having consumer electronic equipment adapted to communicate with other such equipment by means of an IEEE 1394 serial interface. Ludtke discloses an arrangement in which a program-identifying label received with a broadcast data stream that is used in the household is read, interpreted, and forwarded to a remote data collection entity.

The advent of digitally-transmitted television signals has had a profound impact on systems for both verifying broadcasts and for determining audience viewing preferences (in dwellings statistically selected to participate in a television audience measurement). For example, some video encoding arrangements that work well with analog video signals are incompatible with digital transmission because such codes, if present on an original analog signal, do not survive the data compression that is part of the process of converting the analog video signal to a digital one. Moreover, there are a variety of applicable digital transmission standards and a variation in the extent to which broadcasters adhere to those standards. For example, in 2001, the U.S. market faced several digital transmission standards: one for over-the-air terrestrial transmission, one for cable distribution, and one for satellite-to-end-user transmission.

Although the audio component of the overall program signal generally utilizes far less valuable bandwidth than does the video component, there is variation among digital audio standards. The ATSC (Advanced Television Systems Committee) standard, for example, mandates the use of what is called AC-3 audio, which could also be carried by direct satellite and cable systems. In 2001, the majority of satellites (Direct Broadcasting Satellite—DBS) and digital cable system were not using AC-3 sound. For example, some U.S. DBS systems used a standard referred to as Musicam or MPEG1 Layer 2 audio. Other standards, such as the Japanese AAC (Advanced Audio Coding) standard using MPEG2, Layers 1-3, are also known. Of course, many programs are still in a linear PCM format.

Generally speaking, the compressed audio formats call for each audio signal stream to be formatted into frames, where each frame can be configured as a string of packets that can be broadcast at a single frequency, or in a single channel with other audio streams by means of time domain multiplexing. For example, MPEG breaks each frame into a set of fixed-duration packets (where each packet has a header) for multiplexing audio and video bitstreams in one stream with the necessary information to keep the streams synchronized when decoding. Each audio frame is autonomous and contains all the information necessary for decoding so that it can be processed independently of previous or subsequently transmitted frames. Although the length of a frame may vary, depending on the bit rate and sampling frequency, there is a maximum allowed frame length, and smaller frames (i.e., those arising from lower fidelity audio signals) may be padded with dummy data in order to provide a fixed interval between frame headers.

Many of the compressed audio standards set aside portions of selected packets for the transmission of auxiliary data (e.g., signal identification data) that is not part of the audio signal. Moreover, almost all such standards provide for the use of padding bytes in order to provide a fixed interval between frame headers. These padding bytes can, in some cases, be used for the purpose of adding an ancillary program identification label even if a defined auxiliary data field is not provided for in a given standard. Thus, it is expected that in at least the great majority of packetized digital audio broadcasting systems, a program-identifying label can be added to a predetermined portion of a packet or frame.

Consumer electronic equipment for receiving digital broadcasts typically have a standard digital audio output from the receiver to the consumer's digital audio equipment. The receiver may comprise a digital tuner in a set-top box that provides an analog output to an NTSC receiver, a digital television receiver and display, or any of a number of other known audio receivers. Moreover, the use of digital audio equipment has made it common for the consumer digital receiver to supply a digital audio output even if the input to the receiver is an analog broadcast signal or an input from an analog VCR. That is, the consumer's receiving apparatus (that is to be monitored) may be used not only for receiving both analog and digital broadcast program signals, but also for selecting a program signal source from a number of possible local sources such as DVD or CD players.

The industry-standard design for providing a digital audio output signal for use by digital audio equipment is known as the Sony-Philips Digital Interface (SP/DIF). The signal available at an SP/DIF connector may be either a uncompressed linear PCM (Pulse Code Modulated) digital signal having a bit rate of no more than 64 kbit/sec, or may be a non-linear PCM encoded audio bitstreams, such as, in the AC-3 format and having a bit rate of 384 kbit/sec.

Thus, what is needed is a digital signal recognition system which is capable of accurately and reliably recognizing digital audio signals in all of the various configurations and implementations described above.

SUMMARY OF THE INVENTION

It is an object of the present invention to provide method and apparatus for processing received digital audio signals, transmitted through a wide variety of media, to ensure accurate recognition.

According to a first aspect of the present invention, digital audio signal recognition apparatus includes an input connector for direct connection to a standard SP/DIF output connector on consumer digital television receiving equipment. The apparatus includes structure for processing the input digital audio signal to obtain one or more of: (i) an identifying label encoded in a portion of a digital audio frame; (ii) an identifying label code embedded in a decompressed audio signal; (iii) a copy of a selected portion of a frame (e.g., a checksum portion); and (iv) a copy of a selected portion of the decompressed digital audio signal. Preferably, the apparatus also has an SP/DIF output connector and provides a repeated copy of the input digital audio signal at that output, so that the user can use the audio and video equipment without interference by the signal recognition process.

According to another aspect of the present invention, a method of collecting tuning data from digitally transmitted program signals comprise an initial step of obtaining the digital audio signal associated with the program, using a SP/DIF connection. If that signal is a non-linear, PCM encoded audio bitstream signal, an attempt may be made to read a first type of program-identifying label from an auxiliary message portion of the digital signal or to select a predetermined selected portion of the signal frame as a first candidate signature. In addition, the input signal may be decompressed and the measuring equipment may attempt to read a second type of program-identifying label distributed as a code embedded in the audio signal or to generate a second candidate signature from a predetermined part of the decompressed audio signal. Of course, if the signal at the SP/DIF connection is an uncompressed, or a linear PCM, or already a decompressed digital audio signal, the second label and the second candidate signature can be collected without an intermediate decompression step. On the other hand, if the first label and the first candidate signature are present, they can be collected too.

According to yet another aspect of the present invention, a method of collecting tuning data includes the steps of obtaining an input digital audio signal at an SP/DIF connection, and processing the input signal in four parallel paths to obtain up to four identification data. The identification data may comprise (i) any program-identifying label that is present in the compressed audio signals, (ii) any program-identifying label that is present in the decompressed audio signals, (iii) a first candidate signature from the input signal (if it is in a compressed audio format), and (iv) a second candidate signature from the decompressed audio. All of the four possible identification data that are collected at each measurement time are assembled to form a time-stamped record that may be communicated to a store-and-forward apparatus. The store-and-forward apparatus stores in a memory at least some of the time-stamped records sent to it, and subsequently forwards the stored records to a central data collection facility. It will be clear to those skilled in the audience-measurement arts that not all of the collected data (e.g., data collected during a time when the television is turned off, or data that supports a temporal resolution finer that what is called for in the measurement) is of value, and that the triage operation performed by the store-and-forward apparatus, or by some other portion of the measurement system installed a user dwelling, reduces the cost of storing and communicating data.

According to a further aspect of the present invention, a digital signal recognition system comprises monitoring equipment tuned to all the broadcast signal sources that can be viewed in statistically selected dwellings. Preferably, the monitoring equipment collects and stores all of the first and second program-identifying labels that are present in the signal, as well as extracting reference versions of both the first and second signatures. These data are stored as time-stamped records where the interval between data collection times is the smallest acceptable interval in any of the measurements for which the data are to be used. For example, if data from a monitoring site are to be used in both an audience measurement having a minimum reportable viewing interval of ten seconds and in a commercial broadcast verification study having a contractual resolution of one half second, the monitoring site will collect data with a one half second resolution. Regardless of the resolution involved, a central facility periodically retrieves data from both one or more monitoring sites and from some predetermined number of selected dwellings, and compares the various signatures and intermediate codes in order to identify the program signals selected by the sampled audience members.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a data structure diagram of a SP/DIF bitstream where AC-3 audio data, and ID label and signature for audience research are located or extracted.

FIG. 2 is a system-level block diagram of a preferred measurement system according to an embodiment of the present invention.

FIG. 3 is a block diagram of the measurement equipment depicted in FIG. 2.

FIG. 4 is a logic diagram depicting a main loop of a program executed by a tuning measurement apparatus of the invention.

DETAILED DESCRIPTION OF THE PRESENTLY PREFERRED EXEMPLARY EMBODIMENTS

1. Introduction

While the present invention will be described with respect to television audience monitoring (e.g., Nielsen television rating) systems, it should be understood that the present invention applies equally well to radio audience monitoring, Internet audience monitoring, radio/TV commercial verification, copyright royalty collection, etc. As used herein, the term “program signal” refers to segments of various lengths such as all or parts of programs, commercials, promotional messages, public service announcements, and the like, as well as signals generated from consumer program signal storage equipment such as video cassette recorders (analog or digital), CD players, VCD players, DVD players and the like.

Briefly, the preferred embodiment receives a digital program signal and analyzes it in one or more of four different ways. First, the preferred embodiment can identify any program-identifying label that is present in compressed digital audio signals. Second, the preferred embodiment can identify any program-identifying label that is present in decompressed audio signals. Third, the preferred embodiment can identify a first candidate signature from the input signal (preferably when it is in a frame format). And fourth, the preferred embodiment can identify a second candidate signature from the decompressed audio signals. One or more of these identifications are stored in a time-stamped record which may be used immediately, or at a later time, to verify transmitted information or measure audience participation.

2. The Signal Format

FIG. 1 depicts how AC-3 and Nielsen data fit within a SP/DIF bitstream. As depicted in FIG. 1, a SP/DIF bitstream signal 10 (as an example, here the audio is compressed in AC-3 format) is formatted based on IEC 60958 standard of International Electrotechnical Commission. It comprises a sequence of fixed-length frames 12. Each frame contains two sub-frames 13. Each of the sub-frame comprises a Preamble field 14, a Auxiliary data field 15, the data field 16 that is delivered by complying with IEC 61937 standard later, and the Status field 17. According to IEC 61937, a data burst is comprised of many IEC 60958 data fields 16. Data bursts are separated by Stuffing 20. The length of data burst is variable. Each data burst comprises a Header 21, Burst Information 22, Length 23, and a Payload 24 whose length is indicated by Length 23. Here the payload is carrying the AC-3 data. The Payload comprises SI 25, BSI 26, a number of Audio Blocks 27, and Aux data field 28, and finally, CRC2 29 which is the CRC for the entire AC-3 Frame. The last two sections are employed according to the present invention in order to conduct media research that includes measuring viewing information of the audience and monitoring programs.

The Stuffing 20 between data bursts is all zeros, however it must be noted that the IEC60958 frames still contain preamble, channel status bits, validity bits, etc. Bits 12-27 are zero. The stuffing between data bursts is used to maintain the proper synchronization of the audio out since the data channel has a capacity of a higher data rate than is necessary to convey the compressed audio content. In the “worst case” scenario, where the capacity of the data rate is fully utilized, there would be no spare Stuffing space left. However, according to statistical data from real audio contents, there typically are Stuffings available that can be altered and used for delivering additional viewing activity information from the receiver to the outside, by a resident software meter preinstalled inside the receiver. The viewing activity information can be a detailed description of what the audience(viewer) is doing with the receiver and what the receiver is doing accordingly.

In the case of digital television, video and audio data in compressed form are carried in a bit-stream using a format specified by the Advanced Television Standards Committee (ATSC). The audio data uses Dolby's AC-3 compression algorithm and the AC-3 bit-stream contains, besides the actual audio data, headers containing additional information such as synchronization, timing and sampling rate. The AC-3 stream is organized into frames and each frame contains sufficient data to reconstruct audio corresponding approximately to the duration of video frame. Each frame has a fixed size in terms of total number of 16-bit words. At the end of a frame in addition to a Cyclic Redundancy Check (CRC2 in AC-3 terminology) word designed to detect errors in the reception of the frame, there is a reserved field for inserting auxiliary data (AUXDATA). Use may be made of the AUXDATA field to carry program and station information relevant to TV audience metering. Preceding AUXDATA are two fields: AUXDATAE is a 1-bit flag which indicates valid AUXDATA is present and AUXDATAL is a 14-bit field which indicates how many bits of auxiliary data are present.

In the case of 48 kHz-sampled audio with 16-bits per sample, each AC-3 frame represents, in a compressed form, 6 “blocks” of audio. Each block is derived from 256 samples per channel. The number of channels can vary between 1 in the case of monophonic audio to 6 for the case of “5.1 channel surround sound”. A multi-pass algorithm attempts to compress the data from each 256-sample block of audio in order to minimize the number of bits required to represent it. Since the frame size is fixed, at the end of the optimization process several bytes are usually available as “surplus”—in the bit stream these are defined by SKIPLE—if this bit is a “1” it means there are dummy bytes packed into the stream. Following this there is a 9-bit number SKIPL, which defines the number of bytes to skip at this point in the stream.

Most current AC-3 bit-stream generators do not make use of the auxiliary data field and as a result AUXDATAE is set to 0. In such cases, in order to utilize the AUXDATA feature, the stream may be modified by at first examining each frame to determine the total number of SKIP bytes present in the frame. These will occur at the end of each block. These will be used to create the necessary space for AUXDATA at the end of the frame. By appropriately modifying the SKIPL values at the end of each block and repacking the bits, a desired amount of space can be created for AUXDATA. In the event adequate bits to meet the AUXDATA requirements are not present, no change to the frame is made and AUXDATAE is set to 0. It may be noted that depending on the nature of the audio, not every frame will be capable of carrying auxiliary data. Indeed the Cyclic Redundancy Check words CRC1 and CRC2 have to be recomputed after these changes are made.

3. The Signal Identification Techniques

In view of the above discussion, a first technique for identifying a broadcast in a monitoring or audience-measurement system having a clock or other time keeping means operatively associated therewith includes the steps of reading a program-identifying label from an auxiliary data field of a digital audio signal frame 40, and associating the label with the time at which it was read as a time-stamped record that can be stored in a memory for subsequent communication to a central data processing facility. The second technique is similar to the first, but obtains the program-identifying label after decompressing a compressed digital audio signal.

In the first and second techniques, the number of bytes required for a program-identifying label can be relatively small. Thus, it may be possible to add a code to a signal frame even if no auxiliary data field is provided, or if the provided field is pre-empted for some other use. One could encode a program signal by writing the code in non-used portions of the frame—e.g., in padding bytes that are otherwise ignored by the ordinary audio data processing operations.

A third technique for a tuned program signal is to extract a characteristic feature, or signature, from the signal at both a statistically selected tuning site and at one or more monitoring site(s) arranged so as to monitor all broadcast signals that can be received at the tuning site. The candidate signature from the tuning site can then be compared to reference signatures from the monitoring site or sites in order to identify the tuned program by matching the signatures. Correspondingly, the broadcast of repeated program elements can be identified by comparing candidate signatures from a monitoring site with a library of reference signatures. The fourth technique is similar to the third technique, but extracts the candidate signature after decompressing a compressed digital audio signal, or from original uncompressed digital signal.

In the third and fourth techniques, the well-defined data formats used for the transmission of digital signals facilitates the identification of broadcast programs by comparing signatures. One way to compare signatures is to extract a predetermined field from a frame 10 of a digital audio signal at both a measurement site (tuning site and/or monitoring site) and a reference site. In a preferred embodiment of the invention a checksum field (which is commonly a CRC checksum) is read from each digital audio frame and is associated with a read time output from a clock or other time keeping means in a step that forms all or part of a time-stamped record. The CRC is a desirable signature because it comprises a relatively small data field that is variable enough to yield a unique signature. It will be recognized, however, that many other portions of a data frame 10 (e.g., the non-program labeling contents of an auxiliary data field) could equally well be used for this purpose.

The comparison of signatures extracted from analog signals usually requires circuitry or signal processing for handling both (i) temporal errors or drifts, and (ii) changes in the magnitudes of the signals acquired at two different locations. In the case of digital signals, however, the recognition or matching process is considerably simpler. Although the identifying algorithms must provide for “sliding” data blocks relative to each other along a time axis in order to accommodate temporal drifts or other time-keeping errors, there is no corresponding signal amplitude problems inasmuch as two matching signals will be substantially identical bit for bit.

The preferred embodiment of the invention combines all four techniques in reading a program-identifying label from an auxiliary data field of a digital audio signal frame, in reading a program-identifying label of embedded code from a decompressed or uncompressed audio signal, in reading a predetermined portion of the signal frame, in extracting a signature from a decompressed or non compressed audio signal. The labels, if found, and the predetermined portion of the signal frame and the signature are associated with the local time when the frame was read in order to generate a time-stamped tuning record for each frame of digital audio signal that is received. During operation, there may not be all four techniques present at the same time. For example, the original audio signal may be an uncompressed (linear) digital audio signal. Then, only the second and fourth techniques may be employed. Even in a compressed digital signal case, there may be a lack of a program label in an auxiliary data field for various reasons. As long as the system gets at least one label or one signature, they (it) will be associated with the local time when the frame was read in order to generate a time-stamped tuning record for each frame of digital audio signal that is received.

4. The Structure

In a preferred embodiment depicted in FIG. 2, a tuning or verification site 36 comprises a receiver 34, a tuning measurement apparatus 48, a clock or timing device 38, and a storage and forwarding apparatus 52. The tuning measurement apparatus 48 receives a digital audio signal output by a consumer's receiving hardware 34 at an industry-standard SP/DIF connector 50. The tuning measurement apparatus 48 decodes the signal to read a program-identifying label, if any, and to collect a predetermined portion of each signal frame for signature analysis (to be described below). The tuning measurement apparatus 48, the clock or timing device 38, and the storage and forwarding apparatus 52 (FIG. 2) may be embodied in a single computer, or in a plurality of processors, or in hard-wired circuitry. These circuits may also be incorporated into the dwelling set-top box, or as a stand-alone device.

The time-stamped records that are generated by the tuning measurement apparatus 48 are sent to a data storage and forwarding apparatus 52 that stores all or some subset of these records in a memory 44, for subsequent transmission over a public switched telephone network 54 to a central data collection facility 46 by means of a modem 53. It will be recognized that instead of a dial-up modem, other suitable communication means such as a cable modem, or a wireless data link could be used for this purpose.

In more detail, the tuning measurement apparatus 48 is configured to be connected to a SP/DIF connector 50 that is part of a consumer-owned receiving apparatus 34, which may be a digital television receiver, a set-top box feeding an analog signal to an NTSC receiver, or any other such receiving apparatus. As is provided for in industry standards, the signal at the SP/DIF connector may be either a linear PCM (uncompressed) digital audio signal at 64 kbit/sec or less, or a non-linear PCM encoded audio bitstream signal in the AC-3 format and having a bit rate of 384 kbit/sec. Although the standard allows for multiplexed AC-3 streams, the equipment that was initially placed on the market transmitted only one stream. If two or more streams were transmitted, the apparatus of the invention may be configured to select one of the AC-3 streams for decoding. It may be noted for any signal appearing at the SP/DIF connector, the apparatus 48 does not need to deal with the full DTV bitstream, which has a much higher bit rate of 19.2 Mbit/sec. Hence, the preferred embodiment is expected to be both less expensive and more reliable than alternate apparatus that acquires the full DTV signal from within the consumer's receiving equipment.

The records not stored in the memory 44 may be discarded in the interest of using the memory 44 efficiently while still providing the measurement's specified temporal signal granularity. For example, if an audience measurement is made with a guaranteed resolution of fifteen seconds from a digital broadcast signal having an interval of eight milliseconds between frame headers, the memory 44 need only retain one out of every 1875 records taken while the receiving equipment is in active use. Of course, no data need be collected when the equipment is not in use. Hence if the monitored SP/DIF connector always provides an output (e.g., as might be the case if an always-on set-top box digital receiver is used to provide a signal to an NTSC television), a separate on/off sensor 56 can provide an input to the storage and forwarding unit to indicate the time periods during which data are to be collected.

The store-and-forward apparatus 52 in the preferred embodiment assembles time-stamped records from the inputs. As noted previously, the amount of data sent from the measurement apparatus 48, which is preferably taken from each frame of the audio signal, is far in excess of what is required for an audience measurement. Hence, the store-and-forward apparatus 52 may filter the inputs and generate time-stamped records from only some of them. A monitoring site used for measurement, on the other hand, may use essentially the same processes to collect and save all the available data for subsequent comparison with data from a plurality of statistically selected dwellings.

As shown in FIG. 3, in order to avoid attenuating the signal at the receiver output, the apparatus 48 preferably comprises a repeater circuit 64 that provides a repeated signal to a second SP/DIF connector 66 that can be used to furnish signals to other consumer-owned equipment. The signal input to the apparatus 48 is fed to a bit rate detector 68 used to determine whether the signal is a uncompressed (linear) PCM signal or an AC-3 signal, to route the AC-3 signals to a frame decoder 70, and to route uncompressed PCM signals to both a embedded code reader circuit 72 (designed to extract embedded audio codes from the signal) and to a signature extractor circuit 74. If the signal is in the AC-3 format, the frame decoder 70 provides the contents of an auxiliary data field (if one is present) and a copy of a predetermined portion of the frame (e.g., the CRC field) as outputs to a store-and-forward apparatus 52. The frame decoder 70 also has an output to a decompression circuit 76 that supplies a digital signal to the embedded code reader 72 and the signature extractor 74. The signature extractor 74 may supply the candidate signature from the decompressed signal, or the candidate signal from the uncompressed signal to the storage and forwarding apparatus 52.

The preferred embodiment may collect data from a linear PCM audio signal, which may be originally uncompressed, or obtained from the transmitted non-linear PCM encoded audio bitstream signal by a decompression process that is part of the standard operation of recovering the signal that was compressed prior to transmission. The third and fourth techniques mentioned previously can be employed in these kinds of situations.

U.S. patent application Ser. No. 09/116,397, filed Jul. 16, 1998 and assigned to the assignee of this application, and U.S. patent application Ser. No. 09/428,425, filed Oct. 27, 1999, and U.S. patent application Ser. No. 09/543,480, filed Apr. 6, 2000 (each of which is incorporated herein by reference) disclose methods and apparatus for encoding audio signals by spectral modulation. These coding arrangements are selected so that the code survives subsequent compression and decompression and is hence compatible with various digital signal transmission standards. It will be recognized that other coding arrangements that have been (and will be) developed satisfy this process. Hence, the preferred embodiment of the invention attempts to recover an encoded program label from the PCM audio signal, which may be a decompressed audio signal. In other arrangements, of course, audio codes may be recovered from an analog audio signal, such as one recovered from a microphone adjacent a speaker.

As discussed earlier, it is known in the broadcast measurement arts to extract signatures from video and/or audio signals and to compare these with reference signatures extracted by similar means from known program signals. In the preferred embodiment of the invention, a signature is extracted from a linear PCM audio signal, and is stored, with other identification data as a time-stamped tuning record. Although the preferred arrangement calls for extracting this signature from a PCM digital audio data stream, those skilled in the art will recognize that one could also elect to convert the digital audio signal to an analog signal and then extract the signature by methods such as those described in U.S. Pat. Nos. 4,697,209 and 4,677,466, (both of which are incorporated herein by reference).

5. The Process

The preferred embodiment of the invention runs four identification processes on a received signal and generally operates in a parallel fashion, as depicted in the flow chart of FIG. 4. It will be recognized that inasmuch as any one of those processes can yield a positive identification of a program signal, fewer identification processes can also be used. Moreover, instead of collecting all of the identification data all of the time, one could elect to set up a hierarchical collection scheme, such as the one disclosed by Thomas et al. in U.S. Pat. No. 5,481,294 (incorporated herein by reference), that initially looks for a preferred identification datum and collects an alternate datum only if the first one is not available. For example, one could configure a system in which a program label was initially sought in an AUX field 31 of a digital signal frame and in which an embedded code was sought in the decompressed digital audio signal only if the attempt to read the AUX field failed.

Moreover, one could also elect to run multiple identification processes and to then select from among the available data at the time a time-stamped record was assembled (step 42). For example, if a positive identification label was read from an AUX field, that code and the CRC signature could be saved as the time-stamped record. To continue with the example, if no AUX code was found but an embedded code was read, the CRC signature and the embedded code could be saved for retrieval by the central data collection facility.

In FIG. 4, the digital audio signal associated with the program is obtained at the SP/DIF connection in step S1. The program-identifying label is read from the auxiliary data field of a digital audio signal frame in step S2. A checksum field (which is commonly a CRC checksum) is read from each digital audio frame and is associated with a read time output from a clock or other time keeping means 38 in a step S3 that forms all or part of a time-stamped record. If the received digital audio signal is a compressed signal, it is decompressed at step S4. In step S5, an encoded program label is recovered from the decompressed audio signal, PCM audio signal. In step S6, a signature is extracted from a linear PCM audio signal, and is stored, with any other identification data, as a time-stamped tuning record in step S7. While the preferred embodiment conducts four signal identification procedures in parallel, any combination of two or more of these procedures could be conducted in parallel or in series. For example, the process could first determine whether the received signal was compressed, and then invoke the appropriate program-identifying and signature extraction steps. One could also elect to run any combination of these identification procedures, and then select from among the available data at the time the time-stamped record was assembled.

As mentioned previously, the program label (e.g., identifying label) in the aux data field may be encoded by the broadcasters and/or program producers. As an alternative program identifying code, the broadcasters and/or any other participant in the distribution system may also insert PSIP (Program System Information Protocol), and/or Content Identification (Content ID) data, and/or other useful data in the aux data field. As yet another alternative program identifying code, the broadcasters and/or other participant in the distribution system may also copy program related data from bitstream data areas outside the aux data field into the aux data field. The statistical availability of Stuffing can be also utilized to deliver viewing activity information. Furthermore, the SP/DIF standard allows for non-audio data instead of, or in addition to, the non-linear PCM embedded audio bitstreams. The digital signatures and auxiliary codes may be effectively applied to identify or monitor non-audio data that includes Internet and other data transmission applications. All such alternatives are within the scope of the appended claims.

6. Conclusion

Thus, what has been described is a digital audio signal recognition system and method which accurately and reliably detects digital audio signals. The system according to the present invention will find use in verification sites, reference library sites, audience monitoring dwellings, and in any site where the monitoring, storing, and/or comparison of digital audio signals is required.

The individual components shown in block or schematic form in the Drawings are all well-known in the signal processing arts or are described in the documents incorporated herein by reference, and their specific construction and/or operation are not critical to the operation or best mode for carrying out the invention.

While the present invention has been described with respect to what is presently considered to be the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. To the contrary, the invention is intended to cover various modifications and equivalent structures and functions included within the spirit and scope of the appended claims.

Claims

1. Apparatus for identifying a digital audio signal, comprising:

an input which receives the digital audio signal, wherein the digital audio signal contains data from a broadcast or stored digital program signal; and
structure configured to (i) identify a program-identifying code in a compressed portion of the received digital audio signal, (ii) identify a program-identifying code in a decompressed or uncompressed received digital audio signal, (iii) identify a feature signature in a compressed portion of the received digital audio signal, and (iv) identify a feature signature in the decompressed or uncompressed received digital audio signal.

2. Apparatus according to claim 1, wherein said structure time-stamps the identified signals.

3. Apparatus according to claim 2, further comprising a clock which supplies a clock signal to said structure.

4. Apparatus according to claim 2, further comprising a memory for storing the time-stamped identified signals.

5. Apparatus according to claim 4, further comprising transmission structure for transmitting the stored, time-stamped identified signals to a collection site.

6. Apparatus according to claim 1, wherein said structure is disposed in a dwelling.

7. Apparatus according to claim 1, wherein said structure is disposed in a verification site.

8. Apparatus according to claim 1, wherein said structure is disposed in a monitoring site.

9. Apparatus according to claim 1, further comprising a tuner which receives a broadcast program signal and outputs the digital audio signal to said input.

10. Apparatus according to claim 1, wherein the program-identifying code is placed in an aux data field in the digital audio signal by a broadcaster and/or a program producer.

11. Apparatus according to claim 10, wherein the program identifying code is Program System Information Protocol (PSIP) data, Content ID data, and/or other useful data which are placed in the aux data field by the broadcaster and/or any other participant in the distribution system.

12. Apparatus according to claim 10, wherein the program identifying code is program related data which are copied from bitstream data areas outside the aux data field into the aux data field by the broadcaster and/or other participant in the distribution system.

13. Apparatus according to claim 1, wherein stuffing data in the digital audio signal is altered with data which corresponds to viewing activity information.

14. Apparatus according to claim 1, wherein the program-identifying code and the feature signal correspond to an Internet transmission.

15. A method for identifying a digital audio signal, the method comprising:

receiving the digital audio signal at an input, wherein the digital audio signal contains data from a broadcast or stored digital program signal; and
configuring processing structure to (i) identify a program-identifying code in a compressed portion of the received digital audio signal, (ii) identify program-identifying code in a decompressed received digital audio signal, (iii) identify a feature signature in a compressed portion of the received digital audio signal, and (iv) identify a feature signature in the decompressed received digital audio signal.

16. A method according to claim 15, wherein said processing structure time-stamps the identified signals.

17. A method according to claim 16, further comprising supplying a clock signal to said processing structure.

18. A method according to claim 16, further comprising storing the time-stamped identified signals.

19. A method according to claim 18, further comprising transmitting the stored, time-stamped identified signals to a collection site.

20. A method according to claim 15, wherein said processing structure is disposed in a dwelling.

21. A method according to claim 15, wherein said processing structure is disposed in a verification site.

22. A method according to claim 15, wherein said processing structure is disposed in a monitoring site.

23. A method according to claim 15, further comprising using a tuner to receive a broadcast program signal and output the digital audio signal to said input.

24. A computer readable storage medium which causes one or more computers to carry out a method for identifying a digital audio signal, the computer readable storage medium causing the one or more computers to:

input the digital audio signal, wherein the digital audio signal contains data from a broadcast or stored digital program signal; and
configure processing structure to (i) identify a program-identifying code in a compressed portion of the received digital audio signal, (ii) identify a program-identifying code in a decompressed received digital audio signal, (iii) identify a feature signature in a compressed portion of the received digital audio signal, and (iv) identify a feature signature in the decompressed received digital audio signal.

25. A computer readable storage medium according to claim 24, wherein said processing structure is configured to time-stamp the identified signals.

26. A computer readable storage medium according to claim 25, further comprising supplying a clock signal to said processing structure.

27. A computer readable storage medium according to claim 25, further comprising storing the time-stamped identified signals.

28. A computer readable storage medium according to claim 27, further comprising transmitting the stored, time-stamped identified signals to a collection site.

29. A computer readable storage medium according to claim 24, wherein said processing structure is disposed in a dwelling.

30. A computer readable storage medium according to claim 24, wherein said processing structure is disposed in a verification site.

31. A computer readable storage medium according to claim 24, wherein said processing structure is disposed in a monitoring site.

32. A computer readable storage medium according to claim 24, further comprising using a tuner to receive a broadcast program signal and output the digital audio signal to said processing structure.

Referenced Cited
U.S. Patent Documents
3372233 March 1968 Currey
3845391 October 1974 Crosby
3919479 November 1975 Moon et al.
4025851 May 24, 1977 Haselwood et al.
4027332 May 31, 1977 Wu et al.
4230990 October 28, 1980 Lert, Jr. et al.
4367488 January 4, 1983 Leventer et al.
4379947 April 12, 1983 Warner
4547804 October 15, 1985 Greenberg
4599644 July 8, 1986 Fischer
4622583 November 11, 1986 Watanabe et al.
4633302 December 30, 1986 Damoci
4639779 January 27, 1987 Greenberg
4647974 March 3, 1987 Butler et al.
4677466 June 30, 1987 Lert, Jr. et al.
4697209 September 29, 1987 Kiewit et al.
4703476 October 27, 1987 Howard
4718106 January 5, 1988 Weinblatt
4739398 April 19, 1988 Thomas et al.
4764808 August 16, 1988 Solar
4816904 March 28, 1989 McKenna et al.
4843562 June 27, 1989 Kenyon et al.
4847685 July 11, 1989 Gall
4858000 August 15, 1989 Lu
4885632 December 5, 1989 Mabey et al.
4907079 March 6, 1990 Turner et al.
4912552 March 27, 1990 Allison, III et al.
4931871 June 5, 1990 Kramer
4939515 July 3, 1990 Adelson
4943973 July 24, 1990 Werner
4945412 July 31, 1990 Kramer
4955070 September 4, 1990 Welsh et al.
4963998 October 16, 1990 Maufe
4965825 October 23, 1990 Harvey et al.
4969041 November 6, 1990 O'Grady et al.
4972503 November 20, 1990 Zurlinden
5003591 March 26, 1991 Kauffman et al.
5019899 May 28, 1991 Boles et al.
5079648 January 7, 1992 Maufe
5113437 May 12, 1992 Best et al.
5113518 May 12, 1992 Durst, Jr. et al.
5155762 October 13, 1992 Croquet et al.
5165069 November 17, 1992 Vitt et al.
5200822 April 6, 1993 Bronfin et al.
5251324 October 5, 1993 McMullan, Jr.
5278988 January 11, 1994 Dejean et al.
5294977 March 15, 1994 Fisher et al.
5319453 June 7, 1994 Copriviza et al.
5319735 June 7, 1994 Preuss et al.
5373315 December 13, 1994 Dufresne et al.
5379345 January 3, 1995 Greenberg
5404160 April 4, 1995 Schober et al.
5404161 April 4, 1995 Douglass et al.
5404377 April 4, 1995 Moses
5425100 June 13, 1995 Thomas et al.
5432558 July 11, 1995 Kim
5436653 July 25, 1995 Ellis et al.
5437050 July 25, 1995 Lamb et al.
5450122 September 12, 1995 Keene
5450490 September 12, 1995 Jensen et al.
5455630 October 3, 1995 McFarland et al.
5457807 October 10, 1995 Weinblatt
5463423 October 31, 1995 Tults
5473631 December 5, 1995 Moses
5481294 January 2, 1996 Thomas et al.
5481370 January 2, 1996 Kim
5485518 January 16, 1996 Hunter et al.
5488408 January 30, 1996 Maduzia et al.
5490060 February 6, 1996 Malec et al.
5495282 February 27, 1996 Mostafa et al.
5497185 March 5, 1996 Dufresne et al.
5504518 April 2, 1996 Ellis et al.
5512933 April 30, 1996 Wheatley et al.
5526427 June 11, 1996 Thomas et al.
5532732 July 2, 1996 Yuen et al.
5559549 September 24, 1996 Hendricks et al.
5572246 November 5, 1996 Ellis et al.
5574962 November 12, 1996 Fardeau et al.
5579124 November 26, 1996 Aijala et al.
5581658 December 3, 1996 O'Hagan et al.
5581800 December 3, 1996 Fardeau et al.
5585858 December 17, 1996 Harper et al.
5594934 January 14, 1997 Lu et al.
5600364 February 4, 1997 Hendricks et al.
5608445 March 4, 1997 Mischler
5612729 March 18, 1997 Ellis et al.
5612943 March 18, 1997 Moses et al.
5621454 April 15, 1997 Ellis et al.
5629739 May 13, 1997 Dougherty
5630203 May 13, 1997 Weinblatt
5646675 July 8, 1997 Copriviza et al.
5650825 July 22, 1997 Naimpally et al.
5652615 July 29, 1997 Bryant et al.
5654748 August 5, 1997 Matthews, III
5659350 August 19, 1997 Hendricks et al.
5676282 October 14, 1997 Satterfield
5682463 October 28, 1997 Allen et al.
5689509 November 18, 1997 Gaytan et al.
5701582 December 23, 1997 DeBey
5719634 February 17, 1998 Keery et al.
5726701 March 10, 1998 Needham
5734444 March 31, 1998 Yoshinobu
5737025 April 7, 1998 Dougherty et al.
5757414 May 26, 1998 Thorne
5764763 June 9, 1998 Jensen et al.
5768426 June 16, 1998 Rhoads
5771307 June 23, 1998 Lu et al.
5774452 June 30, 1998 Wolosewicz
5787334 July 28, 1998 Fardeau et al.
5793414 August 11, 1998 Shaffer
5793759 August 11, 1998 Rakib et al.
5798785 August 25, 1998 Hendricks et al.
5809160 September 15, 1998 Powell et al.
5815297 September 29, 1998 Ciciora
5815671 September 29, 1998 Morrison
5819156 October 6, 1998 Belmont
5822360 October 13, 1998 Lee et al.
5822436 October 13, 1998 Rhoads
5825976 October 20, 1998 Dorward et al.
5826164 October 20, 1998 Weinblatt
5842010 November 24, 1998 Jain et al.
5848155 December 8, 1998 Cox
5848396 December 8, 1998 Gerace
5850249 December 15, 1998 Massetti et al.
5857190 January 5, 1999 Brown
5872588 February 16, 1999 Aras et al.
5880789 March 9, 1999 Inaba
5881360 March 9, 1999 Fong
5889548 March 30, 1999 Chan
5898836 April 27, 1999 Freivald et al.
5905865 May 18, 1999 Palmer et al.
5910988 June 8, 1999 Ballard
5914746 June 22, 1999 Matthews, III et al.
5915027 June 22, 1999 Cox et al.
5930369 July 27, 1999 Cox et al.
5940135 August 17, 1999 Petrovic et al.
5966120 October 12, 1999 Arazi et al.
5973625 October 26, 1999 Nam
5973750 October 26, 1999 Ogawa et al.
5974299 October 26, 1999 Massetti
5978842 November 2, 1999 Noble et al.
5999689 December 7, 1999 Iggulden
6002443 December 14, 1999 Iggulden
6029045 February 22, 2000 Picco et al.
6031577 February 29, 2000 Ozkan et al.
6035177 March 7, 2000 Moses et al.
6061793 May 9, 2000 Tewfik et al.
6064748 May 16, 2000 Hogan
6069914 May 30, 2000 Cox
6072888 June 6, 2000 Powell et al.
6081783 June 27, 2000 Divine et al.
6085066 July 4, 2000 Fong
6112053 August 29, 2000 Dunki-Jacobs et al.
6118873 September 12, 2000 Lotspiech et al.
6124877 September 26, 2000 Schmidt
6128736 October 3, 2000 Miller
6130726 October 10, 2000 Darbee et al.
6151578 November 21, 2000 Bourcet et al.
6154484 November 28, 2000 Lee et al.
6154571 November 28, 2000 Cox et al.
6157719 December 5, 2000 Wasilewski et al.
6160570 December 12, 2000 Sitnik
6169843 January 2, 2001 Lenihan et al.
6175639 January 16, 2001 Satoh et al.
6184918 February 6, 2001 Goldschmidt Iki et al.
6192427 February 20, 2001 Li et al.
6202210 March 13, 2001 Ludtke
6208735 March 27, 2001 Cox et al.
6215526 April 10, 2001 Barton et al.
6219634 April 17, 2001 Levine
6229924 May 8, 2001 Rhoads et al.
6243481 June 5, 2001 Tao
6252631 June 26, 2001 Lakhami
6259443 July 10, 2001 Williams, Jr.
6259801 July 10, 2001 Wakasu
6263087 July 17, 2001 Miller
6266096 July 24, 2001 Gutsmann et al.
6268866 July 31, 2001 Shibata
6272176 August 7, 2001 Srinivasan
6278792 August 21, 2001 Cox et al.
6286140 September 4, 2001 Ivanyi
6289514 September 11, 2001 Link et al.
6298142 October 2, 2001 Nakano et al.
6301369 October 9, 2001 Powell et al.
6307950 October 23, 2001 Powell et al.
6332194 December 18, 2001 Bloom et al.
6338037 January 8, 2002 Todd et al.
6343138 January 29, 2002 Rhoads
6353929 March 5, 2002 Houston
6359573 March 19, 2002 Taruguchi et al.
6373960 April 16, 2002 Conover et al.
6385330 May 7, 2002 Powell et al.
6400827 June 4, 2002 Rhoads
6404898 June 11, 2002 Rhoads
6415041 July 2, 2002 Oami et al.
6421445 July 16, 2002 Jensen et al.
6421450 July 16, 2002 Nakano
6424726 July 23, 2002 Nakano et al.
6424816 July 23, 2002 Stephens
6438252 August 20, 2002 Miller
6453053 September 17, 2002 Wakasu
6456724 September 24, 2002 Watanabe
6459803 October 1, 2002 Powell et al.
6467089 October 15, 2002 Aust et al.
6469749 October 22, 2002 Dimitrova et al.
6470090 October 22, 2002 Oami et al.
6473903 October 29, 2002 Balakrishnan et al.
6477707 November 5, 2002 King et al.
6493457 December 10, 2002 Quackenbush et al.
6496591 December 17, 2002 Rhoads
6505223 January 7, 2003 Haitsma et al.
6507299 January 14, 2003 Nuijten
6510233 January 21, 2003 Nakano
6510557 January 21, 2003 Thrift
6512796 January 28, 2003 Sherwood
6513161 January 28, 2003 Wheeler et al.
6519769 February 11, 2003 Hopple et al.
6523175 February 18, 2003 Chan
6530082 March 4, 2003 Del Sesto et al.
6539095 March 25, 2003 Rhoads
6542620 April 1, 2003 Rhoads
6553070 April 22, 2003 Hashimoto
6560350 May 6, 2003 Rhoads
6567780 May 20, 2003 Rhoads
6574350 June 3, 2003 Rhoads
6574594 June 3, 2003 Pitman et al.
6577346 June 10, 2003 Perlman
6584138 June 24, 2003 Neubauer et al.
6590998 July 8, 2003 Rhoads
6597405 July 22, 2003 Iggulden
6604072 August 5, 2003 Pitman et al.
6611607 August 26, 2003 Davis et al.
6614915 September 2, 2003 Powell et al.
6633651 October 14, 2003 Hirzalla et al.
6647129 November 11, 2003 Rhoads
6647130 November 11, 2003 Rhoads
6647548 November 11, 2003 Lu et al.
6664976 December 16, 2003 Lofgren et al.
6665419 December 16, 2003 Oami
6668068 December 23, 2003 Hashimoto
6675383 January 6, 2004 Wheeler et al.
6678392 January 13, 2004 Powell et al.
6678740 January 13, 2004 Rakib et al.
6697499 February 24, 2004 Oami
6700993 March 2, 2004 Minematsu
6724911 April 20, 2004 Cox et al.
6735325 May 11, 2004 Wakasu
6735775 May 11, 2004 Massetti
6738493 May 18, 2004 Cox et al.
6738744 May 18, 2004 Kirovski et al.
6738978 May 18, 2004 Hendricks et al.
6751221 June 15, 2004 Saito et al.
6754377 June 22, 2004 Rhoads
6785399 August 31, 2004 Fujihara
6788800 September 7, 2004 Carr et al.
6798893 September 28, 2004 Tanaka
6807528 October 19, 2004 Truman et al.
6826289 November 30, 2004 Hashimoto
6829368 December 7, 2004 Meyer et al.
6834345 December 21, 2004 Bloom et al.
6853737 February 8, 2005 Watanabe
6856693 February 15, 2005 Miller
6915000 July 5, 2005 Tanaka
6928165 August 9, 2005 Takai
6947562 September 20, 2005 Hasimoto
6947572 September 20, 2005 Terasaki
6959386 October 25, 2005 Rhoads
6968564 November 22, 2005 Srinivasan
6996249 February 7, 2006 Miller et al.
7007167 February 28, 2006 Kurahashi
7027611 April 11, 2006 Hashimoto
7050604 May 23, 2006 Fujihara et al.
7051207 May 23, 2006 Watanabe
7092546 August 15, 2006 Tanaka
7114073 September 26, 2006 Watanabe
7146501 December 5, 2006 Tanaka
7149324 December 12, 2006 Tanaka
7159117 January 2, 2007 Tanaka
7316025 January 1, 2008 Aijala et al.
7587601 September 8, 2009 Levy et al.
20010005823 June 28, 2001 Fischer et al.
20010026616 October 4, 2001 Tanaka
20010031064 October 18, 2001 Donescu
20010047478 November 29, 2001 Mase
20010053235 December 20, 2001 Sato
20010053237 December 20, 2001 Hashimoto
20020006203 January 17, 2002 Tachibana
20020010919 January 24, 2002 Lu et al.
20020026635 February 28, 2002 Wheeler et al.
20020059577 May 16, 2002 Lu et al.
20020059632 May 16, 2002 Link et al.
20020085736 July 4, 2002 Kalker
20020085737 July 4, 2002 Kitamura
20020097891 July 25, 2002 Hinishi
20020106106 August 8, 2002 Sato
20020114299 August 22, 2002 Lu et al.
20020120925 August 29, 2002 Logan
20020129253 September 12, 2002 Langelaar
20020147990 October 10, 2002 Lu et al.
20020178410 November 28, 2002 Haitsma et al.
20020186768 December 12, 2002 Dimitrova et al.
20030005430 January 2, 2003 Kolessar
20030023967 January 30, 2003 Kim
20030037333 February 20, 2003 Ghashghai et al.
20030051249 March 13, 2003 Hoang
20030066070 April 3, 2003 Houston
20030081781 May 1, 2003 Jensen et al.
20030086341 May 8, 2003 Wells et al.
20030131350 July 10, 2003 Peiffer et al.
20030133590 July 17, 2003 Miller et al.
20030169879 September 11, 2003 Akins et al.
20040059918 March 25, 2004 Xu
20040122679 June 24, 2004 Neuhauser et al.
20040181799 September 16, 2004 Lu et al.
20040210922 October 21, 2004 Peiffer et al.
20050020223 January 27, 2005 Ellis et al.
20050177361 August 11, 2005 Srinivasan
20050196151 September 8, 2005 Takagi et al.
20060171474 August 3, 2006 Ramaswamy et al.
20070040934 February 22, 2007 Ramaswamy et al.
Foreign Patent Documents
678163 November 1993 AU
747044 September 2000 AU
2004201423 April 2004 AU
718227 November 2006 AU
0161512 April 1985 EP
255383 May 1985 EP
0161512 November 1985 EP
0195639 March 1986 EP
0210609 July 1986 EP
0687083 December 1995 EP
0703683 March 1996 EP
0840511 May 1998 EP
0986047 March 2000 EP
0913952 July 2000 EP
1041767 October 2000 EP
1043854 October 2000 EP
2717025 September 2005 FR
2 063 018 May 1981 GB
2063018 October 1981 GB
2196167 April 1988 GB
2260246 April 1993 GB
2292506 February 1996 GB
2338869 December 1999 GB
01-177796 July 1989 JP
3173291 July 1991 JP
09-083470 March 1997 JP
10-145823 May 1998 JP
8-508617 October 1999 JP
11-275032 October 1999 JP
2000517141 December 2000 JP
1993-006665 July 1993 KR
8810540 December 1988 WO
9111062 July 1991 WO
WO 93/07689 April 1993 WO
9322875 November 1993 WO
9411989 May 1994 WO
9512278 April 1995 WO
9515653 June 1995 WO
9527349 October 1995 WO
9832251 July 1998 WO
9959275 November 1999 WO
9962260 December 1999 WO
9963443 December 1999 WO
WO 01/61892 August 2001 WO
0237316 May 2002 WO
02091361 November 2002 WO
WO 02/091361 November 2002 WO
03007235 January 2003 WO
03060630 July 2003 WO
WO 2004/038538 May 2004 WO
2005002200 January 2005 WO
2005006768 January 2005 WO
2005008582 January 2005 WO
2005099385 October 2005 WO
Other references
  • International Preliminary Examination Report for international application PCT/US02/32043.
  • International Search Report for international application PCT/US02/32043.
  • English translation of Information Statement of Video Research filed in counterpart Japanese application No. 2000-548980.
  • Advanced Television Systems Committee. “Program and System Information Protocol For Terrestrial Broadcast and Cable (Revision A) and Amendment No. 1. Doc. A/65A” May 31, 2000.
  • Advanced Television Systems Committee T3/S8 Transport Specialist Group. “Annex G—An Overview of PSIP For Cable.” ATSC T3/S8 Doc. 268. Apr. 3, 1998.
  • Advanced Televsion Systems Committee. “Guide to the Use of the ATSC Digital Television Standard.” pp. 1-148 (2003).
  • Advanced Televsion Systems Committee. “Program/Episode/Version Identification. ATSC Standard.” Doc. A/57. Aug. 30, 1996.
  • European Patent Office, Communication Pursuant to Article 115(2) EPC for European Patent Application No. 00114272.8, with attached Observation of third party (Aug. 14, 2003).
  • European Patent Office, European Search Report, date of completion Oct. 3, 2000, Application No. EP 00 11 4271.
  • European Patent Office, European Search Report, date of completion Oct. 5, 2000, Application No. EP 00 11 4272.
  • International Search Report, dated Jan. 7, 1999, Application No. PCT/US98/14286.
  • V. Balabanian et al. An Introduction To Digital Storage Media—Command and Control (DSM-CC). IEEE Communications Magazine, Nov. 1996.
  • International Search Report for PCT/US97/09218, dated Oct. 17, 1997.
  • International Search Report for PCT/US94/11795, dated Feb. 24, 1995.
  • Watson, M; Buettner; P. “Design and Implementation of AAC Decoders.” Institute of Electrical and Electronics Engineers, Inc. 2000. pp. 408-409.
  • Allison, A. “PSIP 101: What You Need to Know.” Broadcastengineering.com. Jun. 2001. pp. 1,142,144.
  • Advanced Television Systems Committee. “All About PSIP: Roadmap to the DTV Program and System Information Protocol.” Broadcast Engineering. Jun. 2001. pp. 1-4.
  • International Search Report for PCT/US02/32043, issued Aug. 21, 2003, 4 pages.
  • International Preliminary Examination Report for international application PCT/US02/32043, mailed Feb. 26, 2004, 4 pages.
  • “International Search Report.” International Search Authority for PCT/US2002/03705, Sep. 11, 2002. pp. 1-2.
  • Official Action issued in U.S. Appl. No. 11/535,269, Jan. 9, 2009, 11 pages.
  • Official Action issued in U.S. Appl. No. 11/535,269, Jun. 25, 2009, 9 pages.
  • Intellectual Property Corporation of Malaysia, Substantive/Modified Substantive Examination Adverse Report for MY Application No. PI20051568, mailed May 22, 2009, 5 pages.
  • Patent Cooperation Treaty, Written Opinion for PCT/US2004/018953, Apr. 29, 2005, 7 pages.
  • Patent Cooperation Treaty, International Search Report for PCT/US2004/018953, Apr. 29, 2005, 2 pages.
  • Official Action issued in U.S. Appl. No. 09/973,893, Jul. 24, 2008, 8 pages.
  • Official Action issued in U.S. Appl. No. 09/973,893, Jan. 7, 2008, 9 pages.
  • Schneider et al., “A Robust Content Based Digital Signature for Image Authentication”, 1996.
  • “The Growth of Dolby Digital (AC-3) in Consumer Products”, May 23, 2002, http://www.dolby.com/digital/growth.html.
  • “The Selection of Audio Coding Technologies for Digital Delivery Systems”, May 23, 2002, http://www.dolby.com/dvd/sel-code.html.
  • “The AC-3 Multichannel Coder”, May 23, 2002, http://www.dolby.com/tech/ac-3mult.html.
  • International Search Report for PCT/US05/05271, dated Jan. 22, 2004, 3 sheets.
  • Patent Cooperation Treaty, International Preliminary Report on Patentability PCT/US03/22562, Oct. 30, 2007 (3 Pages).
  • Canadian Intellectual Property Office, first office action issued May 25, 2009, in connection with related Canadian Application No. 2,530,012 (5 pages).
  • Mexican Institution of the Industrial Property, second office action issued Dec. 2, 2008, in connection with related Mexican Application No. PA/a/2005/014162 (3 pages).
  • Official Action, issued in U.S. Appl. No. 11/312,789, Nov. 17, 2009, 18 pages.
  • Official Action, issued in U.S. Appl. No. 11/312,789, Apr. 29, 2009, 18 pages.
  • Official Action, issued in U.S. Appl. No. 11/312,789, Oct. 21, 2008, 17 pages.
  • Chinese language and English Translation of “Notice of Completion of Formalities for Patent Registration”, issued by the State Intellectual Property Office of China on Aug. 1, 2008, 5 pages.
  • Office Action, issued by the Mexicano la Propiedad Industrial on Feb. 5, 2009, 2 pages.
  • Citeseer.ist, “Content-Based Digital Watermarking for Compressed Audio (2000)”, 1995-2000, Citeseer.ist.psu. edu, 1 page.
  • “Digital Audio Watermarking,” Audio Media, Jan./Feb. 1998. pp. 56, 57, 59, and 61.
  • Fraunhofer Iis, “Audio & Multimedia Watermarking”, http://www.iis.fraunhofer.de/amm/techin/water/, 1998-2006, 7 pages.
  • Official Action issued in U.S. Appl. No. 11/535,269, Jul. 30, 2008, 10 pages.
  • Herre, Juergen, “Audio Watermarking in the Bitstream Domain”, IEEE, Jun. 12-13, 2000, 23 pages.
  • Patent Cooperation Treaty, International Preliminary Examination Report, for PCT/US03/28037, Mar. 15, 2007, 4 pages.
  • Patent Cooperation Treaty, International Search Report for PCT/US2003/28037, Feb. 28, 2005, 3 pages.
  • Patent Cooperation Treaty, Written Opinion for PCT/US2003/28037, Oct. 12, 2006, 4 pages.
  • The Arbitron Company, Arbitron Patent Specification ZA 92-7317; Method and Apparatus for Automatically Identifying a Program Including Sound Signal, Sep. 24, 1992, 30 pages.
  • Official Action issued in U.S. Appl. No. 11/237,251, Nov. 24, 2009, 9 pages.
  • DeCarmo, Linden, “Pirates on the Airwaves,” Sep. 1999, 7 pages.
  • Cheung, W.N., “Digital Image Watermarking in Spatial and Transform Domains,” Sep. 3, 2000, 5 pages.
  • Abdulaziz, Hidhal and Pang, K. Khee, “Wavelet Transform and Channel Coding for Hiding in Video,” Oct. 29, 2001-Nov. 1, 2001, 6 pages.
  • Linang, Jianxun and Hu, Bo, “Video Watermarking Combining with Hybrid Coding Scheme,” Apr. 19, 2005, 4 pages.
  • Patent Cooperation Treaty, International Search Report for PCT/US2004/18645, Apr. 19, 2005, 4 pages.
  • Patent Cooperation Treaty, Written Opinion for PCT/US2004/18645, Apr. 19, 2005, 5 pages.
  • Official Action issued in U.S. Appl. No. 11/312,789, Sep. 3, 2009, 2 pages.
Patent History
Patent number: 7742737
Type: Grant
Filed: Oct 9, 2002
Date of Patent: Jun 22, 2010
Patent Publication Number: 20040210922
Assignee: The Nielsen Company (US), LLC. (Schaumburg, IL)
Inventors: John C. Peiffer (New Port Richey, FL), Michael A. Hicks (Clearwater, FL), David H. Wright (Safety Harbor, FL), Paul M. Mears (Safety Harbor, FL), Venugopal Srinivasan (Palm Harbor, FL), Daozheng Lu (Dunedin, FL), Paul C. Kempter (Palm Harbor, FL)
Primary Examiner: Tan Trinh
Attorney: Hanley, Flight & Zimmerman, LLC
Application Number: 10/492,471