System and method for providing information of selectable objects in a television program

- Broadcom Corporation

A system and method for providing information of selectable objects in a television program as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCE

This patent application is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. 12/881,004, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A TELEVISION PROGRAM IN AN INFORMATION STREAM INDEPENDENT OF THE TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/881,031, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A STILL IMAGE FILE AND/OR DATA STRAM”. This patent application is further related to U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USERSELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.

FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

[Not Applicable]

SEQUENCE LISTING

[Not Applicable]

MICROFICHE/COPYRIGHT REFERENCE

[Not Applicable]

BACKGROUND OF THE INVENTION

Present television systems are incapable of providing for and/or conveniently providing for user-selection of objects in a television program. Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.

BRIEF SUMMARY OF THE INVENTION

Various aspects of the present invention provide a system and method for providing information of selectable objects in a television program, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.

BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS

FIG. 1 is a diagram illustrating an exemplary television system, in accordance with various aspects of the present invention.

FIG. 2 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention.

FIG. 3 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention.

FIG. 4 is a diagram illustrating an exemplary television system, in accordance with various aspects of the present invention.

FIG. 5 is a diagram illustrating exemplary modules and/or sub-modules for a television system, in accordance with various aspects of the present invention.

DETAILED DESCRIPTION OF VARIOUS ASPECTS OF THE INVENTION

The following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.

Additionally, the following discussion will refer to various television system modules (e.g., television modules, television receiver modules, television controller modules, modules of a user's local television system, modules of a geographically distributed television system, etc.). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.

The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television control device, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), a general data communication network (e.g., the Internet), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.

The following discussion may at times refer to an on-screen pointing location. Such a pointing location refers to a location on the television screen (e.g., a primary television screen, a secondary television screen, etc.) to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing. Various aspects of the present invention, while referring to on-screen pointing location, are also readily extensible to such other forms of on-screen location identification.

Additionally, the following discussion will at times refer to television programming. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded television programming, broadcast/multicast/unicast television programming, etc.). Such television programming may, for example, comprise real-time television broadcast programming (or multicast or unicast television programming) and/or user-stored television programming that is stored in a user device (e.g., a VCR, PVR, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.). Various aspects of the present invention may, for example in a television program source system and/or television program distribution system, comprise embedding information in a television program, where such information describes various aspects of user-selectable objects in the television program. Various aspects of the present invention may also, for example in a television, comprise receiving television programming, presenting such received television programming to a user, determining an on-screen pointing location pointed to by the user and processing information of user-selectable objects embedded in the received television programming to identify a user-selected object in the television programming and/or associated actions.

Also, the following discussion will at times refer to user-selectable objects in television programming. Such user-selectable objects includes both animate (i.e., living) and inanimate (i.e., non-living) objects, both still and moving. Such objects may, for example, comprise characteristics of any of a variety of objects present in television programming. Such objects may, for example and without limitation, comprise inanimate objects, such as consumer good objects (e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.), premises objects (e.g., business locations, stores, hotels, signs, doors, buildings, landmarks, historical sites, entertainment venues, hospitals, government buildings, etc.), objects related to services (e.g., objects related to transportation, objects related to emergency services, objects related to general government services, objects related to entertainment services, objects related to food and/or drink services, etc.), objects related to location (e.g., parks, landmarks, streets, signs, road signs, etc.), etc. Such objects may, for example, comprise animate objects, such as people (e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.), animals (e.g., pets, zoo animals, wild animals, etc.) and plants (e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.).

Turning first to FIG. 1, such figure is a diagram illustrating a non-limiting exemplary television system 100 in accordance with various aspects of the present invention. The exemplary system 100 includes a television provider 110. The television provider 110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc. The television provider 110 may, for example, be an original source of television programming (or related information). Also for example, the television provider 110 may be a communication company that provides television programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.). The television provider 110 may, for example, provide television programming and non-programming information and/or video content. The television provider 110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.). As will be discussed below in more detail, the television provider 110 may operate to create a television program (or television program data set, television program data stream, etc.) that includes embedded information of user-selectable objects in the television program. For example and without limitation, such a television provider 110 may operate to receive a completed television program (e.g., a data file, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed television program. Also for example, such a television provider 110 may operate to form the original television program and embed information of user-selectable objects in the original television program during such formation (e.g., in the studio).

The exemplary television system 100 may also include a third party program information provider 120. Such a provider may, for example, provide information related to a television program. Such information may, for example, comprise information describing user-selectable objects in programming, program guide information, etc. As will be discussed below in more detail, such a third party program information provider (e.g., a party independent of a television program source, television program network operator, etc.) may operate to create a television program (or television program data set, television program data stream, etc.) that includes embedded information of user-selectable objects in the television program. For example and without limitation, such a third party program information provider 120 may operate to receive a completed television program (e.g., a data file, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed television program.

The exemplary television system 100 may include one or more communication networks (e.g., the communication network(s) 130). The exemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which television programming and/or information related to television programming may be communicated. For example and without limitation, the communication network 130 may comprise characteristics of any one or more of: a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.

The exemplary television system 100 may include a first television 140. Such a first television 140 may, for example, comprise networking capability enabling such television 140 to communicate directly with the communication network 130. For example, the first television 140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.). Also for example, the first television 140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.). The first television 140 may, for example, operate to (which includes “operate when enabled to”) perform any or all of the functionality discussed herein. The first television 140 may, for example, operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 may include a first television controller 160. Such a first television controller 160 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the first television 140. The first television controller 160 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, the first television controller 160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.

The first television controller 160 (or television control device) may, for example, transmit signals directly to the first television 140 to control operation of the first television 140. The first television controller 160 may also, for example, operate to transmit signals (e.g., via the communication network 130) to the television provider 110 to control television programming (or related information) being provided to the first television 140, or to conduct other transactions (e.g., business transactions, etc.).

As will be discussed in more detail later, the first television controller 160 may operate to communicate screen pointing information with the first television 140 and/or other devices. Also, as will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. The first television controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location.

Additionally, for example in a scenario in which the first television controller 160 comprises an on-board display, the first television controller 160 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

As will be mentioned throughout the following discussion, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local television system. The first television 140 and first television controller 160 provide a non-limiting example of a user's local television system. Such a user's local television system, for example, generally refers to the television-related devices that are local to the television system currently being utilized by the user. For example, when a user is utilizing a television system located at the user's home, the user's local television system generally refers to the television-related devices that make up the user's home television system. Also for example, when a user is utilizing a television system at a premises away from the user's home (e.g., at another home, at a hotel, at an office, etc.), the user's local television system generally refers to the television-related devices that make up the premises television system Such a user's local television system does not, for example, comprise television network infrastructure devices that are generally outside of the user's current premises (e.g., cable and/or satellite head-end apparatus, cable and/or satellite communication intermediate communication network nodes) and/or programming source devices that are generally managed by television enterprises and generally exist outside of the user's home. Such entities, which may be communicatively coupled to the user's local television system, may be considered to be entities remote from the user's local television system (or “remote entities”).

The exemplary television system 100 may also include a television receiver 151. The television receiver 151 may, for example, operate to (e.g., which may include “operate when enabled to”) provide a communication link between a television and/or television controller and a communication network and/or information provider. For example, the television receiver 151 may operate to provide a communication link between the second television 141 and the communication network 130, or between the second television 141 and the television provider 110 (and/or third party program information provider 120) via the communication network 130.

The television receiver 151 may comprise characteristics of any of a variety of types of television receivers. For example and without limitation, the television receiver 151 may comprise characteristics of a cable television receiver, a satellite television receiver, etc. Also for example, the television receiver 151 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). The television receiver 151 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.).

Additionally, for example in a scenario in which the television receiver 151 comprises an on-board display and/or provides audio/video information to a television communicatively coupled thereto, the television receiver 151 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 may include a second television controller 161. Such a second television controller 161 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the second television 141 and the television receiver 151. The second television controller 161 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, the second television controller 161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.

The second television controller 161 may, for example, operate to transmit signals directly to the second television 141 to control operation of the second television 141. The second television controller 161 may, for example, operate to transmit signals directly to the television receiver 151 to control operation of the television receiver 151. The second television controller 161 may additionally, for example, operate to transmit signals (e.g., via the television receiver 151 and the communication network 130) to the television provider 110 to control television programming (or related information) being provided to the television receiver 151, or to conduct other transactions (e.g., business transactions, etc.).

As will be discussed in more detail later, various aspects of the present invention include a user selecting a user-selectable object in programming. Such selection may, for example, comprise the user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. The second television controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location. Also, in a scenario in which the second television controller 161 comprises a touch screen, a user may touch a location of such touch screen to point to an on-screen location (e.g., to select a user-selectable object).

As will be mentioned throughout the following discussion, and as mentioned previously in the discussion of the first television 140 and television controller 160, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local television system. The second television 141, television receiver 151 and second television controller 161 provide another non-limiting example of a user's local television system.

Additionally, for example in a scenario in which the second television controller 161 comprises an on-board display, the second television controller 161 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of the exemplary television system 100 unless explicitly claimed.

FIG. 2 is a flow diagram illustrating an exemplary method 200 for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention. Any or all aspects of the exemplary method 200 may, for example, be implemented in a television system component (e.g., the television provider 110, third party program information provider 120, a component of a communication network 130, first television 140, first television controller 160, second television 141, television receiver 151, second television controller 161, shown in FIG. 1 and discussed previously) and/or a plurality of such television system components operating in conjunction. For example, any or all aspects of the exemplary method 200 may be implemented in one or more television system components remote from the user's local television system. Also for example, any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local television system.

The exemplary method 200 may, for example, begin executing at step 205. The exemplary method 200 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided. For example, the exemplary method 200 may begin executing in response to a user command to begin (e.g., a user at a television program source, a user at a television production studio, a user at a television distribution enterprise, etc.), in response to television program information and/or information of user-selectable objects in a television program arriving at a system entity implementing the method 200, in response to an electronic request communicated from the external entity to a system entity implementing the method 200, in response to a timer, in response to a request from an end user and/or a component of a user's local television system for a television program including information of user-selectable objects, in response to a request from a user for a television program where such user is associated in a database with television programming comprising user-selectable objects, upon reset and/or power-up of a system component implementing the exemplary method 200, in response to identification of a user and/or user equipment for which object selection capability is to be provided, in response to user payment of a fee, etc.

The exemplary method 200 may, for example at step 210, comprise receiving moving picture information for a television program. Many non-limiting examples of such television programs were provided above. Note that, depending on the particular implementation, such moving picture information may also, for example, be received with corresponding audio information.

Step 210 may comprise receiving the moving picture information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 210 may comprise receiving the moving picture information from a television broadcasting company, from a movie streaming company, from a television studio, from a television program database or server, from a video camera or other video recording device, an Internet television programming provider, etc.

Step 210 may comprise receiving the moving picture information via any of a variety of types of communication networks. Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).

Step 210 may comprise receiving the moving picture information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200. For example, in a scenario including the utilization of such hard media, step 210 may comprise receiving the moving picture information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).

In an exemplary scenario, step 210 may comprise receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program without information describing user-selectable objects in the television program. For example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.). For example, such a data set may be a data file (or set of logically linked data files) formatted in an MPEG or DVD format for normal presentation on a user's local television system. Such a data set of a television program, when received at step 210, might not have information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be added, as will be explained below.

In another exemplary scenario, step 210 may comprise receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. In an exemplary implementation, step 210 may comprise receiving moving picture information (e.g., frame-by-frame bitmaps, partially encoded moving picture information, etc.) that will be formatted in accordance with a moving picture standard, but which has not yet been so formatted. Such a data set of a television program, when received at step 210, might not have information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be added, as will be explained below.

In yet another exemplary scenario, step 210 may comprise receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program with information describing user-selectable objects in the television program. For example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.), or a variant thereof, that specifically accommodates information of user-selectable objects in the television program. Also for example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.), or a variant thereof, that while not specifically accommodating information of user-selectable objects in the television program, allows for the incorporation of such information in unassigned data fields. For example, such a data set may be a data file (or set of logically linked data files) formatted in an MPEG or DVD format for normal presentation on a user's local television system. Such a data set of a television program, when received at step 210, might comprise information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be deleted, modified and/or appended, as will be explained below.

Step 210 may, for example, comprise receiving the moving picture information in digital and/or analog signals. Though the examples provided above generally concerned the receipt of digital data, such examples are readily extendible to the receipt of analog moving picture information (e.g., the receipt of composite and/or component video signals, etc.).

In general, step 210 may comprise receiving moving picture information for a television program. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of moving picture information or by any particular manner of receiving moving picture information unless explicitly claimed.

The exemplary method 200 may, at step 220, comprise receiving object information corresponding to a user-selectable object in the television program. Many non-limiting examples of receiving such object information will now be provided.

Step 220 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 220 may comprise receiving the user-selectable object information from a television broadcasting company, from a movie streaming company, from a television studio, from a television program database or server, from an advertising company, from a commercial enterprise associated with a user-selectable object in a television program, from a person or organization associated with a user-selectable object in a television program, from an Internet television programming provider, from a third party television program information source, etc.

Step 210 may comprise receiving the user-selectable object information from a plurality of independent sources. For example, in an exemplary scenario in which a television program includes user-selectable objects corresponding to a plurality of respective interested parties (e.g., respective product sponsors, respective leagues or other associations, respective people, etc.), step 210 may comprise receiving the user-selectable object information from each of such respective interested parties. For example, step 210 may comprise receiving user-selectable object information corresponding to a user-selectable consumer good in a television program from a provider of such consumer good, receiving user-selectable object information corresponding to an entertainer in the television program from the entertainer's management company, receiving user-selectable object information corresponding to a user-selectable historical landmark in the television program from a society associated with the historical landmark, receiving user-selectable object information corresponding to a user-selectable object in the television program associated with a service from a provider of such service, etc. In such a multiple-source scenario, step 210 may comprise aggregating the user-selectable object information received from the plurality of sources (e.g., into a single user-selectable object data set) for ultimate combination of such user-selectable object information with received moving picture information.

Step 220 may, for example, comprise receiving the user-selectable object information from a same source as that from which the moving picture information was received at step 210 or may comprise receiving the user-selectable object information from a different source. For example and without limitation, step 220 may comprise receiving the user-selectable object information from an advertising company, while step 210 comprises receiving the moving picture information from a television studio. In another example, step 220 may comprise receiving the user-selectable object information from a commercial enterprise associated with a consumer good object presented in the television program, while step 210 comprises receiving the moving picture information from a head-end server of a sports network.

In yet another example, step 220 may comprise receiving the user-selectable object information directly from a computer process that generates such information. For example, an operator may play a moving picture (e.g., at a normal rate, a slower-than-normal rate, frame-by-frame, etc.) and utilize graphical tools (e.g., boxes or other polygons, edge detection routines, etc.) to define and track movement of a user-selectable object in the moving picture. Such a computer process may then output information describing the object and/or movement thereof in the moving picture. Step 220 may comprise receiving the information output from such process.

Step 220 may comprise receiving the user-selectable object information via any of a variety of types of communication networks. Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).

Step 220 may, for example, comprise receiving the user-selectable object information via a same communication network as that via which the moving picture information was received at step 210 or may comprise receiving the user-selectable object information from a different communication network. For example and without limitation, step 220 may comprise receiving the user-selectable object information via a general data communication network (e.g., the Internet), while step 210 comprises receiving the moving picture information via a television network. In another example, step 220 may comprise receiving the user-selectable object information via a general data network, while step 210 comprises receiving the moving picture information from a computer readable medium.

Step 220 may comprise receiving the user-selectable object information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200. For example, in a scenario including the utilization of such hard media, step 220 may comprise receiving the user-selectable object information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).

The object information corresponding to one or more user-selectable objects that is received at step 220 may comprise any of a variety of characteristics, non-limiting examples of which will now be provided.

For example, such user-selectable object information may comprise information describing and/or defining the user-selectable object that is shown in the television program. Such information may, for example, be processed by a recipient of such information to identify an object that is being selected by a user. Such information may, for example, comprise information describing boundaries associated with a user-selectable object in the television program (e.g., actual object boundaries (e.g., an object outline), areas generally coinciding with a user-selectable object (e.g., a description of one or more geometric shapes that generally correspond to a user-selectable object), selection areas that when selected indicate user-selection of a user-selectable object (e.g., a superset and/or subset of a user-selectable object in the television program), etc. Such information may, for example, describe and/or define the user-selectable in a television program frame coordinate system.

Such information describing and/or defining the user-selectable object that is shown in the television program may comprise information describing movement of a user-selectable object in the television program. For example, such information may comprise information describing the location of the object on a frame-by-frame basis, information describing movement of a user-selectable object in television screen coordinates as a function of time and/or frame, information describing location of a user-selectable object in a video frame relative to a previous object location in a previous video frame, etc.

Many examples of such object description information are provided in a variety of related U.S. patent applications. For example, as mentioned previously, U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; which are hereby incorporated herein by reference in their entirety, provide many examples of information describing (or otherwise related to) user-selectable objects in television programming.

Also for example, such user-selectable object information may comprise information describing the object, where such information may be presented to the user upon user-selection of a user selectable object. For example, such object information may comprise information describing physical characteristics of a user-selectable object, background information, historical information, general information of interest, location information, financial information, travel information, commerce information, personal information, etc.

Additionally for example, such user-selectable object information may comprise information describing and/or defining actions that may be taken upon user-selection of a user-selectable object, non-limiting examples of such actions and/or related information corresponding to a respective user-selectable object will now be presented.

For example, such user-selectable object information may comprise information describing a one or more manners of determining information to present to the user (e.g., retrieving such information from a known location, conducting a search for such information, etc.), establishing a communication session by which a user may interact with networked entities associated with a user-selected object, interacting with a user regarding display of a user-selected object and/or associated information, etc.

For example, such user-selectable object information may comprise information describing one or more manners of obtaining one or more sets of information, where such information may then, for example, be presented to the user. For example, such information may comprise a memory address (or data storage address) and/or a communication network address (e.g., an address of a networked data server, a URL, etc.), where such address may correspond to a location at which information corresponding to the identified object may be obtained. Such information may, for example, comprise a network address of a component with which a communication session may be initiated and/or conducted (e.g., to obtain information regarding the user-selected object, to interact with the user regarding the selected object, etc.).

In an exemplary scenario in which the user-selectable object information comprises information to present to a user upon user-selection of a selectable object in a television program, such information may comprise any of a variety of different types of information related to the user-selected object. For example and without limitation, such information may comprise information describing the user-selectable object (e.g., information describing aspects of the object, history of the object, design of the object, source of the object, price of the object, critiques of the object, information provided by commercial enterprises producing and/or providing such object, etc.), information indicating to the user how the user may obtain the selected object, information indicating how the user may utilize the selected object, etc. The information may, for example, comprise information of one or more non-commercial organizations associated with, and/or having information pertaining to, the identified user-selected object (e.g., non-profit and/or government organization contact information, web site address information, etc.).

In another exemplary scenario, the information corresponding to a user-selectable object in the television program may comprise information related to conducting a search for information corresponding to the user-selectable object. Such information may, for example, comprise network search terms that may be utilized in a search engine to search for information corresponding to the user-selected object. Such information may also comprise information describing the network boundaries of such a search, for example, identifying particular search networks, particular servers, particular addresses, particular databases, etc.

In an exemplary scenario the information corresponding to a user-selectable object may describe a manner in which a system is to intact with a user to more clearly identify information desired by the user. For example, such information may comprise information specifying user interaction that should take place when an amount of information available and corresponding to a user-selectable object exceeds a particular threshold. Such user interaction may, for example, help to reduce the amount of information that may ultimately be presented to the user. For example, such information may comprise information describing a user interface comprising providing a list (or menu) of types of information available to the user and soliciting information from the user regarding the selection of one or more of the listed types of information.

In yet another exemplary scenario, in which an action associated with a user-selectable object comprises the establishment and/or management of a communication session between the user and one or more networked entities, the user-selectable object information may comprise information describing the manner in which a communication session may be established and/or management.

In still another exemplary scenario, in which an action associated with a user-selectable object comprises providing a user interface by which a user may initiate and perform a commercial transaction regarding a user-selectable object, the user-selectable object information may comprise information describing the manner in which the commercial transaction is to be performed (e.g., order forms, financial information exchange, order tracking, etc.).

As shown above, various user-selectable objects (or types of objects) may, for example, be associated with any of a variety of respective actions that may be taken upon selection of a respective user-selectable object by a user. Such actions (e.g., information retrieval, information searching, communication session management, commercial transaction management, etc.) may, for example, be included in a table or other data structure indexed by the identity of a respective user-selectable object.

Other non-limiting examples of object information corresponding to user-selectable objects in a television program may comprise: athlete information (e.g., statistics, personal information, professional information, history, etc.), entertainer information (e.g., personal information, discography and/or filmography information, information of related organizations, fan club information, photograph and/or video information, etc.), landmark information (e.g., historical information, visitation information, location information, mapping information, photo album information, visitation diary, charitable donation information, etc.), political figure information (e.g., party affiliation, stances on particular issues, history, financial information, voting record, attendance record, etc.), information regarding general types of objects (e.g., information describing actions to take upon user-selection of a person object, of a consumer good object, of a landmark object, etc.) and/or specific objects (e.g., information describing actions to take when a particular person object is selected, when a particular consumer good object is selected, when a particular landmark object is selected, etc.).

For additional non-limiting examples of actions that may be performed related to user selectable objects in television programming, and related user-selectable object information that may be combined with television program moving picture information, the reader is directed to U.S. patent application Ser. No. 12/880,530, filed concurrently herewith, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,594, filed concurrently herewith, titled “SYSTEM AND METHOD IN A LOCAL TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,668, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM BASED ON USER LOCATION”, U.S. patent application Ser. No. 12/881,067, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/881,096, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,749, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM UTILIZING AN ALTERNATIVE COMMUNICATION NETWORK”; U.S. patent application Ser. No. 12/880,851, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING ADVERTISING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,888, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED PERSON IN A TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/881,110, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED INFORMATION ELEMENT IN A TELEVISION PROGRAM”. The entire contents of each of such applications are hereby incorporated herein by reference in their entirety.

In general, the above-mentioned types of information corresponding to user-selectable objects in television programming may be general to all eventual viewers of the television program, but may also be customized to a particular target user and/or end user. For example, such information may be customized to a particular user (e.g., based on income level, demographics, age, employment status and/or type, education level and/or type, family characteristics, religion, purchasing history, neighborhood characteristics, home characteristics, health characteristics, etc. For example, such information may also be customized to a particular geographical location or region.

In general, step 220 may comprise receiving object information corresponding to a user-selectable object in the television program. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of such user-selectable object information or by any particular manner of receiving such user-selectable object information unless explicitly claimed.

The exemplary method 200 may, at step 230, comprise combining the received moving picture information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Many non-limiting examples of such combining will now be provided.

As mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving a completed moving picture data set for the television program, where the completed moving picture data set is formatted for communicating the television program without information describing user-selectable objects in the television program. In such an exemplary scenario, step 230 may comprise combining the received moving picture information and the received user-selectable object information by, at least in part, inserting the received user-selectable object information in the completed moving picture data set to create a combined data set comprising the received moving picture data set and the received user-selectable object information.

For example, in an exemplary scenario in which the received completed moving picture data set, as received, is formatted in accordance with a moving picture standard (e.g., an MPEG standard), step 230 may comprise inserting the received user-selectable object information in data fields of the completed moving picture data set that are not assigned by the moving picture standard for any specific type of information (e.g., inserting such information into unassigned data fields provided by the moving picture standard, adding new data fields to the moving picture standard, etc.).

Such inserting may, for example, comprise inserting the received user-selectable object information in data fields of the completed moving picture data set that are interleaved with data fields carrying moving picture data. For example, such inserting may be performed in accordance with a format alternating moving picture data and user-selectable object information on a frame-by-frame basis (e.g., sequencing frame 1 moving picture data, frame 1 user-selectable object information, sequencing frame 2 moving picture data, frame 2 user-selectable object information, etc.), by groups of frames (e.g., frame 1-A moving picture data, frame 1-A user-selectable object information, frame A-N moving picture data, frame A-N user-selectable object information, etc.), by sub-frames, etc. Also for example, utilizing time information user-selectable object information need not be strictly placed with the moving picture data for the frame(s) in which the user-selectable object appears. For example, information of user-selectable objects in frame N+1 may be communicated with frame N moving picture information.

Also for example, in another exemplary scenario in which the received completed moving picture data set, as received, is formatted in accordance with a moving picture data standard that specifically assigns data fields to information of user-selectable objects, step 230 may comprise inserting the received user-selectable object information in the data fields of the completed moving picture data set that are specifically assigned by the moving picture standard to contain information of user-selectable objects.

Also as mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. For example, such a scenario may comprise receiving information describing the television program moving picture that has yet to be formatted into a data set that conforms to a particular moving picture standard (e.g., bitmap information, still frame information, movement vector information, etc., which has yet to be placed into a self-contained MPEG data set for communicating the television program). In such an exemplary scenario, step 230 may comprise combining the received moving picture information and the received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program (e.g., into a single cohesive data set, for example, a single data file or other data structure, into a plurality of logically linked data files or other data structures, etc.).

In an exemplary scenario, such a completed moving picture data set may be formatted in accordance with a moving picture standard that specifically assigns respective data fields (or elements) to moving picture information and user-selectable object information. In another exemplary scenario, such a completed moving picture data set may be formatted in accordance with a moving picture standard that specifically assigns data fields to moving picture information, but does not specifically assign data fields to user-selectable object information (e.g., utilizing general-purpose unassigned data fields, adding new data fields to the standard, etc.).

Also as mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving an initial combined television program data set that comprises initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program. For example, prior to being received, the received initial combined television program data set may have already been formed into a single cohesive data set that comprises the moving picture information for the television program and information of user-selectable objects in the television program.

In such an exemplary scenario, step 230 may comprise modifying the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information (e.g., as received at step 220). Such modifying may, for example and without limitation, comprise adding the received object information to the initial object information in the initial combined television program data set (e.g., in unused unassigned data fields and/or in unused data fields that have been specifically assigned to contain user-selectable object information, etc.).

Also such modifying may comprise changing at least a portion of the initial object information of the initial combined television program data set in accordance with the received user-selectable object information (e.g., changing information defining a user-selectable object in a presented television program, changing information about a user-selectable object to be presented to a user, changing information regarding any action that may be performed upon user-selection of a user-selectable object, etc.). Additionally, such modifying may comprise deleting at least a portion of the initial object information in accordance with the received user-selectable object information (e.g., in a scenario in which the received user-selectable object information includes a command or directive to remove a portion or all information corresponding to a particular user-selectable object).

In the previously provided examples of combining the received moving picture information and the received user-selectable object information, step 230 may comprise performing such operations automatically (i.e., without real-time interaction with a user while such operations are being performed) and may also be performed with user interaction. For example, the received moving picture information and the received user-selectable object information may each be time-stamped to assist in merging such information. For example, step 230 may comprise analyzing such respective time-stamps to determine the location in a serial stream of moving picture information at which the user-selectable object information is to be inserted. For example, the user-selectable object information for a particular user-selectable object may comprise information of the time and/or frame numbers at which the user-selectable object appears in the television program. Such information may be utilized at step 230 to determine the appropriate location in the moving picture data set at which to place the user-selectable object information.

In another example, step 230 may comprise presenting an operator with a view of the moving picture of a television program and a view of a user-selectable object in such moving picture for which information is being added to a combined dataset. Step 230 may then comprise interacting with the operator to obtain permission and/or directions for combining the moving picture and user-selectable object information.

Note that step 230 may comprise encrypting the user-selectable object information or otherwise restricting access to such information. For example, in a scenario in which access to such information is provided on a subscription basis, in a scenario in which providers of such information desire to protect such information from undesirable access and/or manipulation, etc., such information protection may be beneficial.

In general, step 230 may comprise combining the received moving picture information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Accordingly, the scope of various aspects of the present invention should not be limited by any particular manner of performing such combining and/or any particular format in which such a combined data set may be placed unless specifically claimed.

The exemplary method 200 may, at step 240, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Such communication may comprise characteristics of any of a variety of types of communication, non-limiting examples of which will now be presented.

Step 240 may, for example, comprise communicating the combined data set(s) via a communication network (e.g., a television communication network, a telecommunication network, a general data communication network (e.g., the Internet, a LAN, etc.), etc.). Many non-limiting examples of such communication network were provided previously. Step 240 may, for example, comprise broadcasting, multi-casting and/or uni-casting the combined data set over one or more communication networks. Step 240 may also, for example, comprise communicating the combined data set(s) to another system and/or device via a direct conductive path (e.g., via a wire, circuit board trace, conductive trace on a die, etc.).

Additionally for example, step 240 may comprise storing the combined data set(s) on a computer readable medium (e.g., a DVD, a CD, a Blueray® disc, a laser disc, a magnetic tape, a hard drive, a diskette, etc.). Such a computer readable medium may then, for example, be shipped to a distributor and/or ultimate recipient of the computer readable medium. Further for example, step 240 may comprise storing the combined data set(s) in a volatile and/or non-volatile memory device (e.g., a flash memory device, a one-time-programmable memory device, an EEPROM, a RAM, etc.).

Further for example, step 240 may comprise storing (or causing or otherwise participating in the storage of) the combined data set(s) in a television system component (e.g., a component or device of the user's local television system and/or a component or device of a television program provider and/or a component or device of any television program source. For example and without limitation, step 240 may comprise storing the combined dataset(s), or otherwise participating in the storage of the combined dataset(s), in a component of the user's local television system (e.g., in a digital video recorder, a television receiver, a television, a television controller, personal communication device, a local networked database, a local networked personal computer, etc.).

Step 240 may, for example, comprise communicating the combined data set in serial fashion. For example, step 240 may comprise communicating the combined data set (comprising interleaved moving picture information and user-selectable object information) in a single data stream (e.g., via a television network, via a general data network, stored on a hard medium in such serial fashion, etc.). Also for example, step 240 may comprise communicating the combined data set in parallel data streams, each of which comprises interleaved moving picture information and user-selectable object information (e.g., as opposed to separate distinct respective data streams for each of moving picture information and user-selectable object information).

In general, step 240 may comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices (e.g., an end user or associated system, television programming provider or associated system, an advertiser or associated system, a television program producer or associated system, a television program database, a television program server, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such communicating or by any particular recipient of such communication unless explicitly claimed.

The exemplary method 200 may, for example at step 295, comprise performing continued operations. Step 295 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below. For example, step 295 may comprise returning execution flow to any of the previously discussed method steps. For example, step 295 may comprise returning execution flow of the exemplary method 200 to step 220 for receiving additional user-selectable object information to combine with television program information. Also for example, step 295 may comprise returning execution flow of the exemplary method 200 to step 210 for receiving additional television program moving picture information and user-selectable object information to combine with such received television program information. Additionally for example, step 295 may comprise returning execution flow of the exemplary method 200 to step 240 for additional communication of the combined information to additional recipients.

In general, step 295 may comprise performing continued operations (e.g., performing additional operations corresponding to combining television program information and information of user-selectable objects in such programming, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.

Turning next to FIG. 3, such figure is a flow diagram illustrating an exemplary method 300 for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention. The exemplary method 300 may, for example, share any or all characteristics with the exemplary method 200 illustrated in FIG. 2 and discussed previously. Any or all aspects of the exemplary method 300 may, for example, be implemented in a television system component (e.g., the television provider 110, third party program information provider 120, a component of a communication network 130, first television 140, first television controller 160, second television 141, television receiver 151, second television controller 161, shown in FIG. 1 and discussed previously) and/or a plurality of such television system components operating in conjunction. For example, any or all aspects of the exemplary method 300 may be implemented in one or more television system components remote from the user's local television system. Also for example, any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local television system.

The exemplary method 300 may, for example, begin executing at step 305. The exemplary method 300 may begin executing in response to any of a variety of causes or conditions. Step 305 may, for example, share any or all characteristics with step 205 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

The exemplary method 300 may, for example at step 310, comprise receiving moving picture information for a television program. Step 310 may, for example, share any or all characteristics with step 210 of the exemplary method 200 illustrated in FIG. 2 and discussed previously. For example, step 310 may comprise receiving any of the various types of moving picture information from any of the various sources of moving picture information via any of the various communication media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.

For example, step 310 may comprise, for example at sub-step 312, receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program without information describing user-selectable objects in the television program. Alternatively for example, step 310 may comprise, for example at sub-step 314, receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. Alternatively for example, step 310 may comprise, for example at sub-step 316, receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program with information describing user-selectable objects in the television program.

The exemplary method 300 may, for example at step 320, comprise receiving object information corresponding to a user-selectable object in the television program. Step 320 may, for example, share any or all characteristics with step 220 of the exemplary method 200 illustrated in FIG. 2 and discussed previously. For example, step 320 may comprise receiving any of the various types of user-selectable object information from any of the various sources of user-selectable object information via any of the various types of media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.

For example, step 320 may comprise, for example at sub-step 322, receiving user-selectable object information comprising information describing and/or defining the user-selectable object that is shown in the television program (e.g., object dimension information, object movement information, etc.). Also for example, step 320 may comprise, for example at sub-step 324, receiving user-selectable object information comprising information regarding the user-selectable object that may be presented to the user upon user-selection of such object in a television program.

Additionally for example, step 320 may comprise, for example at sub-step 326, receiving user-selectable object information comprising information describing and/or defining actions that may be taken upon user-selection of a user-selectable object (e.g., retrieving and/or obtaining and/or searching for information about a user-selectable object, information specifying a manner in which a system is to interact with a user regarding a user-selected object, searching for information, establishing and/or maintaining communication sessions, information describing the manner in which the commercial transaction is to be performed, etc.).

The exemplary method 300 may, for example at step 330, comprise combining the received moving picture information (e.g., as received at step 310) and the received user-selectable object information (e.g., as received at step 320) in a combined data set. Step 330 may, for example, share any or all characteristics with step 230 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

For example, step 330 may comprise, for example at sub-step 332, inserting the received user-selectable object information in a completed moving picture data set that was received at step 320 (e.g., inserting such user-selectable object information in fields of the moving picture data set that are specified by a standard for carrying such user-selectable object information, inserting such user-selectable object information in fields of the moving picture data set that are not specifically allocated for a particular type of data, etc.).

Also for example, step 330 may comprise, for example at sub-step 334, combining received moving picture data and received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program. Additionally for example, step 330 may comprise, for example at sub-step 336, modifying initial user-selectable object information of an initial combined television program data set in accordance with received user-selectable object information.

The exemplary method 300 may, for example at step 340, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Step 340 may, for example, share any or all characteristics with step 240 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

For example, step 340 may comprise, for example at sub-step 342, communicating the combined data set(s) via a communication network (e.g., any of a variety of communication networks discussed herein, etc.). Also for example, step 340 may comprise, for example, at sub-step 344, communicating the combined data set(s) by storing the combined data set(s) on a computer readable medium and/or by transmitting the combined data set(s) to another device or system to perform such storage. Additionally for example, step 340 may comprise, for example, at sub-step 346, communicating the combined data set in a single serial stream (e.g., comprising interleaved moving picture data and user-selectable object information). Further for example, step 340 may comprise, for example, at sub-step 348, communicating the combined data set in a plurality of parallel serial streams (e.g., each of such streams comprising interleaved moving picture data and user-selectable object information).

The exemplary method 300 may, for example at step 395, comprise performing continued operations. Step 395 may, for example, share any or all characteristics with step 295 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

Turning next to FIG. 4, such figure is a diagram illustrating an exemplary television system (e.g., single television system component and/or plurality of television system components) 400, in accordance with various aspects of the present invention. The exemplary television system 400 may, for example, share any or all characteristics with one or more of the television system components illustrated in FIG. 1 and discussed previously. For example, the exemplary television system 400 may correspond to any of the television system components illustrated in FIG. 1 (or the like) or any group of the television system components illustrated in FIG. 1 (or the like). Also, the exemplary television system 400 may comprise characteristics of a computing system (e.g., a personal computer, a mainframe computer, a digital signal processor, etc.). The exemplary television system 400 (e.g., various modules thereof) may operate to perform any or all of the functionality discussed previously with regard to the exemplary methods 200 and 300 illustrated in FIGS. 2-3 and discussed previously.

The exemplary television system 400 includes a first communication interface module 410. The first communication interface module 410 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the first communication interface module 410 is illustrated coupled to a wireless RF antenna via a wireless port 412, the wireless medium is merely illustrative and non-limiting. The first communication interface module 410 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television-related information (e.g., moving picture information, information of user-selectable objects, television programming with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, the first communication interface module 410 may operate to communicate with local sources of television-related content or other data (e.g., disc drives, computer-readable medium readers, video recorders, video cameras, computers, receivers, etc.). Additionally, for example, the first communication interface module 410 may operate to communicate with a remote controller (e.g., directly or via one or more intermediate communication networks).

The exemplary television system 400 includes a second communication interface module 420. The second communication interface module 420 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the second communication interface module 420 may communicate via a wireless RF communication port 422 and antenna, or may communicate via a non-tethered optical communication port 424 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the second communication interface module 420 may communicate via a tethered optical communication port 426 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 428 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The second communication interface module 420 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television-related information (e.g., moving picture information, information of user-selectable objects, television programming with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, the second communication module 420 may operate to communicate with local sources of television-related information (e.g., disc drives, computer-readable medium readers, video recorders, video cameras, computers, receivers, etc.). Additionally, for example, the second communication module 420 may operate to communicate with a remote controller (e.g., directly or via one or more intervening communication networks).

The exemplary television system 400 may also comprise additional communication interface modules, which are not illustrated (some of which may also be shown in FIG. 5). Such additional communication interface modules may, for example, share any or all aspects with the first 410 and second 420 communication interface modules discussed above.

The exemplary television system 400 may also comprise a communication module 430. The communication module 430 may, for example, operate to control and/or coordinate operation of the first communication interface module 410 and the second communication interface module 420 (and/or additional communication interface modules as needed). The communication module 430 may, for example, provide a convenient communication interface by which other components of the television system 400 may utilize the first 410 and second 420 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 430 may coordinate communications to reduce collisions and/or other interference between the communication interface modules.

The exemplary television system 400 may additionally comprise one or more user interface modules 440. The user interface module 440 may generally operate to provide user interface functionality to a user of the television system 400. For example, and without limitation, the user interface module 440 may operate to provide for user control of any or all standard television system commands (e.g., channel control, volume control, on/off, screen settings, input selection, etc.). The user interface module 440 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television system (e.g., buttons, etc.) and may also utilize the communication module 430 (and/or first 410 and second 420 communication interface modules) to communicate with other systems and/or components thereof, regarding television-related information, regarding user interaction that occurs during the formation of combined dataset(s), etc. (e.g., a television system controller (e.g., a dedicated television system remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.)). In various exemplary scenario, the user interface module(s) 440 may operate to utilize the optional display 470 to communicate with a user regarding user-selectable object information and/or to present television programming to a user.

The user interface module 440 may also comprise one or more sensor modules that operate to interface with and/or control operation of any of a variety of sensors that may be utilized during the performance of the combined data set(s). For example, the one or more sensor modules may be utilized to ascertain an on-screen pointing location, which may for example be utilized to input and/or received user-selectable object information (e.g., to indicate and/or define user-selectable objects in a moving picture). For example and without limitation, the user interface module 440 (or sensor module(s) thereof) may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices, via the communication interface modules 410, 420, etc.). Also for example, in scenarios in which such sensors are active sensors (as opposed to purely passive sensors), the user interface module 440 (or sensor module(s) thereof) may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors. Additionally, the user interface module 440 may perform any of a variety of video output functions (e.g., presenting moving picture information to a user, presenting user-selectable object information to a user, presenting television programming to a user, providing visual feedback to a user regarding an identified user-selected object in a presented moving picture, etc.).

The exemplary television system 400 may comprise one or more processors 450. The processor 450 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, the processor 450 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules in FIG. 4, such illustrative modules, or a portion thereof, may be implemented by the processor 450.

The exemplary television system 400 may comprise one or more memories 460. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one or more memories 460. Such memory 460 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation, such memory 460 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.

The exemplary television system 400 may comprise one or more modules 452 (e.g., moving picture information receiving module(s)) that operate to receive moving picture information for a television program. Such one or more modules 452 may, for example, operate to utilize the communication module 430 (e.g., and at least one of the communication interface modules 410, 420) to receive such television program moving picture information. For example, such one or more modules 452 may operate to perform step 210 of the exemplary method 200 discussed previously and/or step 310 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more module(s) 454 (e.g., user-selectable object information receiving module(s)) that operate to receive object information corresponding to one or more user-selectable objects in a television program. Such one or more modules 454 may, for example, operate to utilize the communication module 430 (e.g., and at least one of the communication interface modules 410, 420) to receive such television program user-selectable object information. For example, such one or more modules 454 may operate to perform step 220 of the exemplary method 200 discussed previously and/or step 320 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more modules 456 (e.g., moving picture and user-selectable object combining module(s)) that operate to combine received moving picture information (e.g., as received by the module(s) 452) and received user-selectable object information (e.g., as received by the module(s) 454) into a combined data set. Such one or more modules 456 may, for example, operate to receive moving picture information from the module(s) 452, receive user-selectable object information from the module(s) 454, combine such received moving picture information and user-selectable object information into a combined data set, and output such combined data set. Such one or more modules 456 may operate to perform step 230 of the exemplary method 200 discussed previously and/or step 330 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more modules 458 (e.g., combined data set communication module(s)) that operate to communicate the combined data set to at least one recipient system and/or device. For example, such module(s) 458 may operate to utilize the communication module(s) 430 (and, for example, one or both of the first communication interface module(s) 410 and second communication interface module(s) 420)) to communicate the combined data set. Also for example, such module(s) 458 may operate to communicate the combined data set to one or more system devices that store the combined data set on a physical medium (e.g., a computer-readable medium). Such one or more modules 458 may operate to perform step 240 of the exemplary method 200 discussed previously and/or step 340 of the exemplary method 300 discussed previously.

Though not illustrated, the exemplary television system 400 may, for example, comprise one or more modules that operate to perform any or all of the continued processing discussed previously with regard to step 295 of the exemplary method 200 and step 395 of the exemplary method 300, discussed previously. Such modules (e.g., as with the one or more modules 452, 454, 456 and 458) may be performed by the processor(s) 450 executing instructions stored in the memory 460.

Turning next to FIG. 5, such figure is a diagram illustrating exemplary modules and/or sub-modules for a television system 500, in accordance with various aspects of the present invention. The exemplary television system 500 may share any or all aspects with the television system 400 illustrated in FIG. 4 and discussed previously. For example, the exemplary television system 500 may, for example, share any or all characteristics with one or more of the television system components illustrated in FIG. 1 and discussed previously. For example, the exemplary television system 500 may correspond to any of the television system components illustrated in FIG. 1 (or the like) or any group of the television system components illustrated in FIG. 1 (or the like). For example, the exemplary television system 500 (or various modules thereof) may operate to perform any or all functionality discussed herein with regard to the exemplary method 200 illustrated in FIG. 2 and the exemplary method 300 illustrated in FIG. 3.

For example, the television system 500 comprises a processor 530. Such a processor 530 may, for example, share any or all characteristics with the processor 450 discussed with regard to FIG. 4. Also for example, the television system 500 comprises a memory 540. Such memory 540 may, for example, share any or all characteristics with the memory 460 discussed with regard to FIG. 4.

Also for example, the television system 500 may comprise any of a variety of user interface module(s) 550. Such user interface module(s) 550 may, for example, share any or all characteristics with the user interface module(s) 440 discussed previously with regard to FIG. 4. For example and without limitation, the user interface module(s) 550 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen), a vibrating mechanism, a keypad, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).

The exemplary television system 500 may also, for example, comprise any of a variety of communication modules (505, 506, and 510). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 410, 420 discussed previously with regard to FIG. 4. For example and without limitation, the communication interface module(s) 510 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. The exemplary television system 500 is also illustrated as comprising various wired 506 and/or wireless 505 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby.

The exemplary television system 500 may also comprise any of a variety of signal processing module(s) 590. Such signal processing module(s) 590 may share any or all characteristics with modules of the exemplary television system 400 that perform signal processing. Such signal processing module(s) 590 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s) 590 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.).

In summary, various aspects of the present invention provide a system and method for providing information of selectable objects in a television program. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims

1. A method for communicating television program information, the method comprising:

by a television or television receiver: receiving, by the television or television receiver, moving picture information for a television program; receiving, by the television or television receiver, user-selectable object information corresponding to a user-selectable object in the television program; and combining, by the television or television receiver, the received moving picture information and the received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program, the completed moving picture data set is formatted in accordance with a moving picture standard;
wherein: said receiving moving picture information for the television program comprises receiving an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and said combining comprises modifying the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the completed moving picture data set that are not assigned by the moving picture standard.

2. The method of claim 1, comprising communicating the combined data set in at least one serial data stream over a communication network to at least one recipient, the at least one serial data stream comprising a serial data stream that comprises moving picture information and user-selectable object information.

3. The method of claim 1, comprising storing the combined data set on a computer readable medium, the combined data set comprising user-selectable object information interleaved with moving picture information.

4. The method of claim 1, wherein the moving picture information for the television program is formatted for communicating the television program without information describing user-selectable objects in the television program.

5. The method of claim 1, wherein said modifying comprises changing at least a portion of the initial object information in accordance with the received user-selectable object information.

6. The method of claim 1, wherein the received user-selectable object information corresponding to the user-selectable object in the television program comprises customized user-selectable object information that is customized to a particular set of one or more users.

7. The method of claim 1, wherein the received user-selectable object information corresponding to the user-selectable object in the television program comprises information describing location of the user-selectable object in a frame of the television program.

8. A television receiver comprising:

at least one processor in the television receiver operable to, at least: receive moving picture information for a television program; receive user-selectable object information corresponding to a user-selectable object in the television program; combine the received moving picture information and the received user-selectable object information into a combined data set, the combined data set is formatted in accordance with a moving picture standard; and communicate the combined data set comprising interleaved moving picture information and user-selectable object information; wherein: the at least one processor is operable to receive the moving picture information for the television program by, at least in part, operating to receive an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and the at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the combined data set that are not assigned by the moving picture standard.

9. The television receiver of claim 8, wherein the at least one processor is operable to communicate the combined data set in at least one serial data stream over a communication network to at least one recipient, the at least one serial data stream comprising a serial data stream that comprises moving picture information and user-selectable object information.

10. The television receiver of claim 8, wherein the at least one processor is operable to store the combined data set on a computer readable medium, the combined data set comprising user-selectable object information interleaved with moving picture information.

11. The television receiver of claim 8, wherein the moving picture information for the television program is formatted for communicating the television program without information describing user-selectable objects in the television program.

12. The television receiver of claim 11, wherein the at least one processor is operable to combine the received moving picture information and the received user-selectable object information in the combined data set by, at least in part, operating to insert the received user-selectable object information in the completed moving picture data set to create the combined data set comprising a moving picture data set and the received user-selectable object information.

13. The television receiver of claim 8, wherein said at least one processor is operable to receive moving picture information for the television program by, at least in part, operating to receive moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program.

14. The television receiver of claim 13, wherein said at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to combine the received moving picture information and the received user-selectable object information into the completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program.

15. The television receiver of claim 8, wherein the at least one processor is operable to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by, at least in part, operating to change at least a portion of the initial object information in accordance with the received user-selectable object information.

16. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises customized user-selectable object information that is customized to a particular set of one or more users.

17. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises information describing location of the user-selectable object in a frame of the television program.

18. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises information identifying at least one action to be performed upon user-selection of the user-selectable object.

19. The method of claim 1, further comprising communicating the combined data set in parallel data streams, each of the parallel data streams comprising interleaved moving picture information and user-selectable object information.

20. The method of claim 1, further comprising aggregating the user-selectable object information received from a plurality of data sources into a single user-selectable object data set prior to the combining.

21. A method for communicating television program information, the method comprising:

by a television or television receiver system: receiving, by the television or television receiver, moving picture information for a television program; receiving, by the television or television receiver, user-selectable object information corresponding to a user-selectable object in the television program; combining, by the television or television receiver, the received moving picture information and the received user-selectable object information into a combined data set, the combined set is formatted in accordance with a moving picture standard; and communicating, by the television or television receiver, the combined data set in parallel data streams, each of the parallel data streams comprising interleaved moving picture information and user-selectable object information;
wherein: the at least one processor is operable to receive the moving picture information for the television program by, at least in part, operating to receive an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and the at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the combined data set that are not assigned by the moving picture standard.

22. The method according to claim 1, wherein modifying the initial user-selectable object information comprises changing information defining the user-selectable object presented in the television program.

23. The method according to claim 1, wherein modifying the initial user-selectable object information comprises changing information regarding an action performed upon selection of the user-selectable object.

24. The method according to claim 1, wherein modifying the initial user-selectable object information comprises deleting information regarding the user-selectable object.

25. The method according to claim 1, wherein modifying the initial user-selectable object information comprises encrypting information regarding the user-selectable object.

26. The method according to claim 1, wherein the initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program is received in a single serial data stream.

Referenced Cited
U.S. Patent Documents
5111511 May 5, 1992 Ishii et al.
5408258 April 18, 1995 Kolessar
5543851 August 6, 1996 Chang
5602568 February 11, 1997 Kim
5708845 January 13, 1998 Wistendahl
5718845 February 17, 1998 Drost
5721584 February 24, 1998 Yoshinobu et al.
5727141 March 10, 1998 Hoddie
5793361 August 11, 1998 Kahn et al.
5929849 July 27, 1999 Kikinis
6097441 August 1, 2000 Allport
6122660 September 19, 2000 Baransky et al.
6133911 October 17, 2000 Kim
6255961 July 3, 2001 Van Ryzin et al.
6256785 July 3, 2001 Klappert et al.
6282713 August 28, 2001 Kitsukawa et al.
6314569 November 6, 2001 Chernock et al.
6317714 November 13, 2001 Del Castillo et al.
6349410 February 19, 2002 Lortz
6407779 June 18, 2002 Herz
6532592 March 11, 2003 Shintani et al.
6538672 March 25, 2003 Dobbelaar
6567984 May 20, 2003 Allport
6931660 August 16, 2005 Kalluri et al.
7053965 May 30, 2006 Fan
7057670 June 6, 2006 Kikinis
7102616 September 5, 2006 Sleator
7158676 January 2, 2007 Rainsford
7207053 April 17, 2007 Asmussen
7301530 November 27, 2007 Lee et al.
7344084 March 18, 2008 DeCosta
7360232 April 15, 2008 Mitchell
7409437 August 5, 2008 Ullman et al.
7535456 May 19, 2009 Liberty
7536706 May 19, 2009 Sezan et al.
7612748 November 3, 2009 Tateuchi
7631338 December 8, 2009 Del Sesto et al.
7805747 September 28, 2010 Klappert
7827577 November 2, 2010 Pack
7864159 January 4, 2011 Sweetser et al.
7889175 February 15, 2011 Kryze et al.
7890380 February 15, 2011 Stefanik
7987478 July 26, 2011 Minor
8068781 November 29, 2011 Ilan et al.
8095423 January 10, 2012 Nichols
8181212 May 15, 2012 Sigal
8223136 July 17, 2012 Hu et al.
8269746 September 18, 2012 Hodges et al.
8290513 October 16, 2012 Forstall et al.
8359628 January 22, 2013 Kitaru et al.
8421746 April 16, 2013 Igoe
8436809 May 7, 2013 Sohn et al.
8451223 May 28, 2013 Choi et al.
8608535 December 17, 2013 Weston
8760401 June 24, 2014 Kimmel et al.
20010019368 September 6, 2001 Holme et al.
20010023436 September 20, 2001 Srinivasan
20010047298 November 29, 2001 Moore
20020016965 February 7, 2002 Tomsen
20020040482 April 4, 2002 Sextro
20020042925 April 11, 2002 Ebisu
20020056136 May 9, 2002 Wistendahl
20020069405 June 6, 2002 Chapin et al.
20020078446 June 20, 2002 Dakss et al.
20020090114 July 11, 2002 Rhoads
20020120934 August 29, 2002 Abrahams
20020136432 September 26, 2002 Koike et al.
20020162120 October 31, 2002 Mitchell
20030005445 January 2, 2003 Schein
20030023981 January 30, 2003 Lemmons
20030028873 February 6, 2003 Lemmons
20030035075 February 20, 2003 Butler et al.
20030051253 March 13, 2003 Barone, Jr.
20030054878 March 20, 2003 Benoy et al.
20030079224 April 24, 2003 Komar et al.
20030115602 June 19, 2003 Knee
20030145326 July 31, 2003 Gutta et al.
20030212996 November 13, 2003 Wolzien
20030217360 November 20, 2003 Gordon et al.
20030236752 December 25, 2003 Dawson et al.
20040003412 January 1, 2004 Halbert
20040078814 April 22, 2004 Allen
20040109087 June 10, 2004 Robinson et al.
20040119701 June 24, 2004 Mulligan et al.
20040167855 August 26, 2004 Cambridge
20040221025 November 4, 2004 Johnson et al.
20040236865 November 25, 2004 Ullman
20040268401 December 30, 2004 Gray et al.
20050028208 February 3, 2005 Ellis
20050086690 April 21, 2005 Gilfix et al.
20050132420 June 16, 2005 Howard et al.
20050137958 June 23, 2005 Huber et al.
20050138668 June 23, 2005 Gray et al.
20050153687 July 14, 2005 Niemenmaa et al.
20050177861 August 11, 2005 Ma et al.
20050193425 September 1, 2005 Sull et al.
20050229227 October 13, 2005 Rogers
20050234782 October 20, 2005 Schackne et al.
20050251835 November 10, 2005 Scott
20050262542 November 24, 2005 DeWeese et al.
20060037044 February 16, 2006 Daniels
20060064734 March 23, 2006 Ma
20060099964 May 11, 2006 Barrese et al.
20060152489 July 13, 2006 Sweetser et al.
20060174273 August 3, 2006 Park
20060195878 August 31, 2006 Pack et al.
20060241864 October 26, 2006 Rosenberg
20060259930 November 16, 2006 Rothschild
20060268895 November 30, 2006 Kotzin
20060282847 December 14, 2006 Gupte
20070097275 May 3, 2007 Dresti et al.
20070130581 June 7, 2007 Del Sesto et al.
20070137611 June 21, 2007 Contin et al.
20070156521 July 5, 2007 Yates
20070157260 July 5, 2007 Walker
20070195205 August 23, 2007 Lowe
20070199014 August 23, 2007 Clark et al.
20070250901 October 25, 2007 McIntire et al.
20070261079 November 8, 2007 Pack et al.
20070266406 November 15, 2007 Aravamudan
20070277201 November 29, 2007 Wong
20070300263 December 27, 2007 Barton
20080016526 January 17, 2008 Asmussen
20080052750 February 28, 2008 Grunnet-Jepsen
20080066097 March 13, 2008 Park et al.
20080066129 March 13, 2008 Katcher et al.
20080071750 March 20, 2008 Schloter
20080089551 April 17, 2008 Heather et al.
20080109851 May 8, 2008 Heather
20080132163 June 5, 2008 Ilan et al.
20080134342 June 5, 2008 Shamoon et al.
20080136754 June 12, 2008 Tsuzaki et al.
20080172693 July 17, 2008 Ludvig
20080177570 July 24, 2008 Craine
20080184132 July 31, 2008 Zato
20080204603 August 28, 2008 Hattori
20080204605 August 28, 2008 Tsai
20080209480 August 28, 2008 Eide
20090006211 January 1, 2009 Perry et al.
20090021473 January 22, 2009 Grant et al.
20090034784 February 5, 2009 McQuaide, Jr.
20090037947 February 5, 2009 Patil
20090077394 March 19, 2009 Tsai et al.
20090083815 March 26, 2009 McMaster et al.
20090113475 April 30, 2009 Li
20090165041 June 25, 2009 Penberthy et al.
20090165048 June 25, 2009 Nishimura
20090187862 July 23, 2009 DaCosta
20090199259 August 6, 2009 Alao et al.
20090217317 August 27, 2009 White
20090235312 September 17, 2009 Morad
20090237572 September 24, 2009 Kishimoto
20090256811 October 15, 2009 Pasquariello
20090271815 October 29, 2009 Contin et al.
20090296686 December 3, 2009 Pirani et al.
20090327894 December 31, 2009 Rakib et al.
20100005488 January 7, 2010 Rakib et al.
20100064320 March 11, 2010 Angiolillo et al.
20100097348 April 22, 2010 Park
20100098074 April 22, 2010 Kokemak
20100157152 June 24, 2010 Weitbruch et al.
20100162303 June 24, 2010 Cassanova
20100218228 August 26, 2010 Walter
20100257448 October 7, 2010 Squires
20110032191 February 10, 2011 Cooke et al.
20110063523 March 17, 2011 Karaoguz et al.
20110066929 March 17, 2011 Karaoguz et al.
20110067062 March 17, 2011 Karaoguz et al.
20110067063 March 17, 2011 Karaoguz et al.
20110067064 March 17, 2011 Karaoguz et al.
20110067069 March 17, 2011 Karaoguz et al.
20110141013 June 16, 2011 Matthews
20110179435 July 21, 2011 Cordray
20120079525 March 29, 2012 Ellis
20120154268 June 21, 2012 Alten
20120163776 June 28, 2012 Hassell et al.
20140101690 April 10, 2014 Boncyk et al.
Foreign Patent Documents
1193869 September 1998 CN
1300501 June 2001 CN
1329796 January 2002 CN
WO 99/04559 January 1999 WO
WO 2007/137611 December 2007 WO
WO 2009/033500 March 2009 WO
Other references
  • Office Action from related U.S. Appl. No. 12/880,530 dated Aug. 2, 2012.
  • Office Action from related U.S. Appl. No. 12/880,594 dated Jun. 19, 2012.
  • Office Action from related U.S. Appl. No. 12/880,668 dated Jul. 2, 2012.
  • Office Action from related U.S. Appl. No. 12/881,067 dated Jun. 27, 2012.
  • Office Action from related U.S. Appl. No. 12/881,096 dated Jun. 19, 2012.
  • Office Action from related U.S. Appl. No. 12/880,749 dated Aug. 30, 2012.
  • Office Action from related U.S. Appl. No. 12/851,036 dated Aug. 22, 2012.
  • Office Action from related U.S. Appl. No. 12/880,851 dated Jun. 20, 2012.
  • Office Action from related U.S. Appl. No. 12/880,888 dated Jul. 2, 2012.
  • Office Action from related U.S. Appl. No. 12/881,110 dated May 29, 2012.
  • Office Action from related U.S. Appl. No. 12/774,380 dated Jul. 9, 2012.
  • Office Action from related U.S. Appl. No. 12/850,832 dated Aug. 15, 2012.
  • Office Action from related U.S. Appl. No. 12/850,866 dated Jun. 20, 2012.
  • Office Action from related U.S. Appl. No. 12/850,911 dated Jun. 20, 2012.
  • Office Action from related U.S. Appl. No. 12/850,945 dated Aug. 2, 2012.
  • Office Action from related U.S. Appl. No. 12/881,004 dated Nov. 1, 2012.
  • Final Office Action from related U.S. Appl. No. 12/881,067 dated Oct. 9, 2012.
  • Office Action from related U.S. Appl. No. 12/851,075 dated Sep. 5, 2012.
  • Office Action from related U.S. Appl. No. 12/774,221 dated Aug. 29, 2012.
  • Final Office Action from related U.S. Appl. No. 12/881,110 dated Oct. 17, 2012.
  • Office Action from related U.S. Appl. No. 12/850,866 dated Oct. 4, 2012.
  • Final Office Action from related U.S. Appl. No. 12/850,911 dated Oct. 5, 2012.
  • Final Office Action from related U.S. Appl. No. 12/880,851 dated Nov. 14, 2012.
  • Office Action from related U.S. Appl. No. 12/774,321 dated Nov. 14, 2012.
  • Office Action from related U.S. Appl. No. 12/774,154 dated Dec. 5, 2012.
  • Final Office Action from related U.S. Appl. No. 12/880,530 dated Jan. 14, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,594 dated Nov. 28, 2012.
  • Office Action from related U.S. Appl. No. 12/880,668 dated Jan. 2, 2013.
  • Final Office Action from related U.S. Appl. No. 12/881,096 dated Jan. 23, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,749 dated Feb. 1, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,888 dated Dec. 6, 2012.
  • Office Action from related U.S. Appl. No. 12/774,380 dated Jan. 8, 2013.
  • Final Office Action from related U.S. Appl. No. 12/774,154 dated Apr. 10, 2013.
  • Final Office Action from related U.S. Appl. No. 12/881,004 dated Mar. 7, 2013.
  • Final Office Action from related U.S. Appl. No. 12/851,036 dated Feb. 26, 2013.
  • Final Office Action from related U.S. Appl. No. 12/851,075 dated Mar. 5, 2013.
  • Final Office Action from related U.S. Appl. No. 12/774,221 dated Feb. 26, 2013.
  • Final Office Action from related U.S. Appl. No. 12/850,832 dated Feb. 25, 2013.
  • Final Office Action from related U.S. Appl. No. 12/850,866 dated Mar. 29, 2013.
  • Final Office Action from related U.S. Appl. No. 12/850,945 dated Apr. 26, 2013.
  • Final Office Action from related U.S. Appl. No. 12/774,321 dated Jun. 27, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,154 dated Aug. 14, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/850,945 dated Aug. 27, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,668 dated Jun. 10, 2013.
  • Final Office Action from related U.S. Appl. No. 12/774,380 dated Jun. 11, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,851 dated Sep. 10, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Sep. 10, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,110 dated Sep. 17, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,096 dated Sep. 22, 2014.
  • Final Office Action from related U.S. Appl. No. 12/851,075 dated Oct. 14, 2014.
  • Final Office Action from related U.S. Appl. No. 12/850,832 dated Oct. 7, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 14/457,451 dated Nov. 20, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,154 dated Nov. 13, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,530 dated Mar. 31, 2015.
  • Final Office Action from related U.S. Appl. No. 12/881,096 dated Apr. 27, 2015.
  • Final Office Action from related U.S. Appl. No. 12/850,911 dated Mar. 20, 2015.
  • Final Office Action from related U.S. Appl. No. 14/457,451 dated Apr. 29, 2015.
  • Final Office Action from related U.S. Appl. No. 14/480,020 dated May 8, 2015.
  • Final Office Action from related U.S. Appl. No. 14/467,408 dated May 7, 2015.
  • Final Office Action from related U.S. Appl. No. 14/488,778 dated May 19, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 12/850,832 dated Jun. 3, 2015.
  • Final Office Action from related U.S. Appl. No. 14/479,670 dated Jun. 9, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/625,810 dated Jun. 11, 2015.
  • Final Office Action from related U.S. Appl. No. 12/851,075 dated Jun. 8, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Jul. 1, 2015.
  • Final Office Action from related U.S. Appl. No. 12/774,221 dated Jul. 1, 2015.
  • Final Office Action from related U.S. Appl. No. 12/880,530 dated Sep. 16, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/457,451 dated Sep. 22, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 12/851,075 dated Apr. 4, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,110 dated Apr. 7, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,530 dated Apr. 9, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,380 dated Apr. 15, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,067 dated May 9, 2014.
  • Final Office Action from related U.S. Appl. No. 12/774,321 dated Jun. 2, 2014.
  • Intel, “Intel Ethernet Switch Converged Enhanced Ethernet (CEE) and Datacenter Bridging (DCB)”, White Paper, Feb. 2009, pp. 1-14.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Jul. 25, 2014.
  • Final Office Action from related U.S. Appl. No. 12/880,530 dated Aug. 18, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,749 dated Jul. 30, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/850,866 dated Aug. 14, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/859,911 dated Aug. 14, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/850,945 dated Jul. 25, 2014.
  • Final Office Action from related U.S. Appl. No. 12/774,154 dated Feb. 27, 2014.
  • Final Office Action from related U.S. Appl. No. 12/881,031 dated Mar. 6, 2014.
  • Final Office Action from related U.S. Appl. No. 12/880,749 dated Mar. 13, 2014.
  • Final Office Action from related U.S. Appl. No. 12/774,221 dated Jan. 29, 2014.
  • Final Office Action from related U.S. Appl. No. 12/880,851 dated Feb. 12, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,321 dated Feb. 7, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/850,832 dated Mar. 24, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,221 dated Sep. 20, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,749 dated Oct. 4, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/881,004 dated Oct. 30, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,594 dated Oct. 22, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,668 dated Nov. 26, 2013.
  • Non-Final Office Action from related U.S. Appl. No. 12/880,888 dated Nov. 4, 2013.
  • Final Office Action from related U.S. Appl. No. 12/850,945 dated Dec. 16, 2013.
  • Final Office Action from related U.S. Appl. No. 12/880,749 dated Jan. 13, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 12/774,221 dated Jan. 28, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/488,778 dated Jan. 2, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/480,020 dated Dec. 31, 2014.
  • Non-Final Office Action from related U.S. Appl. No. 14/479,670 dated Dec. 19, 2014.
  • Final Office Action from related U.S. Appl. No. 12/881,031 dated Feb. 12, 2015.
  • Final Office Action from related U.S. Appl. No. 12/881,110 dated Feb. 18, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/467,408 dated Dec. 17, 2014.
  • Final Office Action from related U.S. Appl. No. 12/881,031 dated Dec. 15, 2015.
  • Final Office Action from related U.S. Appl. No. 12/850,832 dated Sep. 24, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/488,778 dated Oct. 7, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/480,020 dated Sep. 30, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/479,670 dated Oct. 15, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/467,408 dated Oct. 26, 2015.
  • Final Office Action from related U.S. Appl. No. 14/625,810 dated Nov. 16, 2015.
  • Non-Final Office Action from related U.S. Appl. No. 14/753,183 dated Nov. 6, 2015.
Patent History
Patent number: 9271044
Type: Grant
Filed: Sep 13, 2010
Date of Patent: Feb 23, 2016
Patent Publication Number: 20110067062
Assignee: Broadcom Corporation (Irvine, CA)
Inventors: Jeyhan Karaoguz (Irvine, CA), Nambirajan Seshadri (Irvine, CA)
Primary Examiner: Kyu Chae
Application Number: 12/880,965
Classifications
Current U.S. Class: Receiver (e.g., Set-top Box) (725/131)
International Classification: H04N 21/482 (20110101); G06F 3/03 (20060101); G06F 3/0346 (20130101); G06F 3/042 (20060101); H04N 5/76 (20060101); H04N 9/82 (20060101); H04N 21/2343 (20110101); H04N 21/2389 (20110101); H04N 21/422 (20110101); H04N 21/432 (20110101); H04N 21/433 (20110101); H04N 21/4725 (20110101); H04N 21/4728 (20110101); H04N 21/845 (20110101); G06F 3/038 (20130101); H04N 5/445 (20110101); H04N 21/436 (20110101); H04N 21/258 (20110101); H04N 21/45 (20110101); H04N 21/2668 (20110101); H04N 21/472 (20110101); H04N 21/8545 (20110101); H04N 21/858 (20110101); H04N 21/24 (20110101); H04N 21/438 (20110101); H04N 21/81 (20110101); G06F 3/041 (20060101); H04N 5/44 (20110101);