Advertising impression determination
Systems and methods for verifying an advertisement impression in a digital environment are provided. In some aspects, methods of the subject technology include operations for defining a portion of the digital environment as an impression area, wherein the impression area is associated with a tagged advertisement area, providing a stream of an advertisement to the tagged advertisement area, and updating advertising impression information stored in memory regarding the advertisement, wherein an advertising impression is based on the identification of the character within the impression area and the availability of an unobstructed line-of-sight between the character and the tagged advertisement area. In some aspects, computer readable media are also provided.
Latest SONY INTERACTIVE ENTERTAINMENT AMERICA LLC Patents:
The present application is a continuation and claims the priority benefit of U.S. application Ser. No. 14/336,452, filed Jul. 21, 2014, which is a continuation of U.S. application Ser. No. 13/939,178, filed Jul. 10, 2013, now U.S. Pat. No. 8,795,076, which is a continuation and claims the priority benefit of U.S. patent application Ser. No. 11/241,229, filed Sep. 30, 2005, now U.S. Pat. No. 8,574,074 the disclosures of which are incorporated herein by reference.
The present application is related to U.S. patent application Ser. No. 09/780,995 filed Feb. 9, 2001 and entitled “In-Contents Advertising Method, In-Content Advertising Server, and Program-Transferring Medium for Realizing In-Contents Advertising,” which claims the priority benefit of Japanese patent application number 2000-241861 filed Jul. 4, 2000 and Japanese patent application number 2000-375096 filed Dec. 8, 2000. The present application is also related to U.S. patent application Ser. No. 10/268,495 filed Oct. 9, 2002 and entitled “System and Method for Camera Navigation,” which claims the priority benefit of U.S. provisional patent application No. 60/328,488 filed Oct. 10, 2001. This application is further related to U.S. patent application Ser. No. 11/240,655 filed Sep. 30, 2005 and entitled “Targeted Advertising.” The disclosure of all the aforementioned applications is incorporated by reference.
BACKGROUND OF THE INVENTIONField of the Invention
The present invention generally relates to targeted advertising, such as an advertising system and method for dynamically displaying advertisements in the context of video games (i.e., in-game advertising). More specifically, the present invention provides for the determination and tracking of advertising impressions in response to users interacting with video games having in-game advertising functionality.
Description of the Related Art
One of the many ways the advertising industry governs the success of advertising campaigns is through impressions. Impressions refer to the exposure a user has had to an ad or ad campaign. Impressions are typically indexed in to the number of times a potential consumer views a particular advertisement. For example, a print advertisement located in a kiosk in a shopping center might be viewed by 1,000 shoppers over the course of an afternoon. It could be said that the particular advertisement enjoyed 1,000 impressions as each shopper walked past the kiosk and viewed the goods or services advertised therein.
High-traffic areas offer the opportunity for additional impressions. For example, an advertising kiosk located near the entrance of a popular store in a shopping center might enjoy 10,000 impressions due to high shopper traffic whereas an advertising kiosk located near an unsuccessful store (e.g., a store going out of business) may enjoy significantly less advertising impressions. As advertisers seek to have their goods and services viewed by as many persons as possible, there is obviously a demand for advertisement placement in high traffic areas.
The same theory applies to other advertising media. For example, newspapers and magazines with high circulation enjoy increased advertising revenue because those newspapers and magazines offer the possibility for additional impressions whereas an unpopular or unsuccessful newspaper or magazine as do those publications circulated in small towns or with niche (i.e., limited) readership. An advertisement on a billboard in Times Square in New York City will similarly offer more impressions (and demand higher revenue) than a billboard located adjacent a service road in rural Nebraska.
High traffic areas or high impression opportunities thus become a valuable asset in the advertising community. Assigning value to those assets offers a challenge as it is difficult to accurately measure how many impressions a particular advertisement or advertising opportunity might offer.
For example, television relies on the Nielsen TV ratings system whereby an estimate of the number of people watching any particular television program at any particular time is provided. Based on statistical information provided by these ratings, a determination of which programs are the most watched or the most popular can be made. These programs often demand higher advertising fees as the increased popularity offers the opportunity for additional impressions. For example, the Super Bowl is one of the most watched television events; Super Bowl XVI between the San Francisco 49ers and the Cincinnati Bengals in 1982 saw almost 50% of the televisions in the United States ‘tuned-in’ to the game. As such, advertising ‘spots’ for the Super Bowl commonly demand exorbitant costs; a 30-second spot for Super Bowl XXXIV between the St. Louis Rams and the Tennessee Titans in 2000 demanded close to $2M. A similar supply-and-demand theory applies to regularly scheduled programs (e.g., sitcoms). Popular television shows will demand more advertising dollars due to the increased opportunity for advertising impressions whereas less popular shows will demand considerably less.
Other methodologies exist for measuring advertising impressions. For example, and as previously noted, television and magazines traditionally look to circulation to determine advertising rates whereas physical real-estate (e.g., bill boards) will look at a number of factors including location, size of the bill board and general traffic in the area. Radio and the audio media have similar ratings services, for example, those offered by Arbitron Inc., in conjunction with comScore Media Metrix.
Certain advertising models have significant shortcomings. For example, pop-up Internet advertisements that appear in a Web browser are generally viewed as annoying and while usually gaining the attention of the individual ‘surfing’ the web, also garner their disdain for the interruption of their browsing session often leading to immediately closure of the pop-up window or, as is often the case today, the use of a pop-up blocker whereby pop-up advertisements are prevented from popping-up altogether.
Other technological innovations continue to offer additional advertising challenges. For example, digital video recorders (DVR) like those offered by TiVo® provide the ability to ‘skip’ over advertisements by fast forwarding through the advertisement. Due to the digital nature of the television program stored on a TiVo® DVR, ‘skipping’ over advertisements is simple and does not involve the jerkiness, fast-forward/back-up that accompanied VHS tape recorders and video tapes.
The advertising industry is, therefore, increasingly faced with the inability to target its advertisements to individuals due to the decrease in readership in print media, unrefined advertising methodologies on the Internet and the inability to keep audiences ‘captive’ whereby there is a certain degree of assurance that a consumer views a particular advertisement (e.g., users can now ‘skip’ over ads while remaining in front of their television during a television program).
The video game industry is quickly becoming one of the last bastions of captive audience advertising. That is, the player of a video game often offers their undivided attention to the video game environment so that they may remain aware of actions taking place in the game (e.g., being attacked by an enemy, discovering a cache of weapons or treasure trove, identifying a ‘lane’ through which to navigate a running back in a football game). Video games, therefore, offer the opportunity for placing ads before a captive and extremely attentive audience.
There have been—and continue to be—numerous cases wherein actual advertisements of advertisers are deployed and displayed within a video game environment. A classic example is in a driving game, wherein advertisements are pasted onto billboards around a driving course as illustrated in U.S. Pat. Nos. 5,946,664 and 6,539,544, the disclosures of which are incorporated herein by reference. With such in-game advertising, the software publishing company that creates the video game identifies an advertiser, creates texture data based on ad copy provided by the advertiser and places this texture data representative of an advertisement in the video game environment (i.e., posting the advertisement on the billboard).
Online and networked gaming is increasing in popularity throughout the world. With this increase in popularity, there is an expectation that gaming networks assemble standards and evolve into an advertising channel such as television and radio. As a part of this increase and evolution, there is a need for a framework and system for advertisers and media providers to manage and track advertising in video games and other digital environments.
SUMMARY OF THE INVENTIONThe present invention may be configured to provide a system and method for deploying and tracking advertisements across a video game network.
The present invention may be configured to provide a method for determining an impression area in a video game environment relative an advertisement in the environment.
The present invention may also be configured to provide a method for identifying an obstruction in the impression area and redefining the impression area based on the presence of the obstruction.
The present invention may also be configured to provide a method for verifying an advertising impression in a video game environment when a video game character is present in an impression area with an unobstructed view of a related advertisement.
The present invention may also be configured to further provide a method for determining the time a video game character is present in an impression area with an unobstructed view of an advertisement.
The present invention may also be configured to provide a system for determining when an advertising impression has been made in a video game environment.
The present invention may also be configured to provide a system for processing a payment based on the presence of a video game character in an impression area in a video game environment.
The present invention may also be configured to allow advertisers to identify popular games and/or effective advertisements to allow for the establishment of proper pricing models, receive feedback on their products, market to various segments and deploy custom programming relating to advertising campaigns in an adaptable in-game advertising network.
As shown in
The content server 120 may distribute digital content. Content may be requested from networked devices operating in a gaming network. In one embodiment, the content is requested by end-user client devices 1701 . . . 170N. The content distributed by content server 120 may comprise video game content (e.g., actual video games, or portions thereof, accessed by end-user client devices 1701 . . . 170N) as well as other forms of digital media (e.g., music and video). The content server 120 may further provide for the storage of digital content. The content server 120 may store such content locally (e.g., as part of a storage area network) or at a location physically remote from the content server 120 but otherwise communicatively coupled to the server 120 thereby allowing for retrieval and transmission of the content to end-user client devices 1701 . . . 170N. Content served by the content server 120 may be served as the result of a push or pull operation.
The advertising server 130, as previously noted, may be managed by an advertising agency providing for the distribution of advertising content to larger audiences (e.g., end-users). The advertising server 130 may serve audio, video, audio/video and still image content. Content served by the advertising server 130 may be served as the result of a push or pull transaction. Advertising database is a storage mechanism for advertising content such as the aforementioned video and audio content. While advertising images are the most prevalent type of advertising content, advertising content may further comprise element types such as programs, objects, state data, control data, textures, bitmap images, compressed images, sequencing data, authentication data, public key and private key. Advertising database 140 may be integrated with advertising server 130 or may be physically remote from the advertising server but otherwise providing a communicate coupling allowing for the retrieval of content from the database 140 for subsequent transmission to end-user client devices 1701 . . . 170N.
Content authors 1501 . . . 150N are those entities that develop content for distribution to end-users, for example, video games. Content authors 1501 . . . 150N may also develop audio, video and/or audio/video content. Content developed by content authors 1501 . . . 150N may be generated in any form of media. For example, content may be developed in an optical disk format or in non-volatile memory such as a flash card. Content may also be provided in a pure data format to be transmitted and hosted by another party. For example, content author 1501 . . . 150N may develop a video game but never commercially distribute the content in a physical form of media. Instead, the content may be FTP'd or otherwise transmitted to content server 120 and stored in an appropriate storage means for subsequent delivery to end-user client devices 1701 . . . 170N.
Advertiser 1601 . . . 160N is any entity seeking to place an advertisement in the digital content created by content author 1501 . . . 150N. Advertiser may be from any field of endeavor and need not necessarily be in the entertainment or video game industry.
End-user client devices 1701 . . . 170N are those devices allowing an end-user to access digital content. For example, in the case of a video game, the appropriate end-user client device 1701 . . . 170N may be a home entertainment video game system such as a PlayStation3 from Sony Computer Entertainment Inc. In the instance of digital content being, for example, an on-demand movie or other video program, the end-user client device 1701 . . . 170N may be a set-top cable box. End-user client device 1701 . . . 170N may, in other instances, be a portable device that may be temporarily coupled to a more permanent device (e.g., a desktop computer) to allow for the transfer or updating of digital content via a USB cable as would be the case in, for example, a portable music device such as an MP3 player.
Optional payment processing center 180 allows for the execution of various payment and/or monetary transfer transactions. These payments may be achieved, for example, through direct deposit, automatic funds or wire transfers as is appropriate and/or available. Payment processing center 180 may, for example, be a bank offering these services. In another example, payment processing center 180 may be an on-line escrow agent communicatively coupled to a variety of banks wherein the escrow agent instructs and/or receives notice of various monetary transactions on behalf of various entities in the exemplary in-game advertising system 100 (e.g., advertisers 1601 . . . 160N and content providers authors 1501 . . . 150N).
Advertising content creator 190 is an entity that authors and/or develops advertisements on behalf of advertisers 1601 . . . 160N for placement into digital content. In some instances, advertising content creator 190 may only digitally author content. For example, certain advertising copy (be it audio, video, print or any combination of the three) may have already been created in a non-digital format. In those instances, advertising content creator 190 would manipulate (e.g., digitize) the advertising copy so that it may be placed into the greater context of digital content that is offered by the content server 120. In other instances, advertising content creator may take a script for an advertisement and create the same (e g, film video, record audio and then combine the two with various special effects). Advertising content creator 190 may also utilize program objects and program scripts including commands related to special effects, program elements, control signals, messaging and various protocols. In still other instances, advertising content creator may develop advertisement campaigns from scratch (e.g., the advertising concept for a campaign) and subsequently create the ad content to correspond to that campaign.
The advertiser 1601 . . . 160N can access the advertisement server 130 and can view the advertisement information in viewing step 215 and further apply for an advertisement buy from, for example, a web-browser screen in application step 220. Once the advertisers 1601 . . . 160N have been established, advertiser specified information such as advertiser name, time slot, and time period of an advertisement are provided to an appropriate content author 1501 . . . 150N from the advertisement server 130 in notification step 225. Notification may occur by traditional mail, electronic mail, listservs, SMS, instant messenger, chat or any other available communication medium.
Advertiser specified information and advertisement structure information are also supplied to the advertisement content creator 190 via the advertisement server 130 in ordering step 230. The advertisement content creator 190 creates advertisement content (e.g., the advertisement) based on the advertiser specified information and advertisement structure information. The completed advertisement information such as bitmap data or other graphic, audio and/or video data is delivered by the advertisement content creator 190 to the advertisement server 130 in delivery step 235.
Notification of the receipt of the completed advertisement is communicated by the advertisement server 130 to the advertiser 1601 . . . 160N in completion step 240 by traditional mail, electronic mail, listservs, SMS, instant messenger, chat or any other available communication medium.
The advertiser 1601 . . . 160N can view the completed advertisement information on the advertisement server 130 in viewing/approval step 245. If the advertiser 1601 . . . 160N approves of the completed advertisement content (e.g., by pressing an ‘OK’ button in a web-interface), the advertisement content is confirmed and an itinerary is by the advertisement server 130 to the content author 250 by traditional mail, electronic mail, listservs, SMS, instant messenger, chat or any other available communication medium in delivery detail confirmation step 250. The itinerary delivered in step 250 may comprise information related to the advertiser, time slot, period, advertising fees and so forth.
In registration step 255, the content provider 120 correlates certain advertisement information and advertisement content with digital contents to be delivered. That is, the content provider 120 recognizes that particular advertisements are to be delivered with particular portions of digital content and so forth. This correlation of information may comprise authoring new derivative files reflecting both advertisement information and digital content/advertising programs), the embedding of metadata in the digital contents or the implementation of object oriented programming wherein certain data files (e.g., digital contents/advertising programs) call upon other distinct files (e.g., advertising information). The metadata may also comprise information as it pertains to advertising information such as how long a game character must be present within an impression area defined within the video game. The metadata may further provide information defining the parameters of the impression area and certain quality factors as are discussed herein. Tracking parameters and feedback information and/or instructions may further be imbedded in the metadata of the advertisement. Such information may also be contemporaneously downloaded with the advertising information as a separate file whereby the advertising information calls upon certain information related to impressions, reporting and so forth.
A user accesses and/or requests digital content (e.g., a driving simulation video game) using end-user client devices 1701 . . . 170N in content application step 260. As a result of the application for content, the user may start to download the content in download step 265. Alternatively, if the user already has a particular portion of the digital content, this step may involve presenting the user with an update as to that content. This step may also comprise unlocking digital content that is already in possession of the user. Step 265 may also involve authenticating removable media, accommodating registration with a game network or a game ‘lobby’ or ‘waiting room’ and so forth
In some embodiments of the present invention, the user may access digital content using permanent physical media (e.g., an optical disc). The physical media may have embodied thereon instruction for accessing the present in-game advertising system 100 as it pertains specifically to advertising content versus actual video game data. In additional embodiments of the present invention, the user may access a combination of advertising content and actual game data via the exemplary in-game system 100 (e.g., new advertising content and new game levels published after the initial release of the game on physical media). Such embodiments are discussed in further detail below.
During the download of content and/or advertisements in step 265, the content provider 120 notifies the advertisement server 130 of the download request as it relates to particular advertising material in step 270, such correlations between content and advertising having previously been made in registration step 255. The advertisement server 130 then transmits the necessary advertisement data corresponding to the user download to the content provider 120 in step 275. If necessary, the advertising data provided to content provider 120 can be updated over time relative the content being downloaded (e.g., new ad copy).
After downloading the digital contents (including advertisement information or content in ad information delivery step 275), the user (through end-user client device 170) renders the advertisement information within the game contents, the advertisement information having been provided via advertisement server 130. As noted above, some embodiments of the present invention may access solely advertising information or a combination of new game content and advertising information rather than an entire game.
The state of the advertisement, such as the number of distributions or impressions made, may be provided to the advertisement server 130 and, if necessary or desired, to the advertiser 1601 . . . 160N in advertisement status notification step 280 so that certain determinations made be made, for example, the success of an ad campaign with regard to the number of impressions made.
As a result of the notification in step 270, the advertisement server 130 can track the advertisements that have been or are being downloaded to an end-user client device 170. Utilizing certain ad impression and tracking methodologies as discussed herein, the advertisement server 130 can receive feedback in connection with advertisement impressions. Information concerning impressions or other advertisement feedback may be generated at the end-user client device 170, which has been configured with the necessary software to either directly or indirectly implement impression tracking.
Direct impression tracking may be based on software configured at the end-user client device 170 and that operates in conjunction with a game kernel and is further configured to participate in network communications such that textures and objects or indexes to textures and objects related to an advertising campaign may be received. The tracking software may directly monitor the angle and position of various advertising asserts with respect to changing camera perspectives presented to the user who controls the camera perspective utilizing a game controller. Indirect impression tracking may occur through a server or a session master client in a peer-to-peer network participating, facilitating, arbitrating or interrogating functions associated with the campaign program (e.g., extraction of data necessary to yield the determination of an ad impression). Hence, ad impression determinations may occur at, for example, ad server 130 or advertiser 160 in response to information generated or signals sent from the end-user client device 170.
Ad impression data may be batched or transmitted over the network at periodic intervals. Transmission of impression data may occur in accordance with a schedule or in conjunction with other processes or transmissions used to facilitate game play. Impression data may also be streamed or pulled during an inquiry received over the network. Any network element of the advertising system 100 may facilitate or influence the transmission of impression data.
Advertisement impressions may be calculated in various ways. For example, an advertisement located in a virtual kiosk in a virtual shopping center might be viewed by 1,000 gamers over the course of an afternoon. It could be said that the particular advertisement enjoyed 1,000 impressions as each gamer walked their gaming character past the kiosk and viewed the goods or services advertised therein. Impressions may also be calculated through a time threshold index. For example, an impression may be earned, triggered, counted or computed after a user has been exposed to the advertisement for a particular period of time. In this example, an impression may occur after 30 seconds of exposure by the user to an advertisement. The impression may also be tracked and computed based on one or more user's continuous or distributed exposures to the advertisement on the virtual kiosk or as part of an overall ad campaign.
The advertisement content receiving, impression tracking and impression data feedback transmission systems of the present invention may reside in a single software element or in multiple software elements. Software elements may be distributed in whole, or part, on one or more processors or across a local or wide area network.
Impression tracking software may be provided as a result of downloading a necessary software module during download step 265 or the software having been installed directly on physical media (e.g. an optical disk) read by the end-user client device 170 or, alternatively, installed directly in the end-user client device 170. Tracking software or various components of the software may also be installed in the various other components of the advertising system 100 dependent upon the particular configuration of an embodiment.
Similar or identical advertisement state information may be provided to content author 1501 . . . 150N. This notification is made so that the advertiser 1601 . . . 160N may be properly invoiced by the contents author 1501 . . . 150N in accordance with any number of payment plans as are discussed herein. The advertisement server 130 may further provide this information to payment processing center 180 to allow for automatic billing and payment in step 285. These payments may be achieved, for example, through direct deposit, automatic funds or wire transfers or any other money transfer methodology as is appropriate and/or available.
The advertising system 100 and methodology of the present invention and as described in an exemplary embodiment through
It should be noted that in some embodiments of the present invention, certain elements of the in-game advertising system 100 may be combined or removed from the system 100 entirely without compromising the operations of the system 100. For example, an embodiment of the in-game advertising system 100 as described herein may function without the need for a payment processing center 180 as proper remuneration of parties in the system 100 may have been established before hand or subject to analysis of certain information after advertisement delivery. Similarly, the ad server 130 and related database 140 may be operated in conjunction with the advertisement content creator 190 or with content provider 120. Various approval and notification steps may also be omitted in the course of
While
In such an embodiment, advertising content may be embodied on the physical media as well. As has been previously noted, however, such advertising schemes may be ineffective if the popularity of a game turns out to be overrated (wherein an ad buy was likely overpriced) or underrated (wherein an ad buy was likely under priced). Similarly, the relevance of certain advertisements may expire over the course of time (e.g., an advertised event occurs, the advertiser stops selling the product or goes out of business or the advertisements are mock advertisements pertaining to a fictitious product but remain relevant in the context of the game despite the passage of time).
In such physical media, a software client may be embodied in the physical media, the client comprising the operating routines, resources, instructions and so forth that allow an end-user client device 170 reading the optical media or other physical media to access the in-game advertising system 100 like that described in the present invention. Although the user may not necessarily be receiving video game content (e.g., the user does not download or directly access code and other information related to the actual game), the user may still receive advertising content as the client pertains to the instructions and operations necessary to access in-game advertising system 100 and for advertising content to be provided to the system 100.
Through the provision of such an advertising client on physical media, it becomes possible for a variety of parties that develop games that operate on a particular end-user client device (e.g., the PS3 game console) to interact with the in-game advertising system 100. Access to the advertising client code may be subject to a fee charged by the in-game advertising system 100 operator, the costs of which may be recouped by the third-party game developer who passes those costs onto advertisers 1601 . . . 160N that might wish to place content in a particular video game as the popularity and advertising value of that game is assessed.
In order to enjoy the advertising opportunities offered by such a system like that described in the present invention, certain objects in a video game may be ‘tagged’ as subject to advertising. For example, and as shown in
Alternatively, advertising content may be loaded into a game during development and specific advertising campaigns may be purchased after the release of the game. In such an embodiment, advertisement purchases trigger signaling event that index specific ads embedded in a game and unlock the advertisements. The ads are then associated with one or more tagged advertising assets. Preloaded advertisements may be replaceable in whole or in part over the network by a server or via a peer-to-peer arrangement. The replacement of advertisements may occur based upon a user profile, user interaction with an advertisement or ad campaign, geographic location of the user or control signals, messages or communications in connection with the advertisement.
Tags reflect not only the space where an advertisement may be placed but may also reflect information such as size limitations, coloring and shading requirements, pointers to variables that track state and impression data, functions and programs associated with the advertisement, hyperlinks and mini-games associated with the advertisement, user-profile filters and, in some embodiments, even advertising relevance. It should be noted that the said functions and programs associated with the advertisement may access variables that track state and impression data. Tags may be numbered to reflect individual assets wherein advertisements may be imposed or grouped to reflect that one particular advertiser 1601 . . . 160N should have one or more of their advertisements placed in these tagged groups (e.g., all billboards on a city block). The tagging of assets and rules related to the tagging of assets may be imposed by content author 150. Rules embedded in the tagged assets (e.g., ad size) may be recognized by advertisement server 130 to ensure that the proper advertising content is delivered to these tagged areas when called upon in the in-game advertising network 100.
Impressions of or exposure to advertising asset tags are capable of being tracked independently or as a group. Additionally, impressions of or exposure to advertising asset tags are capable of being aggregated against a particular end-user device, versus a particular game or across a network in general. For example, an advertising server 130 may receive ad impression information relating to impressions of specifically tagged assets (e.g., billboard A, billboard B, billboard C, etc.) or assets as they apply to a particular advertiser (e.g., Coke, Pepsi or Sprite) at an end-user device or a plurality of devices. Impressions and exposure may also be aggregated as they pertain to a particular game whereby the number of impressions generated for a particular advertiser in a particular title are determined regardless of the particular asset on which the advertisements were placed. Similar aggregation of impression data can occur across an entire network regardless of the particular game title whereby the total impressions for a particular advertiser are determined against all end-user client devices communicating with the advertising network/system 100. Other aggregation parameters may be utilized at the particular needs of an advertiser.
Tagging is not limited to ‘print-like’ advertisements (e.g., a billboard). Tagging can also be related to other visual formats such as audio and video. For example, a television in a video game may be tagged as to reflect that the user tuning the television to a particular channel will cause a full motion video advertisement to be streamed. Various other advertisements could be streamed or rendered on additional channels subject to the whim of the game designer and the extent of tagging of assets for advertisement introduction.
Similarly, audio may be tagged for advertising purposes. For example, if a user plays a video game with a radio (e.g., driving an automobile), the game designed can create different channels whereby actual music from actual artists is played interspersed with various advertisement that one might hear over the radio. Similarly, the actual music a user listens to may be Top 40 hits or other popular music rather than a one-time generated, static soundtrack. In that way, the user may play the game today or five years from now and be able to listen to not only relevant advertisement but relevant music that is current and popular the day the user plays the game. Similar attributes may be reserved for providing real-time television programs and the like (e.g., short films, movie previews and so forth).
As a result of tagging and the delivery of relevant advertisements into the user's game environment, tracking of advertisements may take place. That is, through in-game advertising system 100, it may be determined exactly how many times a particular advertisement was introduced to an end-user client device 170 subject to any variety of conditions (e.g., nationwide ad buys over the gaming network or geographic or targeted advertisements). Additionally, and as described in
It should be noted that while most networks and computing devices can provide nearly instant rendering of dynamic advertising information, to identify a particular portion of an environment where such dynamic content may be rendered (e.g., identifying a tag), sending relevant information to the advertising server 130 and retrieving the relevant advertising information may take several seconds. If a user has a slow or congested communications network or a computing device with slower processing power, rendering of that dynamic information may take even longer. If extended delays in rendering information result or, worse, stagnation of game play to wait for the rendering of the dynamic advertising information, user's may lose interest in the video game or seek to deactivate the dynamic advertising aspects of the game.
As such, it is necessary for video games to identify references points in the video game environment (e.g., physical points in the video game or subject to the accomplishment of certain tasks or reaching a certain level) to determine when the in-game advertising system 100 should begin to be accessed to acquire the necessary advertising information. For example, while a user may not have yet reached a billboard tagged to render advertising information, the user may have surpassed a reference point earlier in the game such the content begins to load in the background to provide for instant rendering when the user finally does reach the billboard. An example of such dynamic loading methodology is described in U.S. Pat. No. 6,764,403, which is incorporated herein by reference.
Certain embodiments of advertisement structure information 400 may be abbreviated, that is, it comprises demonstration or short previews of larger portions of content (namely movie content 420 and game program data 430). In other embodiments, only game program data 430 may be present, for example, for game downloads in the context of the present in-game advertising system 100. In even further embodiments, for example, wherein physical media comprises game program information, only header 410 and certain ad information may be present.
Object data 440 comprises coordinate values of information of objects in a game environment, those objects made up of polygon apex data or the like. Texture data 440-450 comprises pattern data of the object data converted from three-dimensional data to two-dimensional data through various conversion techniques known in the art. For example, object data 440 may related to a race car, texture data 450 and 460 would relate to color patterns and logo advertisements on the race car.
In an embodiment of the present invention utilizing advertisement structure information 400, advertisement information AD1 . . . AD4 is embedded in the structure information 400. AD1 may comprise information such as a code indicating that advertisements may be inserted, the nature of the advertisement to be inserted, or information pertaining to frames, resolution and so forth. This data may be linked to the advertisement server 130 whereby advertisements are inserted into a game environment. Advertisement information AD1 . . . AD4 may also be embedded in physical media should it be necessary to download content as is described in certain embodiments of the present invention.
In optional step 610, a determination may be made whether a game program comprising tagged assets has been activated. Once the digital contents have been downloaded or accessed on physical media via end-user client devices 1701 . . . 170N, an advertisement delivery program on advertisement server 130 may be contacted to request advertising content. If an end-user client device 170 operating a game program with tagged assets has not been activated, the server 130 can await a request or indication delivery of content is or will be necessary. If such a program has been activated, the advertising program may access the advertising server 130 in step 620 and, in step 630, make notification of the title of the digital contents activated on the end-user client device 170 user address (e.g., a network address). Depending on the nature of the advertising content to be delivered (e.g., targeted advertising as discussed herein), the advertisement server 130 reads out the advertisement data in advertisement database 140 and transfers this content to the user's address. In step 640, the end-user client device 170 receives the advertising data under control of the advertising program, records this in the main data storage 530 (
Once the game starts or action in the game continues in step 670, a determination is made in step 680 with regard to whether or not the tagged asset has been reached in the game environment; that is, has the user reached the position for inserting advertisement information. If the determination is YES, in step 690 the corresponding advertisement data is positioned at the corresponding position in the memory (i.e., the tagged asset). In some embodiments, it may not be necessary for advertisements to be inserted during game play as tags may be associated with advertising content upon game commencement, upon a level change or in response to a control signal relating to an in-game advertising event.
As has been previously noted, advertising information may be dynamically loaded prior to the content being needed. Larger advertising data—for example, full motion video or audio—may be stored in main memory 530 or a graphics engine buffer (not shown) before action in the game commences. Other embodiments may place object data and/or texture data in the main memory 530 either immediately before the game action starts or before the data is used.
In
The presently described in-game advertising system 100 may also be utilized to provide for the targeting of advertisements. Providing information over a communications network requires proper addressing of that information to an end-user. For example, a network address (e.g., an Internet Protocol address) may be static and assigned to a particular user. Identifying the actual user assigned to this address may be achieved through the network service provider (e.g., an ISP) that is assigned the network address and aware of the address of that user. Alternatively, a user may register with a content provider (e.g., an on-line gaming network), which may require providing specific information (e.g., name, e-mail, billing address and so forth).
In the case of acquiring end-user information from a network provider (e.g., through a commercial information sharing agreement), the acquired information may reflect billing information (i.e., certain geographic information). Similarly, registering with the content provider may reflect certain geographic information of the user (e.g., billing information). As a result of this geographic information, an advertiser may target geographic or region-specific advertisements.
For example, an end-user that resides in Boston may have little interest in receiving information concerning New York Yankee season ticket sales. Similarly, there would likely be little value to advertise a regional product or service such as a restaurant in New York to someone who resides in San Francisco where that product or service is not offered. National advertising campaigns concerning a regional product or service would likely be ineffective relative a return on the advertising investment and may annoy the user receiving those advertisements, because the user may have no interest/access to the product or service being advertised. In contrast, a user in San Francisco might have interest in receiving advertisements related to San Francisco Giants season ticket sales or a concert in the area; that is, geographically relevant advertisements.
By acquiring geographic information of a user (either through direct registration or a service provider), advertisements can be targeted so that the appropriate advertisement is directed to the user. In this way, advertising dollars are ensured a greater return on investment. For example, products localized to Boston are advertised to persons living in the Boston area and products specific to San Francisco are advertised to persons residing in the San Francisco area.
Geographic information may also be inferred from other available information. For example, an IP address may identify a particular region of a country through geo-location. While geo-location via an IP address is not as accurate as explicit registration with a service provider, it provides a greater degree of accuracy than would blind advertisement campaigns. Thus, even dynamic IP addresses that are not consistently associated with any particular user (but instead a service provider who may recycle the address amongst a group of users) may have some advertising value due to geo-location techniques. Geographically-specific advertisements may be provided to the geo-located user, although there remains the possibility that such advertisements may be less accurately targeted than an advertisement with a specific geographic affiliation.
Advertisements may also be more accurately tracked with regard to actual impressions thereby allowing for more accurate determinations of advertising campaign value or proper remuneration to a provider of the advertisement relative those impressions. For example, an advertisement may have 1,000,000 impressions over a 2 week period. While this number may be impressive in a vacuum, when it is learned that 75% of those impressions occurred in a geographic region where the product or service is unavailable, the number of impressions becomes much less valuable. Many of the impressions were wasted on portions of the consuming public that will not or, perhaps even worse, cannot purchase the service or product. Thus, an advertiser can purchase a particular number of impressions with the caveat that those impressions be within a particular geographic region to count against a total overall ad buy.
Direct targeting of users may also take place using variations of the aforementioned identification methodologies. For example, in the registration scenario, a user may provide certain ‘likes’ or ‘dislikes’ in a user profile generated during the registration process. A user may indicate favorite sports teams, favorite hobbies, and the like. As a result of the user profile reflecting that a user is a Boston Red Sox fan, the user may be presented with certain advertisements that relate to World Series Memorabilia from the Red Sox 2004 World Series victory, and not a compilation of the New York Yankees World Series victories. Similarly, a user that identifies an affiliation with the San Francisco 49ers may receive 49er related advertisements instead of advertisements related to the Oakland Raiders. Alternative or more generic profile factors may also be implemented and/or utilized as are available and/or relevant to a particular advertiser.
This type of targeted advertising may be extremely useful when a product or service is available nationwide but has limited popularity or sales in particular regions. For example, a product may be available over the Internet (e.g., through Amazon.com) but also available at a number of brick and mortar stores in one particular region of the country (e.g., the West Coast). A user on the East Coast might purchase these products if he was aware of particular sale opportunities or new product releases. If that user does not live on the West Coast where an advertising campaign is in effect, however, they may never receive advertisements related to that product as advertising dollars have been allocated to a in the locale where brick and mortar stores are located. If the East Coast user indicated an affinity for a particular product in a profile, advertisements can be presented to this user via the in-game advertising system 100 even though the user lives in a region where product sales are otherwise low and advertising (in traditional media forms) is low or entirely non-existent. Through such targeted advertising, not only are impressions generated amongst able buyers, but also amongst willing and highly interested buyers making each impression all the more valuable.
Certain learning intelligence may also be implemented to aid in the direct or geographic targeting advertising process. For example, a game user may participate in an on-line baseball league. Registration for that league may be limited solely to a user name and billing information. If the user resides in Southern California, it would be (as a broad-based assumption) unlikely for this game user to be a fan of the Florida Marlins and (as another assumption) probably a fan of the Los Angeles Dodgers or the Anaheim Angels. Such assumptions may prove to be false.
But if the same user, via the on-line baseball league, continually selects the Florida Marlins as his team of choice, the in-game advertising system 100 may recognize the repetitive behavior (e.g., the selection of a particular team, or a particular character in a game). Based on the repetitive behavior of the user, an assumptive profile of a user may be generated.
Further, if the user plays the networked/on-line baseball league fifteen times and elects to play with the Marlins fourteen of those times, it would be an intelligent assumption that the user is a Marlins fan even though the user lives in Southern California. As a result, certain advertisements in the game environment may be directed toward fan merchandise for the Florida Marlins, instead of for the Dodgers or a random advertisement.
Such targeted advertising is not limited to favorites or affiliations of the user. Direct targeting may also utilize demographics such as gender, age, and the nature of the game itself. Gender may be specifically identified or presumed based on the content of a video game. Age may be based on a specific identification or a presumption related to the maturity of a particular game. The nature of the game itself may indicate demographic information of the user or relevant advertising content. For example, a sports game may generate sports advertisement whereas role playing games may generate advertisement specific to the nature of the game such as combat or fantasy. Various combinations or subsets of targeted advertising may also be utilized (e.g., age and gender relative a particular genre of video game).
These intelligent determinations or analyses based on various demographics may take place at the advertisement server 130 via an appropriate software module providing for such deductive or intelligent determinations.
It should be noted that the present disclosure describes numerous inventive components that may operate individually or with other inventive components outlined herein. One such inventive component—tracking what advertisements the user sees—involves monitoring the view perspective of the user (e.g., the point-of-view of the game character or of the actual user via a game camera) and calculating when the user has experienced an ad impression. One embodiment of this method further allows for object occlusion detection. Such impression information may then be returned to an ad server 130 or other component of the system 100 as described in
In an ideal advertising environment, game character 810 and advertisement 820 would be separated by an unbroken line-of-sight. That is, obstacles 830-880 would not break line-of-sight 890 whereby game character 810 (and its controlling user through, for example, a first-person game view) would have a full and uninterrupted view of and exposure to advertisement 820. Such an uninterrupted view of and exposure to advertisement 820 is desirous in that it provides for an advertising impression most like that as would be encountered in the real-world (e.g., reading a newspaper advertisement, viewing a billboard or attentively viewing a television commercial). That is, persons in the real-world are generally able to view an advertisement (or at least position themselves) such that other objects in the environment do not obscure a view of that advertisement.
But as is shown in
The interruption of the line-of-sight 890 as caused by obstacles 830, 840 and 850 may partially (or wholly) prevent the character 810 from viewing the advertisement 820. Depending on the exact angle of obstacles 830, 840 and 850, the character 810 may be able to see certain portions of advertisement 820, but those portions may be minimal compared to the greater portion of the advertisement 820 obscured by obstacles 830, 840 and 850. In some cases, an advertiser may have paid significant sums of money for the placement of advertisement 820 in game environment 800. However, the advertisement 820 may never be viewed as was intended by the advertiser (e.g., a full-frontal observation of the advertisement 820 for a given period of time in order to allow the game user controlling game character 810 to review and comprehend the advertisement 820). The advertiser may, therefore, have expended certain sums of money with absolutely no end benefit as the user of the game (via character 810) did not view the advertisement 820. This lack of an advertisement impression results even though character 810 is actually standing directly in front of advertisement 820 and has their line-of-sight 890 oriented in the same direction.
The game character 920 may enter this particular portion of the game environment 900 (the record store) through, for example, an entryway 960. As can be seen in
The game character's 920 inability to view the advertisement 910 in
The surface vector 940 comprising a unit length (e.g., a distance from the advertisement) further defines the impression area 930 for a predetermined distance from the surface of the advertisement 910. The surface vector 940 relative the advertisement 910 is defined, for example, as being 20 feet. Absent any obstructions in the impression area 930, if the game character 920 is within 20 feet of the advertisement 910 and within the angles defined by first ray 970 and second ray 980 (i.e., +/−30° relative the surface normal 990), then the game character 920 is within the impression area 930.
A user controlling the game character 920 within the impression area and facing the advertisement 910 will be able to view the advertisement 910. That is, an impression will be established for the advertisement 910 as would normally occur in the real world (e.g., while the user is standing in front of a billboard). Alternatively, if the game character 920 is not within the impression area 930 as defined by first ray 970, second ray 980 and surface vector 940, then no impression is generated.
An impression area 1030 is defined in a manner similar to that of
With the increased graphic complexity of many video games, placement of objects about a game environment increasingly provides a challenge to creating advertising impressions. For example, in
Determining whether a game character falls within an obstructed area can be accomplished by using line-of-sight determination. In
In some embodiments, obstruction probe 1225 is a spherical object with a predetermined radius r. Obstruction probe 1225 travels along the line of sight 1220 (or 1250) between game character 1200 (1200′) and advertisement 1210. If obstruction probe 1225 does not collide with any obstacles, then the line-of-sight between game character 1200 (1200′) and advertisement 1210 is unobstructed. If game character 1200 is located in an impression area and oriented toward the advertisement 1210, an impression of the advertisement 1210 is generated. Unobstructed line-of-sight 1250 illustrates the absence of object obstruction between game character 1200′ and advertisement 1210, which allows for an advertising impression.
Alternatively, line-of-sight 1220 is obstructed as a result of one or more objects 1230A . . . 1230C, preventing an advertising impression. If the obstruction probe 1225, while traveling along (obstructed) line-of-sight 1220 intersects one or more polygonal sides 1240a . . . 1240c of one or more objects 1230a . . . 1230c, where each of one or more objects 1230a . . . 1230c is typically constructed from multiple polygonal sides 1240a . . . 1240c, then an unobstructed view of the advertisement 1210 relative the game character 1200 is not possible and no advertising impression is generated notwithstanding the presence of the game character 1200 in an impression area. Such a scenario—an obstructed line-of-sight and absence of an advertising impression despite being in an impression area—is illustrated in the aforementioned
In some embodiments of the present invention, partial viewing of and exposure to an advertisement may be sufficient to establish an advertising impression. For example, certain trademarks or logos have established a certain degree of notoriety within the purchasing public. For these famous or easily recognizable trademarks or logos, viewing even a portion of the trademark or logo may be sufficient to establish an advertising impression. Similar ‘partial viewing impressions’ may be acceptable with regard to slogans, celebrities, famous spokespersons, and so forth. In these instances, even though the obstruction probe 1225 may intersect with an object, if the intersection involves only a small percentage of the probe 1225, then a partial impression may be generated. If the object obscures the advertisement in its entirety—100% of the probe 1225 intersects with the object—then no impression is generated.
The radius r of the obstruction probe 1225 may be reduced whereby a collision with a polygonal sides 1240a . . . 1240c of one or more objects 1230a . . . 1230c may be avoided thus allowing for an unobstructed line-of-sight and, subject to presence in an impression area, establishing an advertising impression. In that regard, the radius r of obstruction probe 1225 may be relative to an advertisement to be viewed. Information relative the setting of radius r may be part of advertising data pushed to a video game environment by the advertising server 130.
In some embodiments of the present invention, especially those involving third-person points-of-view, it may be possible to overcome obstructed lines-of-sight in an effort to create an unobstructed line-of-sight. For example, in an in-game advertising system 100 (
It should be noted that impression counter 1360 is not necessarily a stopwatch or other timing device as depicted in
By measuring the length of exposure to an advertisement via impression counter 1360, an advertiser can determine the value of an ad impression or whether an impression has actually been made if the existence of an impression is tied to the duration of presence in the ad impression area (e.g., the time of exposure to the advertisement). For example, a pricing model may be established wherein an advertiser is charged based on the duration of the advertisement impression. The duration of the advertisement impression is reflected by the impression counter 1360. In another pricing model, an advertiser may pay a fee for a certain number of advertisement impressions. An ad impression may be defined as unobstructed exposure to an advertisement for a certain period of time. For example, and as evidenced in
The various ad impression determinations may be implemented utilizing software downloads as discussed in the context of
As noted, various pricing models may be based upon the existence of advertising impressions or the quality thereof. For example, an advertiser may be satisfied knowing that their advertising content has made it into a video game. Another advertiser may be more demanding and require information related to actual impressions. Using the methodology described in
Even more specific, it may be determined how long the user viewed the advertisement. For example, if a user is merely scanning around the room for an exit or a particular object, their line-of-sight may intersect with the advertisement but the scanning of the room was too quick to allow for any meaningful consideration or understanding of what the advertisement portrayed. In this scenario, a timer may be implemented as was described in
On an even more detailed level, it may be possible to determine the quality of the impression. For example, a user may view an advertisement as a result of being in an impression area. That user may, however, be on the very far edge of the impression area and have slight difficulty viewing the advertisement. This might be the case if a user is utilizing a later model television or computer monitor or is utilizing a computing device that has lower graphics processing power. Notwithstanding graphics output considerations, it is possible to further delineate the impression area into quality impression areas whereby the advertisement is viewed in every instance but better or worse depending on the exact placement of the game character when viewing the advertisement.
A game character may be face-to-face with an advertisement. The character, while clearly within the impression area, may be so close to the advertisement that he cannot fully view the advertisement or the copy that he can view is blurred because of the close proximity of the character relative the advertisement in the gaming environment. Similarly, a user may be too far away to fully appreciate the advertisement. Through delineating quality impressions, advertisers can appreciate a minimal impression (e.g., up close or almost too far away) but also have certain assurances with regard to quality impressions as may be subject to the particular whims of the advertiser.
Various pricing schemes may be based upon these various levels or quality of impression whereby a general impression is charged at one rate while a higher quality impression is charged at a different rate. Similarly, the length of time a user is in an impression area can be correlated to a pricing model. For example, if a user is in an impression area for 2-seconds, an impression may have been made but possibly a minimal one due to the complexity of the advertisement. If the user is in the impression area for 10-seconds, a greater impression has been made and has greater value to the advertiser. Limits may be imposed on such an impression counter such that an advertiser is not charged for a 30-minute impression when a user happens to position his game character in front of an advertisement and then leave to attend to another task for half-an-hour. Notwithstanding the presence in the impression area for that period of time, a thirty-minute impression has not truly been made as the user of the game (the controller of the character) has not been subjected to that advertising copy.
Traditional economic aspects of supply and demand may also come into play with various pricing models. For example, if a game is released with great fanfare and is a ‘must buy for the holiday season,’ ad buys in the game may be more expensive. If the game layer proves to be unpopular for a variety of reasons, the pricing levels may decrease to reflect the demand of the game. These determinations as to supply and demand may be made, in part, based on the location and intrinsic value definition of specific tags, the demand for a tag as driven by the number of times tagged assets in a video game are identified during average, peak and off-peak game play thereby resulting in various requests to the advertising server 130, which may be counted as have been previously described in the context of
Similarly, certain video games may have indicia identifying a distinct owner as a result of a user profile or information embedded on the game or subject to, for example, a network address. An advertiser may also determine that while 1,000 impressions may have occurred for their advertisement on a particular day that almost half of those were related to a small group of users who continually entered a gaming environment where the advertisement was rendered time-and-again versus 1,000 impressions distributed more equally amongst 800 different, unique game players. The impressions in the latter example are more valuable then the repeated impressions amongst a small group of users in the first example.
Certain embodiments of the exemplary in-game advertising system 100 described in the present invention may also allow for certain interactions with the products advertised in a video game. For example, a game player might approach a vending machine whereby a variety of beverages are available and effectively advertised via their labeling, trademarks or other visual indicia. A user might select a particular beverage for his character to enjoy in the course of the video game by pressing a button on his keypad that corresponds to purchasing a particular beverage from the vending machine. These ‘virtual purchases’ may be conveyed to advertisers in that the advertisement has not only made an advertising impression in that a user has seen the product or related advertising but taken some sort of positive interaction with the product (e.g., purchasing the product in the video game).
Similar game metrics may be implemented with regard to negative connotations. For example, if four beverages are available, the user's selection of one beverage may reflect negatively as to the other three. By further example, a user may be presented with a vending machine for two competing beverages; if the user takes some action relative one of the vending machines (e.g., destroying it with a weapon), that act too may reflect negatively relative advertising metric information.
Through tracking user interactions with advertisements in a video game environment, the video game effectively becomes a user feedback service similar to an advertising focus group. Feedback may also be registered through explicit interactions wherein a user may expressly provide their opinion of a product or service relative the game environment. For example, the user may be prompted as to their opinion of a particular product; the user may then press ‘up’ for a positive reflection or ‘down’ for a negative reflection.
Interactions via, for example, a microphone input are also possible wherein the user provides feedback in connection with the advertisement. Such feedback may be transmitted to an agent on the other end of the communication channel or passed through speech recognition software wherein certain keywords as they relate to a product are recognized and categorized.
To address the feedback features, the impression tracking system may include functions or may interact with functions capable of soliciting or recording user reaction to an advertising campaign. For example, an advertiser may deploy an advertising campaign defined by advertising content that is loaded into a tag with program or pointer to program(s). Such programs may signal the user to perform actions. Other programs may monitor user reaction in and about the advertisement or in response thereto. For example, one such program that may be invoked when accessing a loaded advertisement tag includes a reference to a speech input requirements and definitions.
Advertising beyond the traditional flat, print advertisement may also be implemented utilizing the presently described in-game advertising system 100. For example, in addition to billboards or single page ad copy, rotating billboards may be utilized wherein triangular panels in the billboard rotate relative to one another thereby effectively providing three-billboards-in-one. The billboard panels then rotate every few seconds to reflect a new advertisement on each panel as occurs in the real-world. In this way, a single game asset can be tagged for multiple advertisement introductions.
Other ads in a game environment may be movable. For example, advertisements may be located on the sides of buses that traverse city streets or a series of flyers that might blow down an alley. Television ads with full motion video and audio ads as might be emitted over a radio or a telephone in a gaming environment are also envisioned as being implemented in the present invention.
Just as certain advertisements have higher demand in the real-world (e.g., high traffic areas), certain advertisements in a video game environment may enjoy higher pricing as a result of high traffic areas. For example, advertisements that appear in the beginning of a video game or a level wherein every user will view the advertisement inherently have more value than an advertisement located in a ‘secret Easter Egg’ level or extremely difficult level that many users may never reach.
Other embodiments of the present invention may include rewards based on user interaction with particular advertisements. For example, if a user provides actual feedback in a video game environment, the user may receive merchandise, points or coupon rewards from the producer of the product as an appreciation for their opinion. Such information concerning where to send a reward may be expressly provided during a feedback session or as a result of an association with a user profile. This latter case would be valuable wherein points or rewards are offered for less explicit interactions (e.g., not in response to an advertiser/feedback query) such as casual interactions or favorable behavior relative an advertised product and points or rewards accumulate over time.
Advertisements, especially those ads that are audible in nature or are full motion video, may be subject to real-time limitations. For example, a user in a video game may be changing the channels of a television in the video game environment. If the user only watches two seconds of the advertisement, an impression may or may not be generated. Such limitations in the case of real-time advertising may be subject not only to an impression area but also an impression time and even an impression time relative particular portions of the advertisement.
For example, an advertisement may be thirty-seconds in length but the first five-seconds do not indicate the nature of the product and the last five-seconds concern legal boilerplate required by the particular advertisement. If an impression time is identified as five-seconds, watching the first or last five-seconds of this particular advertisement would technically constitute an impression notwithstanding the fact that the user knows nothing more about the product after those five-seconds than he did prior. In these cases, limitations as to impressions of particular portions of an advertisement may be implemented. For example, for an advertiser to consider there to have been an impression, the user must not only view five-seconds of the advertisement but those five-seconds must be within the middle 20-seconds of the 30-second advertisement.
Video or audio ads may also be subject to start-stop loops. That is, the advertisement starts when the user accesses the advertisement (e.g., tunes to a radio station playing the advertisement) and then stops when the user leaves the advertisement (e.g., changes the radio to another station). If the user then changes back to the original station with the advertisement, the ad may commence where it left off as if no time has passed. Such a methodology better ensures an impression but does so at the risk compromising reality (i.e., real-time passage of time is not in effect). The tag object may track the state of the advertisement impression, such as the index into the location in a video file to start the next sequence for the one or more tags associated with the advertising video loop.
Video games, radios and televisions that offer the user the ability to change channels may be associated with features to track multiple advertisement impressions and campaigns. When a user changes a channel or directs a virtual character in the game environment to change a channel, new advertisements may be provided. Such advertisement changes may be transitioned with white noise or a familiar blur associated with changing a channel according to the nature of the device. Radio or television devices may be configured with channels that access both traditional programming, advertisement content or other content. Other content may include chat wherein the device facilitates communications. Other content may also include other information in connection with the game. Generally, the mixing of advertisement and other content in such devices may have the benefit of catalyzing user exposure to advertisements since the use of the device and changing of the channels may be necessary to facilitate game play.
Other advertisements may be rendered or emitted in true real-time. For example, if a television advertisement in a video game is two minutes in length and the user changes the channel in the video game after thirty-seconds of viewing the advertisement but comes back to the same channel thirty-seconds later, the advertisement will now be at the 60 second point and not the 30 second point as in a start-stop embodiment.
While real-time advertisements may be more realistic, ensuring an impression becomes more difficult relative the portion of the advertisement the user viewed as has been previously noted. Certain impression, especially in the real-time video and audio sense, may be subject to ongoing impression limitations. For example, an impression may constitute viewing 30 seconds of a one minute advertisement. The user may, at one point in the game, view a first 10-seconds of the advertisement, view a second 10-seconds at a different point in the game and view yet another 10-seconds at another point in the game. In this instance, the user—albeit piecemeal—may have viewed enough of the ad over the course of time to constitute an impression.
Other advertisements may limit an impression opportunity to consecutive time or such piecemeal viewing/listening but within an overall time frame. For example, viewing the advertisement in 10-second snippets may suffice as an advertisement but they must occur within 15 minutes of one another. Other advertisements may require the thirty-seconds to occur consecutively or an impression has not been established.
Some of these real-time/consecutive impression implications addressed above are reflected in
For example, as the game character 1420 moves past object 1440A, the impression counter would begin to measure the existence of an ad impression as provided by unobstructed line-of-sight 1450A. As the game character passes behind object 1440a, the impression counter would stop measuring the existence of an advertisement impression as a result of now obstructed line-of-sight. Once the game character 1420 emerges from behind object 1440a, an unobstructed line-of-sight (1450B) once again exists and the impression counter again would begin to measure the existence of an advertisement impression from the stop point of the previous impression. The measurement of an advertisement impression would continue in a similar fashion as the game character 1420 passes in between remaining objects 1440B . . . 1440D.
In the present embodiment, as the impression counter starts-and-stops, any one segment of time correlating to an advertisement impression may not constitute a single advertisement impression. The ongoing exposure to the advertisement 1410, albeit in an interrupted fashion, may over the course of time constitute an ad impression. For example, by the impression counter reaching a certain time period (e.g., from start point to a point three seconds in time later), this time period may (as a whole) constitute an ad impression. Such a measurement methodology would be desirous in instances where a game character passes by, for example, a number of pillars; a rod iron fence, a series of windows, or a crowded room.
The above description is illustrative and not restrictive. Many variations of the invention will become apparent to those of skill in the art upon review of this disclosure. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.
For example, the embedded advertising ‘tags’ as described in the present invention may be further applied to digital video and audio signals—television and audio broadcasts, for example—as well as movies filmed in a digital format whereby advertisements or other content may be inserted into previously generated audio and/or video content. On-line media such as on-line magazines, newspapers and blogs may also benefit from the implementation of tagging advertising assets (e.g., particular column inches or steaming news broadcasts) as present-day advertising methodologies such as pop-ads become less effective and/or less popular. Advertising content may be offered by network and/or content providers (e.g., cable providers) whereby advertising content is offered on-demand.
Additionally, the various impression area and occlusion concepts disclosed herein may be applied to audio advertisements or other audible emissions. For example, a radio or other audio emitting object may be defined, in part, by an impression area. Such an impression area would be determined in a manner similar to an impression area as it concerns a visual advertisement. An impression area in the context of audio would be representative of where an audio advertisement or other audio emission may be heard by the character in a game environment as the volume of the audio emission decreases as the character moves further away from the advertisement in three-dimensional space or if the character is located behind an object in which case the occlusion determination concepts become applicable (e.g., does a wall separate the character and the audio signal). The quality of audio impressions may also be determined in a manner similar to quality determinations with visual advertisements with regard to not only distance but the extent to which an intermediate object might absorb the sound, for example, a pane of soundproof glass versus a thinly constructed wall.
The asset tagged to receive an advertisement may be movable and rotatable and may be programmed to dynamically orientate towards the user camera as the user manipulates around the game environment. Ad campaigns may be interleaved with special programming. Special programming may influence ad campaigns, variables in the tags relating to the ad campaigns, or may relate to the game environment. Special programming may influence (e.g., terminate or replace) an ad campaign or modify variables or functions contained in an ad campaign or tag. Special programming may accommodate for dynamic reconfiguration and reuse of an advertising asset. For example, special programming may be used to communicate special messages, game messages, forum messages, facilitate chat and so forth. Special programming may also be used to transfer control of the advertising asset to the game environment so that the advertising real estate can be used to convey game information and other information.
Notwithstanding the providing of detailed descriptions of exemplary embodiments, it is to be understood that the present invention may be embodied in various forms. Therefore, specific details disclosed herein are not to be interpreted as limiting, but rather as a basis for the claims and as a representative basis for teaching one skilled in the art to employ the present invention in virtually any appropriately detailed system, structure, method, process, or manner.
Claims
1. A method for verifying an impression in a digital environment, the method comprising:
- defining a portion of the digital environment as an impression area, wherein the impression area is associated with a tagged content area;
- providing a stream of digital content to the tagged content area;
- executing instructions stored in memory, wherein execution of the instructions by a processor: renders the content stream in accordance with one or more graphic rendering techniques specific to the tagged content area; identifies that a user view perspective is positioned within the impression area, identifies availability of an unobstructed line-of-sight between the user view perspective and the tagged content area; and
- updating impression information stored in memory regarding the streamed digital content, wherein an identified impression is based on the identification of the user view perspective within the impression area and the availability of an unobstructed line-of-sight between the user view perspective and the tagged content area.
2. The method of claim 1, wherein the availability of the unobstructed line-of-sight is determined using an obstruction probe.
3. The method of claim 1, further comprising updating the content stream in response to the updated impression information.
4. The method of claim 1, wherein the identified impression is further based on a viewing angle between the user view perspective and the tagged content area.
5. The method of claim 1, wherein the digital content provided in the content stream is selected based on a location of the user view perspective in the digital environment.
6. The method of claim 1, wherein the digital content provided in the content stream is selected based on a current geographic location of a user associated with the user view perspective.
7. The method of claim 1, further comprising updating price information associated with the digital content in real-time based on the updated impression information.
8. The method of claim 1, wherein the identified impression is further based on the line-of-sight between the user view perspective and the tagged content area remaining unobstructed for a predetermined period of time.
9. The method of claim 1, wherein the line-of-sight between the user view perspective and the tagged content area is partially obstructed, and wherein the identified impression is further based a level of fame associated with one or more elements in the digital content.
10. A system for verifying an impression in a digital environment, the system comprising:
- non-transitory memory that stores information regarding a portion of the digital environment defined as an impression area, wherein the impression area is associated with a tagged content area in the digital environment;
- a communication interface that provides a stream of digital content to the tagged content area; and
- one or more processors coupled to the memory, the processors configured to execute instructions stored in the memory, wherein execution of the instructions by the processors: renders the content stream in accordance with one or more graphic rendering techniques specific to the tagged content area; identifies that a user view perspective is within the impression area, identifies an availability of an unobstructed line-of-sight between the user view perspective and the tagged content area; and updating impression information stored in memory regarding the streamed digital content, wherein an identified impression is based on the identification of the user view perspective within the impression area and the availability of an unobstructed line-of-sight between the user view perspective and the tagged content area.
11. The system of claim 10, wherein the availability of the unobstructed line-of-sight is determined using an obstruction probe.
12. The system of claim 10, wherein the processors execute further instructions to update the content stream in response to updating the impression information.
13. The system of claim 10, wherein the identified impression is further based on a viewing angle between the user view perspective and the tagged content area.
14. The system of claim 10, wherein the digital content provided in the content stream is selected based on a location of the character in the digital environment.
15. The system of claim 10, wherein the digital content provided in the content stream is selected based on a current geographic location of a user associated with the user view perspective.
16. The system of claim 10, wherein the identified impression is further based on the line-of-sight between the user view perspective and the tagged content area remaining unobstructed for a predetermined period of time.
17. The system of claim 10, wherein the line-of-sight between the user view perspective and the tagged content area is partially obstructed, and wherein the advertising impression is further based a level of fame associated with one or more elements in the digital content.
18. A non-transitory computer-readable storage medium, having embodied thereon a program executable by a processor to perform operations for registering an advertising impression in a digital environment, the operations comprising:
- defining a portion of the digital environment as an impression area, wherein the impression area is associated with a tagged content area;
- providing a stream of digital content to the tagged content area;
- rendering the content stream in accordance with one or more graphic rendering techniques specific to the tagged content area;
- identifying that a user view perspective is positioned within the impression area;
- identifying availability of an unobstructed line-of-sight between the user view perspective and the tagged content area; and
- updating impression information stored in memory regarding the streamed digital content, wherein an identified impression is based on the identification of the user view perspective within the impression area and the availability of an unobstructed line-of-sight between the user view perspective and the tagged content area.
19. The non-transitory computer-readable storage medium of claim 18, wherein the availability of the unobstructed line-of-sight is determined using an obstruction probe.
20. The non-transitory computer-readable storage medium of claim 18, the operations further comprising updating the content stream in response to the updated impression information.
4521021 | June 4, 1985 | Dixon |
4542897 | September 24, 1985 | Melton et al. |
4734690 | March 29, 1988 | Waller |
4807158 | February 21, 1989 | Blanton et al. |
4905168 | February 27, 1990 | McCarthy et al. |
4926255 | May 15, 1990 | Von Kohorn |
4969036 | November 6, 1990 | Bhanu et al. |
5014234 | May 7, 1991 | Edwards, Jr. |
5083271 | January 21, 1992 | Thacher et al. |
5105184 | April 14, 1992 | Pirani et al. |
5227874 | July 13, 1993 | Von Kohorn |
5231568 | July 27, 1993 | Cohen et al. |
5283731 | February 1, 1994 | Lalonde et al. |
5305195 | April 19, 1994 | Murphy |
5305389 | April 19, 1994 | Palmer |
5319454 | June 7, 1994 | Schutte |
5347632 | September 13, 1994 | Filepp et al. |
5373440 | December 13, 1994 | Cohen et al. |
5377997 | January 3, 1995 | Wilden et al. |
5446919 | August 29, 1995 | Wilkins |
5462275 | October 31, 1995 | Lowe et al. |
5497479 | March 5, 1996 | Hornbuckle |
5512935 | April 30, 1996 | Majeti et al. |
5526041 | June 11, 1996 | Glatt |
5539450 | July 23, 1996 | Handelman |
5548645 | August 20, 1996 | Ananda |
5564038 | October 8, 1996 | Grantz et al. |
5565909 | October 15, 1996 | Thibadeau et al. |
5572643 | November 5, 1996 | Judson |
5574447 | November 12, 1996 | Roylance |
5592212 | January 7, 1997 | Handelman |
5630757 | May 20, 1997 | Gagin et al. |
5636346 | June 3, 1997 | Saxe |
5646992 | July 8, 1997 | Subler et al. |
5663757 | September 2, 1997 | Morales |
5684526 | November 4, 1997 | Yoshinobu |
5697844 | December 16, 1997 | Von Kohorn |
5699497 | December 16, 1997 | Erdahl et al. |
5707289 | January 13, 1998 | Watanabe et al. |
5712979 | January 27, 1998 | Graber et al. |
5721827 | February 24, 1998 | Logan et al. |
5724521 | March 3, 1998 | Dedrick |
5737619 | April 7, 1998 | Judson |
5748867 | May 5, 1998 | Cosman et al. |
5751956 | May 12, 1998 | Kirsch |
5758068 | May 26, 1998 | Brandt et al. |
5758257 | May 26, 1998 | Herz et al. |
5762553 | June 9, 1998 | Takasugi et al. |
5771347 | June 23, 1998 | Grantz et al. |
5774170 | June 30, 1998 | Hite et al. |
5794210 | August 11, 1998 | Goldhaber et al. |
5798519 | August 25, 1998 | Vock et al. |
5805815 | September 8, 1998 | Hill |
5822523 | October 13, 1998 | Rothschild et al. |
5848396 | December 8, 1998 | Gerace |
5855008 | December 29, 1998 | Goldhaber et al. |
5857149 | January 5, 1999 | Suzuki |
5860073 | January 12, 1999 | Ferrel et al. |
5867208 | February 2, 1999 | McLaren |
5876286 | March 2, 1999 | Lee |
5878135 | March 2, 1999 | Blatter et al. |
5879235 | March 9, 1999 | Kaneko et al. |
5883958 | March 16, 1999 | Ishiguro et al. |
5903816 | May 11, 1999 | Broadwin et al. |
5910987 | June 8, 1999 | Ginter et al. |
5911582 | June 15, 1999 | Redford |
5916024 | June 29, 1999 | Von Kohorn |
5917725 | June 29, 1999 | Thacher et al. |
5929850 | July 27, 1999 | Broadwin et al. |
5937037 | August 10, 1999 | Kamel et al. |
5946646 | August 31, 1999 | Schena et al. |
5946664 | August 31, 1999 | Ebisawa |
5948061 | September 7, 1999 | Merriman et al. |
5964660 | October 12, 1999 | James et al. |
5970143 | October 19, 1999 | Schneier et al. |
5974398 | October 26, 1999 | Hanson et al. |
5987511 | November 16, 1999 | Elixmann et al. |
5991735 | November 23, 1999 | Gerace |
6005602 | December 21, 1999 | Matthews, III |
6012984 | January 11, 2000 | Roseman |
6015348 | January 18, 2000 | Lambright et al. |
6016348 | January 18, 2000 | Blatter et al. |
6020883 | February 1, 2000 | Herz et al. |
6024643 | February 15, 2000 | Begis |
6026368 | February 15, 2000 | Brown et al. |
6029046 | February 22, 2000 | Khan et al. |
6036601 | March 14, 2000 | Heckel |
6047289 | April 4, 2000 | Thorne et al. |
6088722 | July 11, 2000 | Herz et al. |
6089975 | July 18, 2000 | Dunn |
6108637 | August 22, 2000 | Blumenau |
6113494 | September 5, 2000 | Lennert |
6119098 | September 12, 2000 | Guyot et al. |
6149519 | November 21, 2000 | Osaki et al. |
6151631 | November 21, 2000 | Ansell et al. |
6165070 | December 26, 2000 | Nolte et al. |
6179713 | January 30, 2001 | James et al. |
6181988 | January 30, 2001 | Schneider et al. |
6188398 | February 13, 2001 | Collins-Rector et al. |
6196920 | March 6, 2001 | Spaur et al. |
6199082 | March 6, 2001 | Ferrel et al. |
6216129 | April 10, 2001 | Eldering |
6236975 | May 22, 2001 | Boe et al. |
6238290 | May 29, 2001 | Tarr et al. |
6251017 | June 26, 2001 | Leason et al. |
6263360 | July 17, 2001 | Arnold et al. |
6264555 | July 24, 2001 | Glazman et al. |
6264560 | July 24, 2001 | Goldberg et al. |
6267672 | July 31, 2001 | Vance |
6267675 | July 31, 2001 | Lee |
6275854 | August 14, 2001 | Himmel et al. |
6275989 | August 14, 2001 | Broadwin et al. |
6298348 | October 2, 2001 | Eldering |
6308328 | October 23, 2001 | Bowcutt et al. |
6312337 | November 6, 2001 | Edwards et al. |
6320495 | November 20, 2001 | Sporgis |
6324519 | November 27, 2001 | Eldering |
6343990 | February 5, 2002 | Rasmussen et al. |
6346045 | February 12, 2002 | Rider et al. |
6366701 | April 2, 2002 | Chalom et al. |
6371850 | April 16, 2002 | Sonoda |
6379251 | April 30, 2002 | Auxier et al. |
6381362 | April 30, 2002 | Deshpande et al. |
6385592 | May 7, 2002 | Angles et al. |
6385596 | May 7, 2002 | Wiser et al. |
6390922 | May 21, 2002 | Vange et al. |
6393574 | May 21, 2002 | Kashiwagi et al. |
6394899 | May 28, 2002 | Walker |
6400996 | June 4, 2002 | Hoffberg et al. |
6411936 | June 25, 2002 | Sanders |
6434614 | August 13, 2002 | Blumenau |
6443843 | September 3, 2002 | Walker et al. |
6446130 | September 3, 2002 | Grapes |
6446261 | September 3, 2002 | Rosser |
6456234 | September 24, 2002 | Johnson |
6457010 | September 24, 2002 | Eldering et al. |
6460036 | October 1, 2002 | Herz |
6468155 | October 22, 2002 | Zucker et al. |
6470138 | October 22, 2002 | Um et al. |
6484148 | November 19, 2002 | Boyd |
6484149 | November 19, 2002 | Jammes et al. |
6489955 | December 3, 2002 | Newhall, Jr. |
6496826 | December 17, 2002 | Chowdhury et al. |
6513160 | January 28, 2003 | Dureau |
6516338 | February 4, 2003 | Landsman et al. |
6529940 | March 4, 2003 | Humble |
6530840 | March 11, 2003 | Cuomo et al. |
6532448 | March 11, 2003 | Higginson et al. |
6539375 | March 25, 2003 | Kawasaki |
6539544 | March 25, 2003 | Ebisawa |
6553178 | April 22, 2003 | Abecassis |
6560578 | May 6, 2003 | Eldering |
6563523 | May 13, 2003 | Suchocki et al. |
6564217 | May 13, 2003 | Bunney et al. |
6595859 | July 22, 2003 | Lynn |
6606746 | August 12, 2003 | Zdepski et al. |
6611812 | August 26, 2003 | Hurtado et al. |
6611813 | August 26, 2003 | Bratton |
6611957 | August 26, 2003 | Ebisawa |
6612932 | September 2, 2003 | Stern |
6615039 | September 2, 2003 | Eldering |
6616533 | September 9, 2003 | Rashkovskiy |
6625578 | September 23, 2003 | Spaur et al. |
6632138 | October 14, 2003 | Serizawa et al. |
6640097 | October 28, 2003 | Corrigan et al. |
6640335 | October 28, 2003 | Ebisawa |
6640336 | October 28, 2003 | Ebisawa |
6645068 | November 11, 2003 | Kelly et al. |
6654725 | November 25, 2003 | Langheinrich et al. |
6656050 | December 2, 2003 | Busch et al. |
6659861 | December 9, 2003 | Faris et al. |
6663105 | December 16, 2003 | Sullivan et al. |
6669562 | December 30, 2003 | Shiino |
6669564 | December 30, 2003 | Young et al. |
6680746 | January 20, 2004 | Kawai et al. |
6683941 | January 27, 2004 | Brown et al. |
6684194 | January 27, 2004 | Eldering et al. |
6687608 | February 3, 2004 | Sugimoto et al. |
6697792 | February 24, 2004 | Bunney et al. |
6698020 | February 24, 2004 | Zigmond et al. |
6699127 | March 2, 2004 | Lobb et al. |
6701363 | March 2, 2004 | Chiu et al. |
6704930 | March 9, 2004 | Eldering et al. |
6709335 | March 23, 2004 | Bates et al. |
6712702 | March 30, 2004 | Goldberg et al. |
6714236 | March 30, 2004 | Wada et al. |
6714723 | March 30, 2004 | Abecassis |
6714917 | March 30, 2004 | Eldering et al. |
6716103 | April 6, 2004 | Eck et al. |
6718551 | April 6, 2004 | Swix et al. |
6721748 | April 13, 2004 | Knight et al. |
6731238 | May 4, 2004 | Johnson |
6738078 | May 18, 2004 | Duncombe |
6745011 | June 1, 2004 | Hendrickson et al. |
6757740 | June 29, 2004 | Parekh et al. |
6758746 | July 6, 2004 | Hunter et al. |
6758754 | July 6, 2004 | Lavanchy et al. |
6758755 | July 6, 2004 | Kelly et al. |
6764395 | July 20, 2004 | Guyett |
6764403 | July 20, 2004 | Gavin |
6771290 | August 3, 2004 | Hoyle |
6783460 | August 31, 2004 | Galyean, III et al. |
6785902 | August 31, 2004 | Zigmond et al. |
6799327 | September 28, 2004 | Reynolds et al. |
6814663 | November 9, 2004 | Edwards et al. |
6820277 | November 16, 2004 | Eldering et al. |
6826614 | November 30, 2004 | Hanmann et al. |
6827645 | December 7, 2004 | Morita et al. |
6840861 | January 11, 2005 | Jordan et al. |
6863612 | March 8, 2005 | Willis |
6874683 | April 5, 2005 | Keronen et al. |
6882978 | April 19, 2005 | Ebisawa |
6890256 | May 10, 2005 | Walker et al. |
6895170 | May 17, 2005 | Lambert et al. |
6912398 | June 28, 2005 | Domnitz |
6928414 | August 9, 2005 | Kim |
6941574 | September 6, 2005 | Broadwin et al. |
6942575 | September 13, 2005 | Mergler |
6948062 | September 20, 2005 | Clapper |
6954728 | October 11, 2005 | Kusumoto |
6955605 | October 18, 2005 | Young et al. |
6964608 | November 15, 2005 | Koza |
6967566 | November 22, 2005 | Weston et al. |
6968567 | November 22, 2005 | Gordon et al. |
6970834 | November 29, 2005 | Martin et al. |
6970915 | November 29, 2005 | Partovi et al. |
6973664 | December 6, 2005 | Fries |
6987221 | January 17, 2006 | Platt |
6995788 | February 7, 2006 | James |
7028082 | April 11, 2006 | Rosenberg et al. |
7072849 | July 4, 2006 | Filepp et al. |
7076445 | July 11, 2006 | Cartwright |
7086187 | August 8, 2006 | Bandak |
7136871 | November 14, 2006 | Ozer et al. |
7305442 | December 4, 2007 | Lundy |
7362999 | April 22, 2008 | Petschke et al. |
7363643 | April 22, 2008 | Drake et al. |
7370002 | May 6, 2008 | Heckerman et al. |
7370073 | May 6, 2008 | Yen et al. |
7386127 | June 10, 2008 | Bar-On |
7401140 | July 15, 2008 | Goulden et al. |
7421454 | September 2, 2008 | DeShan et al. |
7437368 | October 14, 2008 | Kolluri et al. |
7466823 | December 16, 2008 | Vestergaard et al. |
7487112 | February 3, 2009 | Barnes, Jr. |
7594189 | September 22, 2009 | Walker et al. |
7707485 | April 27, 2010 | Laksono |
7852222 | December 14, 2010 | Johnson et al. |
8005713 | August 23, 2011 | Sanz-Pastor et al. |
8024766 | September 20, 2011 | Addington |
8060407 | November 15, 2011 | Delker et al. |
8074076 | December 6, 2011 | Courtois |
8175921 | May 8, 2012 | Kopra |
8191088 | May 29, 2012 | Edwards et al. |
8267783 | September 18, 2012 | van Datta |
8272964 | September 25, 2012 | van Datta |
8302030 | October 30, 2012 | Soroca et al. |
8406739 | March 26, 2013 | Hull et al. |
8574074 | November 5, 2013 | van Datta |
8626584 | January 7, 2014 | van Datta |
8645992 | February 4, 2014 | Russell |
8676900 | March 18, 2014 | Yruski |
8751310 | June 10, 2014 | van Datta |
8763090 | June 24, 2014 | Capati |
8763157 | June 24, 2014 | Navar |
8769558 | July 1, 2014 | Navar |
8795076 | August 5, 2014 | van Datta |
9015747 | April 21, 2015 | Russell |
9129301 | September 8, 2015 | van Datta |
9195991 | November 24, 2015 | van Datta |
9367862 | June 14, 2016 | Yruski |
9466074 | October 11, 2016 | van Datta |
9474976 | October 25, 2016 | van Datta |
9525902 | December 20, 2016 | Navar |
9831686 | November 28, 2017 | Kohara et al. |
9864998 | January 9, 2018 | Yruski |
9873052 | January 23, 2018 | van Datta |
20010011226 | August 2, 2001 | Greer et al. |
20010013009 | August 9, 2001 | Greening et al. |
20010014915 | August 16, 2001 | Blumenau |
20010025245 | September 27, 2001 | Flickinger et al. |
20010025254 | September 27, 2001 | Park |
20010025274 | September 27, 2001 | Zehr et al. |
20010027412 | October 4, 2001 | Son |
20010032125 | October 18, 2001 | Bhan et al. |
20010032132 | October 18, 2001 | Moran |
20010032133 | October 18, 2001 | Moran |
20010032137 | October 18, 2001 | Bennett et al. |
20010032333 | October 18, 2001 | Flickinger |
20010034643 | October 25, 2001 | Acres |
20010034762 | October 25, 2001 | Jacobs et al. |
20010037232 | November 1, 2001 | Miller |
20010039210 | November 8, 2001 | St-Denis |
20010047297 | November 29, 2001 | Wen |
20010049620 | December 6, 2001 | Blasko |
20010052123 | December 13, 2001 | Kawai |
20020004743 | January 10, 2002 | Kutaragi et al. |
20020004744 | January 10, 2002 | Muyres et al. |
20020007307 | January 17, 2002 | Miller et al. |
20020007310 | January 17, 2002 | Long |
20020010626 | January 24, 2002 | Agmoni |
20020010628 | January 24, 2002 | Burns |
20020010757 | January 24, 2002 | Granik et al. |
20020013174 | January 31, 2002 | Murata |
20020018076 | February 14, 2002 | Gianola |
20020018982 | February 14, 2002 | Conroy |
20020019774 | February 14, 2002 | Kanter |
20020022476 | February 21, 2002 | Go |
20020022516 | February 21, 2002 | Forden |
20020023000 | February 21, 2002 | Bollay |
20020026345 | February 28, 2002 | Juels |
20020026355 | February 28, 2002 | Mitsuoka et al. |
20020026638 | February 28, 2002 | Eldering et al. |
20020032608 | March 14, 2002 | Kanter |
20020032626 | March 14, 2002 | DeWolf et al. |
20020032906 | March 14, 2002 | Grossman |
20020044687 | April 18, 2002 | Federman |
20020046087 | April 18, 2002 | Hey |
20020046095 | April 18, 2002 | Wallace |
20020046102 | April 18, 2002 | Dohring et al. |
20020049679 | April 25, 2002 | Russell et al. |
20020049968 | April 25, 2002 | Wilson et al. |
20020051521 | May 2, 2002 | Patrick |
20020055833 | May 9, 2002 | Sterling |
20020055876 | May 9, 2002 | Gabler |
20020056107 | May 9, 2002 | Schlack |
20020059577 | May 16, 2002 | Lu et al. |
20020059590 | May 16, 2002 | Kitsukawa et al. |
20020059610 | May 16, 2002 | Ellis |
20020061778 | May 23, 2002 | Acres |
20020067730 | June 6, 2002 | Hinderks et al. |
20020069240 | June 6, 2002 | Berk |
20020069405 | June 6, 2002 | Chapin et al. |
20020072965 | June 13, 2002 | Merriman et al. |
20020072966 | June 13, 2002 | Eldering et al. |
20020073235 | June 13, 2002 | Chen et al. |
20020077906 | June 20, 2002 | Remler |
20020077985 | June 20, 2002 | Kobata et al. |
20020078441 | June 20, 2002 | Drake et al. |
20020078444 | June 20, 2002 | Krewin et al. |
20020080968 | June 27, 2002 | Olsson |
20020082077 | June 27, 2002 | Johnson et al. |
20020082910 | June 27, 2002 | Kontogouris |
20020082913 | June 27, 2002 | Li |
20020082941 | June 27, 2002 | Bird |
20020083435 | June 27, 2002 | Blasko et al. |
20020083439 | June 27, 2002 | Eldering |
20020083441 | June 27, 2002 | Flickinger et al. |
20020083442 | June 27, 2002 | Eldering |
20020083443 | June 27, 2002 | Eldering et al. |
20020083444 | June 27, 2002 | Blasko et al. |
20020083445 | June 27, 2002 | Flickinger et al. |
20020083451 | June 27, 2002 | Gill et al. |
20020087402 | July 4, 2002 | Zustak |
20020087403 | July 4, 2002 | Meyers et al. |
20020087887 | July 4, 2002 | Busam et al. |
20020087973 | July 4, 2002 | Hamilton et al. |
20020087975 | July 4, 2002 | Schlack |
20020087980 | July 4, 2002 | Eldering et al. |
20020094868 | July 18, 2002 | Tuck et al. |
20020095676 | July 18, 2002 | Knee et al. |
20020098891 | July 25, 2002 | Graham et al. |
20020099600 | July 25, 2002 | Merriman et al. |
20020099611 | July 25, 2002 | De Souza et al. |
20020099653 | July 25, 2002 | De Souza et al. |
20020100040 | July 25, 2002 | Bull |
20020107073 | August 8, 2002 | Binney |
20020107075 | August 8, 2002 | Stephan |
20020107730 | August 8, 2002 | Bernstein |
20020109680 | August 15, 2002 | Orbanes et al. |
20020111154 | August 15, 2002 | Eldering et al. |
20020111172 | August 15, 2002 | De Wolf et al. |
20020111825 | August 15, 2002 | Martin et al. |
20020111865 | August 15, 2002 | Middleton, III et al. |
20020112035 | August 15, 2002 | Carey et al. |
20020112233 | August 15, 2002 | Cantu Bonilla et al. |
20020112240 | August 15, 2002 | Basco et al. |
20020112249 | August 15, 2002 | Hendricks et al. |
20020112250 | August 15, 2002 | Koplar et al. |
20020114466 | August 22, 2002 | Tanaka et al. |
20020116284 | August 22, 2002 | Steelman et al. |
20020120574 | August 29, 2002 | Ezaki |
20020120589 | August 29, 2002 | Aoki |
20020122052 | September 5, 2002 | Reich et al. |
20020123928 | September 5, 2002 | Eldering et al. |
20020129362 | September 12, 2002 | Chang et al. |
20020129368 | September 12, 2002 | Schlack et al. |
20020133398 | September 19, 2002 | Geller et al. |
20020136407 | September 26, 2002 | Denning et al. |
20020138493 | September 26, 2002 | Shapiro et al. |
20020143639 | October 3, 2002 | Beckett et al. |
20020143652 | October 3, 2002 | Beckett |
20020143782 | October 3, 2002 | Headings et al. |
20020143901 | October 3, 2002 | Lupo et al. |
20020144262 | October 3, 2002 | Plotnick et al. |
20020144263 | October 3, 2002 | Eldering et al. |
20020147633 | October 10, 2002 | Rafizadeh |
20020147638 | October 10, 2002 | Banerjee et al. |
20020147645 | October 10, 2002 | Alao et al. |
20020152117 | October 17, 2002 | Cristofalo et al. |
20020155878 | October 24, 2002 | Lert, Jr. et al. |
20020155891 | October 24, 2002 | Okada et al. |
20020157002 | October 24, 2002 | Messerges et al. |
20020159304 | October 31, 2002 | Morita et al. |
20020161625 | October 31, 2002 | Brito-Valladares et al. |
20020161639 | October 31, 2002 | Goldstein |
20020164977 | November 7, 2002 | Link, II et al. |
20020164999 | November 7, 2002 | Johnson |
20020165026 | November 7, 2002 | Perkins et al. |
20020165764 | November 7, 2002 | Wade et al. |
20020173349 | November 21, 2002 | Ach, III |
20020173359 | November 21, 2002 | Gallo et al. |
20020175936 | November 28, 2002 | Tenembaum |
20020178442 | November 28, 2002 | Williams |
20020178445 | November 28, 2002 | Eldering et al. |
20020178447 | November 28, 2002 | Plotnick et al. |
20020184047 | December 5, 2002 | Plotnick et al. |
20020184086 | December 5, 2002 | Linde |
20020184088 | December 5, 2002 | Rosenberg |
20020184130 | December 5, 2002 | Blasko |
20020184642 | December 5, 2002 | Lude et al. |
20020193066 | December 19, 2002 | Connelly |
20020194058 | December 19, 2002 | Eldering |
20020194585 | December 19, 2002 | Connelly |
20020194590 | December 19, 2002 | Pong |
20020194598 | December 19, 2002 | Connelly |
20020194607 | December 19, 2002 | Connelly |
20030004810 | January 2, 2003 | Eldering |
20030009762 | January 9, 2003 | Hooper et al. |
20030014307 | January 16, 2003 | Heng |
20030014312 | January 16, 2003 | Fleisher |
20030014414 | January 16, 2003 | Newman |
20030014754 | January 16, 2003 | Chang |
20030018527 | January 23, 2003 | Filepp et al. |
20030018797 | January 23, 2003 | Dunning et al. |
20030028433 | February 6, 2003 | Merriman et al. |
20030033405 | February 13, 2003 | Perdon et al. |
20030035075 | February 20, 2003 | Butler et al. |
20030036944 | February 20, 2003 | Lesandrini et al. |
20030046148 | March 6, 2003 | Rizzi et al. |
20030048293 | March 13, 2003 | Werkhoven |
20030054888 | March 20, 2003 | Walker et al. |
20030060247 | March 27, 2003 | Goldberg et al. |
20030066092 | April 3, 2003 | Wagner et al. |
20030070167 | April 10, 2003 | Holtz et al. |
20030073496 | April 17, 2003 | D'Amico et al. |
20030074252 | April 17, 2003 | Chandler-Pepelnjak et al. |
20030074661 | April 17, 2003 | Krapf et al. |
20030076347 | April 24, 2003 | Barrett et al. |
20030079226 | April 24, 2003 | Barrett |
20030084449 | May 1, 2003 | Chane et al. |
20030084456 | May 1, 2003 | Ryan et al. |
20030093311 | May 15, 2003 | Knowlson |
20030100375 | May 29, 2003 | Wakae et al. |
20030101449 | May 29, 2003 | Bentolila et al. |
20030101451 | May 29, 2003 | Bentolila et al. |
20030101454 | May 29, 2003 | Ozer et al. |
20030103644 | June 5, 2003 | Klayh |
20030104867 | June 5, 2003 | Kobayashi et al. |
20030110131 | June 12, 2003 | Alain et al. |
20030110171 | June 12, 2003 | Ozer et al. |
20030110499 | June 12, 2003 | Knudson et al. |
20030115074 | June 19, 2003 | Freeman et al. |
20030115318 | June 19, 2003 | Wueste |
20030115587 | June 19, 2003 | Kendall et al. |
20030120940 | June 26, 2003 | Vataja |
20030126150 | July 3, 2003 | Chan |
20030135513 | July 17, 2003 | Quinn et al. |
20030139966 | July 24, 2003 | Sirota et al. |
20030144044 | July 31, 2003 | Piarsky |
20030144048 | July 31, 2003 | Silva |
20030149618 | August 7, 2003 | Sender et al. |
20030149623 | August 7, 2003 | Chen |
20030149938 | August 7, 2003 | McElfresh et al. |
20030149975 | August 7, 2003 | Eldering et al. |
20030158872 | August 21, 2003 | Adams |
20030163369 | August 28, 2003 | Arr |
20030163482 | August 28, 2003 | Bunney et al. |
20030171988 | September 11, 2003 | Sugihara |
20030171990 | September 11, 2003 | Rao et al. |
20030172374 | September 11, 2003 | Vinson et al. |
20030172376 | September 11, 2003 | Coffin, III |
20030177490 | September 18, 2003 | Hoshino et al. |
20030182567 | September 25, 2003 | Barton et al. |
20030182663 | September 25, 2003 | Gudorf et al. |
20030187719 | October 2, 2003 | Brocklebank |
20030190961 | October 9, 2003 | Seidman |
20030191690 | October 9, 2003 | McIntyre et al. |
20030191742 | October 9, 2003 | Yonezawa et al. |
20030195021 | October 16, 2003 | Yamashita et al. |
20030195801 | October 16, 2003 | Takakura et al. |
20030195837 | October 16, 2003 | Kostic et al. |
20030199292 | October 23, 2003 | Greenberg |
20030200452 | October 23, 2003 | Tagawa et al. |
20030204632 | October 30, 2003 | Willebeek-LeMair et al. |
20030208680 | November 6, 2003 | Byrne et al. |
20030212608 | November 13, 2003 | Cliff |
20030215211 | November 20, 2003 | Coffin, III |
20030216961 | November 20, 2003 | Barry |
20030221100 | November 27, 2003 | Russ et al. |
20030221113 | November 27, 2003 | Kupka et al. |
20030226141 | December 4, 2003 | Krasnow et al. |
20030226142 | December 4, 2003 | Rand |
20030229893 | December 11, 2003 | Sgaraglino |
20040002380 | January 1, 2004 | Brosnan et al. |
20040003396 | January 1, 2004 | Babu |
20040014454 | January 22, 2004 | Burgess et al. |
20040015397 | January 22, 2004 | Barry et al. |
20040015608 | January 22, 2004 | Ellis et al. |
20040019521 | January 29, 2004 | Birmingham |
20040025174 | February 5, 2004 | Cerrato |
20040030595 | February 12, 2004 | Park |
20040034536 | February 19, 2004 | Hughes |
20040034686 | February 19, 2004 | Guthrie |
20040039648 | February 26, 2004 | Candelore et al. |
20040039796 | February 26, 2004 | Watkins |
20040043817 | March 4, 2004 | Willis |
20040043819 | March 4, 2004 | Willis |
20040044567 | March 4, 2004 | Willis |
20040044569 | March 4, 2004 | Roberts et al. |
20040044571 | March 4, 2004 | Bronnimann et al. |
20040044574 | March 4, 2004 | Cochran et al. |
20040054589 | March 18, 2004 | Nicolas et al. |
20040057348 | March 25, 2004 | Shteyn et al. |
20040059625 | March 25, 2004 | Schrader |
20040060060 | March 25, 2004 | Carr |
20040064833 | April 1, 2004 | Lee et al. |
20040068483 | April 8, 2004 | Sakurai et al. |
20040068552 | April 8, 2004 | Kotz et al. |
20040073482 | April 15, 2004 | Wiggins et al. |
20040076404 | April 22, 2004 | Nakano et al. |
20040078263 | April 22, 2004 | Altieri |
20040078266 | April 22, 2004 | Kim |
20040078292 | April 22, 2004 | Blumenau |
20040078809 | April 22, 2004 | Drazin |
20040083133 | April 29, 2004 | Nicholas et al. |
20040088583 | May 6, 2004 | Yoon et al. |
20040102248 | May 27, 2004 | Young et al. |
20040103024 | May 27, 2004 | Patel et al. |
20040103429 | May 27, 2004 | Carlucci et al. |
20040107136 | June 3, 2004 | Nemirofsky et al. |
20040110565 | June 10, 2004 | Levesque |
20040111317 | June 10, 2004 | Ebisawa |
20040111484 | June 10, 2004 | Young et al. |
20040116183 | June 17, 2004 | Prindle |
20040117272 | June 17, 2004 | Shehab |
20040121835 | June 24, 2004 | Willis et al. |
20040121842 | June 24, 2004 | Willis et al. |
20040126747 | July 1, 2004 | Fujisawa et al. |
20040133480 | July 8, 2004 | Domes |
20040133518 | July 8, 2004 | Dryall |
20040137980 | July 15, 2004 | Aenlle |
20040139465 | July 15, 2004 | Matthews, III et al. |
20040140352 | July 22, 2004 | Walker et al. |
20040143478 | July 22, 2004 | Ward |
20040143495 | July 22, 2004 | Koenig |
20040148221 | July 29, 2004 | Chu |
20040148424 | July 29, 2004 | Berkson et al. |
20040148625 | July 29, 2004 | Eldering et al. |
20040152517 | August 5, 2004 | Hardisty et al. |
20040152518 | August 5, 2004 | Kogo |
20040153360 | August 5, 2004 | Schumann |
20040153363 | August 5, 2004 | Stehling |
20040153385 | August 5, 2004 | Allibhoy et al. |
20040153453 | August 5, 2004 | Brodie et al. |
20040158858 | August 12, 2004 | Paxton |
20040162758 | August 19, 2004 | Willis |
20040162759 | August 19, 2004 | Willis |
20040163101 | August 19, 2004 | Swix et al. |
20040163134 | August 19, 2004 | Willis |
20040168063 | August 26, 2004 | Revital et al. |
20040168188 | August 26, 2004 | Bennington et al. |
20040168202 | August 26, 2004 | Ebihara |
20040169678 | September 2, 2004 | Oliver |
20040172324 | September 2, 2004 | Merriman et al. |
20040172331 | September 2, 2004 | Merriman et al. |
20040172332 | September 2, 2004 | Merriman et al. |
20040172343 | September 2, 2004 | Allibhoy et al. |
20040176170 | September 9, 2004 | Eck et al. |
20040176995 | September 9, 2004 | Fusz |
20040177001 | September 9, 2004 | Salinas |
20040181808 | September 16, 2004 | Schaefer et al. |
20040186766 | September 23, 2004 | Fellenstein et al. |
20040186771 | September 23, 2004 | Squires |
20040186993 | September 23, 2004 | Risan et al. |
20040193488 | September 30, 2004 | Khoo et al. |
20040193902 | September 30, 2004 | Vogler et al. |
20040194123 | September 30, 2004 | Fredlund et al. |
20040194128 | September 30, 2004 | McIntyre et al. |
20040201629 | October 14, 2004 | Bates et al. |
20040204238 | October 14, 2004 | Aoki |
20040204247 | October 14, 2004 | Walker et al. |
20040205157 | October 14, 2004 | Bibelnieks et al. |
20040205508 | October 14, 2004 | Wecker et al. |
20040205807 | October 14, 2004 | Wilcoxson et al. |
20040210472 | October 21, 2004 | Lew et al. |
20040210489 | October 21, 2004 | Jackson et al. |
20040210661 | October 21, 2004 | Thompson |
20040210824 | October 21, 2004 | Shoff et al. |
20040219977 | November 4, 2004 | Ebisawa |
20040220850 | November 4, 2004 | Ferrer et al. |
20040221018 | November 4, 2004 | Ji |
20040224772 | November 11, 2004 | Canessa et al. |
20040225562 | November 11, 2004 | Turner |
20040225715 | November 11, 2004 | Gottfried |
20040230593 | November 18, 2004 | Rudin et al. |
20040230994 | November 18, 2004 | Urdang et al. |
20040234932 | November 25, 2004 | Hughes et al. |
20040236585 | November 25, 2004 | Kohnke et al. |
20040243455 | December 2, 2004 | Smith |
20040243466 | December 2, 2004 | Trzybinski et al. |
20040243470 | December 2, 2004 | Ozer et al. |
20040243623 | December 2, 2004 | Ozer et al. |
20040248649 | December 9, 2004 | Arai et al. |
20040249786 | December 9, 2004 | Dabney et al. |
20040252051 | December 16, 2004 | Johnson |
20040254831 | December 16, 2004 | Dean |
20040254957 | December 16, 2004 | Hyotyniemi et al. |
20040255148 | December 16, 2004 | Monteiro et al. |
20040259553 | December 23, 2004 | Delaney et al. |
20040260609 | December 23, 2004 | Loeb et al. |
20040261125 | December 23, 2004 | Ellis et al. |
20040266535 | December 30, 2004 | Reeves |
20040266537 | December 30, 2004 | Morris |
20040267611 | December 30, 2004 | Hoerenz |
20040267806 | December 30, 2004 | Lester |
20050005242 | January 6, 2005 | Hoyle |
20050015267 | January 20, 2005 | Barringer et al. |
20050021387 | January 27, 2005 | Gottfurcht |
20050021396 | January 27, 2005 | Pearch et al. |
20050021397 | January 27, 2005 | Cui et al. |
20050021403 | January 27, 2005 | Ozer et al. |
20050021465 | January 27, 2005 | Segerstrom |
20050021470 | January 27, 2005 | Martin et al. |
20050021853 | January 27, 2005 | Parekh et al. |
20050022019 | January 27, 2005 | Medvinsky et al. |
20050027587 | February 3, 2005 | Latona et al. |
20050027595 | February 3, 2005 | Ha et al. |
20050027699 | February 3, 2005 | Awadallah et al. |
20050028188 | February 3, 2005 | Latona et al. |
20050028195 | February 3, 2005 | Feinleib et al. |
20050032577 | February 10, 2005 | Blackburn et al. |
20050033700 | February 10, 2005 | Vogler et al. |
20050038698 | February 17, 2005 | Lukose et al. |
20050038702 | February 17, 2005 | Merriman et al. |
20050050070 | March 3, 2005 | Sheldon |
20050055725 | March 10, 2005 | Stewart |
20050060264 | March 17, 2005 | Shrock et al. |
20050075155 | April 7, 2005 | Sitrick |
20050075172 | April 7, 2005 | Coleman |
20050076051 | April 7, 2005 | Carobus et al. |
20050091107 | April 28, 2005 | Blum |
20050091108 | April 28, 2005 | Frost |
20050091111 | April 28, 2005 | Green et al. |
20050096975 | May 5, 2005 | Moshe |
20050096983 | May 5, 2005 | Werkhoven |
20050097622 | May 5, 2005 | Zigmond et al. |
20050101386 | May 12, 2005 | Lavanchy et al. |
20050102177 | May 12, 2005 | Takayama |
20050102202 | May 12, 2005 | Linden et al. |
20050107158 | May 19, 2005 | Kanisawa et al. |
20050108095 | May 19, 2005 | Perlmutter |
20050113170 | May 26, 2005 | McHugh |
20050114526 | May 26, 2005 | Aoyama |
20050125286 | June 9, 2005 | Crippen et al. |
20050125528 | June 9, 2005 | Burke et al. |
20050130725 | June 16, 2005 | Creamer et al. |
20050143174 | June 30, 2005 | Goldman et al. |
20050144063 | June 30, 2005 | Spector |
20050144073 | June 30, 2005 | Morrisroe et al. |
20050149396 | July 7, 2005 | Horowitz et al. |
20050149397 | July 7, 2005 | Morgenstern et al. |
20050153760 | July 14, 2005 | Varley |
20050154640 | July 14, 2005 | Kolluri et al. |
20050154717 | July 14, 2005 | Watson et al. |
20050155056 | July 14, 2005 | Knee et al. |
20050155083 | July 14, 2005 | Oh et al. |
20050160442 | July 21, 2005 | Kaplowitz |
20050164757 | July 28, 2005 | Ebisawa |
20050165640 | July 28, 2005 | Kotorov |
20050165644 | July 28, 2005 | Beyda et al. |
20050171865 | August 4, 2005 | Beardow |
20050177413 | August 11, 2005 | Blumberg et al. |
20050177430 | August 11, 2005 | Willis |
20050177431 | August 11, 2005 | Willis et al. |
20050177461 | August 11, 2005 | Rosefelt et al. |
20050177853 | August 11, 2005 | Williams et al. |
20050178940 | August 18, 2005 | Granick |
20050179685 | August 18, 2005 | Kake et al. |
20050182693 | August 18, 2005 | Alivandi |
20050182737 | August 18, 2005 | Brown |
20050185825 | August 25, 2005 | Hoshino et al. |
20050192071 | September 1, 2005 | Matsuno et al. |
20050192864 | September 1, 2005 | Ganz |
20050193411 | September 1, 2005 | Funston |
20050193425 | September 1, 2005 | Sull et al. |
20050195157 | September 8, 2005 | Kramer et al. |
20050202385 | September 15, 2005 | Coward et al. |
20050203804 | September 15, 2005 | Suzuki et al. |
20050203811 | September 15, 2005 | David |
20050203849 | September 15, 2005 | Benson |
20050204381 | September 15, 2005 | Ludvig et al. |
20050216346 | September 29, 2005 | Kusumoto et al. |
20050216348 | September 29, 2005 | Martin et al. |
20050216581 | September 29, 2005 | Blumenau et al. |
20050216932 | September 29, 2005 | Danker |
20050222908 | October 6, 2005 | Altberg et al. |
20050227749 | October 13, 2005 | Bender et al. |
20050228797 | October 13, 2005 | Koningstein et al. |
20050235030 | October 20, 2005 | Lauckhart et al. |
20050235199 | October 20, 2005 | Adams |
20050235310 | October 20, 2005 | Bies |
20050235318 | October 20, 2005 | Grauch et al. |
20050240476 | October 27, 2005 | Bigott |
20050240961 | October 27, 2005 | Jerding et al. |
20050246736 | November 3, 2005 | Beyda et al. |
20050247769 | November 10, 2005 | Potter et al. |
20050251539 | November 10, 2005 | Parekh et al. |
20050254366 | November 17, 2005 | Amar |
20050255914 | November 17, 2005 | McHale et al. |
20050256768 | November 17, 2005 | Robinson |
20050261062 | November 24, 2005 | Lewin et al. |
20050261962 | November 24, 2005 | Chuah |
20050266906 | December 1, 2005 | Stevens |
20050266907 | December 1, 2005 | Weston et al. |
20050270537 | December 8, 2005 | Mian et al. |
20050273618 | December 8, 2005 | Takemura et al. |
20050283395 | December 22, 2005 | Lesandrini et al. |
20050283401 | December 22, 2005 | Swix et al. |
20050283797 | December 22, 2005 | Eldering et al. |
20050286860 | December 29, 2005 | Conklin |
20050288999 | December 29, 2005 | Lerner et al. |
20060007312 | January 12, 2006 | James |
20060031405 | February 9, 2006 | Goldman et al. |
20060031551 | February 9, 2006 | Agresta et al. |
20060080702 | April 13, 2006 | Diez et al. |
20060085517 | April 20, 2006 | Kaurila |
20060085816 | April 20, 2006 | Funk et al. |
20060090186 | April 27, 2006 | Santangelo et al. |
20060130095 | June 15, 2006 | Willis et al. |
20060135232 | June 22, 2006 | Willis |
20060143650 | June 29, 2006 | Tanikawa et al. |
20060150249 | July 6, 2006 | Gassen et al. |
20060167747 | July 27, 2006 | Goodman |
20060193471 | August 31, 2006 | Stehle |
20060195859 | August 31, 2006 | Konig et al. |
20060195860 | August 31, 2006 | Eldering et al. |
20060195902 | August 31, 2006 | King et al. |
20060206912 | September 14, 2006 | Klarfeld et al. |
20060212347 | September 21, 2006 | Fang et al. |
20060212350 | September 21, 2006 | Ellis et al. |
20060230141 | October 12, 2006 | Willis |
20060242667 | October 26, 2006 | Peterson et al. |
20060242703 | October 26, 2006 | Abeni |
20060248209 | November 2, 2006 | Chiu |
20060248569 | November 2, 2006 | Lienhart et al. |
20060248595 | November 2, 2006 | Kelly et al. |
20060253323 | November 9, 2006 | Phan et al. |
20060253330 | November 9, 2006 | Maggio et al. |
20060265503 | November 23, 2006 | Jones et al. |
20060268667 | November 30, 2006 | Jellison, Jr. et al. |
20060294566 | December 28, 2006 | Zlattner |
20070027771 | February 1, 2007 | Collins et al. |
20070038508 | February 15, 2007 | Jain et al. |
20070038516 | February 15, 2007 | Apple et al. |
20070038931 | February 15, 2007 | Allaire et al. |
20070050254 | March 1, 2007 | Driscoll |
20070050256 | March 1, 2007 | Walker et al. |
20070055980 | March 8, 2007 | Mageid et al. |
20070061204 | March 15, 2007 | Ellis et al. |
20070061838 | March 15, 2007 | Grubbs et al. |
20070066287 | March 22, 2007 | Papulov |
20070072676 | March 29, 2007 | Baluja |
20070073756 | March 29, 2007 | Manhas et al. |
20070078706 | April 5, 2007 | Datta et al. |
20070078712 | April 5, 2007 | Ott et al. |
20070078714 | April 5, 2007 | Ott |
20070078989 | April 5, 2007 | van Datta |
20070079326 | April 5, 2007 | Datta et al. |
20070079331 | April 5, 2007 | Datta et al. |
20070079335 | April 5, 2007 | McDonough |
20070083611 | April 12, 2007 | Farago et al. |
20070089151 | April 19, 2007 | Moore et al. |
20070094081 | April 26, 2007 | Yruski |
20070094082 | April 26, 2007 | Yruski |
20070094083 | April 26, 2007 | Yruski |
20070094363 | April 26, 2007 | Yruski |
20070101360 | May 3, 2007 | Gutta et al. |
20070118425 | May 24, 2007 | Yruski |
20070130012 | June 7, 2007 | Yruski |
20070130594 | June 7, 2007 | Hidary et al. |
20070146812 | June 28, 2007 | Lawton |
20070150919 | June 28, 2007 | Morishita |
20070157220 | July 5, 2007 | Cordray et al. |
20070162945 | July 12, 2007 | Mills |
20070168288 | July 19, 2007 | Bozeman |
20070174471 | July 26, 2007 | Van Rossum |
20070244760 | October 18, 2007 | Bodnar et al. |
20070294740 | December 20, 2007 | Drake et al. |
20070294773 | December 20, 2007 | Hydrie et al. |
20070299935 | December 27, 2007 | Plastina et al. |
20080046948 | February 21, 2008 | Verosub |
20080097872 | April 24, 2008 | Peckover |
20080102947 | May 1, 2008 | Hays et al. |
20080104106 | May 1, 2008 | Rosenberg et al. |
20080109844 | May 8, 2008 | Baldeschwieler et al. |
20080114861 | May 15, 2008 | Gildred |
20080120407 | May 22, 2008 | Chen et al. |
20080127244 | May 29, 2008 | Zhang |
20080137645 | June 12, 2008 | Skog |
20080140239 | June 12, 2008 | Rosenberg et al. |
20080140717 | June 12, 2008 | Rosenberg et al. |
20080141372 | June 12, 2008 | Massey et al. |
20080207137 | August 28, 2008 | Maharajh et al. |
20090083788 | March 26, 2009 | Russell |
20090183081 | July 16, 2009 | Rodriguez |
20090204481 | August 13, 2009 | Navar |
20090254430 | October 8, 2009 | Cherenson |
20100022310 | January 28, 2010 | van Datta |
20100030640 | February 4, 2010 | van Datta |
20100043022 | February 18, 2010 | Kaftan |
20100169467 | July 1, 2010 | Shukla et al. |
20100169910 | July 1, 2010 | Collins et al. |
20100269138 | October 21, 2010 | Krikorian et al. |
20110004669 | January 6, 2011 | Navar |
20110010545 | January 13, 2011 | Kill et al. |
20110015975 | January 20, 2011 | Yruski et al. |
20110029383 | February 3, 2011 | Engel et al. |
20110041161 | February 17, 2011 | Capati |
20110125582 | May 26, 2011 | Datta et al. |
20110138058 | June 9, 2011 | Ishida |
20110307339 | December 15, 2011 | Russell |
20130232000 | September 5, 2013 | van Datta |
20130232001 | September 5, 2013 | van Datta |
20130297411 | November 7, 2013 | van Datta |
20140019229 | January 16, 2014 | van Datta |
20140019249 | January 16, 2014 | Nicholas et al. |
20140089081 | March 27, 2014 | Yruski |
20140215224 | July 31, 2014 | Navar |
20140304328 | October 9, 2014 | Capati |
20140324576 | October 30, 2014 | van Datta |
20140337882 | November 13, 2014 | Navar |
20150294368 | October 15, 2015 | Russell |
20160027053 | January 28, 2016 | van Datta |
20160292736 | October 6, 2016 | Yruski |
20170164030 | June 8, 2017 | Navar |
20170206341 | July 20, 2017 | Navar |
20170208145 | July 20, 2017 | Capati |
9959097 | November 1999 | AU |
2106122 | March 1994 | CA |
2250680 | April 2000 | CA |
1 653 819 | August 2005 | CN |
103279874 | September 2013 | CN |
0 337 539 | October 1989 | EP |
0 405 776 | January 1991 | EP |
0 620 688 | October 1994 | EP |
0 625 760 | November 1994 | EP |
0 743 595 | October 1996 | EP |
0 905 928 | March 1999 | EP |
2 141 907 | January 1985 | GB |
2 194 369 | March 1988 | GB |
12-20925 | September 1989 | JP |
63-35569 | December 1994 | JP |
81-17445 | May 1996 | JP |
81-73634 | July 1996 | JP |
82-80934 | October 1996 | JP |
2001-111921 | April 2001 | JP |
2001-321556 | November 2001 | JP |
2002-259433 | September 2002 | JP |
2002-358455 | December 2002 | JP |
2002-366971 | December 2002 | JP |
2003-248844 | September 2003 | JP |
2004-102475 | April 2004 | JP |
2004-298469 | October 2004 | JP |
WO 1993/14462 | July 1993 | WO |
WO 1993/19427 | September 1993 | WO |
WO 1993/22017 | November 1993 | WO |
WO 1993/23125 | November 1993 | WO |
WO 1995/12442 | May 1995 | WO |
WO 1995/12853 | May 1995 | WO |
WO 98/51384 | November 1998 | WO |
WO 2003/032127 | April 2003 | WO |
WO 2004/100010 | November 2004 | WO |
WO 2005/086969 | September 2005 | WO |
WO 2007/041022 | April 2007 | WO |
WO 2007/041028 | April 2007 | WO |
WO 2007/130681 | November 2007 | WO |
- Andreaux. J.-P.; Copy Protection system for digital home networks; Mar. 2004; IEEE, vol. 21, Issue: 2; pp. 100-108.
- Business Wire, “Juno launches America's first free Internet e-mail service; Initial advertisers include Land's End, Miramax and Snapple,” Apr. 19, 1996.
- Business Wire, “RTIME Announces First 100-Person Twitch Game for Internet; ”RTIME Rocks!“ Demonstrates the Power of the RTIME Interactive Networking Engine to Support Large Scale, High Performance, Internet Game Play,” Apr. 14, 1997.
- Cohen, Josh, “A General Overview of Two New Technologies for Playing Protected Content on Portable or Networked Devices,” Microsoft Windows Media, Jun. 2004, 1-8.
- Courtois N et al: An Algebraic Masking Method to Protect AES Agaist Power Attacks, ′Online! XP002344150 Retrieved from the Internet: URL:eprint.iacr.org/2005/204.pdf> retrieved on Sep. 8, 2005!
- Fontijn, Willem; AmbientDB: P2P Data Management Middleware for Ambient Intelligence; Year: 2004; IEEE; pp. 1-5.
- Microsoft Corporation, “A Technical Overview of Windows Media DRM 10 for Devices,” Microsoft Windows Media, Sep. 2004, 1-16.
- Microsoft Corporation, “Architecture of Windows Media Rights Manager,” www.microsoft.com/windows/windowsmedia/howto/articles/drmarchitecture.aspc, May 2004.
- PricewaterhouseCoopers, “Lab Online Ad Measurement Study,” Dec. 2001.
- Recording Industry Association of America, “Frequently Asked Questions—Webcasting,” www.riaa.com/issues/licensing/webcasting_faq.asp. (acc. 2004).
- Statement in Accordance with the Notice from the European Patent Office dated Oct. 1, 2007 Concerning Business Methods Nov. 1, 2007, XP002456252.
- U.S. Copyright Office, “The Digital Millennium Copyright Act of 1998,” Oct. 1998, 1-18.
- What TV Ratings Really Mean (and Other Frequently-Asked Questions). Nielsen Media Research. Web. <http:!/ documents.chelmsford. k 12. ma.us/dsweb/GeUDocument-14983/nielsenmedia.htm>, Jun. 2005.
- PCT/US06/037018, International Search Report and Written Opinion dated Aug. 7, 2007.
- PCT/US06/036958, International Search Report and Written Opinion dated Apr. 27, 2007.
- PCT/US07/11059, International Search Report and Written Opinion dated May 30, 2008.
- EP 06815173.7, First Examination Report dated Feb. 23, 2016.
- EP 06815173.7, Extended European Search Report dated Oct. 5, 2011.
- JP 2009-509786, Decision of Refusal dated Oct. 30, 2012.
- JP 2009-509786, Decision of Refusal dated Aug. 2, 2011.
- JP 2009-509786, Decision of Refusal dated Jul. 28, 2011.
- JP 2013-039681, Decision of Refusal dated Feb. 2, 2015.
- JP 2013-039681, Notification of Reason for Refusal dated Feb. 12, 2014.
- CN 200780016268.2, First Office Action dated Jan. 4, 2012.
- CN 201310051520.0, First Office Action dated Sep. 1, 2015.
- EP 07776856.2, Extended European Search Report dated Jun. 9, 2011.
- U.S. Appl. No. 11/241,229 Final Office Action dated Apr. 23, 2010.
- U.S. Appl. No. 11/241,229 Office Action dated Nov. 19, 2009.
- U.S. Appl. No. 13/939,178 Office Action dated Oct. 10, 2013.
- U.S. Appl. No. 14/336,452 Office Action dated Jan. 8, 2016.
- U.S. Appl. No. 12/571,204 Office Action dated Feb. 28, 2012.
- U.S. Appl. No. 12/571,225 Office Action dated Feb. 2, 2012.
- U.S. Appl. No. 11/240,655 Final Office Action dated Nov. 14, 2013.
- U.S. Appl. No. 11/240,655 Office Action dated Aug. 5, 2013.
- U.S. Appl. No. 11/240,655 Final Office Action dated Jan. 27, 2010.
- U.S. Appl. No. 11/240,655 Office Action dated Apr. 16, 2009.
- U.S. Appl. No. 13/857,080 Office Action dated Aug. 2, 2016.
- U.S. Appl. No. 13/857,080 Final Office Action dated Aug. 19, 2015.
- U.S. Appl. No. 13/857,080 Office Action dated Apr. 29, 2015.
- U.S. Appl. No. 13/857,082 Office Action dated Aug. 18, 2016.
- U.S. Appl. No. 13/857,082 Final Office Action dated Aug. 11, 2015.
- U.S. Appl. No. 13/857,082 Office Action dated Apr. 16, 2015.
- U.S. Appl. No. 12/190,323 Final Office Action dated Feb. 25, 2013.
- U.S. Appl. No. 12/190,323 Office Action dated May 7, 2012.
- U.S. Appl. No. 12/190,323 Office Action dated Jun. 8, 2011.
- U.S. Appl. No. 12/190,323 Final Office Action dated Nov. 14, 2011.
- U.S. Appl. No. 14/691,404 Final Office Action dated Mar. 25, 2016.
- U.S. Appl. No. 14/691,404 Office Action dated Nov. 13, 2015.
- U.S. Appl. No. 13/191,398 Final Office Action dated Jun. 10, 2014.
- U.S. Appl. No. 13/191,398 Office Action dated Dec. 3, 2013.
- U.S. Appl. No. 13/191,398 Final Office Action dated Jun. 7, 2013.
- U.S. Appl. No. 13/191,398 Office Action dated Mar. 22, 2012.
- U.S. Appl. No. 11/535,370 Final Office Action dated Jun. 8, 2010.
- U.S. Appl. No. 11/535,307 Office Action dated Dec. 10, 2009.
- U.S. Appl. No. 11/535,307 Final Action dated Sep. 8, 2009.
- U.S. Appl. No. 11/535,307 Office Action dated Apr. 16, 2009.
- U.S. Appl. No. 13/013,789 Final Office Action dated Jun. 17, 2016.
- U.S. Appl. No. 13/013,789 Office Action dated Feb. 12, 2016.
- U.S. Appl. No. 13/013,789 Final Office Action dated Jul. 28, 2014.
- U.S. Appl. No. 13/013,789 Office Action dated Dec. 20, 2013.
- U.S. Appl. No. 13/013,789 Final Office Action dated Feb. 27, 2013.
- U.S. Appl. No. 13/013,789 Office Action dated Oct. 9, 2012.
- U.S. Appl. No. 11/452,848 Final Office Action dated Apr. 7, 2015.
- U.S. Appl. No. 11/452,848 Office Action dated Oct. 23, 2014.
- U.S. Appl. No. 11/452,848 Final Office Action dated Jun. 5, 2014.
- U.S. Appl. No. 11/452,848 Office Action dated Nov. 18, 2013.
- U.S. Appl. No. 11/452,848 Final Office Action dated Feb. 15, 2011.
- U.S. Appl. No. 11/452,848 Office Action dated Sep. 15, 2010.
- U.S. Appl. No. 11/452,848 Final Office Action dated Apr. 21, 2010.
- U.S. Appl. No. 11/452,848 Office Action dated Oct. 20, 2009.
- U.S. Appl. No. 11/452,848 Final Office Action dated Jul. 9, 2009.
- U.S. Appl. No. 11/452,848 Office Action dated Jan. 27, 2009.
- U.S. Appl. No. 14/028,327 Final Office Action dated Mar. 19, 2015.
- U.S. Appl. No. 14/028,327 Office Action dated Oct. 8, 2014.
- U.S. Appl. No. 14/028,327 Final Office Action dated Jun. 9, 2014.
- U.S. Appl. No. 14/028,327 Office Action dated Nov. 7, 2013.
- U.S. Appl. No. 14/875,682 Final Office Action dated Jul. 18, 2016.
- U.S. Appl. No. 14/875,682 Office Action dated Jan. 29, 2016.
- U.S. Appl. No. 12/782,678 Final Office Action dated Jul. 31, 2013.
- U.S. Appl. No. 12/782,678 Office Action dated Jan. 7, 2013.
- U.S. Appl. No. 12/782,678 Office Action dated Oct. 4, 2012.
- U.S. Appl. No. 14/308,313 Final Office Action dated Oct. 23, 2015.
- U.S. Appl. No. 14/308,313 Office Action dated Apr. 27, 2015.
- U.S. Appl. No. 11/586,990 Office Action dated Mar. 18, 2016.
- U.S. Appl. No. 11/586,990 Final Office Action dated Dec. 8, 2014.
- U.S. Appl. No. 11/586,990 Office Action dated Aug. 12, 2014.
- U.S. Appl. No. 11/586,990 Final Office Action dated Apr. 7, 2014.
- U.S. Appl. No. 11/586,990 Office Action dated Nov. 20, 2013.
- U.S. Appl. No. 11/586,990 Final Office Action dated Apr. 10, 2013.
- U.S. Appl. No. 11/586,990 Office Action dated Nov. 23, 2012.
- U.S. Appl. No. 11/586,990 Final Office Action dated Feb. 14, 2011.
- U.S. Appl. No. 11/586,990 Office Action dated Sep. 15, 2010.
- U.S. Appl. No. 11/588,036 Office Action dated Aug. 31, 2016.
- U.S. Appl. No. 11/588,036 Final Office Action dated Aug. 4, 2015.
- U.S. Appl. No. 11/588,036 Office Action dated Jan. 15, 2015.
- U.S. Appl. No. 11/588,036 Final Office Action dated Apr. 15, 2014.
- U.S. Appl. No. 11/588,036 Office Action dated Jan. 6, 2014.
- U.S. Appl. No. 11/588,036 Final Office Action dated Oct. 4, 2012.
- U.S. Appl. No. 11/588,036 Office Action dated Apr. 27, 2012.
- U.S. Appl. No. 11/588,036 Final Office Action dated Feb. 17, 2011.
- U.S. Appl. No. 11/588,036 Office Action dated Sep. 14, 2010.
- U.S. Appl. No. 11/586,958 Office Action dated Jun. 23, 2016.
- U.S. Appl. No. 11/586,958 Final Office Action dated Aug. 4, 2015.
- U.S. Appl. No. 11/586,958 Office Action dated Jan. 14, 2015.
- U.S. Appl. No. 11/586,958 Final Office Action dated Mar. 12, 2014.
- U.S. Appl. No. 11/586,958 Office Action dated Nov. 6, 2013.
- U.S. Appl. No. 11/586,958 Final Office Action dated Feb. 14, 2011.
- U.S. Appl. No. 11/586,958 Office Action dated Sep. 30, 2010.
- U.S. Appl. No. 11/586,989 Final Office Action dated Dec. 9, 2010.
- U.S. Appl. No. 11/586,989 Office Action dated May 11, 2010.
- U.S. Appl. No. 11/586,989 Office Action dated Mar. 20, 2009.
- U.S. Appl. No. 14/091,327 Office Action dated Mar. 12, 2015.
- U.S. Appl. No. 11/586,959 Final Office Action dated Jan. 29, 2016.
- U.S. Appl. No. 11/586,959 Office Action dated Jul. 9, 2015.
- U.S. Appl. No. 11/586,959 Final Office Action dated Dec. 8, 2014.
- U.S. Appl. No. 11/586,959 Office Action dated Jul. 31, 2014.
- U.S. Appl. No. 11/586,959 Office Action dated Feb. 12, 2014.
- U.S. Appl. No. 11/586,959 Final Office Action dated Aug. 30, 2013.
- U.S. Appl. No. 11/586,959 Office Action dated May 8, 2013.
- U.S. Appl. No. 11/586,959 Final Office Action dated Oct. 5, 2012.
- U.S. Appl. No. 11/586,959 Office Action dated Apr. 27, 2012.
- U.S. Appl. No. 11/586,959 Final Office Action dated Feb. 14, 2011.
- U.S. Appl. No. 11/586,959 Office Action dated Oct. 1, 2010.
- U.S. Appl. No. 12/370,531 Office Action dated Aug. 1, 2013.
- U.S. Appl. No. 12/370,531 Final Office Action dated Aug. 3, 2011.
- U.S. Appl. No. 12/370,531 Office Action dated Nov. 16, 2011.
- U.S. Appl. No. 12/370,531 Final Office Action dated Aug. 1, 2011.
- U.S. Appl. No. 12/370,531 Office Action dated Feb. 2, 2011.
- U.S. Appl. No. 14/315,694 Office Action dated Mar. 25, 2016.
- U.S. Appl. No. 14/315,694 Final Office Action dated Oct. 27, 2015.
- U.S. Appl. No. 14/315,694 Office Action dated Apr. 10, 2015.
- U.S. Appl. No. 11/588,236 Office Action dated Sep. 9, 2009.
- U.S. Appl. No. 11/588,236 Office Action dated Mar. 5, 2009.
- U.S. Appl. No. 12/703,188 Final Office Action dated Sep. 7, 2016.
- U.S. Appl. No. 12/703,188 Office Action dated Apr. 1, 2016.
- U.S. Appl. No. 12/703,188 Office Action dated Apr. 1, 2015.
- U.S. Appl. No. 12/703,188 Final Office Action dated Jul. 14, 2014.
- U.S. Appl. No. 12/703,188 Office Action dated Nov. 21, 2013.
- U.S. Appl. No. 12/703,188 Final Office Action dated Oct. 12, 2012.
- U.S. Appl. No. 12/703,188 Office Action dated Apr. 6, 2012.
- U.S. Appl. No. 10/924,009 Supplemental Final Office Action dated Feb. 4, 2009.
- U.S. Appl. No. 10/924,009 Final Office Action dated Dec. 5, 2008.
- U.S. Appl. No. 10/924,009 Office Action dated Jun. 30, 2008.
- U.S. Appl. No. 12/717,108 Final Office Action dated Jan. 31, 2012.
- U.S. Appl. No. 12/717,108 Final Office Action dated Jul. 20, 2011.
- U.S. Appl. No. 12/717,108 Office Action dated Feb. 9, 2011.
- U.S. Appl. No. 14/242,664 Office Action dated Feb. 29, 2016.
- U.S. Appl. No. 14/242,664 Office Action dated Aug. 31, 2015.
- U.S. Appl. No. 14/691,404 Office Action dated Oct. 27, 2016.
- U.S. Appl. No. 15/180,615 Office Action dated Nov. 2, 2016.
- U.S. Appl. No. 15/333,932, Allister Capati, Management of Ancillary Content Delivery and Presentation, filed Oct. 25, 2016.
- U.S. Appl. No. 15/385,688, Murgesh Navar, Discovery and Analytics for Episodic Downloaded Media, filed Dec. 20, 2016.
- U.S. Appl. No. 15/391,522, Murgesh Navar, Statutory License Restricted Digital Media Playback on Portable Devices, Not Yet Assigned.
- U.S. Appl. No. 13/857,080 Final Office Action dated Feb. 24, 2017.
- U.S. Appl. No. 13/857,082 Final Office Action dated Feb. 28, 2017.
- U.S. Appl. No. 14/691,404 Final Office Action dated Mar. 22, 2017.
- U.S. Appl. No. 14/875,682 Office Action dated Jan. 26, 2017.
- U.S. Appl. No. 11/588,036 Final Office Action dated Mar. 15, 2017.
- U.S. Appl. No. 11/586,958 Final Office Action dated Jan. 18, 2017.
- U.S. Appl. No. 13/013,789 Office Action dated May 4, 2017.
- U.S. Appl. No. 15/180,615 Final Office Action dated May 19, 2017.
- U.S. Appl. No. 12/703,188 Office Action dated Apr. 5, 2017.
- U.S. Appl. No. 13/857,080 Office Action dated Jul. 28, 2017.
- U.S. Appl. No. 13/013,789 Final Office Action dated Aug. 25, 2017.
- U.S. Appl. No. 15/333,932 Office Action dated Aug. 14, 2017.
- U.S. Appl. No. 14/691,404 Office Action dated Sep. 21, 2017.
- U.S. Appl. No. 15/391,522 Office Action dated Nov. 27, 2017.
- U.S. Appl. No. 14/691,404 Final Office Action dated Jan. 30, 2018.
- U.S. Appl. No. 14/875,682 Office Action dated Feb. 22, 2018.
- U.S. Appl. No. 11/588,036 Office Action dated Mar. 16, 2018.
- U.S. Appl. No. 15/385,688 Final Office Action dated Jan. 18, 2018.
- U.S. Appl. No. 15/866,308, filed Jan. 9, 2018, Andrey Yruski, Asynchronous Advertising.
Type: Grant
Filed: Oct 5, 2016
Date of Patent: May 29, 2018
Patent Publication Number: 20170091804
Assignee: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC (San Mateo, CA)
Inventors: Glen van Datta (San Diego, CA), Gary Zalewski (Oakland, CA)
Primary Examiner: Kevin Y Kim
Application Number: 15/285,928
International Classification: A63F 9/24 (20060101); A63F 13/00 (20140101); G06F 17/00 (20060101); G06F 19/00 (20180101); G06Q 30/02 (20120101); A63F 13/5258 (20140101); A63F 13/61 (20140101); H04N 21/81 (20110101); H04N 21/478 (20110101); H04N 21/61 (20110101); H04N 21/258 (20110101);