System for Real Time Internet Protocol Content Integration, Prioritization and Distribution

- Spotible Labs LLC

A system and method to integrate, prioritize, and distribute multimedia assets into Internet Protocol (IP) media in real time. Embodiments include dynamic media integration based on permissible content elements/locations and app events, asset formatting based on user exposure and key performance indicators and contextual relevancy. Consumer engagement is enhanced through gesture-based interaction on the native device of the user and intelligent routing from the device of the user when performing for example a quick response (QR) or embedded code scan, which is prioritized based on user geo and other criteria and landing page and other performance parameters. The most suitable assets are dynamically selected, integrated and distributed into the content environment.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Number 63/059484, filed Jul. 31, 2020, which is hereby incorporated by reference herein in its entirety.

FIELD OF THE INVENTION

The present invention relates generally to computer-based methods and apparatuses, including computer program products, for a system and method of real time Internet Protocol (IP) content integration, prioritization and distribution.

BACKGROUND OF THE DISCLOSURE

Digital media promotions can be targeted and displayed in a multitude of ways. By and large, however, IP connected media and app promotions are preset in placement.

Conventionally, digital media ad placements are based on predetermined locations positioned inside a website/webpage by location-specific ad tags, and rotated by ad systems. Beyond webpage content, there are increasing ways consumers are accessing digital content such as through native smartphone apps and voice-based smart speakers. Native platforms pose even greater challenges in the integration of ad placements which need to be developed and released into new application versions; a process which may take days, weeks or sometimes several months until completed, approved and updated on an app store.

Consumer in-app engagement for example on a connected TV (CTV) or by means of using a device like a smartphone to engage for example with a quick response (QR) code printed in digital or non-digital media is encumbered by difficult remote controlled based response and a lack of targeting embedded Uniform Resource Locators (URLs) for personalized landing pages.

Additionally, digital media ad delivery is predominately fixed in orientation unable to adapt placement for any particular reason.

The exemplary disclosed system and method of the present disclosure is directed to overcoming one or more of the shortcomings set forth above and/or other deficiencies in existing technology.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a schematic overview of a computing device, in accordance with an embodiment of the present invention;

FIG. 2 illustrates a network schematic of a system, in accordance with an embodiment of the present invention;

FIG. 3A is a block diagram of an exemplary embodiment of the dynamic creative asset controller of the present invention;

FIG. 3B is a block diagram showing additional detail of interaction between an application and the multimedia server shown in FIG. 3A;

FIG. 4 is a flow diagram of the dynamic integration of creative assets process from the point of view of an application receiving the creative assets;

FIG. 5 is a block diagram of a computer system for practicing embodiments of a multimedia server portion of the dynamic creative asset controller;

FIG. 6 is a flow diagram of the dynamic integration of creative assets process from the point of view of a multimedia server;

FIG. 7-FIG. 11 are screenshot views of a digital stream view through a multimedia device display in accordance with a dynamic streaming media integration embodiment of the present invention;

FIG. 12 is a diagrammatic view of a format optimization scheme for creative asset presentation based on dynamic user exposure rules in accordance with an embodiment of the present invention;

FIG. 13-FIG. 17 are diagrammatic views of an auto-optimization scheme in dynamic display of creative asset presentation based on viewability rate in accordance with an embodiment of the present invention;

FIG. 18-FIG. 23 are diagrammatic views of an auto-optimization scheme based on maximum view through of a creative asset in accordance with an embodiment of the present invention;

FIG. 24-FIG. 27 are diagrammatic views of a process for dynamic rendering of relevant audio creative asset integration in accordance with an embodiment of the present invention; and

FIG. 28 is a diagrammatic view of an intelligent based routing process in accordance with an embodiment of the present invention.

SUMMARY OF THE INVENTION

In one aspect of the subject disclosure, a method for dynamically integrating ads into IP content, performed by a computer processor is disclosed. The method includes extracting structural metadata and contextual features from a digital media file. The digital media file is configured for performance in an Internet protocol (IP) based medium. The structural metadata and contextual features are analyzed for an understanding of characteristics associated with the digital media file. A digital media content environment is identified from the characteristics associated with the digital media file. An advertisement asset is selected from a storage of assets. The selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment. Dynamic rules for asset formatting of the selected advertisement asset are processed. The advertisement asset is integrated and distributed into the digital media content environment.

In another aspect, a computer program product for dynamically integrating ads into IP content is disclosed. The computer program product comprises one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media. The program instructions include: extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium; analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file; identifying a digital media content environment from the characteristics associated with the digital media file; selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment; processing dynamic rules for asset formatting of the selected advertisement asset; and integrating and distributing the advertisement asset into the digital media content environment.

The techniques described herein may be implemented in a number of ways. Example implementations are provided below with reference to the following figures.

DETAILED SPECIFICATION

The subject technology generally relates to dynamically integrating creative assets into software applications (apps) or other digital media. In an exemplary embodiment, a method dynamically integrates creative assets and content into web-based (IP-connected) media and apps at any time within a content's play/replay and at any dynamically determined location in the IP-based digital media while a user interacts with the content or application. Creative assets may be for example, advertising, promotional or informative multimedia content.

Embodiments provide computer-based methods and systems for providing a minimally intrusive mechanism that allows a media entity such as a content publisher to dynamically incorporate creative assets, such as advertisements, into digital media. Digital media may constitute, for example, IP connected websites/webpages, email clients, native applications, over-the-top (OTT) streaming platforms, CTVs, smart speakers, gaming consoles, out-of-home (OOH) billboards/signs, augmented reality (AR) and virtual reality (VR) headsets/displays or other environments (collectively, an “app”). Example embodiments provide real-time creative asset or content targeting for enabling a viewer or listener to experience promotions or other relevant content dynamically integrated into the app as the user consumes the digital media. These promotions can be dynamically updated as the user navigates the digital media and may be triggered by predetermined events or behaviors of the user with respect to the app.

Referring now to FIGS. 7-11, some embodiments may include real-time event listeners and triggers for content or activity-based events integrated anywhere in the app at any time. An “event listener” as used herein may refer to a function in a computer program that waits for an event to occur (also known as “event handlers”), based on various unique or combined event detections for example cue-points (timestamped markers), ID3 tags (metadata containers), HTTP Live Streaming (HLS) computer vision analysis and/or real-time data sources. For example, during live or on-demand video streaming, such as watching a professional sports game, a dynamic picture-in-picture frame may be event-based triggered by the system to display in the app a nonlinear commercial during streaming content playback, allowing a user to view and pan the commercial window while continuing to view the streaming content window in the app on the user's device. FIGS. 7-11 show representations of a digital video stream for example live TV. The figures depict a video stream being viewed in an app on a device (for example, a CTV, smartphone, or desktop) with a dynamic commercial break which dynamically resizes the video stream player window for display of a non-linear multimedia commercial without interruption of the content video stream. This feature may be well received during a live event such as a professional sports game, so that display of the live event is not obstructed or otherwise interfered. Embodiments may include user options to interact with or enlarge the dynamically integrated commercial frame upon interest in the commercial content, or to close or zoom out from the commercial break window before completion of the commercial content. FIG. 8 shows a commercial break window being dynamically integrated into the display area for the main streaming content. In some embodiments, the commercial break window may be integrated gradually into the display or may instantly pop into the display screen. The dynamic commercial break window may wrap the streaming content window in a variety of ways in accompaniment of the commercial content, including additional landscape or portrait display of multimedia creative assets surrounding the streaming content player window, a background and/or imagery related to the creative asset, and/or additional information related to either the asset or the streaming content based on real-time data sources or related information/statistics. FIG. 9 shows an adjunct window, at full size, for the commercial juxtaposed with the main content streaming window. FIGS. 10 and 11 represent the adjunct window for the dynamic commercial break gradually being removed from display until the main streaming content occupies the principal areas of the display. Event-based triggering of dynamic commercial breaks can be programmed to avoid rendering during detection of linear commercials playing during the content stream. In another embodiment, displaying a commercial on-screen may be triggered by prolonged user inactivity or content pause. In this case, the trigger for displaying or activating the commercial is a predetermined period of inactivity, which can be based on implicit actions for example lack of user input for a certain period of time or explicit actions for example pressing pause by mouse click, touch screen, or remote control depending on the viewer device type.

In one embodiment, a media entity is provided a location-based events control panel to insert creative assets, promotional content or other media in IP connected digital media. The media entity dynamically permits locations within the digital media where creative assets may be rendered. Each permitted location-based event is associated with a content element, which is further associated preferably with one or more criteria, which is used by the system to select the most appropriate creative assets, promotional content or other media. In one example, the media entity may decide to permit the insertion of creative assets into a certain part of a page section, or above/below a certain content element, or within a certain paragraph number depending on the length of the webpage or article. In another example, if the subject matter of the content being viewed by a user relates to a certain category such as vehicles, the media entity could enable advertisers to dynamically integrate a car commercial at an optimal location in the content not hard-coded in advance, instead determined on-the-fly based on location placement logic and rules to create the highest level of viewability, contextual adjacency and lowest dilution through placement with a defined amount of distance from competing promotions or content.

In some embodiments, user activity-based event listeners are programmed to detect certain actions when a user interacts with an app. These actions can include logging into an app, scrolling through content, hovering over an object, selecting a category of content, or some other in-app activity. Promotional integration and exposure is thus achieved throughout the viewer's entire content journey in the app, instead of just a specific section. For example, FIG. 12 shows a depiction of a scheme for variation and optimization of promotional experiences based on user exposure rules. Embodiments of the variation and optimization processes may present different creative asset formats at different times throughout an app. Today, ad placements are predominately limited to show for example a video commercial when a viewer is watching a video channel in an app (for example on an OTT platform). With the subject system and event method, a video commercial can be shown to the user when they first start/open the app as depicted in FIG. 12, and as further depicted an additional opportunity to show a commercial can occur while the user is navigating in the app between different views, screens or rows until the viewer has decided which content to watch. Some embodiments of background processes may include rules which determine what creative asset format to select and use for subsequent commercials. For example, a rule may dictate that a second commercial shown to the user should be an image/display message instead of video if for instance a rule is set to show the user only one video promotion per app session/journey. In addition, commercials can be displayed in different formats depending on different user exposure rules such as total time of an app session, creative asset format exposure (no more than one 1 certain type in a session due to intrusiveness), creative asset type exposure (for example, show a video commercial on first view and an image-based commercial on subsequent views) and device type (desktop, smartphone, CTV). In one embodiment, customizable ad types are used to encourage behaviors of users in the app, such as purchasing a video or subscribing to a podcast. Incentives or rewards can be given to viewers who perform the intended actions of marketers, such as providing a free pass to view subscriber content or a discount coupon for a product or service upon purchasing, watching a video, listening in whole or part or subscribing to a podcast or other form of content. This provides a mechanism for entities to encourage user engagement based upon the demonstrated interests and in-app activities of the content viewer.

According to an embodiment, monetization and inventory management optimization can be achieved based on ad exposure time and effective value per user visit. For instance, commercial lengths and formats shown on-the-fly can be optimized based on a total value per visit goal. By way of example, when selecting commercials of various lengths (for example, 15 and 30 seconds) or type (for example, video or image) a dynamic selection can be made based on expected length of a content viewer's app visit calculated based on average time spent during previous visits, effectively optimizing ad frequency and creative asset formats exposure to achieve a revenue goal per viewer app session.

In other embodiments, the time a content viewer is subjected to the creative asset is also measured. Additional qualitative factors can also yield subjective measurement values, such as the activity the viewer was engaged in at the time of exposure to the creative asset. Any one or more factors can be used alone, or in combination, or weighted in some manner, to achieve optimal creative asset integration, location, and orientation. Accordingly, embodiments may programmatically adapt functionality of creative asset formats to meet certain performance goals which can be defined for each type of user or device. For example, a key performance indicator (KPI) rule may be programmed for a creative asset format to optimize rendering in accordance to the largest viewable video size versus highest video completion rate. For example, in one instance a video or multimedia creative asset is dynamically displayed at the top of an app in the largest viewable size possible for at least two seconds to meet Media Rating Council (MRC) accredited viewability standard. Alternatively, the same video or multimedia creative asset may be dynamically formatted to display at the top of the app and additionally persist in-view when a viewer scrolls down the app in a customizable manner (e.g., persisting the video in a certain orientation, size and amount of time) to achieve a certain multimedia or video quartile/completion rate. FIGS. 13-17 illustrate a process of auto-optimization in dynamic display of a creative asset to achieve an advertiser viewability goal (e.g., at least three seconds in user view whether stationary or scrolling down the content). The creative asset is dynamically integrated to persist in the user's view on the app to achieve the KPI and once achieved, the creative asset may be snapped into a certain place of the app potentially out of view of the user.

In another embodiment, KPI optimization may also be programmed for the viewability of imagery or multimedia with reactive persistence. FIGS. 18-23 illustrate a process of auto-optimization in reactive persistence of a KPI for maximum view through of a creative asset, in this case, a carousel of imagery. In this example, the creative asset will automatically persist within the viewable area of the user until completion of the media presentation (for example, the completion of a video or in this case, the completion of image assets shown in the carousel), so that as the user scrolls down the app content, the carousel will reactively persist (e.g., auto flip through the carousel on each user scroll or time event, which can be configured by a step count or amount of content scrolled by the user or time interval) upon which reaching the last image or multimedia creative of the carousel the KPI is achieved and the creative asset will be snapped into position potentially out of view of the user (in this example, after the third slide of the 3-slide carousel).

Additionally, phases/break points to adapt creative elements or different creative sizes/aspect ratios can be programmed to adapt a creative asset across different device/screen types. For example, a horizontal 16:9 aspect ratio video can be auto-formatted on larger desktop/laptop, tablet or TV screen and a vertical 9:16 aspect ratio video can be auto-formatted in accordance with a KPI rule for smartphone device screens.

In other embodiments, event notifications are sent to the media entity from the app to provide information on the creative assets integrated, dynamic creative asset locations available, and/or other events. These event notifications may be used by the multimedia server to schedule creative assets. Additionally, signals can be detected in a video stream by programmed cue points, meta data, or via real-time multimedia machine analysis to determine a certain point in time or content event as a trigger to dynamically integrate a creative asset either by linear or non-linear insertion. For example, upon identification of a certain cue event or context, such as a change in score in a live streaming game, a creative asset can be overlaid on the screen of the app promoting a certain brand. Alternatively, silence can be detected such as the point in time in which a podcast is transitioning between topics in an episode, as a dynamic point in time to introduce a sponsor message or commercial break on a user's device (for example a smartphone or smart speaker).

In one embodiment, the real-time content asset integration, prioritization and distribution system (CAIPD hereinafter, sometimes referred to simply as the “system”) comprises a multimedia server, a creative asset client, and an creative asset server, each of which is connected to a communication network, such as the Internet. For purposes of this application, the terms “creative asset client” and “multimedia server” shall be regarded as equivalent to “creative asset client” and “creative asset server” since they may include creative assets and/or other types of messaging or content. In some embodiments, the system further incorporates a creative asset client for managing communication with the multimedia server. In some other embodiments, a first communication link is made between the creative asset client and the multimedia server, which is used to provide a second communication link between the system (and/or the creative asset client associated therewith) and the multimedia server. In one embodiment the second communications link is a connection for continual delivery of creative assets, promotions, events, and other communication.

According to an embodiment of the invention, the system and method provide dynamic formatting of text (or imagery) or micro-segmented audio content (or video) for presentation and engagement with viewers in the highest performing way. For example, multimedia can be formatted in inline article versus a persistent footer format, where the creative asset format that drives the most interactions, click-throughs, audio (or video) tune-ins/unmutes and/or download and subscription engagement will be shown the most. In one example, transcription of audio or video assets may be displayed to maximize engagement with viewers without needing to actually listen (or watch) the multimedia asset with audio. In other examples, various animations can be applied for enticement in reading what is being transcribed and optimization of the highest performing animation effect or creative format. FIGS. 24-27 illustrate rendering of general or contextually relevant audio content according to an exemplary embodiment. The system player displays dynamically transcribed or provided transcription of multimedia audio (or video) assets to be read or optionally tuned-in or downloaded to be listened now or later. The selection of assets can be based on contextual relevancy to serve as related listening (or viewing) to a content environment, and presentation of the creative asset in audio or video form can be dynamically optimized to appear in the way most optimal based on viewer engagement rate.

In another embodiment, a method is provided for displaying a remote control asset so that the audio (or video) creative asset can be paused and played at any point of the app when the main creative asset audio or video player is scrolled out of view of the user, along with presenting call-to-action to download or subscribe from the persisting remote control asset on the user's screen (which can be configured, for example to persist in the lower right of the app).

Referring now to FIG. 1, a computing device 100 appropriate for use with embodiments of the present application is shown according to an exemplary embodiment. The computing device 100 may generally be comprised of one or more of a Central Processing Unit (CPU) 101, Random Access Memory (RAM) 102, and a storage medium (e.g., hard disk drive, solid state drive, flash memory) 103. Examples of computing devices usable with embodiments of the present invention include, but are not limited to, personal computers, smartphones, laptops, tablet PCs, mobile computing devices, connected TVs, gaming consoles, smart speakers, AR and VR headsets/displays and servers. The term computing device may also describe two or more computing devices communicatively linked in a manner as to distribute and share one or more resources, such as clustered computing devices and server banks/farms. One of ordinary skill in the art would understand that any number of computing devices could be used, and embodiments of the present invention are contemplated for use with any computing device.

In an exemplary embodiment, data may be provided to the system, stored by the system and provided by the system to users of the system across local area networks (LANs) (e.g., office networks, home networks) or wide area networks (WANs) (e.g., the Internet) with protection of personal identifiable information. In accordance with the previous embodiment, the system may be comprised of numerous servers communicatively connected across one or more LANs and/or WANs. One of ordinary skill in the art would appreciate that there are numerous manners in which the system could be configured and embodiments of the present invention are contemplated for use with any configuration.

In general, the system and methods provided herein may be consumed by a user of a computing device whether connected to a network or not. According to an embodiment, some of the applications of the present invention may not be accessible when not connected to a network, however a user may be able to compose data offline that will be consumed by the system when the user is later connected to a network.

Referring now to FIG. 2, a schematic overview of a system in accordance with an exemplary embodiment is shown. The system is comprised of one or more application servers 203 for electronically storing information used by the system. Applications in the server 203 may retrieve and manipulate information in storage devices and exchange information through a WAN 201 (e.g., the Internet). Applications in server 203 may also be used to manipulate information stored remotely and process and analyze data stored remotely across a WAN 201 (e.g., the Internet).

According to an exemplary embodiment, as shown in FIG. 2, exchange of information through the WAN 201 or other network may occur through one or more high speed connections. In some cases, high speed connections may be over-the-air (OTA), passed through networked systems, directly connected to one or more WANs 201 or directed through one or more routers 202. Router(s) 202 are completely optional and other embodiments in accordance with the present invention may or may not utilize one or more routers 202. One of ordinary skill in the art would appreciate that there are numerous ways server 203 may connect to WAN 201 for the exchange of information, and embodiments of the present invention are contemplated for use with any method for connecting to networks for the purpose of exchanging information. Further, while this application refers to high speed connections, embodiments of the present invention may be utilized with connections of any speed.

Components of the system may connect to server 203 via WAN 201 or other network in numerous ways. For instance, a component may connect to the system i) through a computing device 212 directly connected to the WAN 201, ii) through a computing device 205, 206 connected to the WAN 201 through a routing device 204, iii) through a computing device 208, 209, 210 connected to a wireless access point 207 or iv) through a computing device 211 via a wireless connection (e.g., CDMA, GMS, 3G, 4G) to the WAN 201. One of ordinary skill in the art would appreciate that there are numerous ways that a component may connect to server 203 via WAN 201 or other network, and embodiments of the present invention are contemplated for use with any method for connecting to server 203 via WAN 201 or other network. Furthermore, server 203 could be comprised of a personal computing device, such as a smartphone, acting as a host for other computing devices to connect to.

Embodiments provide methods and systems for generating and dynamically incorporating creative assets such as advertisements into IP connected media, specific examples of which are apps, email clients, smart speakers, over-the-top (OTT) streaming platforms, OOH digital billboards/signs, AR and VR headsets/displays and more traditional website content (static and multimedia). Example embodiments provide a control panel in communication with a creative asset server, for enabling a media entity to dynamically integrate creative assets, advertisements (ads) or other content at any desired location in the app from the time a user first accesses the app to the time the user leaves the app. This provides more control over creative asset placement and provides the flexibility of inserting creative assets at any point throughout the viewer's or listener's app journey from start to end. The creative assets are dynamically updated as a viewer navigates through the app, such as by scrolling though movie and TV show titles of interest, or dynamic paneling of commercial messaging while watching streaming content allowing the viewer to interact with the commercial or content in tandem of continuous content playback. For purposes of this invention, one skilled in the art will recognize that “advertisements” can include any type of media or electronic content including static or dynamic text, static or dynamic graphical images of any type, including animated images, web content such as HTML and XML code, other code, video and/or sounds or other audio content including, for example, speech and music whether streamed or downloaded, RSS and other feeds, data such as social media content and other data sources. In addition, advertisements as used herein include any combination of media types, and may be compressed or raw. According to an exemplary embodiment, dynamic information displayed to the user in digital media may be based on relevancy to the user for example a registry of providers based on data that will only display the providers or centers that are in closest geographic proximity to the user. In another embodiment, the display of a data source can be based on relevancy to the content being consumed by the viewer for example the point spread between a game matchup based of teams playing in a professional sports game or moreover a real-time display of dynamic odds based on teams mentioned in a content environment.

A creative asset client manages inserting the creative assets and promotions (“promos or ads”) into the app. Upon initialization, the creative asset client connects to a dynamic multimedia server (for example, server 203 of FIG. 2), and establishes a second communication channel into the app. The multimedia server maintains a collection or a list of dynamically modifiable creative assets for incorporation into the app.

Upon the creative asset client receiving an indication that a creative asset is needed in the app, the creative asset client notifies the multimedia server to request a creative asset. The multimedia server, in turn, selects a particular creative asset from its collection according to a set of criteria, and sends that creative asset or an indication of that creative asset to the creative asset client. The creative asset client then provides the creative asset to the app, which inserts the creative asset into the app by displaying it on the screen, playing a sound over the speakers, or by some other method appropriate to the creative asset. In an alternative environment, the multimedia server begins sending a set of creative assets to the creative asset client immediately after the second communication channel is established, where they are stored, for example, by the creative asset client until needed.

Embodiments of the present invention can be used with all types of apps, OTT streaming platforms such as Apple tvOS® or Amazon Fire TV®, voice/audio streaming platforms such as Apple Siri® or Amazon Alexa® smart speakers/products and/or services, and systems, software or hardware that can establish a communication channel with the multimedia server. The communication channel may be established with the app through, for example, an application program interface (API) or software development kit (SDK). Furthermore, the app may be running at any location on a standard personal computer or mobile device connected to a network (e.g., the Internet) through narrow or broadband, DSL, ISDN, ATM and Frame Relay, cable modem, optical fiber, satellite or other wireless network, etc. For purposes of this application the term “app” shall include any software application, including on-line websites, email clients, native apps, OTT streaming service environments/platforms, games, etc.

Although the subject technology is discussed below specifically with reference to apps, smart speakers and OTT services, one skilled in the art will appreciate that the techniques disclosed are useful in other contexts as well, such as dynamic integration of creative assets, ads and/or other content into any Internet connected device. Cost of use of such devices and/or access to pay-per-view content may be subsidized by advertisers paying for advertising directed to the devices. Also, the ads can be made much more effective to the recipient than standard advertising by using a system with dynamic context, location, user activity and KPI event-based ads in the content being viewed or listened and can tailor (or personalize) them to the users of such content.

FIG. 3A is a block diagram of an exemplary embodiment of CAIPD. The CAIPD comprises a creative asset client, one or more apps in communication with the creative asset client, and a dynamic multimedia server 314, each of which is connected to a communication network 310. In one instance, the communication network 310 can be the Internet and will be referred to as such for brevity, but could be any type of computer network, LAN, WAN, or wireless network, etc. The multimedia server 314 includes a control panel 316 used to interact with the app through, for example, an API. The control panel 316 may be used by a media entity to perform the dynamic integration of creative assets into the digital media being listened/viewed/accessed by a user, as discussed further below. Alternatively, the insertion of creative assets, ads and/or content may be performed by a preconfigured automated process.

FIG. 3B is an example block diagram showing additional detail of interaction between the creative asset client 320 and the multimedia server 314 shown in FIG. 3A. The creative asset client 320 includes creative asset client software running on or in communication with app 312 and includes a portion dedicated to creative asset insertion routines 380, a portion dedicated to creative asset measurement routines 370, and a portion dedicated to memory for creative asset storage 390. The app 312 may be loaded into a computing device on which an end user may be presented the creative asset. The creative asset measurement routines 370 may be used to determine (and/or rate) creative asset exposure, including quality metrics, which can be logged and incorporated into a billing system. The creative asset communicates with the multimedia server 314 through a communication channel established between the creative asset and the multimedia server 314 which is used to request creative assets for display, send the actual creative assets, and send creative asset insertion logs and other data. The multimedia server 314 includes a creative asset scheduler 340 and a creative asset billing system 350, and is also connected to a database 360 that stores the creative asset data.

In one embodiment, the app 312 uses structural metadata for the creative assets and the creative asset destinations. For example, codes or tags may be included in digital media content associated with the creative asset platforms and likewise, similarly corresponding codes or tags may be attached to the various creative asset in storage. The codes or tags indicate for example, permissions that the app verifies against to accept a creative asset to be placed at a location within the app content. The system extracts the tags from the digital media when searching for assets with corresponding or qualifying tags. For instance, streaming content may contain tags for a billboard on a side of a building in a city. As the viewer watches a streaming program a billboard may appear in the video having the advertising tag, an ad, or code designating an ad. If no ad is available locally, an ad is requested from the multimedia server 314. Once the ad is received from the multimedia server 314 (or retrieved locally), it is presented in the space indicated by the billboard tag and appears in the video as a billboard advertising a product. The actual insertion of the ad is performed by the app software itself, with coordination from the creative asset client 320 of FIGS. 3A and 3B. The app is responsible for outputting or displaying the advertisement within the advertising billboard.

According to one embodiment, the CAIPD extrapolates the native app view hierarchy in order to emulate or remain consistent with the app layout when determining dynamic layout and positioning of creative assets/ads/content in the app. For example, a creative asset may be dynamically integrated into the 3rd row/index of an app on specific devices such as CTV screens, or placed above/below “Upcoming Titles” content element in order to maintain the visual flow and integrity of the app.

The user/viewer may also have the ability to control creative asset, ad or content engagement by interacting with the CAIPD through voice or gesture-based commands native to the user's input device (example, a remote control paired to a CTV). For example, a user may press an enter button on their CTV remote once to watch a trailer for a creative asset promoting a movie and/or press the enter button twice to purchase, instead of responding by more cumbersome means of pointing and clicking the remote control to interact with the creative asset.

In another embodiment, the viewer may interact through QR code or vector format with custom colors, logos and/or graphics by using a smartphone or other device capable of scanning such codes with a smart camera or application, that initiate personalized landing page URLs in association with the promotion or content, deep links to open associated smartphone apps or other actions such as tickets or coupon/discount rewards that can be stored in the mobile wallets of smartphone users. In this embodiment, the subject system intelligently routes the landing page/destination the user will visit or a mobile wallet offer, based on event or geographic-based triggers. Referring to FIG. 28, a process of intelligent rule-based routing is shown according to an exemplary embodiment. For example, two different users may be routed, after scanning the same code, to two different landing pages/destinations. The routing, in one embodiment, can be based on the geo-location of the user so that geographic information most relevant to one user will be presented and the geographic information more relevant to the other user's location will be presented and displayed for that user. For example, seeing stores of the marketer that are nearest to the user or public health information that is specific to the location of the viewer. In another example, downloading of a coupon into the user's mobile wallet may be based on user location or intelligent prioritization/weighting based on other criteria/factors, for example routing more viewers/scanners to a purchase a product or service from a higher performing e-commerce store over lower performing e-commerce stores. In another embodiment, a user may be prompted to say or speak a certain word or voice phrase to engage/express interest to learn more about an ad or other content being promoted in the app and/or on a smart speaker.

According to an embodiment, the creative asset client 320 may be configured to collect declared data through interactive surveys/overlays to improve listener/viewer targeting. For example, the creative asset client 320 may overlay a question asking whether a viewer liked a commercial selected to show and prompt the viewer to respond via gesture or use of their input device. The declared data can verify interest for example in a certain genre of movies. Additionally, survey trees or cascades can be generated by the creative asset client. In this example, a follow up question can be presented to the viewer on whether they would watch the sequel for the movie. Subsequently, a media entity can use the collected data to target viewers who explicitly expressed interest in watching the sequel when the movie sequel is released. The subject technology enables enhancement of the relevancy of a creative assets, promotions and content based on viewer declared data in a privacy compliant manner, which can include explicitly provided information for example to join a mailing list or receive an update for a promotion of interest.

FIG. 4 is an example flow diagram of the dynamic insertion of creative assets process from the point of view of an app client. For purposes of this application, the terms “app” and “app client” will be considered equivalent terms and may be used interchangeably. Although the steps are shown in an example order, one skilled in the art will recognize that other orders of these steps are operable with embodiments of the present invention and that multiple threads of execution may be appropriate depending upon the actual implementation. Further, additional steps can be included or some steps omitted, yet similarly achieve the techniques of the present invention.

In step 410, a user accesses an app. In step 420, the app is initiated and the user starts viewing digital media into which dynamic creative assets will be delivered. In step 430, the multimedia server establishes communication with the app through, for example, an API. Data sent from the multimedia server may include advertising data, codes, binary files, web content, event notifications, billing data, and other information as described below. It also may contain scheduling directions for when the creative assets are to be inserted into the app. Additionally, it may contain coded descriptions or indications of the creative assets sent, so that the app client can identify where to place the creative asset. For instance, one type of creative asset could be a hanging sign advertisement. The type and genre of the creative asset could affect where the creative asset is placed within the app. For example, creative assets indicating sporting goods, such as NIKE® shoes may be out of place in a cooking app, while an ad for BUDWEISER® is out of place in a children's app. By identifying not only the type of ad, but also the genre of ad, advertisements can be targeted to or restricted from users selectively. A detailed discussion of the types and genres of creative assets follows below.

Referring back to FIG. 4, step 440 performs a check to see if the app user wants to continue using the app. As long as the app user is using the app, step 440 will be exited in the affirmative direction to step 450. In step 450, creative assets are sent from the multimedia server and are downloaded via a UDP or TCIP/IP connection. In actuality, this “step” is ongoing and occurs during the entire time that the connection exists (is live) between the multimedia server and the app client. In step 450, the multimedia server sends graphics or other ad data that will make up the actual ad to be displayed in the app and may also include other data, such as scheduling data. The app client receives the creative asset data and stores the creative asset locally. The other data sent by the multimedia server is likewise stored. Alternatively, in order to minimize use of bandwidth resources, the data making up the creative asset to be placed in the app may be compressed. The creative asset, once received by the app client, may be uncompressed prior to storing it in memory, or the creative asset may remain in memory in its compressed format and decompressed only when actually being placed into the app. If the creative asset graphics are compressed or otherwise coded, they will need additional memory for the decompression/decoding. Creative asset encryption can be handled similarly. The creative asset, in either its compressed or uncompressed state, may be stored in memory, or it may be cached out to another type of storage, such as a hard drive or other non-volatile storage.

In step 460, the creative asset client 320 checks for the existence of an advertising tag in the app. An advertising tag, as described above, is a code or other indicator that indicates an advertisement could be placed in a particular location of the app. The tag could have been generated based on information stored in the app file on a hard drive or other storage of the app client, or could have been included in the data sent by the multimedia server to the creative asset client in step 450. Alternatively, the multimedia server can target dynamic creative asset placement based on identification of an intended content element, such as a certain title, paragraph or menu position (second paragraph or 5th menu row). The dynamic content positioning can be altered based on different apps, device types, pages or sections, or other objects in the app. For example, dynamic placement can be performed by the multimedia server on the second paragraph of the lifestyle section and/or on a mobile device however after fourth paragraph on sports section and/or on a desktop, unless either are within a paragraph of an image or video player, in such an event an alteration in positioning can be programmed on any device. In an alternative embodiment, a separate execution thread is launched to check for advertising tags or targeted content elements. When one is found, the CAIPD is notified. When an advertising tag or content element is detected in step 460, the creative asset client checks in step 470 to see if it is already storing any creative assets available and appropriate to be placed in that location. In order to determine when a particular creative asset can be placed in the location or on the object that corresponds to an advertising tag or content element, different criteria are used. The exact criteria examined are customizable and dependent upon the type of app (e.g., the app content) and the creative assets. In certain scenarios some criteria may be relevant while others not. Some criteria that could be examined for appropriateness prior to inserting a creative asset into a particular advertising tag or content element location are creative asset type, creative asset genre, creative asset key performance indicators and creative asset scheduling time. The creative asset type relates to its material. For instance, some of the types of creative assets can include static images, animated or dynamic images, HTML or other programmatic codes, and audio or video files. The audio or video files can either be downloaded in whole or in a streamed format. Additionally, each creative asset stored in the multimedia server is associated with a genre, even if the genre is universal. For instance, one genre may be sports related, and another may refer to alcohol. Creative assets that correspond to the sports genre may be placed in a school related app, while the alcohol genre ads may not be (for many reasons, including convention and legal). Further, each creative asset has associated scheduling information that prescribes when the advertisement can be placed in the app. For example, an advertiser may want their advertisements only to appear on weekends, and not during weekdays. The scheduling information about that advertisement would instruct the creative asset client only to insert that creative asset during the allowable scheduled times.

Determining if a given creative asset conforms to the type specified in an advertising tag or content element found in step 460 involves comparing the type or types associated with the creative asset to the type or types associated with the particular ad tag or content element. In regards to streaming content, each ad tag or content element within the app may be associated with an object in the content. For example, an object could be a billboard on a building, a sign on a bus, an emblem on a hat or other clothing, a loudspeaker that is actually producing sounds within the content, a scrolling message board such as in Time Square in New York City, or any other type of advertisement that may exist in the real world and is represented in the content. Many of the objects in the streaming content will not have advertising on them, and therefore will not have ad tags associated with them, trees or fire hydrants, for example. When an ad tag is found in step 460, the object to which it is associated will be identified by the creative asset type. For example, step 460 may find an ad tag that is associated with a logo on a baseball cap. The ad tag may be a code that specifies the relative dimensions of the creative asset, or be an indication of the type of object (clothing) or specific object (hat), or some other indication. Step 480 will then examine the inventory of stored creative assets to determine if any of them match the type code of the found ad tag. If one or more advertisements are found, then those could be selected as conforming ads. Alternatively, the creative asset client may request a creative asset to be inserted in the ad tag or content element according to the request and a conforming creative asset is sent from the multimedia server.

In addition to checking to see if the types and genres match, step 480 will check to see if the scheduling requirements for the creative assets and the located ad tag or content element match. As mentioned above, each creative asset may have scheduling data that describes when the creative asset should preferably be inserted in the app. The scheduling data may indicate that there are no restrictions on when the creative asset can be placed into the app, or the scheduling data may tightly control when the creative asset can be placed. For instance, an advertiser may only want a creative asset placed into the app if it is after 5:00 pm local time. Therefore, prior to insertion into the app, a function is called to preferably ensure that the schedule is met. The scheduling need not be limited to time and date restrictions, but are only given by way of example. For instance, an advertiser may wish that a certain series of creative asset are seen by the user in a particular order. The app client could ensure that prior to inserting the second advertisement in the series into the app, that the first advertisement of the series was already inserted, or display the second advertisement only when a certain viewing time or user interaction has occurred with the first advertisement across any device the user utilizes.

Therefore, at least one of the type, genre, and scheduling time associated with the creative asset preferably identically or closely matches the codes associated with the advertising tag or content element for it to be considered a “conforming” creative asset for the purposes of step 480.

In step 470, the creative asset client requests a conforming creative asset from the multimedia server by sending information to the multimedia server describing the creative asset specifications that is specified by the tag or content element found in step 460. Once the multimedia server receives the notice, it will check for conforming creative assets in step 480 and send a creative asset conforming to the desired creative asset specifications (e.g., type, genre, format and/or scheduling time) to the creative asset client. Once the conforming creative asset is downloaded from the multimedia server, it will be re-verified that it conforms to the advertising tag or content element found in step 460, and then await insertion into the app.

In step 484, the conforming creative asset is formatted and placed into the app in the location that corresponds to the advertising tag or content element found in step 460. Formatting a stored creative asset for insertion into native app, OTT, or streaming content may be performed by calling functions or an API provided in the SDK that was used to integrate the system with the native app, OTT, or streaming content. Some of the functions may include de-encrypting the stored creative asset either at this step or when it was stored in memory or decompressing the creative asset that was compressed for transmission from the multimedia server.

Following insertion of the creative asset into the app in step 484, the insertion event or the request to the multimedia server in step 470 is logged by sending the information about the creative asset placement back to the multimedia server in step 490. It is important that the creative asset client communicate this information back to the multimedia server, because the insertion is at least one event that preferably will be paid for by the advertiser. In addition, other information regarding measuring the app user's exposure to the creative asset and the quality of that exposure may be communicated back to the multimedia server for billing and other purposes. In logging the insertion, as much information as possible regarding the event is preferably sent back to the multimedia server, such as creative asset identification, target tag identification, user interaction, date, time, etc. This information can be stored in a backend database associated with the multimedia server in a privacy compliant manner and mined by the media entity or advertisers or others to provide useful information from the patterns that emerge. Additionally, the app client can determine duration of the creative asset (how long the inserted creative asset was displayed) and a measure of the quality of the exposure. The quality of the inserted creative asset may not be the same in every app, and indeed, each insertion may be different. For example, a creative asset may be inserted in the app, but at a location outside of the user viewport, or the creative asset may be presented at a 45° or some other degree angle away from the app user, yielding a less than optimal quality measurement. Alternatively, the app user may be in direct view of an inserted creative asset so that it takes up 33% of the user's available screen. This type of creative asset insertion could be very valuable to advertisers, who may be willing to pay a premium for such treatment. The creative asset measurement routines 370 of the creative asset client 320 can be used to collect and transmit such information to the multimedia server 314.

At a minimum, the data about the creative asset insertion by the creative asset client that is sent to the multimedia server. Additionally, the creative asset client would specifically identify the creative asset that was inserted. The duration of the creative asset insertion would also be included in the event log, as well as an indication of the quality of the insertion, as discussed above, such as size, position, viewing angle, etc. The last two data points (duration and quality) may be combined to create a “pixel-hours” type number, where the total number of pixels making up the inserted creative assets could be counted or calculated, timed, and multiplied to form a rough indication of overall creative asset insertion exposure. This combined data would be more useful than simply listing the number of creative asset insertions. For example, if there were a large number of creative asset insertions that were of a small size (such that they appeared small on the user's screen), these may be roughly equal to only a few creative asset insertions that appeared larger on the user's screen. Pixel-hours is a way to average all of these events to get an overall sense of the app user's exposure to the inserted creative assets. Another way to reflect the exposure of the creative assets displayed in the app would be to set a minimum time and screen size to qualify as a creative asset “hit.” For example, each time a creative asset takes up 25% of the user's screen for 2 seconds it would be classified as a ‘hit’, then the number of hits per app session could be counted and sent to the multimedia server in the event logging step 490. Alternatively, the creative asset 320 might implement a weighting scheme, where certain relative positions and view angles are weighted higher than others. One skilled in the art will recognize that other types of quality measurements and combinations of ratings are possible.

FIG. 5 is a block diagram of a computer system for practicing embodiments of a server portion of the CAIPD. Specifically, the block diagram of FIG. 5 shows a computer system that performs as a multimedia server and ad scheduler. The computer system 500 includes a central processing unit (CPU) 502, a local storage device 504, as well as other input/output devices 506. The computer system 500 also includes a computer memory 510. As described with reference to FIG. 3B, components of the CAIPD 512 typically reside in the computer memory 510 and execute on the CPU 502. The creative asset scheduler 514, for example, could be a program or process that is used to select and schedule ads or other content to be transmitted to the app. Alternatively, the ad scheduling function may be a separate component from the ad selection function. In some embodiments, a dynamic inserter portion 512 of the CAIPD present on the multimedia server includes a creative asset billing system 520 for processing logged billing information and potentially automatically interacting with billing infrastructure of advertising companies. Other programs 530 also may reside in the memory 510. One of the input/output devices 506 would be the connection means between the multimedia server and the communication network, such as a network card coupled to a LAN.

One skilled in the art will recognize that the exemplary CAIPD can be implemented as one or more code modules and may be implemented in a distributed environment where the various programs residing in memory 510 are instead distributed among several computer systems. Similarly, one would recognize that many computers 500 could be interconnected to make a single multimedia server.

FIG. 6 is an example flow diagram of the dynamic integration of creative assets process from the point of view of a multimedia server. Although the steps are shown in an example order, one skilled in the art will recognize that other orders of these steps are operable with embodiments of the present invention and that multiple threads of execution may be appropriate. Specifically, in step 610, a link is established between the multimedia server and the app. In step 620, a creative asset scheduler (such as scheduler 340) that resides in the multimedia server determines which ad or other content should be next sent to the app. The creative asset scheduler can select the next creative asset based on criteria provided by the app, or can make the determination itself. In step 630, the selected creative asset/ad/content is sent to the app through the communication link established in step 610, and, after the creative asset is inserted into the app, the creative asset insertion event is logged in step 640. Step 650 determines whether data is still being sent from the app. If so, the process loops back to step 620 and another creative asset is selected to send. If, however, the app is no longer connected or responding to the multimedia server, the communication link is closed in step 660, and this event is logged in step 670.

Discussing the above steps in more detail, in step 610 the multimedia server, such as the multimedia server in FIG. 3A, receives an indication to establish a connection with an app, such as the application software 105 in FIG. 1 or the app 312 in FIG. 3A. Communication with the app is established as described above, for example, with reference to step 430 of FIG. 4. Initially, and after having been authenticated by the multimedia server, the app sends information about itself. In addition to its IP address, which may be immediately expunged upon geolocation being verified by the system to maintain compliance with privacy law such as GDPR and CCPA, and the type of app, some of the information sent by the app could include, for example, the size of memory available for storing and decoding creative assets, the size of memory available for storing other data relative to the creative assets, the speed of the app's connection to the Internet, and, a list of the types and/or genres of creative assets that can be inserted into the app. Providing a list of the types and/or genres that are acceptable to an app may be done during step 610, or the type/genre of one particular creative asset can be sent as part of the advertising tag that is found during use of the app, as described below. It is useful to send the list all at once in step 610, however, because the multimedia server can then better tailor what is sent to the app. Alternatively, this information can be stored solely on the app side and non-conforming creative assets merely discarded. The list of types and genres of creative assets that can be inserted into the app can be derived from the list of ad tags or content elements that are found in the app, and/or could be previously stored in the multimedia server based on the multimedia server's knowledge of the app. This list of acceptable types/genres of creative assets can be stored in the multimedia server, and creative assets that do not appear on the list would not be sent to the app. For example, the list may include an indication of which physical types of creative assets are acceptable, such as a 320×50 pixels. Further, the creative assets may include object “skins”, such as clothing, covers, hats, etc., and entire characters that could be advertisements, such as a person carrying a signboard, or a game monster, etc. Additionally, objects displayed in the app could be presented as advertisements, or have advertisements placed on them. Simply stated, any object or any part of part of the app content is capable of being used as an advertisement, or having some sort of advertisement placed in it.

With respect to genres, as described above with reference to FIG. 4, because all types of creative assets may not be suitable for all app users, embodiments of the CAIPD allow the app client to specify certain genres, or classes of creative assets that are not acceptable to that app client. Similar to the list of types described above, the list of all of the genres in an app could be sent in step 610, or the individual genre matching an ad tag or content element could be sent as the ad tags are individually found in the app. An example genre of creative assets may be beer or other alcohol type creative assets, which should not be targeted to those too young to purchase such products. In that case, the app 312 would indicate to the multimedia server that such a genre of creative assets is unacceptable; and such creative assets would not be inserted into the app. (Note, as above, that such control can take place at the server or the client side of the communication.) In addition, a preferred type of genre, such as sports, could also be sent to the multimedia server, and those type creative assets would be selected with more frequency compared to the others, as further described below.

The list of acceptable creative asset types and genres that the app client presents to the multimedia server can be sent to the multimedia server in many ways and at different times. As mentioned above, the app client may send an entire list of all of the acceptable creative asset types/genres/formats in the entire app as part of establishing the communication link with the multimedia server (step 610 in FIG. 6). In this manner, the multimedia server can be notified that creative assets not conforming to the list will never be used in the app. Alternatively, the app client could send a list of all of the acceptable creative asset types/genres/formats that are in the vicinity of where the user is currently located in the app. An initial list would be sent in step 610 which would have to be updated on a periodic basis. Under this scenario, creative assets that the multimedia server sends to the app client are more likely to be used in a short time. Or the app client may send only a list of acceptable creative asset types/genres/formats that are currently active. Active types/genres/formats may correspond with ad tags currently within the field of view of the app user. Then, any creative asset that the multimedia server sends down to the app client that conforms to those creative asset types/genres/formats may be inserted into the app immediately, provided that the scheduling information of the downloaded creative assets was also appropriate and that the field of view remained unchanged.

In addition to sending an entire list of acceptable creative asset types/genres/formats under any of these scenarios, the app client may individually update the list stored on the multimedia server. For example, the app client may initially send a full list of acceptable creative asset types that are currently in use. Then, as the user navigates the app, exposing other creative asset types, the app client could send a message to the multimedia server (for example, an “update event”) to add or delete various creative asset types on an individual basis. Note that sending the list of acceptable creative asset types is not the same as sending the list of advertising tags found in the app. For example, there may be 100 advertising tags or content elements in the app that fall into only two acceptable ad types and three ad genres. In this case, preferably only the list of the acceptable ad types and genres, and not each of the advertising tags, may be sent to the multimedia server. In practice, this may involve maintaining a list of acceptable ad types/genres/formats on both the app client and the multimedia server, and, when the app client finds a new ad tag in the app (step 460 of FIG. 4), the app client first updates its list, then sends the update to the list on the multimedia server. Deletions could be similarly handled.

In addition to supporting selection of ads by type, genre, format and scheduling criteria, the CAIPD can support the automatic targeting of ads based upon additional information sent from one app client to the multimedia server or user declared data and interests collected by the system as described in the manner of a survey responses from previous viewed creative assets. For example, a user's musical or video selection can be used to target (select specifically for that user) creative assets to watch a certain type of show declared to be of interest by the user.

In step 620 of FIG. 6, once communication is established with the app client, the CAIPD must determine which creative asset to send to the app client. Preferred embodiments of the CAIPD will provide a preferred creative asset type, format and genre to the multimedia server. In this manner, the multimedia server can simply choose a creative asset from its database that conforms to the creative asset and device type, format and genre, and prepare the selected creative asset for sending it to the app client. Preparation may include compressing, or otherwise modifying the creative asset for transfer. Typically, on the multimedia server there may reside more than one creative asset that meets the preferred type, format and genre descriptions. For example, if the type of the creative asset was a clothing logo and the genre was sports, examples of conforming creative asset could include emblems of NIKE®, ADIDAS®, UNDER ARMOUR®, etc. A decision thread or process running on the multimedia server would select one of the creative assets to send to the app client. If instead, no type or genre is specified, the decision thread simply selects any creative assets from its database, and allows the app client to determine whether it is appropriate. This second method may not be preferred because the probability that the creative asset will not be a conforming one is high, and the downloaded creative asset may simply be wasted.

Step 630 of FIG. 6 provides that, once a creative asset has been selected to send to the app client in step 620, the selected creative asset is transmitted to the app client. In addition to sending the creative asset in step 630, information about the creative asset may also be sent by the multimedia server to the app client. For example, scheduling information of the creative asset, as well as a description of the creative asset may be sent to the app client. The app client can use the scheduling information to determine when to insert the downloaded creative asset into the app.

After the creative asset has been sent to the app client and inserted into the app, in step 640, the app client sends a response back to the multimedia server to log the event. Additionally, the multimedia server notifies the creative asset scheduler in the multimedia server of which creative assets were sent to the app client, whether they were requested or selected, and as much other information as possible about the system. This information may instead or also be sent to a billing system if one is available. For example, the multimedia server may report that creative asset “A” was sent to app client “B” at time “C”. Logging data from both the multimedia server and from the app client in the creative asset scheduler are represented in step 640 of FIG. 6.

The information logged in step 640 is used for a variety of purposes. Foremost, it is used for billing the advertiser, in that each time a creative asset is inserted into an app, it is known by the advertiser that it appeared on the app user's screen. Additionally, as discussed above, some level of the quality of the exposure to an advertising placement may be also gleaned and reported. This information can also be used in billing the advertiser, perhaps by charging more for premium advertising placements or longer durations. Also, some advertisers may want to purchase a guaranteed minimum number of impressions. The information logging step 640 allows the multimedia servers to track this data in real-time, in that a program connected to the creative assetscheduler can easily track the number of creative asset impressions of a particular ad or advertiser. Once the minimum number of impressions are met, the creative asset scheduler may choose another advertiser's advertisement in the selection step 620. The information logged in the step 640 can also be used for statistical analysis and reports, such as which advertising tag location in an app yields the most or highest quality insertions, which creative asset types/formats generated the highest KPI results, which genre, contextual and/or emotional creative asset and digital media content matching generated the highest performance and user interest/engagement.

In step 650, the CAIPD checks to see if the app client is still responding by either requesting other creative assets or sending creative asset placement events to be logged by the multimedia server, or by even sending useless fill data. If the app client continues sending data, the flow loops back to step 620 to determine if the app client is requesting a specific creative asset. If instead the app client is no longer responding, then the multimedia server flow may shutdown in step 660 by closing the link established with the app client in step 610. The shutdown event is logged in step 670.

When scheduling which creative asset to send to the app client, the creative asset scheduler reviews which creative assets are available to be inserted to the app with respect to creative asset and device type, format, genre, position, timing considerations, etc. Because these requirements may be continuously changing, embodiments of the creative asset scheduler are capable of operating in a mode where all of the limitations are checked prior to sending a creative asset to the app client. For example, if the creative asset scheduler decides not to send advertisement “A” to an app client at a particular time, that does not exclude the advertisement A from ever being sent to the app client. For instance, if advertisement A was not initially sent because the app client indicated that it had not encountered the matching ad tag type, once the matching ad tag type was requested by the app client, the ad scheduler could then choose to insert advertisement A.

Another consideration for choosing which creative asset to send to the app client is the total number of times a creative asset has been seen by app users. The creative asset scheduler can factor in the number of times a particular creative asset has been seen when deciding which creative asset to next send to the app client. For example, an advertiser may pay for 10,000 ad views over a three hour period. The creative asset scheduler may send these conforming creative assets to the app client, and then, by checking the insertion data sent back by the app client, determine that the requisite number of ad views have been inserted, and no longer send that creative asset during the contract period.

It should be remembered that the multimedia server may be coupled to many app clients, and that the selection by the creative asset scheduler of which creative asset to next send in step 620 may be different for every connected app client. In this case the creative asset scheduler keeps track of information on a per client, per session basis. In one embodiment, this tracking is implemented using well-known database techniques.

From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the invention. For example, one skilled in the art will recognize that the methods and systems discussed herein are applicable to other areas and devices other than apps having ads dynamically incorporated, such as email, web browsers, newsreaders, online books, navigation devices, other multimedia devices and environments, voice based media, applications and devices, etc. In addition, different forms of content can be dynamically incorporated into multimedia targets, including, but not limited to, webpages, HTML, XML, other code, RSS and other feeds, data such as social media content and other data sources, audio, video and static or animated text or graphics. One skilled in the art will also recognize that the methods and systems discussed herein are applicable to differing protocols and communication media (optical, wireless, cable, etc.) and that the techniques described herein may be embedded into such a system. In addition, those skilled in the art will understand how to make changes and modifications to the methods and systems described to meet their specific requirements or conditions.

The embodiments described above may include methods performed as computer program products. Traditionally, a computer program includes a finite sequence of computational instructions or program instructions. It will be appreciated that a programmable apparatus or computing device can receive such a computer program and, by processing the computational instructions thereof, produce a technical effect.

A programmable apparatus or computing device includes one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, programmable devices, programmable gate arrays, programmable array logic, memory devices, application specific integrated circuits, or the like, which can be suitably employed or configured to process computer program instructions, execute computer logic, store computer data, and so on. Throughout this disclosure and elsewhere a computing device can include any and all suitable combinations of at least one general purpose computer, special-purpose computer, programmable data processing apparatus, processor, processor architecture, and so on. It will be understood that a computing device can include a computer-readable storage medium and that this medium may be internal or external, removable and replaceable, or fixed. It will also be understood that a computing device can include a Basic Input/Output System (BIOS), firmware, an operating system, a database, or the like that can include, interface with, or support the software and hardware described herein.

Embodiments of the system as described herein are not limited to applications involving conventional computer programs or programmable apparatuses that run them. It is contemplated, for example, that embodiments of the disclosure as claimed herein could include an optical computer, quantum computer, analog computer, or the like.

Regardless of the type of computer program or computing device involved, a computer program can be loaded onto a computing device to produce a particular machine that can perform any and all of the depicted functions. This particular machine (or networked configuration thereof) provides a technique for carrying out any and all of the depicted functions.

Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Illustrative examples of the computer readable storage medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

A data store may be comprised of one or more of a database, file storage system, relational data storage system or any other data system or structure configured to store data. The data store may be a relational database, working in conjunction with a relational database management system (RDBMS) for receiving, processing and storing data. A data store may comprise one or more databases for storing information related to the processing of moving information and estimate information as well one or more databases configured for storage and retrieval of moving information and estimate information.

Computer program instructions can be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner. The instructions stored in the computer-readable memory constitute an article of manufacture including computer-readable instructions for implementing any and all of the depicted functions.

A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

The elements depicted in flowchart illustrations and block diagrams throughout the figures imply logical boundaries between the elements. However, according to software or hardware engineering practices, the depicted elements and the functions thereof may be implemented as parts of a monolithic software structure, as standalone software components or modules, or as components or modules that employ external routines, code, services, and so forth, or any combination of these. All such implementations are within the scope of the present disclosure. In view of the foregoing, it will be appreciated that elements of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, program instruction technique for performing the specified functions, and so on.

It will be appreciated that computer program instructions may include computer executable code. A variety of languages for expressing computer program instructions are possible, including without limitation C, C++, Java, JavaScript, assembly language, Lisp, HTML, Perl, and so on. Such languages may include assembly languages, hardware description languages, database programming languages, functional programming languages, imperative programming languages, and so on. In some embodiments, computer program instructions can be stored, compiled, or interpreted to run on a computing device, a programmable data processing apparatus, a heterogeneous combination of processors or processor architectures, and so on. Without limitation, embodiments of the system as described herein can take the form of web-based computer software, which includes client/server software, software-as-a-service, peer-to-peer software, or the like.

In some embodiments, a computing device enables execution of computer program instructions including multiple programs or threads. The multiple programs or threads may be processed more or less simultaneously to enhance utilization of the processor and to facilitate substantially simultaneous functions. By way of implementation, any and all methods, program codes, program instructions, and the like described herein may be implemented in one or more thread. The thread can spawn other threads, which can themselves have assigned priorities associated with them. In some embodiments, a computing device can process these threads based on priority or any other order based on instructions provided in the program code.

Unless explicitly stated or otherwise clear from the context, the verbs “process” and “execute” are used interchangeably to indicate execute, process, interpret, compile, assemble, link, load, any and all combinations of the foregoing, or the like. Therefore, embodiments that process computer program instructions, computer-executable code, or the like can suitably act upon the instructions or code in any and all of the ways just described.

The functions and operations presented herein are not inherently related to any particular computing device or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent to those of ordinary skill in the art, along with equivalent variations. In addition, embodiments of the disclosure are not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the present teachings as described herein, and any references to specific languages are provided for disclosure of enablement and best mode of embodiments of the disclosure. Embodiments of the disclosure are well suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks include storage devices and computing devices that are communicatively coupled to dissimilar computing and storage devices over a network, such as the Internet, also referred to as “web” or “world wide web”.

In at least some exemplary embodiments, the exemplary disclosed system may utilize sophisticated machine learning and/or artificial intelligence techniques to prepare and submit datasets and variables to cloud computing clusters and/or other analytical tools (e.g., predictive analytical tools) which may analyze such data using artificial intelligence neural networks. The exemplary disclosed system may for example include cloud computing clusters performing predictive analysis. For example, the exemplary neural network may include a plurality of input nodes that may be interconnected and/or networked with a plurality of additional and/or other processing nodes to determine a predicted result. Exemplary artificial intelligence processes may include filtering and processing datasets, processing to simplify datasets by statistically eliminating irrelevant, invariant or superfluous variables or creating new variables which are an amalgamation of a set of underlying variables, and/or processing for splitting datasets into train, test and validate datasets using at least a stratified sampling technique. The exemplary disclosed system may utilize prediction algorithms and approach that may include regression models, tree-based approaches, logistic regression, Bayesian methods, deep-learning and neural networks both as a stand-alone and on an ensemble basis, and final prediction may be based on the model/structure which delivers the highest degree of accuracy and stability as judged by implementation against the test and validate datasets.

Throughout this disclosure and elsewhere, block diagrams and flowchart illustrations depict methods, apparatuses (e.g., systems), and computer program products. Each element of the block diagrams and flowchart illustrations, as well as each respective combination of elements in the block diagrams and flowchart illustrations, illustrates a function of the methods, apparatuses, and computer program products. Any and all such functions (“depicted functions”) can be implemented by computer program instructions; by special-purpose, hardware-based computer systems; by combinations of special purpose hardware and computer instructions; by combinations of general purpose hardware and computer instructions; and so on—any and all of which may be generally referred to herein as a “component”, “module,” or “system.”

While the foregoing drawings and description set forth functional aspects of the disclosed systems, no particular arrangement of software for implementing these functional aspects should be inferred from these descriptions unless explicitly stated or otherwise clear from the context.

Each element in flowchart illustrations may depict a step, or group of steps, of a computer-implemented method. Further, each step may contain one or more sub-steps. For the purpose of illustration, these steps (as well as any and all other steps identified and described above) are presented in order. It will be understood that an embodiment can contain an alternate order of the steps adapted to a particular application of a technique disclosed herein. All such variations and modifications are intended to fall within the scope of this disclosure. The depiction and description of steps in any particular order is not intended to exclude embodiments having the steps in a different order, unless required by a particular application, explicitly stated, or otherwise clear from the context.

The functions, systems and methods herein described could be utilized and presented in a multitude of languages. Individual systems may be presented in one or more languages and the language may be changed with ease at any point in the process or methods described above. One of ordinary skill in the art would appreciate that there are numerous languages the system could be provided in, and embodiments of the present disclosure are contemplated for use with any language.

While multiple embodiments are disclosed, still other embodiments of the present disclosure will become apparent to those skilled in the art from this detailed description. There may be aspects of this disclosure that may be practiced without the implementation of some features as they are described. It should be understood that some details have not been described in detail in order to not unnecessarily obscure the focus of the disclosure. The disclosure is capable of myriad modifications in various obvious aspects, all without departing from the spirit and scope of the present disclosure. Accordingly, the drawings and descriptions are to be regarded as illustrative rather than restrictive in nature.

Claims

1. A method for dynamically inserting ads into online content, performed by a computer processor, comprising:

extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium;
analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file;
identifying a digital media content environment from the characteristics associated with the digital media file;
selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment;
processing dynamic rules for asset formatting of the selected advertisement asset; and
integrating and distributing the advertisement asset into the digital media content environment.

2. The method of claim 1, further comprising:

identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment; and
integrating the advertisement asset into the digital media file in response to the triggering event.

3. The method of claim 2, further comprising inserting an adjunct window displaying the integrated advertisement asset, wherein the adjunct window is displayed concurrently with the digital media file.

4. The method of claim 3, further comprising resizing a main window displaying the digital media file as the adjunct window is inserted.

5. The method of claim 2, wherein the triggering event is based on criteria specified by a media provider or within a file of the advertisement asset.

6. The method of claim 1, further comprising:

identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment;
identifying a type of user action associated with the triggering event;
identifying a rule associated with the type of user action;
distributing the advertisement asset into the digital media player in a display format dependent on the rule identified with the type of user action, wherein the display format is different based on the rule identified.

7. The method of claim 6, wherein the type of user action includes scrolling through the digital media file.

8. The method of claim 1, further comprising:

identifying whether a criteria associated with the display of the advertisement asset has been met within the digital media content environment; and
persisting display of the advertisement asset within the digital media content environment until a rule based on user interaction with the digital media content environment has been fulfilled.

9. The method of claim 1, further comprising:

logging, in association with a display of a particular advertisement asset: a number of times the particular advertisement asset was displayed, a number of total pixels used in the display of the particular advertisement asset, and an aggregated duration of display time; and
generating a quality of display metric for the particular advertisement asset, wherein the metric is used to determine an overall ad exposure for the particular advertisement asset in the digital media environment.

10. The method of claim 1, wherein:

the step of integration and distribution of the advertisement asset occurs at varying points in time during a performance of the digital media file; and
more than one advertisement asset is integrated and distributed into the digital media content environment during the performance of the digital media file.

11. A computer program product for dynamically inserting ads into online content, the computer program product comprising:

one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media, the program instructions comprising:
extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium;
analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file;
identifying a digital media content environment from the characteristics associated with the digital media file;
selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment;
processing dynamic rules for asset formatting of the selected advertisement asset; and integrating and distributing the advertisement asset into the digital media content environment.

12. The computer program product of claim 11, wherein the program instructions further comprise:

identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment; and
integrating the advertisement asset into the digital media file in response to the triggering event.

13. The computer program product of claim 12, further comprising inserting an adjunct window displaying the integrated advertisement asset, wherein the adjunct window is displayed concurrently with the digital media file.

14. The computer program product of claim 13, further comprising resizing a main window displaying the digital media file as the adjunct window is inserted.

15. The computer program product of claim 12, wherein the triggering event is based on criteria specified by a media provider or within a file of the advertisement asset.

16. The computer program product of claim 11, further comprising:

identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment;
identifying a type of user action associated with the triggering event;
identifying a rule associated with the type of user action;
distributing the advertisement asset into the digital media player in a display format dependent on the rule identified with the type of user action, wherein the display format is different based on the rule identified.

17. The computer program product of claim 16, wherein the type of user action includes scrolling through the digital media file.

18. The computer program product of claim 11, further comprising:

identifying whether a criteria associated with the display of the advertisement asset has been met within the digital media content environment; and
persisting display of the advertisement asset within the digital media content environment until a rule based on user interaction with the digital media content environment has been fulfilled.

19. The computer program product of claim 11, further comprising:

logging, in association with a display of a particular advertisement asset: a number of times the particular advertisement asset was displayed, a number of total pixels used in the display of the particular advertisement asset, and an aggregated duration of display time; and
generating a quality of display metric for the particular advertisement asset, wherein the metric is used to determine an overall ad exposure for the particular advertisement asset in the digital media environment.

20. The computer program product of claim 11, wherein:

the step of integration and distribution of the advertisement asset occurs at varying points in time during a performance of the digital media file; and
more than one advertisement asset is integrated and distributed into the digital media content environment during the performance of the digital media file.
Patent History
Publication number: 20220038757
Type: Application
Filed: Aug 2, 2021
Publication Date: Feb 3, 2022
Applicant: Spotible Labs LLC (New York, NY)
Inventor: Dana Ghavami (New York, NY)
Application Number: 17/391,956
Classifications
International Classification: H04N 21/234 (20060101); H04N 21/235 (20060101); H04N 21/431 (20060101); H04N 21/262 (20060101); G06Q 30/02 (20060101);