System for Real Time Internet Protocol Content Integration, Prioritization and Distribution
A system and method to integrate, prioritize, and distribute multimedia assets into Internet Protocol (IP) media in real time. Embodiments include dynamic media integration based on permissible content elements/locations and app events, asset formatting based on user exposure and key performance indicators and contextual relevancy. Consumer engagement is enhanced through gesture-based interaction on the native device of the user and intelligent routing from the device of the user when performing for example a quick response (QR) or embedded code scan, which is prioritized based on user geo and other criteria and landing page and other performance parameters. The most suitable assets are dynamically selected, integrated and distributed into the content environment.
Latest Spotible Labs LLC Patents:
This application claims benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Number 63/059484, filed Jul. 31, 2020, which is hereby incorporated by reference herein in its entirety.
FIELD OF THE INVENTIONThe present invention relates generally to computer-based methods and apparatuses, including computer program products, for a system and method of real time Internet Protocol (IP) content integration, prioritization and distribution.
BACKGROUND OF THE DISCLOSUREDigital media promotions can be targeted and displayed in a multitude of ways. By and large, however, IP connected media and app promotions are preset in placement.
Conventionally, digital media ad placements are based on predetermined locations positioned inside a website/webpage by location-specific ad tags, and rotated by ad systems. Beyond webpage content, there are increasing ways consumers are accessing digital content such as through native smartphone apps and voice-based smart speakers. Native platforms pose even greater challenges in the integration of ad placements which need to be developed and released into new application versions; a process which may take days, weeks or sometimes several months until completed, approved and updated on an app store.
Consumer in-app engagement for example on a connected TV (CTV) or by means of using a device like a smartphone to engage for example with a quick response (QR) code printed in digital or non-digital media is encumbered by difficult remote controlled based response and a lack of targeting embedded Uniform Resource Locators (URLs) for personalized landing pages.
Additionally, digital media ad delivery is predominately fixed in orientation unable to adapt placement for any particular reason.
The exemplary disclosed system and method of the present disclosure is directed to overcoming one or more of the shortcomings set forth above and/or other deficiencies in existing technology.
In one aspect of the subject disclosure, a method for dynamically integrating ads into IP content, performed by a computer processor is disclosed. The method includes extracting structural metadata and contextual features from a digital media file. The digital media file is configured for performance in an Internet protocol (IP) based medium. The structural metadata and contextual features are analyzed for an understanding of characteristics associated with the digital media file. A digital media content environment is identified from the characteristics associated with the digital media file. An advertisement asset is selected from a storage of assets. The selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment. Dynamic rules for asset formatting of the selected advertisement asset are processed. The advertisement asset is integrated and distributed into the digital media content environment.
In another aspect, a computer program product for dynamically integrating ads into IP content is disclosed. The computer program product comprises one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media. The program instructions include: extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium; analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file; identifying a digital media content environment from the characteristics associated with the digital media file; selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment; processing dynamic rules for asset formatting of the selected advertisement asset; and integrating and distributing the advertisement asset into the digital media content environment.
The techniques described herein may be implemented in a number of ways. Example implementations are provided below with reference to the following figures.
DETAILED SPECIFICATIONThe subject technology generally relates to dynamically integrating creative assets into software applications (apps) or other digital media. In an exemplary embodiment, a method dynamically integrates creative assets and content into web-based (IP-connected) media and apps at any time within a content's play/replay and at any dynamically determined location in the IP-based digital media while a user interacts with the content or application. Creative assets may be for example, advertising, promotional or informative multimedia content.
Embodiments provide computer-based methods and systems for providing a minimally intrusive mechanism that allows a media entity such as a content publisher to dynamically incorporate creative assets, such as advertisements, into digital media. Digital media may constitute, for example, IP connected websites/webpages, email clients, native applications, over-the-top (OTT) streaming platforms, CTVs, smart speakers, gaming consoles, out-of-home (OOH) billboards/signs, augmented reality (AR) and virtual reality (VR) headsets/displays or other environments (collectively, an “app”). Example embodiments provide real-time creative asset or content targeting for enabling a viewer or listener to experience promotions or other relevant content dynamically integrated into the app as the user consumes the digital media. These promotions can be dynamically updated as the user navigates the digital media and may be triggered by predetermined events or behaviors of the user with respect to the app.
Referring now to
In one embodiment, a media entity is provided a location-based events control panel to insert creative assets, promotional content or other media in IP connected digital media. The media entity dynamically permits locations within the digital media where creative assets may be rendered. Each permitted location-based event is associated with a content element, which is further associated preferably with one or more criteria, which is used by the system to select the most appropriate creative assets, promotional content or other media. In one example, the media entity may decide to permit the insertion of creative assets into a certain part of a page section, or above/below a certain content element, or within a certain paragraph number depending on the length of the webpage or article. In another example, if the subject matter of the content being viewed by a user relates to a certain category such as vehicles, the media entity could enable advertisers to dynamically integrate a car commercial at an optimal location in the content not hard-coded in advance, instead determined on-the-fly based on location placement logic and rules to create the highest level of viewability, contextual adjacency and lowest dilution through placement with a defined amount of distance from competing promotions or content.
In some embodiments, user activity-based event listeners are programmed to detect certain actions when a user interacts with an app. These actions can include logging into an app, scrolling through content, hovering over an object, selecting a category of content, or some other in-app activity. Promotional integration and exposure is thus achieved throughout the viewer's entire content journey in the app, instead of just a specific section. For example,
According to an embodiment, monetization and inventory management optimization can be achieved based on ad exposure time and effective value per user visit. For instance, commercial lengths and formats shown on-the-fly can be optimized based on a total value per visit goal. By way of example, when selecting commercials of various lengths (for example, 15 and 30 seconds) or type (for example, video or image) a dynamic selection can be made based on expected length of a content viewer's app visit calculated based on average time spent during previous visits, effectively optimizing ad frequency and creative asset formats exposure to achieve a revenue goal per viewer app session.
In other embodiments, the time a content viewer is subjected to the creative asset is also measured. Additional qualitative factors can also yield subjective measurement values, such as the activity the viewer was engaged in at the time of exposure to the creative asset. Any one or more factors can be used alone, or in combination, or weighted in some manner, to achieve optimal creative asset integration, location, and orientation. Accordingly, embodiments may programmatically adapt functionality of creative asset formats to meet certain performance goals which can be defined for each type of user or device. For example, a key performance indicator (KPI) rule may be programmed for a creative asset format to optimize rendering in accordance to the largest viewable video size versus highest video completion rate. For example, in one instance a video or multimedia creative asset is dynamically displayed at the top of an app in the largest viewable size possible for at least two seconds to meet Media Rating Council (MRC) accredited viewability standard. Alternatively, the same video or multimedia creative asset may be dynamically formatted to display at the top of the app and additionally persist in-view when a viewer scrolls down the app in a customizable manner (e.g., persisting the video in a certain orientation, size and amount of time) to achieve a certain multimedia or video quartile/completion rate.
In another embodiment, KPI optimization may also be programmed for the viewability of imagery or multimedia with reactive persistence.
Additionally, phases/break points to adapt creative elements or different creative sizes/aspect ratios can be programmed to adapt a creative asset across different device/screen types. For example, a horizontal 16:9 aspect ratio video can be auto-formatted on larger desktop/laptop, tablet or TV screen and a vertical 9:16 aspect ratio video can be auto-formatted in accordance with a KPI rule for smartphone device screens.
In other embodiments, event notifications are sent to the media entity from the app to provide information on the creative assets integrated, dynamic creative asset locations available, and/or other events. These event notifications may be used by the multimedia server to schedule creative assets. Additionally, signals can be detected in a video stream by programmed cue points, meta data, or via real-time multimedia machine analysis to determine a certain point in time or content event as a trigger to dynamically integrate a creative asset either by linear or non-linear insertion. For example, upon identification of a certain cue event or context, such as a change in score in a live streaming game, a creative asset can be overlaid on the screen of the app promoting a certain brand. Alternatively, silence can be detected such as the point in time in which a podcast is transitioning between topics in an episode, as a dynamic point in time to introduce a sponsor message or commercial break on a user's device (for example a smartphone or smart speaker).
In one embodiment, the real-time content asset integration, prioritization and distribution system (CAIPD hereinafter, sometimes referred to simply as the “system”) comprises a multimedia server, a creative asset client, and an creative asset server, each of which is connected to a communication network, such as the Internet. For purposes of this application, the terms “creative asset client” and “multimedia server” shall be regarded as equivalent to “creative asset client” and “creative asset server” since they may include creative assets and/or other types of messaging or content. In some embodiments, the system further incorporates a creative asset client for managing communication with the multimedia server. In some other embodiments, a first communication link is made between the creative asset client and the multimedia server, which is used to provide a second communication link between the system (and/or the creative asset client associated therewith) and the multimedia server. In one embodiment the second communications link is a connection for continual delivery of creative assets, promotions, events, and other communication.
According to an embodiment of the invention, the system and method provide dynamic formatting of text (or imagery) or micro-segmented audio content (or video) for presentation and engagement with viewers in the highest performing way. For example, multimedia can be formatted in inline article versus a persistent footer format, where the creative asset format that drives the most interactions, click-throughs, audio (or video) tune-ins/unmutes and/or download and subscription engagement will be shown the most. In one example, transcription of audio or video assets may be displayed to maximize engagement with viewers without needing to actually listen (or watch) the multimedia asset with audio. In other examples, various animations can be applied for enticement in reading what is being transcribed and optimization of the highest performing animation effect or creative format.
In another embodiment, a method is provided for displaying a remote control asset so that the audio (or video) creative asset can be paused and played at any point of the app when the main creative asset audio or video player is scrolled out of view of the user, along with presenting call-to-action to download or subscribe from the persisting remote control asset on the user's screen (which can be configured, for example to persist in the lower right of the app).
Referring now to
In an exemplary embodiment, data may be provided to the system, stored by the system and provided by the system to users of the system across local area networks (LANs) (e.g., office networks, home networks) or wide area networks (WANs) (e.g., the Internet) with protection of personal identifiable information. In accordance with the previous embodiment, the system may be comprised of numerous servers communicatively connected across one or more LANs and/or WANs. One of ordinary skill in the art would appreciate that there are numerous manners in which the system could be configured and embodiments of the present invention are contemplated for use with any configuration.
In general, the system and methods provided herein may be consumed by a user of a computing device whether connected to a network or not. According to an embodiment, some of the applications of the present invention may not be accessible when not connected to a network, however a user may be able to compose data offline that will be consumed by the system when the user is later connected to a network.
Referring now to
According to an exemplary embodiment, as shown in
Components of the system may connect to server 203 via WAN 201 or other network in numerous ways. For instance, a component may connect to the system i) through a computing device 212 directly connected to the WAN 201, ii) through a computing device 205, 206 connected to the WAN 201 through a routing device 204, iii) through a computing device 208, 209, 210 connected to a wireless access point 207 or iv) through a computing device 211 via a wireless connection (e.g., CDMA, GMS, 3G, 4G) to the WAN 201. One of ordinary skill in the art would appreciate that there are numerous ways that a component may connect to server 203 via WAN 201 or other network, and embodiments of the present invention are contemplated for use with any method for connecting to server 203 via WAN 201 or other network. Furthermore, server 203 could be comprised of a personal computing device, such as a smartphone, acting as a host for other computing devices to connect to.
Embodiments provide methods and systems for generating and dynamically incorporating creative assets such as advertisements into IP connected media, specific examples of which are apps, email clients, smart speakers, over-the-top (OTT) streaming platforms, OOH digital billboards/signs, AR and VR headsets/displays and more traditional website content (static and multimedia). Example embodiments provide a control panel in communication with a creative asset server, for enabling a media entity to dynamically integrate creative assets, advertisements (ads) or other content at any desired location in the app from the time a user first accesses the app to the time the user leaves the app. This provides more control over creative asset placement and provides the flexibility of inserting creative assets at any point throughout the viewer's or listener's app journey from start to end. The creative assets are dynamically updated as a viewer navigates through the app, such as by scrolling though movie and TV show titles of interest, or dynamic paneling of commercial messaging while watching streaming content allowing the viewer to interact with the commercial or content in tandem of continuous content playback. For purposes of this invention, one skilled in the art will recognize that “advertisements” can include any type of media or electronic content including static or dynamic text, static or dynamic graphical images of any type, including animated images, web content such as HTML and XML code, other code, video and/or sounds or other audio content including, for example, speech and music whether streamed or downloaded, RSS and other feeds, data such as social media content and other data sources. In addition, advertisements as used herein include any combination of media types, and may be compressed or raw. According to an exemplary embodiment, dynamic information displayed to the user in digital media may be based on relevancy to the user for example a registry of providers based on data that will only display the providers or centers that are in closest geographic proximity to the user. In another embodiment, the display of a data source can be based on relevancy to the content being consumed by the viewer for example the point spread between a game matchup based of teams playing in a professional sports game or moreover a real-time display of dynamic odds based on teams mentioned in a content environment.
A creative asset client manages inserting the creative assets and promotions (“promos or ads”) into the app. Upon initialization, the creative asset client connects to a dynamic multimedia server (for example, server 203 of
Upon the creative asset client receiving an indication that a creative asset is needed in the app, the creative asset client notifies the multimedia server to request a creative asset. The multimedia server, in turn, selects a particular creative asset from its collection according to a set of criteria, and sends that creative asset or an indication of that creative asset to the creative asset client. The creative asset client then provides the creative asset to the app, which inserts the creative asset into the app by displaying it on the screen, playing a sound over the speakers, or by some other method appropriate to the creative asset. In an alternative environment, the multimedia server begins sending a set of creative assets to the creative asset client immediately after the second communication channel is established, where they are stored, for example, by the creative asset client until needed.
Embodiments of the present invention can be used with all types of apps, OTT streaming platforms such as Apple tvOS® or Amazon Fire TV®, voice/audio streaming platforms such as Apple Siri® or Amazon Alexa® smart speakers/products and/or services, and systems, software or hardware that can establish a communication channel with the multimedia server. The communication channel may be established with the app through, for example, an application program interface (API) or software development kit (SDK). Furthermore, the app may be running at any location on a standard personal computer or mobile device connected to a network (e.g., the Internet) through narrow or broadband, DSL, ISDN, ATM and Frame Relay, cable modem, optical fiber, satellite or other wireless network, etc. For purposes of this application the term “app” shall include any software application, including on-line websites, email clients, native apps, OTT streaming service environments/platforms, games, etc.
Although the subject technology is discussed below specifically with reference to apps, smart speakers and OTT services, one skilled in the art will appreciate that the techniques disclosed are useful in other contexts as well, such as dynamic integration of creative assets, ads and/or other content into any Internet connected device. Cost of use of such devices and/or access to pay-per-view content may be subsidized by advertisers paying for advertising directed to the devices. Also, the ads can be made much more effective to the recipient than standard advertising by using a system with dynamic context, location, user activity and KPI event-based ads in the content being viewed or listened and can tailor (or personalize) them to the users of such content.
In one embodiment, the app 312 uses structural metadata for the creative assets and the creative asset destinations. For example, codes or tags may be included in digital media content associated with the creative asset platforms and likewise, similarly corresponding codes or tags may be attached to the various creative asset in storage. The codes or tags indicate for example, permissions that the app verifies against to accept a creative asset to be placed at a location within the app content. The system extracts the tags from the digital media when searching for assets with corresponding or qualifying tags. For instance, streaming content may contain tags for a billboard on a side of a building in a city. As the viewer watches a streaming program a billboard may appear in the video having the advertising tag, an ad, or code designating an ad. If no ad is available locally, an ad is requested from the multimedia server 314. Once the ad is received from the multimedia server 314 (or retrieved locally), it is presented in the space indicated by the billboard tag and appears in the video as a billboard advertising a product. The actual insertion of the ad is performed by the app software itself, with coordination from the creative asset client 320 of
According to one embodiment, the CAIPD extrapolates the native app view hierarchy in order to emulate or remain consistent with the app layout when determining dynamic layout and positioning of creative assets/ads/content in the app. For example, a creative asset may be dynamically integrated into the 3rd row/index of an app on specific devices such as CTV screens, or placed above/below “Upcoming Titles” content element in order to maintain the visual flow and integrity of the app.
The user/viewer may also have the ability to control creative asset, ad or content engagement by interacting with the CAIPD through voice or gesture-based commands native to the user's input device (example, a remote control paired to a CTV). For example, a user may press an enter button on their CTV remote once to watch a trailer for a creative asset promoting a movie and/or press the enter button twice to purchase, instead of responding by more cumbersome means of pointing and clicking the remote control to interact with the creative asset.
In another embodiment, the viewer may interact through QR code or vector format with custom colors, logos and/or graphics by using a smartphone or other device capable of scanning such codes with a smart camera or application, that initiate personalized landing page URLs in association with the promotion or content, deep links to open associated smartphone apps or other actions such as tickets or coupon/discount rewards that can be stored in the mobile wallets of smartphone users. In this embodiment, the subject system intelligently routes the landing page/destination the user will visit or a mobile wallet offer, based on event or geographic-based triggers. Referring to
According to an embodiment, the creative asset client 320 may be configured to collect declared data through interactive surveys/overlays to improve listener/viewer targeting. For example, the creative asset client 320 may overlay a question asking whether a viewer liked a commercial selected to show and prompt the viewer to respond via gesture or use of their input device. The declared data can verify interest for example in a certain genre of movies. Additionally, survey trees or cascades can be generated by the creative asset client. In this example, a follow up question can be presented to the viewer on whether they would watch the sequel for the movie. Subsequently, a media entity can use the collected data to target viewers who explicitly expressed interest in watching the sequel when the movie sequel is released. The subject technology enables enhancement of the relevancy of a creative assets, promotions and content based on viewer declared data in a privacy compliant manner, which can include explicitly provided information for example to join a mailing list or receive an update for a promotion of interest.
In step 410, a user accesses an app. In step 420, the app is initiated and the user starts viewing digital media into which dynamic creative assets will be delivered. In step 430, the multimedia server establishes communication with the app through, for example, an API. Data sent from the multimedia server may include advertising data, codes, binary files, web content, event notifications, billing data, and other information as described below. It also may contain scheduling directions for when the creative assets are to be inserted into the app. Additionally, it may contain coded descriptions or indications of the creative assets sent, so that the app client can identify where to place the creative asset. For instance, one type of creative asset could be a hanging sign advertisement. The type and genre of the creative asset could affect where the creative asset is placed within the app. For example, creative assets indicating sporting goods, such as NIKE® shoes may be out of place in a cooking app, while an ad for BUDWEISER® is out of place in a children's app. By identifying not only the type of ad, but also the genre of ad, advertisements can be targeted to or restricted from users selectively. A detailed discussion of the types and genres of creative assets follows below.
Referring back to
In step 460, the creative asset client 320 checks for the existence of an advertising tag in the app. An advertising tag, as described above, is a code or other indicator that indicates an advertisement could be placed in a particular location of the app. The tag could have been generated based on information stored in the app file on a hard drive or other storage of the app client, or could have been included in the data sent by the multimedia server to the creative asset client in step 450. Alternatively, the multimedia server can target dynamic creative asset placement based on identification of an intended content element, such as a certain title, paragraph or menu position (second paragraph or 5th menu row). The dynamic content positioning can be altered based on different apps, device types, pages or sections, or other objects in the app. For example, dynamic placement can be performed by the multimedia server on the second paragraph of the lifestyle section and/or on a mobile device however after fourth paragraph on sports section and/or on a desktop, unless either are within a paragraph of an image or video player, in such an event an alteration in positioning can be programmed on any device. In an alternative embodiment, a separate execution thread is launched to check for advertising tags or targeted content elements. When one is found, the CAIPD is notified. When an advertising tag or content element is detected in step 460, the creative asset client checks in step 470 to see if it is already storing any creative assets available and appropriate to be placed in that location. In order to determine when a particular creative asset can be placed in the location or on the object that corresponds to an advertising tag or content element, different criteria are used. The exact criteria examined are customizable and dependent upon the type of app (e.g., the app content) and the creative assets. In certain scenarios some criteria may be relevant while others not. Some criteria that could be examined for appropriateness prior to inserting a creative asset into a particular advertising tag or content element location are creative asset type, creative asset genre, creative asset key performance indicators and creative asset scheduling time. The creative asset type relates to its material. For instance, some of the types of creative assets can include static images, animated or dynamic images, HTML or other programmatic codes, and audio or video files. The audio or video files can either be downloaded in whole or in a streamed format. Additionally, each creative asset stored in the multimedia server is associated with a genre, even if the genre is universal. For instance, one genre may be sports related, and another may refer to alcohol. Creative assets that correspond to the sports genre may be placed in a school related app, while the alcohol genre ads may not be (for many reasons, including convention and legal). Further, each creative asset has associated scheduling information that prescribes when the advertisement can be placed in the app. For example, an advertiser may want their advertisements only to appear on weekends, and not during weekdays. The scheduling information about that advertisement would instruct the creative asset client only to insert that creative asset during the allowable scheduled times.
Determining if a given creative asset conforms to the type specified in an advertising tag or content element found in step 460 involves comparing the type or types associated with the creative asset to the type or types associated with the particular ad tag or content element. In regards to streaming content, each ad tag or content element within the app may be associated with an object in the content. For example, an object could be a billboard on a building, a sign on a bus, an emblem on a hat or other clothing, a loudspeaker that is actually producing sounds within the content, a scrolling message board such as in Time Square in New York City, or any other type of advertisement that may exist in the real world and is represented in the content. Many of the objects in the streaming content will not have advertising on them, and therefore will not have ad tags associated with them, trees or fire hydrants, for example. When an ad tag is found in step 460, the object to which it is associated will be identified by the creative asset type. For example, step 460 may find an ad tag that is associated with a logo on a baseball cap. The ad tag may be a code that specifies the relative dimensions of the creative asset, or be an indication of the type of object (clothing) or specific object (hat), or some other indication. Step 480 will then examine the inventory of stored creative assets to determine if any of them match the type code of the found ad tag. If one or more advertisements are found, then those could be selected as conforming ads. Alternatively, the creative asset client may request a creative asset to be inserted in the ad tag or content element according to the request and a conforming creative asset is sent from the multimedia server.
In addition to checking to see if the types and genres match, step 480 will check to see if the scheduling requirements for the creative assets and the located ad tag or content element match. As mentioned above, each creative asset may have scheduling data that describes when the creative asset should preferably be inserted in the app. The scheduling data may indicate that there are no restrictions on when the creative asset can be placed into the app, or the scheduling data may tightly control when the creative asset can be placed. For instance, an advertiser may only want a creative asset placed into the app if it is after 5:00 pm local time. Therefore, prior to insertion into the app, a function is called to preferably ensure that the schedule is met. The scheduling need not be limited to time and date restrictions, but are only given by way of example. For instance, an advertiser may wish that a certain series of creative asset are seen by the user in a particular order. The app client could ensure that prior to inserting the second advertisement in the series into the app, that the first advertisement of the series was already inserted, or display the second advertisement only when a certain viewing time or user interaction has occurred with the first advertisement across any device the user utilizes.
Therefore, at least one of the type, genre, and scheduling time associated with the creative asset preferably identically or closely matches the codes associated with the advertising tag or content element for it to be considered a “conforming” creative asset for the purposes of step 480.
In step 470, the creative asset client requests a conforming creative asset from the multimedia server by sending information to the multimedia server describing the creative asset specifications that is specified by the tag or content element found in step 460. Once the multimedia server receives the notice, it will check for conforming creative assets in step 480 and send a creative asset conforming to the desired creative asset specifications (e.g., type, genre, format and/or scheduling time) to the creative asset client. Once the conforming creative asset is downloaded from the multimedia server, it will be re-verified that it conforms to the advertising tag or content element found in step 460, and then await insertion into the app.
In step 484, the conforming creative asset is formatted and placed into the app in the location that corresponds to the advertising tag or content element found in step 460. Formatting a stored creative asset for insertion into native app, OTT, or streaming content may be performed by calling functions or an API provided in the SDK that was used to integrate the system with the native app, OTT, or streaming content. Some of the functions may include de-encrypting the stored creative asset either at this step or when it was stored in memory or decompressing the creative asset that was compressed for transmission from the multimedia server.
Following insertion of the creative asset into the app in step 484, the insertion event or the request to the multimedia server in step 470 is logged by sending the information about the creative asset placement back to the multimedia server in step 490. It is important that the creative asset client communicate this information back to the multimedia server, because the insertion is at least one event that preferably will be paid for by the advertiser. In addition, other information regarding measuring the app user's exposure to the creative asset and the quality of that exposure may be communicated back to the multimedia server for billing and other purposes. In logging the insertion, as much information as possible regarding the event is preferably sent back to the multimedia server, such as creative asset identification, target tag identification, user interaction, date, time, etc. This information can be stored in a backend database associated with the multimedia server in a privacy compliant manner and mined by the media entity or advertisers or others to provide useful information from the patterns that emerge. Additionally, the app client can determine duration of the creative asset (how long the inserted creative asset was displayed) and a measure of the quality of the exposure. The quality of the inserted creative asset may not be the same in every app, and indeed, each insertion may be different. For example, a creative asset may be inserted in the app, but at a location outside of the user viewport, or the creative asset may be presented at a 45° or some other degree angle away from the app user, yielding a less than optimal quality measurement. Alternatively, the app user may be in direct view of an inserted creative asset so that it takes up 33% of the user's available screen. This type of creative asset insertion could be very valuable to advertisers, who may be willing to pay a premium for such treatment. The creative asset measurement routines 370 of the creative asset client 320 can be used to collect and transmit such information to the multimedia server 314.
At a minimum, the data about the creative asset insertion by the creative asset client that is sent to the multimedia server. Additionally, the creative asset client would specifically identify the creative asset that was inserted. The duration of the creative asset insertion would also be included in the event log, as well as an indication of the quality of the insertion, as discussed above, such as size, position, viewing angle, etc. The last two data points (duration and quality) may be combined to create a “pixel-hours” type number, where the total number of pixels making up the inserted creative assets could be counted or calculated, timed, and multiplied to form a rough indication of overall creative asset insertion exposure. This combined data would be more useful than simply listing the number of creative asset insertions. For example, if there were a large number of creative asset insertions that were of a small size (such that they appeared small on the user's screen), these may be roughly equal to only a few creative asset insertions that appeared larger on the user's screen. Pixel-hours is a way to average all of these events to get an overall sense of the app user's exposure to the inserted creative assets. Another way to reflect the exposure of the creative assets displayed in the app would be to set a minimum time and screen size to qualify as a creative asset “hit.” For example, each time a creative asset takes up 25% of the user's screen for 2 seconds it would be classified as a ‘hit’, then the number of hits per app session could be counted and sent to the multimedia server in the event logging step 490. Alternatively, the creative asset 320 might implement a weighting scheme, where certain relative positions and view angles are weighted higher than others. One skilled in the art will recognize that other types of quality measurements and combinations of ratings are possible.
One skilled in the art will recognize that the exemplary CAIPD can be implemented as one or more code modules and may be implemented in a distributed environment where the various programs residing in memory 510 are instead distributed among several computer systems. Similarly, one would recognize that many computers 500 could be interconnected to make a single multimedia server.
Discussing the above steps in more detail, in step 610 the multimedia server, such as the multimedia server in
With respect to genres, as described above with reference to
The list of acceptable creative asset types and genres that the app client presents to the multimedia server can be sent to the multimedia server in many ways and at different times. As mentioned above, the app client may send an entire list of all of the acceptable creative asset types/genres/formats in the entire app as part of establishing the communication link with the multimedia server (step 610 in
In addition to sending an entire list of acceptable creative asset types/genres/formats under any of these scenarios, the app client may individually update the list stored on the multimedia server. For example, the app client may initially send a full list of acceptable creative asset types that are currently in use. Then, as the user navigates the app, exposing other creative asset types, the app client could send a message to the multimedia server (for example, an “update event”) to add or delete various creative asset types on an individual basis. Note that sending the list of acceptable creative asset types is not the same as sending the list of advertising tags found in the app. For example, there may be 100 advertising tags or content elements in the app that fall into only two acceptable ad types and three ad genres. In this case, preferably only the list of the acceptable ad types and genres, and not each of the advertising tags, may be sent to the multimedia server. In practice, this may involve maintaining a list of acceptable ad types/genres/formats on both the app client and the multimedia server, and, when the app client finds a new ad tag in the app (step 460 of
In addition to supporting selection of ads by type, genre, format and scheduling criteria, the CAIPD can support the automatic targeting of ads based upon additional information sent from one app client to the multimedia server or user declared data and interests collected by the system as described in the manner of a survey responses from previous viewed creative assets. For example, a user's musical or video selection can be used to target (select specifically for that user) creative assets to watch a certain type of show declared to be of interest by the user.
In step 620 of
Step 630 of
After the creative asset has been sent to the app client and inserted into the app, in step 640, the app client sends a response back to the multimedia server to log the event. Additionally, the multimedia server notifies the creative asset scheduler in the multimedia server of which creative assets were sent to the app client, whether they were requested or selected, and as much other information as possible about the system. This information may instead or also be sent to a billing system if one is available. For example, the multimedia server may report that creative asset “A” was sent to app client “B” at time “C”. Logging data from both the multimedia server and from the app client in the creative asset scheduler are represented in step 640 of
The information logged in step 640 is used for a variety of purposes. Foremost, it is used for billing the advertiser, in that each time a creative asset is inserted into an app, it is known by the advertiser that it appeared on the app user's screen. Additionally, as discussed above, some level of the quality of the exposure to an advertising placement may be also gleaned and reported. This information can also be used in billing the advertiser, perhaps by charging more for premium advertising placements or longer durations. Also, some advertisers may want to purchase a guaranteed minimum number of impressions. The information logging step 640 allows the multimedia servers to track this data in real-time, in that a program connected to the creative assetscheduler can easily track the number of creative asset impressions of a particular ad or advertiser. Once the minimum number of impressions are met, the creative asset scheduler may choose another advertiser's advertisement in the selection step 620. The information logged in the step 640 can also be used for statistical analysis and reports, such as which advertising tag location in an app yields the most or highest quality insertions, which creative asset types/formats generated the highest KPI results, which genre, contextual and/or emotional creative asset and digital media content matching generated the highest performance and user interest/engagement.
In step 650, the CAIPD checks to see if the app client is still responding by either requesting other creative assets or sending creative asset placement events to be logged by the multimedia server, or by even sending useless fill data. If the app client continues sending data, the flow loops back to step 620 to determine if the app client is requesting a specific creative asset. If instead the app client is no longer responding, then the multimedia server flow may shutdown in step 660 by closing the link established with the app client in step 610. The shutdown event is logged in step 670.
When scheduling which creative asset to send to the app client, the creative asset scheduler reviews which creative assets are available to be inserted to the app with respect to creative asset and device type, format, genre, position, timing considerations, etc. Because these requirements may be continuously changing, embodiments of the creative asset scheduler are capable of operating in a mode where all of the limitations are checked prior to sending a creative asset to the app client. For example, if the creative asset scheduler decides not to send advertisement “A” to an app client at a particular time, that does not exclude the advertisement A from ever being sent to the app client. For instance, if advertisement A was not initially sent because the app client indicated that it had not encountered the matching ad tag type, once the matching ad tag type was requested by the app client, the ad scheduler could then choose to insert advertisement A.
Another consideration for choosing which creative asset to send to the app client is the total number of times a creative asset has been seen by app users. The creative asset scheduler can factor in the number of times a particular creative asset has been seen when deciding which creative asset to next send to the app client. For example, an advertiser may pay for 10,000 ad views over a three hour period. The creative asset scheduler may send these conforming creative assets to the app client, and then, by checking the insertion data sent back by the app client, determine that the requisite number of ad views have been inserted, and no longer send that creative asset during the contract period.
It should be remembered that the multimedia server may be coupled to many app clients, and that the selection by the creative asset scheduler of which creative asset to next send in step 620 may be different for every connected app client. In this case the creative asset scheduler keeps track of information on a per client, per session basis. In one embodiment, this tracking is implemented using well-known database techniques.
From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the invention. For example, one skilled in the art will recognize that the methods and systems discussed herein are applicable to other areas and devices other than apps having ads dynamically incorporated, such as email, web browsers, newsreaders, online books, navigation devices, other multimedia devices and environments, voice based media, applications and devices, etc. In addition, different forms of content can be dynamically incorporated into multimedia targets, including, but not limited to, webpages, HTML, XML, other code, RSS and other feeds, data such as social media content and other data sources, audio, video and static or animated text or graphics. One skilled in the art will also recognize that the methods and systems discussed herein are applicable to differing protocols and communication media (optical, wireless, cable, etc.) and that the techniques described herein may be embedded into such a system. In addition, those skilled in the art will understand how to make changes and modifications to the methods and systems described to meet their specific requirements or conditions.
The embodiments described above may include methods performed as computer program products. Traditionally, a computer program includes a finite sequence of computational instructions or program instructions. It will be appreciated that a programmable apparatus or computing device can receive such a computer program and, by processing the computational instructions thereof, produce a technical effect.
A programmable apparatus or computing device includes one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, programmable devices, programmable gate arrays, programmable array logic, memory devices, application specific integrated circuits, or the like, which can be suitably employed or configured to process computer program instructions, execute computer logic, store computer data, and so on. Throughout this disclosure and elsewhere a computing device can include any and all suitable combinations of at least one general purpose computer, special-purpose computer, programmable data processing apparatus, processor, processor architecture, and so on. It will be understood that a computing device can include a computer-readable storage medium and that this medium may be internal or external, removable and replaceable, or fixed. It will also be understood that a computing device can include a Basic Input/Output System (BIOS), firmware, an operating system, a database, or the like that can include, interface with, or support the software and hardware described herein.
Embodiments of the system as described herein are not limited to applications involving conventional computer programs or programmable apparatuses that run them. It is contemplated, for example, that embodiments of the disclosure as claimed herein could include an optical computer, quantum computer, analog computer, or the like.
Regardless of the type of computer program or computing device involved, a computer program can be loaded onto a computing device to produce a particular machine that can perform any and all of the depicted functions. This particular machine (or networked configuration thereof) provides a technique for carrying out any and all of the depicted functions.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Illustrative examples of the computer readable storage medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A data store may be comprised of one or more of a database, file storage system, relational data storage system or any other data system or structure configured to store data. The data store may be a relational database, working in conjunction with a relational database management system (RDBMS) for receiving, processing and storing data. A data store may comprise one or more databases for storing information related to the processing of moving information and estimate information as well one or more databases configured for storage and retrieval of moving information and estimate information.
Computer program instructions can be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner. The instructions stored in the computer-readable memory constitute an article of manufacture including computer-readable instructions for implementing any and all of the depicted functions.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The elements depicted in flowchart illustrations and block diagrams throughout the figures imply logical boundaries between the elements. However, according to software or hardware engineering practices, the depicted elements and the functions thereof may be implemented as parts of a monolithic software structure, as standalone software components or modules, or as components or modules that employ external routines, code, services, and so forth, or any combination of these. All such implementations are within the scope of the present disclosure. In view of the foregoing, it will be appreciated that elements of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, program instruction technique for performing the specified functions, and so on.
It will be appreciated that computer program instructions may include computer executable code. A variety of languages for expressing computer program instructions are possible, including without limitation C, C++, Java, JavaScript, assembly language, Lisp, HTML, Perl, and so on. Such languages may include assembly languages, hardware description languages, database programming languages, functional programming languages, imperative programming languages, and so on. In some embodiments, computer program instructions can be stored, compiled, or interpreted to run on a computing device, a programmable data processing apparatus, a heterogeneous combination of processors or processor architectures, and so on. Without limitation, embodiments of the system as described herein can take the form of web-based computer software, which includes client/server software, software-as-a-service, peer-to-peer software, or the like.
In some embodiments, a computing device enables execution of computer program instructions including multiple programs or threads. The multiple programs or threads may be processed more or less simultaneously to enhance utilization of the processor and to facilitate substantially simultaneous functions. By way of implementation, any and all methods, program codes, program instructions, and the like described herein may be implemented in one or more thread. The thread can spawn other threads, which can themselves have assigned priorities associated with them. In some embodiments, a computing device can process these threads based on priority or any other order based on instructions provided in the program code.
Unless explicitly stated or otherwise clear from the context, the verbs “process” and “execute” are used interchangeably to indicate execute, process, interpret, compile, assemble, link, load, any and all combinations of the foregoing, or the like. Therefore, embodiments that process computer program instructions, computer-executable code, or the like can suitably act upon the instructions or code in any and all of the ways just described.
The functions and operations presented herein are not inherently related to any particular computing device or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent to those of ordinary skill in the art, along with equivalent variations. In addition, embodiments of the disclosure are not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the present teachings as described herein, and any references to specific languages are provided for disclosure of enablement and best mode of embodiments of the disclosure. Embodiments of the disclosure are well suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks include storage devices and computing devices that are communicatively coupled to dissimilar computing and storage devices over a network, such as the Internet, also referred to as “web” or “world wide web”.
In at least some exemplary embodiments, the exemplary disclosed system may utilize sophisticated machine learning and/or artificial intelligence techniques to prepare and submit datasets and variables to cloud computing clusters and/or other analytical tools (e.g., predictive analytical tools) which may analyze such data using artificial intelligence neural networks. The exemplary disclosed system may for example include cloud computing clusters performing predictive analysis. For example, the exemplary neural network may include a plurality of input nodes that may be interconnected and/or networked with a plurality of additional and/or other processing nodes to determine a predicted result. Exemplary artificial intelligence processes may include filtering and processing datasets, processing to simplify datasets by statistically eliminating irrelevant, invariant or superfluous variables or creating new variables which are an amalgamation of a set of underlying variables, and/or processing for splitting datasets into train, test and validate datasets using at least a stratified sampling technique. The exemplary disclosed system may utilize prediction algorithms and approach that may include regression models, tree-based approaches, logistic regression, Bayesian methods, deep-learning and neural networks both as a stand-alone and on an ensemble basis, and final prediction may be based on the model/structure which delivers the highest degree of accuracy and stability as judged by implementation against the test and validate datasets.
Throughout this disclosure and elsewhere, block diagrams and flowchart illustrations depict methods, apparatuses (e.g., systems), and computer program products. Each element of the block diagrams and flowchart illustrations, as well as each respective combination of elements in the block diagrams and flowchart illustrations, illustrates a function of the methods, apparatuses, and computer program products. Any and all such functions (“depicted functions”) can be implemented by computer program instructions; by special-purpose, hardware-based computer systems; by combinations of special purpose hardware and computer instructions; by combinations of general purpose hardware and computer instructions; and so on—any and all of which may be generally referred to herein as a “component”, “module,” or “system.”
While the foregoing drawings and description set forth functional aspects of the disclosed systems, no particular arrangement of software for implementing these functional aspects should be inferred from these descriptions unless explicitly stated or otherwise clear from the context.
Each element in flowchart illustrations may depict a step, or group of steps, of a computer-implemented method. Further, each step may contain one or more sub-steps. For the purpose of illustration, these steps (as well as any and all other steps identified and described above) are presented in order. It will be understood that an embodiment can contain an alternate order of the steps adapted to a particular application of a technique disclosed herein. All such variations and modifications are intended to fall within the scope of this disclosure. The depiction and description of steps in any particular order is not intended to exclude embodiments having the steps in a different order, unless required by a particular application, explicitly stated, or otherwise clear from the context.
The functions, systems and methods herein described could be utilized and presented in a multitude of languages. Individual systems may be presented in one or more languages and the language may be changed with ease at any point in the process or methods described above. One of ordinary skill in the art would appreciate that there are numerous languages the system could be provided in, and embodiments of the present disclosure are contemplated for use with any language.
While multiple embodiments are disclosed, still other embodiments of the present disclosure will become apparent to those skilled in the art from this detailed description. There may be aspects of this disclosure that may be practiced without the implementation of some features as they are described. It should be understood that some details have not been described in detail in order to not unnecessarily obscure the focus of the disclosure. The disclosure is capable of myriad modifications in various obvious aspects, all without departing from the spirit and scope of the present disclosure. Accordingly, the drawings and descriptions are to be regarded as illustrative rather than restrictive in nature.
Claims
1. A method for dynamically inserting ads into online content, performed by a computer processor, comprising:
- extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium;
- analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file;
- identifying a digital media content environment from the characteristics associated with the digital media file;
- selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment;
- processing dynamic rules for asset formatting of the selected advertisement asset; and
- integrating and distributing the advertisement asset into the digital media content environment.
2. The method of claim 1, further comprising:
- identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment; and
- integrating the advertisement asset into the digital media file in response to the triggering event.
3. The method of claim 2, further comprising inserting an adjunct window displaying the integrated advertisement asset, wherein the adjunct window is displayed concurrently with the digital media file.
4. The method of claim 3, further comprising resizing a main window displaying the digital media file as the adjunct window is inserted.
5. The method of claim 2, wherein the triggering event is based on criteria specified by a media provider or within a file of the advertisement asset.
6. The method of claim 1, further comprising:
- identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment;
- identifying a type of user action associated with the triggering event;
- identifying a rule associated with the type of user action;
- distributing the advertisement asset into the digital media player in a display format dependent on the rule identified with the type of user action, wherein the display format is different based on the rule identified.
7. The method of claim 6, wherein the type of user action includes scrolling through the digital media file.
8. The method of claim 1, further comprising:
- identifying whether a criteria associated with the display of the advertisement asset has been met within the digital media content environment; and
- persisting display of the advertisement asset within the digital media content environment until a rule based on user interaction with the digital media content environment has been fulfilled.
9. The method of claim 1, further comprising:
- logging, in association with a display of a particular advertisement asset: a number of times the particular advertisement asset was displayed, a number of total pixels used in the display of the particular advertisement asset, and an aggregated duration of display time; and
- generating a quality of display metric for the particular advertisement asset, wherein the metric is used to determine an overall ad exposure for the particular advertisement asset in the digital media environment.
10. The method of claim 1, wherein:
- the step of integration and distribution of the advertisement asset occurs at varying points in time during a performance of the digital media file; and
- more than one advertisement asset is integrated and distributed into the digital media content environment during the performance of the digital media file.
11. A computer program product for dynamically inserting ads into online content, the computer program product comprising:
- one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media, the program instructions comprising:
- extracting structural metadata and contextual features from a digital media file, wherein the digital media file is configured for performance in an Internet protocol (IP) based medium;
- analyzing the structural metadata and contextual features for an understanding of characteristics associated with the digital media file;
- identifying a digital media content environment from the characteristics associated with the digital media file;
- selecting an advertisement asset from a storage of assets, wherein the selection is based on: content included in the digital media file, wherein the selected advertisement asset includes a correlation to the structural metadata and contextual features of the digital media file, the digital media content environment, and rules with respect to a native content integration correlating with the performance of the selected advertisement asset in the digital media content environment;
- processing dynamic rules for asset formatting of the selected advertisement asset; and integrating and distributing the advertisement asset into the digital media content environment.
12. The computer program product of claim 11, wherein the program instructions further comprise:
- identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment; and
- integrating the advertisement asset into the digital media file in response to the triggering event.
13. The computer program product of claim 12, further comprising inserting an adjunct window displaying the integrated advertisement asset, wherein the adjunct window is displayed concurrently with the digital media file.
14. The computer program product of claim 13, further comprising resizing a main window displaying the digital media file as the adjunct window is inserted.
15. The computer program product of claim 12, wherein the triggering event is based on criteria specified by a media provider or within a file of the advertisement asset.
16. The computer program product of claim 11, further comprising:
- identifying by an event listener controlled by the computer processor, a triggering event within the digital media content environment;
- identifying a type of user action associated with the triggering event;
- identifying a rule associated with the type of user action;
- distributing the advertisement asset into the digital media player in a display format dependent on the rule identified with the type of user action, wherein the display format is different based on the rule identified.
17. The computer program product of claim 16, wherein the type of user action includes scrolling through the digital media file.
18. The computer program product of claim 11, further comprising:
- identifying whether a criteria associated with the display of the advertisement asset has been met within the digital media content environment; and
- persisting display of the advertisement asset within the digital media content environment until a rule based on user interaction with the digital media content environment has been fulfilled.
19. The computer program product of claim 11, further comprising:
- logging, in association with a display of a particular advertisement asset: a number of times the particular advertisement asset was displayed, a number of total pixels used in the display of the particular advertisement asset, and an aggregated duration of display time; and
- generating a quality of display metric for the particular advertisement asset, wherein the metric is used to determine an overall ad exposure for the particular advertisement asset in the digital media environment.
20. The computer program product of claim 11, wherein:
- the step of integration and distribution of the advertisement asset occurs at varying points in time during a performance of the digital media file; and
- more than one advertisement asset is integrated and distributed into the digital media content environment during the performance of the digital media file.
Type: Application
Filed: Aug 2, 2021
Publication Date: Feb 3, 2022
Applicant: Spotible Labs LLC (New York, NY)
Inventor: Dana Ghavami (New York, NY)
Application Number: 17/391,956