CONTEXTUALLY AWARE INTERACTIVE ADVERTISEMENTS

A system and method for contextually aware interactive advertisements are provided. In example embodiments, an advertisement indication that indicates a presentation of an advertisement to a user is received. The advertisement corresponding to the advertisement indication is identified. At least one item listing is determined based, at least in part, on the advertisement. Real-time contextual data corresponding to the advertisement indication is received. The real-time contextual data corresponds to a physical context of the presentation of the advertisement. Accessing contextual conditions associated with the advertisement. Determining satisfaction of the contextual conditions based on the real-time contextual data. Based on the determined satisfaction of the contextual conditions, presentation of the at least one item listing is caused to the user.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application claims the priority benefit of U.S. Provisional Application No. 61/869,557, entitled “IMPROVED RETAIL EXPERIENCE,” filed Aug. 23, 2013, which is hereby incorporated by reference in its entirety.

TECHNICAL FIELD

Embodiments of the present disclosure relate generally to mobile computing technology and, more particularly, but not by way of limitation, to contextual interactive advertisements.

BACKGROUND

In recent years mobile devices, wearable devices, smart devices, and the like have pervaded nearly every aspect of modern life. Such devices often include sensors operable to physically detect identifiers such as Quick Response (QR) codes. In addition, the near ubiquity of wireless networks provides users with access to information virtually anywhere.

BRIEF DESCRIPTION OF THE DRAWINGS

Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and cannot be considered as limiting its scope.

FIG. 1A is a block diagram illustrating a networked system, according to some example embodiments.

FIG. 1B illustrates a block diagram showing components provided within the system of FIG. 1A, according to some example embodiments.

FIG. 2 is a block diagram illustrating an example embodiment of an advertisement system, according to some example embodiments.

FIG. 3 is a depiction of an interactive advertisement, according to some example embodiments.

FIG. 4 is a flow diagram illustrating an example method for identifying an advertisement and presenting item listings, according to some example embodiments.

FIG. 5 is an illustration showing example types of sensors that provide various sensor data, according to some example embodiments.

FIG. 6 is a flow diagram illustrating communication between various entities, according to some example embodiments.

FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments.

FIGS. 8 and 9 are flow diagrams illustrating further operations for determining contextual conditions, according to some example embodiments.

FIGS. 10-13 illustrate example user interfaces, according to some example embodiments.

FIG. 14 depicts an example mobile device and mobile operating system interface, according to some example embodiments.

FIG. 15 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments.

FIG. 16 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment.

The headings provided herein are merely for convenience and do not necessarily affect the scope or meaning of the terms used.

DETAILED DESCRIPTION

The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.

Mobile devices provide a variety of data that is measured, captured, or otherwise obtained via sensors such as position sensors to capture position data (e.g., using a Global Positioning System (GPS) component), detection sensors to detect identifiers (e.g., an optical sensor to read QR codes), and the like. Such data may be utilized to augment, supplement, or otherwise enhance advertisements (also referred to as “ads”) or promotions. For example, contextually aware interactive advertisements may be realized using such data. In various implementations, exclusivity associated with a particular advertisement can be implemented using contextual data. A particular (in some cases exclusive) offer or deal corresponding to a particular advertisement may be available to users who are physically being presented the particular advertisement at a particular time and location. The element of exclusivity is intended, in some scenarios, to have the effect of generating excitement or a “buzz” regarding an advertising campaign.

In various embodiments, an advertisement indication that indicates a presentation of an advertisement or promotion to a user is received. For instance, the user may scan or otherwise obtain an advertisement identifier (e.g., scanning a QR code that included the advertisement identifier) corresponding to an advertisement using a user device (e.g., a smart phone equipped with an optical sensor to scan QR codes). In this instance, the advertisement indication comprises the advertisement identifier scanned by the user, although the advertisement indication can include other information such as location, time, or other contextual data. Subsequent to receiving the advertisement indication, the advertisement corresponding to the advertisement indication is identified. For example, if the advertisement indication includes the advertisement identifier, the advertisement can be identified via a lookup of the advertisement based on the advertisement identifier.

In embodiments, one or more item listings are determined based, at least in part, on the identified advertisement. In a specific example, a predefined set of items is accessed based on the advertisement (e.g., a lookup of the predefined set of items corresponding to the advertisement using the advertisement identifier) and one or more item listings corresponding to items included in the predefined set of items may be identified. In another example, the advertisement corresponds to an item type, and item listings associated with the item type are determined (e.g., identifying item listings on an e-commerce website that match the item type). Many other schemes and techniques may be employed to determine the item listing.

In further embodiments, contextual data corresponding to the advertisement indication is received. In some cases, the contextual data comprises real-time contextual data. In various implementations, the contextual data corresponds to a physical context or physical environment of the presentation of the advertisement. For instance, the contextual data comprises location data (e.g., as determined by a GPS component of a mobile device of the user) that corresponds to a presentation location of the presentation of the advertisement to the user. Thus, based on the real-time contextual data, a location of where the user is viewing a particular advertisement may be ascertained in real-time.

Presentation of the at least one item listing is caused based on the real-time contextual data. For example, if the user is viewing a particular advertisement at a particular location and time, the item listing is presented to the user. In some examples, the item listings are exclusively available to users that meet contextual conditions such as a location condition (e.g., a distance condition) and temporal condition. For example, if a user location is not within a distance of an advertisement location, the item listings may not be available to the user. In this manner, the user may interact with a contextually aware interactive advertisement.

With reference to FIG. 1A, an example embodiment of a high-level client-server-based network architecture 100 is shown. A networked system 102 provides server-side functionality via a network 104 (e.g., the Internet or wide area network (WAN)) to a client device 110. In some implementations, a user (e.g., user 106) interacts with the networked system 102 using the client device 110. FIG. 1A illustrates, for example, a web client 112 (e.g., a browser, such as the Internet Explorer®) browser developed by Microsoft® Corporation of Redmond, Wash. State), client application(s) 114, and a programmatic client 116 executing on the client device 110. The client device 110 may include the web client 112, the client application(s) 114, and the programmatic client 116 alone, together, or in any suitable combination. Although FIG. 1A shows one client device 110, in other implementations, the network architecture 100 comprises multiple client devices.

In various implementations, the client device 110 comprises a computing device that includes at least a display and communication capabilities that provide access to the networked system 102 via the network 104. The client device 110 comprises, but is not limited to, a remote device, work station, computer, general purpose computer, Internet appliance, hand-held device, wireless device, portable device, wearable computer, cellular or mobile phone, Personal Digital Assistant (PDA), smart phone, tablet, ultrabook, netbook, laptop, desktop, multi-processor system, microprocessor-based or programmable consumer electronic, game consoles, set-top box, network Personal Computer (PC), mini-computer, and so forth. In an example embodiment, the client device 110 comprises one or more of a touch screen, accelerometer, gyroscope, biometric sensor, camera, microphone, Global Positioning System (GPS) device, and the like.

The client device 110 communicates with the network 104 via a wired or wireless connection. For example, one or more portions of the network 104 comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.

In some example embodiments, the client device 110 includes one or more of the applications (also referred to as “apps”) such as, but not limited to, web browsers, book reader apps (operable to read e-books), media apps (operable to present various media forms including audio and video), fitness apps, biometric monitoring apps, messaging apps, electronic mail (email) apps, and e-commerce site apps (also referred to as “marketplace apps”). In some implementations, the client application(s) 114 include various components operable to present information to the user and communicate with networked system 102. In some embodiments, if the e-commerce site application is included in the client device 110, then this application may be configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with the networked system 102, on an as needed basis, for data or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment). Conversely, if the e-commerce site application is not included in the client device 110, the client device 110 can use its web browser to access the e-commerce site (or a variant thereof) hosted on the networked system 102.

In various example embodiments, the user (e.g., the user 106) comprises a person, a machine, or other means of interacting with the client device 110. In some example embodiments, the user is not be part of the network architecture 100, but interacts with the network architecture 100 via the client device 110 or another means. For instance, the user provides input (e.g., touch screen input or alphanumeric input) to the client device 110 and the input is communicated to the networked system 102 via the network 104. In this instance, the networked system 102, in response to receiving the input from the user, communicates information to the client device 110 via the network 104 to be presented to the users. In this way, the user can interact with the networked system 102 using the client device 110.

An Application Program Interface (API) server 120 and a web server 122 may be coupled to, and provide programmatic and web interfaces respectively to, one or more application server(s) 140. The application server(s) 140 may host one or more publication system(s) 142, payment system(s) 144, and an advertisement system 150, each of which may comprise one or more modules or applications and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application server(s) 140 are, in turn, shown to be coupled to one or more database server(s) 124 that facilitate access to one or more information storage repositories or database(s) 126. In an example embodiment, the database(s) 126 are storage devices that store information to be posted (e.g., publications or listings) to the publication system(s) 142. The database(s) 126 may also store digital goods information in accordance with some example embodiments.

Additionally, a third party application 132, executing on a third party server 130, is shown as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120. For example, the third party application 132, utilizing information retrieved from the networked system 102, may support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, marketplace, or payment functions that are supported by the relevant applications of the networked system 102.

The publication system(s) 142 may provide a number of publication functions and services to the users that access the networked system 102. The payment system(s) 144 may likewise provide a number of functions to perform or facilitate payments and transactions. While the publication system(s) 142 and payment system(s) 144 are shown in FIG. 1A to both form part of the networked system 102, it will be appreciated that, in alternative embodiments, each system 142 and 144 may form part of a payment service that is separate and distinct from the networked system 102. In some example embodiments, the payment system(s) 144 may form part of the publication system(s) 142.

In some implementations, the advertisement system 150 provides functionality to implement contextually aware interactive advertisements. As such, the advertisement system 150 receives an advertisement indication, identifies the advertisement corresponding to the advertisement indication, determines the item listings based on the advertisement, receives the contextual data, and causes the presentation of the item listings to the user based on the contextual data. In some example embodiments, the system 150 communicates with the client device 110, the third party server(s) 130, the publication system(s) 142 (e.g., retrieving item listings), and the payment system(s) 144 (e.g., purchasing a listing). In an alternative example embodiment, the advertisement system 150 is a part of the publication system(s) 142. The advertisement system 150 will be discussed further in connection with FIG. 2 below.

Further, while the client-server-based network architecture 100 shown in FIG. 1A employs a client-server architecture, the present inventive subject matter is, of course, not limited to such an architecture, and may equally well find application in a distributed, or peer-to-peer, architecture system, for example. The various systems of the applications server(s) 140 (e.g., the publication system(s) 142 and the payment system(s) 144) may also be implemented as standalone software programs, which do not necessarily have networking capabilities.

The web client 112 may access the various systems of the networked system 102 (e.g., the publication system(s) 142) via the web interface supported by the web server 122. Similarly, the programmatic client 116 and client application(s) 114 may access the various services and functions provided by the networked system 102 via the programmatic interface provided by the API server 120. The programmatic client 116 may, for example, be a seller application (e.g., the Turbo Lister application developed by eBay® Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 116 and the networked system 102.

FIG. 1B illustrates a block diagram showing components provided within the publication system(s) 142, according to some embodiments. In various example embodiments, the publication system(s) 142 may comprise a market place system to provide market place functionality (e.g., facilitating the purchase of items associated with item listings on an e-commerce website). The networked system 102 may be hosted on dedicated or shared server machines that are communicatively coupled to enable communications between server machines. The components themselves are communicatively coupled (e.g., via appropriate interfaces) to each other and to various data sources, so as to allow information to be passed between the applications or so as to allow the applications to share and access common data. Furthermore, the components may access one or more database(s) 126 via the database server(s) 124.

The networked system 102 may provide a number of publishing, listing, and price-setting mechanisms whereby a seller (also referred to as a “first user”) may list (or publish information concerning) goods or services for sale or barter, a buyer (also referred to as a “second user”) can express interest in or indicate a desire to purchase or barter such goods or services, and a transaction (such as a trade) may be completed pertaining to the goods or services. To this end, the networked system 102 may comprise a publication engine 160 and a selling engine 162. The publication engine 160 may publish information, such as item listings or product description pages, on the networked system 102. In some embodiments, the selling engine 162 may comprise one or more fixed-price engines that support fixed-price listing and price setting mechanisms and one or more auction engines that support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.). The various auction engines may also provide a number of features in support of these auction-format listings, such as a reserve price feature whereby a seller may specify a reserve price in connection with a listing and a proxy-bidding feature whereby a bidder may invoke automated proxy bidding. The selling engine 162 may further comprise one or more deal engines that support merchant-generated offers for products and services.

A listing engine 164 allows sellers to conveniently author listings of items or authors to author publications. In one embodiment, the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via the networked system 102. In some embodiments, the listings may be an offer, deal, coupon, or discount for the good or service. Each good or service is associated with a particular category. The listing engine 164 may receive listing data such as title, description, and aspect name/value pairs. Furthermore, each listing for a good or service may be assigned an item identifier. In other embodiments, a user may create a listing that is an advertisement or other form of information publication. The listing information may then be stored to one or more storage devices coupled to the networked system 102 (e.g., database(s) 126). Listings also may comprise product description pages that display a product and information (e.g., product title, specifications, and reviews) associated with the product. In some embodiments, the product description page may include an aggregation of item listings that correspond to the product described on the product description page.

The listing engine 164 also may allow buyers to conveniently author listings or requests for items desired to be purchased. In some embodiments, the listings may pertain to goods or services that a user (e.g., a buyer) wishes to transact via the networked system 102. Each good or service is associated with a particular category. The listing engine 164 may receive as much or as little listing data, such as title, description, and aspect name/value pairs, that the buyer is aware of about the requested item. In some embodiments, the listing engine 164 may parse the buyer's submitted item information and may complete incomplete portions of the listing. For example, if the buyer provides a brief description of a requested item, the listing engine 164 may parse the description, extract key terms, and use those terms to make a determination of the identity of the item. Using the determined item identity, the listing engine 164 may retrieve additional item details for inclusion in the buyer item request. In some embodiments, the listing engine 164 may assign an item identifier to each listing for a good or service.

In some embodiments, the listing engine 164 allows sellers to generate offers for discounts on products or services. The listing engine 164 may receive listing data, such as the product or service being offered, a price or discount for the product or service, a time period for which the offer is valid, and so forth. In some embodiments, the listing engine 164 permits sellers to generate offers from sellers' mobile devices. The generated offers may be uploaded to the networked system 102 for storage and tracking.

Searching the networked system 102 is facilitated by a searching engine 166. For example, the searching engine 166 enables keyword queries of listings published via the networked system 102. In example embodiments, the searching engine 166 receives the keyword queries from a device of a user and conducts a review of the storage device storing the listing information. The review will enable compilation of a result set of listings that may be sorted and returned to the client device 110 of the user. The searching engine 166 may record the query (e.g., keywords) and any subsequent user actions and behaviors (e.g., navigations, selections, or click-throughs).

The searching engine 166 also may perform a search based on a location of the user. A user may access the searching engine 166 via a mobile device and generate a search query. Using the search query and the user's location, the searching engine 166 may return relevant search results for products, services, offers, auctions, and so forth to the user. The searching engine 166 may identify relevant search results both in a list form and graphically on a map. Selection of a graphical indicator on the map may provide additional details regarding the selected search result. In some embodiments, the user may specify, as part of the search query, a radius or distance from the user's current location to limit search results.

In a further example, a navigation engine 168 allows users to navigate through various categories, catalogs, or inventory data structures according to which listings may be classified within the networked system 102. For example, the navigation engine 168 allows a user to successively navigate down a category tree comprising a hierarchy of categories (e.g., the category tree structure) until a particular set of listings is reached. Various other navigation applications within the navigation engine 168 may be provided to supplement the searching and browsing applications. The navigation engine 168 may record the various user actions (e.g., clicks) performed by the user in order to navigate down the category tree.

In some example embodiments, a personalization engine 170 provides functionality to personalize various aspects of user interactions with the networked system 102. For instance, the user can define, provide, or otherwise communicate personalization settings used by the personalization engine 170 to determine interactions with the networked system 102. In further example embodiments, the personalization engine 170 determines personalization settings automatically and personalizes interactions based on the automatically determined settings. For example, the personalization engine 170 determines a native language of the user and automatically presents information in the native language.

FIG. 2 is a block diagram of the advertisement system 150 that provides functionality to implement contextually aware interactive advertisements, according to some example embodiments. In an example embodiment, the advertisement system 150 includes a presentation module 210, a communication module 220, an analysis module 230, an item module 240, a condition module 250, and an offer module 260. All, or some, of the modules 210-260 of FIG. 2, communicate with each other either directly or indirectly, for example, via a network coupling, shared memory, and the like. It will be appreciated that each module of modules 210-260 may be implemented as a single module, combined into other modules, further subdivided into multiple modules, or any suitable combination thereof. Other modules not pertinent to example embodiments may also be included, but are not shown.

The presentation module 210 provides various presentation and user interface functionality operable to interactively present and receive information from the user. For instance, the presentation module 210 can cause presentation of the determined item listings to the user. In various implementations, the presentation module 210 presents or causes presentation of information (e.g., visually displaying information on a screen, acoustic output, haptic feedback). Interactively presenting is intended to include the exchange of information between a particular device and the user. The user may provide input to interact with the user interface in many possible manners such as alphanumeric, point based (e.g., cursor), tactile, or other input (e.g., touch screen, tactile sensor, light sensor, infrared sensor, biometric sensor, microphone, gyroscope, accelerometer, or other sensors), and the like. It will be appreciated that the presentation module 210 provides many other user interfaces to facilitate functionality described herein. Further, it will be appreciated that “presenting” as used herein is intended to include communicating information or instructions to a particular device that is operable to perform presentation based on the communicated information or instructions.

The communication module 220 provides various communications functionality and web services. For example, the communication module 220 provides network communication such as communicating with the networked system 102, the client device 110, and the third party server(s) 130. In a specific example, the communication module 220 receives the advertisement indication from a user device (e.g., a smart phone) of the user. In another specific example, the communication module 220 receives contextual data corresponding to the advertisement indication. In some instances, the contextual data is real-time contextual data. In various example embodiments, the network communication may operate over wired or wireless modalities. Web services are intended to include retrieving information from the third party server(s) 130, the database(s) 126, and the application server(s) 140. In some implementations, information retrieved by the communication module 220 comprises data associated with the user (e.g., user profile information from an online account, social network service data associated with the user), data associated with one or more items listed on an e-commerce website (e.g., images of the item, reviews of the item, item price), or other data to facilitate the functionality described herein.

The analysis module 230 provides a variety of analysis functions to facilitate the functionality describe herein. For example, the analysis module 230 identifies the advertisement corresponding to the advertisement indication. More specifically, the analysis module 230 performs a lookup of the advertisement identifier included in the advertisement indication to identify the advertisement, according to some implementations. In some implementations, the analysis module 230 extracts information from the contextual data such as a user location, a presentation time, a user identity, and so on.

In some further implementations, the analysis module 230 accesses user data corresponding to the user. For instance, user data may include calendars (e.g., user calendar events such as birthdays, trips, exams), user profiles (e.g., demographic information such as age, gender, income level), purchase histories, browse histories (e.g., search terms), social media content (e.g., check-ins, posts, connections), other user data (e.g., bookmarked websites, preferences or settings for various applications, application usage data such as time spent using a particular application), and the like. In various example embodiments, the analysis module 230 access, retrieves, or otherwise obtains the user data from the database(s) 126, the third party server(s) 130, the publication system(s) 142, or elsewhere.

The item module 240 provides functionality to determine the item listings, according to some implementations. In various implementations, the item listings correspond to items available for purchase such as a listing on an e-commerce website. The item module 240 may employ a variety of schemes and techniques to determine the item listing based on various data. In an embodiment, the item module 240 accesses a predefined set of item listings corresponding to the advertisement and determines one or more item listings among the set of item listings. The predefined set of item listings may be configured by an operator, advertiser, or another party associated with the advertisement. In another example, the item module 240 determines the item listings from an e-commerce website (e.g., eBay®) based on the advertisement (e.g., an item type or brand corresponding to the advertisement).

The condition module 250 provides functionality to implement contextual conditions in conjunction with the advertisement, according to some embodiments. For instance, the condition module 250 accesses contextual conditions associated with the advertisement and evaluates satisfaction of the contextual conditions. For example, if the contextual conditions include a distance condition, the condition module 250 determines satisfaction of the distance condition based on the user location and the advertisement location. The contextual conditions are intended to create exclusivity in association with the advertisement. In other words, the interactive features of the advertisement may be available under specified conditions associated with the contextual data and otherwise unavailable to the user.

The offer module 260 provides functionality to generate advertisement offers associated with the item listing, according to some embodiments. In an embodiment, the offer comprises a discount for a purchase associated with the item listing. In other embodiments, the offer module 260 provides advertisement features that are specific to the advertisement. The advertisement features comprise, for example, free shipping, faster shipping, otherwise unavailable item features (e.g., a color or style not widely available), exclusive items, and so forth. The offer module 260 may provide a variety of offers to the user, and in some cases, the offers may be based on various data such as the contextual data, user data, and so forth.

Referring now to FIG. 3, a depiction 300 of an interactive advertisement is shown, according to some example embodiments. Scene 310 depicts an advertisement 320 that includes a tag 330 (e.g., a QR code embedded on or near the advertisement 320). In some implementations, the tag 330 is embedded in the advertisement 320, and in other implementations, the tag 330 is merely in the vicinity of the advertisement 320. In some implementations, user device 350 detects the advertisement identifier encoded in the tag 330 via a signal 340. For instance, the signal 340 may be an optical signal captured, detected, or otherwise obtained by the user device 350, with the user device 350 being operable to decode the signal to extract the advertisement identifier.

In an embodiment, the tag 330 comprises a QR code that is readable by an app executing on a mobile device of the user that includes a camera sensor. In other embodiments, the tag 330 comprises a Radio Frequency Identification (RFID) tag, Near Field Communication (NFC) tag, smart tag, or another storage device operable to store the advertisement identifier and communicate the advertisement identifier to the user device 350 (see FIG. 5 for additional sensor to detect identifiers). In still other embodiments, a tagless identification of the advertisement may be implemented by comparing a user location to respective advertisement locations corresponding to a plurality of advertisements and identifying a match between a particular advertisement location and the user location.

In some implementations, the user device 350 is communicatively coupled, via coupling 360, to the network 104, which is in turn communicatively coupled to the networked system 102 including the advertisement system 150 (discussed above in connection with FIG. 1A). User 370 may initiate the identification of the advertisement 320 by operating the user device 350. For example, the user device 350 executes an app operable to obtain the advertisement identifier and presents a user interfaces that includes the item listings to the user.

In the example depiction 300, the user 370 is carrying the user device 350 (e.g., a smart phone or smart watch) and may be interested in the advertisement 320. The user 370 initiates the identification of the advertisement 320 by detecting the tag 330 with the user device 350. The user device 350 may extract the advertisement identifier from the tag 330 (e.g., scanning a QR code). Subsequently, the advertisement indication that includes the advertisement identifier corresponding to the tag 330 and the advertisement 320 is communicated from the user device 350 to the communication module 220 via the network 104. The analysis module 230 identifies the advertisement corresponding to the advertisement indication (e.g., a lookup of the advertisement based on the advertisement identifier). Once the analysis module 230 identifies the advertisement, the item module 240 determines the item listings, based, at least in part, on the identified advertisement. For instance, the item module 240 may access a predefined set of item listings or dynamically determine item listings corresponding to the advertisement. In some implementations, the item module 240 retrieves the item listing and associated item data from the publication system(s) 142. The item data may include item images, price, description, brand, and so forth.

In various embodiments, the communication module 220 receives the contextual data corresponding to the advertisement indication from the user device 350. In some embodiments, the contextual data includes location data (e.g., as determined by a GPS component of the user device 350). In an embodiment, based on the contextual data, the presentation module 210 causes presentation of the item listings (e.g., by communicating the item listings and instructions to present the item listing to the user device 350). For example, the condition module 250 determines satisfaction of the contextual conditions and, based on the satisfaction of the contextual conditions, the presentation module 210 causes presentation of the item listings to the user.

In a specific example embodiment, the contextual conditions include the distance condition. In this example embodiment, the analysis module 230 identifies the advertisement location corresponding to the advertisement (e.g., the advertisement location may be predefined and accessed by the analysis module 230) and extracts the user location from the contextual data (e.g., the contextual data includes GPS data from the user device 350). The condition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location. The distance can be predefined or dynamically determined based on the contextual data. In this example, if the contextual conditions are satisfied, the presentation module 210 causes presentation of the item listing. Conversely, if the contextual conditions are not satisfied, the presentation module 210 does not cause presentation of the item listing. Put another way, in the context of this example embodiment, if the user interacts with the advertisement from another location that is outside of the distance of the distance condition (e.g., scanning the same or a similar QR code from a remote location), the user may not be able to view the item listings. In this way, the presentation of the item listings may be exclusive to users (e.g., the user 370) that are physically in the vicinity of the advertisement 320. Thus, in the depiction 300, the user 370 may interact with the advertisement 320 in a contextually aware manner.

FIG. 4 is a flow diagram illustrating an example method 400 for identifying the advertisement and presenting the item listings, according to some example embodiments. The operations of the method 400 may be performed by components of the advertisement system 150. At operation 410, the communication module 220 may receive, from the user device, the advertisement indication that indicates the presentation of the advertisement or a promotion to the user. For example, the advertisement indication can comprise the advertisement identifier extracted from a QR code, a RFID tag, a NFC tag, a smart tag, an audio signal (e.g., audio tagging), or the contextual data (e.g., a location mapping). In some implementations, the analysis module 230 extracts the advertisement identifier from various suitable combinations of tags and the contextual data. In some example embodiments, the advertisement indication includes the contextual data corresponding to the physical context or physical environment of the presentation of the advertisement to the user. In other example embodiments, the contextual data is received, retrieved, or otherwise obtained as a separate operation as discussed below in connection with operation 440.

In some implementations, the user initiates the advertisement identification (e.g., the user activates a user interface element on the user device to begin the advertisement identification). In other implementations, the advertisement identification initiates automatically via the analysis module 230 monitoring, tracking, or otherwise automatically detecting the advertisement indication. In these implementations, the analysis module 230 monitors the contextual data, received by the communication module 220, for the advertisement indication. For instance, the contextual data may include location data pertaining to the user. In this instance, the analysis module 230 automatically detects the advertisement indication based on the location data (e.g., mapping the user location with a plurality of advertisement locations).

In some embodiments, a QR code, a RFID tag, a NFC tag, a smart tag, or the like is embedded in the advertisement or in the vicinity of the advertisement. The user may initiate the advertisement identification by physically detecting a particular tag corresponding to the advertisement (e.g., physically moving a mobile device operable to detect RFID tags within a detection range of the RFID tag corresponding to the advertisement). In some implementations, the advertisement indication includes an advertisement identifier extracted from the tag (e.g., extracted by the user device).

In other embodiments, the advertisement may include an audio component (e.g., a television advertisement, a radio advertisement, a loud speaker announcement). In this example, the user device 350 or the analysis module 230 extracts the advertisement identifier using audio tag identification software. For example, an app executing on the user device 350, operable to detect and extract the advertisement identifier from an audio signal detected by the user device 350, communicates the advertisement indication including the advertisement identifier to the communication module 220. In an alternative example, the user device 350 communicates the advertisement indication including the audio signal to the communication module 220, and the analysis module 230 subsequently extracts the advertisement identifier from the audio signal.

In alternative embodiments, the analysis module 230 extracts the advertisement indication from the contextual data. For example, the analysis module 230 extracts the user location from the contextual data, accesses location data corresponding to a plurality of advertisements, and identifies a match between the user location and the location data of a particular advertisement among the plurality of advertisements. In this way, the analysis module 230 identifies a particular advertisement being presented to the user based on the contextual data.

At operation 420, the analysis module 230 identifies the advertisement corresponding to the advertisement indication. In an embodiment, the advertisement indication includes the advertisement identifier, and the analysis module 230 identifies the advertisement based on the advertisement identifier. For instance, the analysis module 230 performs a lookup of the advertisement using the advertisement identifier (e.g., a lookup table in a database, such as database(s) 126, indexed with advertisement identifiers). In another embodiment, the advertisement indication includes contextual data that the analysis module 230 uses to identify the advertisement. For instance, if the contextual data includes the user location, the analysis module 230 compares the user location to the advertisement locations (e.g., stored in a database such as database(s) 126).

At operation 430, the item module 240 determines one or more item listings based, at least in part, on the advertisement. For example, an operator, advertiser, or another party associated with the advertisement may specify a predefined set of item listings for the advertisement. In this example, the item module 240 accesses the predefined set of item listings for the advertisement and determines a portion of the predefined set of item listings. For instance, the advertisement may depict a particular celebrity, and the predefined set of item listings may include item listing endorsed by the particular celebrity. In another implementation, the item module 240 dynamically determines the item listings based on the advertisement. For instance, the item module 240 identifies item listings from an e-commerce website that are associated with the advertisement (e.g., same or similar type of item or brand as promoted by the advertisement).

In further implementations, the item module 240 determines the item listings based on the contextual data, user data, or other data. For example, the item module 240 determines the item listings by first identifying item listings on an e-commerce website associated with the advertisement and then refining the identified item listings based on the user data. In a specific example, if the user data indicates a gender of the user, various ones of the identified item listings may be excluded based on the gender (e.g., gender specific apparel items). In a further example, the item module 240 may employ the contextual data to determine item listings pertinent to the user. For instance, the item module 240 may use the user location included in the contextual data as a basis for determining the item listings (e.g., based on the user being near a store that sells a particular item, include an item listing corresponding to the particular item). In another example, the contextual data may indicate weather conditions such as a cold day, and the item module 240 may identify item listings based on the weather conditions (e.g., item listings associated with cold weather such as hot coffee).

At operation 440, the communication module 220 receives contextual data corresponding to the advertisement indication. In various implementations, the contextual data includes real-time contextual data. The term “real-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the real-time data may include user input data or sensor data communicated to the communication module 220 after a delay interval (e.g., due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and the communication module 220 receiving the data.

In further embodiments, the communication module 220 stores (e.g., a storage device such as database(s) 126) the contextual data in association with the user and the advertisement (e.g., a database index by a user identifier or the advertisement identifier). In some implementations, the item module 240 determines the item listings based on the stored contextual data. For example, the stored contextual data may indicate that the user has previously initiated identification of a particular advertisement. Based on the indication of the previous presentation of the advertisement to the user, the item module 240 may determine different item listings than those previously presented to the user.

In some implementations, the real-time contextual data corresponds to the physical context of the presentation of the advertisement. The physical context includes the presentation location (e.g., where the advertisement is being presented to the user), a presentation time (e.g., the time the advertisement is being presented to the user), an ambient noise level (e.g., a decibel level corresponding to a noise level of the advertisement presentation), an ambient temperature, an ambient illumination level, biometric data associated with the user, and so on. In various implementations, the communication module 220 receives the contextual data from sensors of the user device as further discussed in connection with FIG. 5, below.

Referring now to FIG. 5, example diagram 500 depicts non-limiting example sensor components 510 that may provide attribute data, according to some example embodiments. In example embodiments, the sensor components 510 include motion components 520, position components 530, environmental components 540, biometric components 550, detection components 560, and a wide gamut of other sensors, gauges, and measurement components not shown in FIG. 5. The sensor components 510 or a suitable combination of the sensor components 510 may be included in any suitable device or machine of FIG. 1, such as the client device 110, to facilitate the functionality described herein.

The sensor components 510 may receive, detect, measure, capture, or otherwise obtain sensor data associated with physical properties, attributes, or characteristics. The sensor components 510 may provide, produce, transmit, or otherwise communicate the sensor data or other indications associated with the physical properties, attributes, or characteristics (e.g., a sensor included in a device operable to communicate the sensor data to the networked system 102). In some implementations, a combination of devices may be employed to provide the sensor data (e.g., a first device that includes a sensor and is communicatively coupled to a second device that communicates sensor data received from the first device to the networked system 102). As a result, the sensor data provided by the sensor components 510 may be accessible to all, or some, of the modules described above on a real-time or near real-time basis. The sensor components 510 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting.

The motion components 520 include acceleration sensors (e.g., accelerometer), gravitation sensors, rotation sensors (e.g., gyroscope), and so forth. The motion components 520 may provide motion data such as velocity, acceleration, or other force measurements along an x, y, and z axes. In some implementations, the motion data is provided at a regular update rate or sampling rate (e.g., 10 updates per second) that may be configurable.

The position components 530 include location sensors (e.g., a Global Position System (GPS) receiver component), altitude sensors (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensors (e.g., magnetometers that provide magnetic field strength along the x, y, and z axes), and the like. In an example embodiment, the position components 530 may provide position data such as latitude, longitude, altitude, and a time stamp. Similar to the motion components 520, the position components 530 may provide the motion data at a regular update rate that may be configurable.

The environmental components 540 include illumination sensors (e.g., photometer), temperature sensors (e.g., one or more thermometers that detect ambient temperature), humidity sensors, pressure sensors (e.g., barometer), acoustic sensors (e.g., one or more microphones that detect background noise), proximity sensors (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), and so on. The environmental components 540 may measure various physical parameters to provide an indication or signal corresponding to the physical environment surrounding the environmental components 540.

The biometric components 550 include components to detect expressions, measure biosignals, or identify people, among other functions. For example, the biometric components 550 include expression components to detect expressions (also referred to as “kinesics”) such as hand gestures (e.g., an optical component to detect a hand gesture or a Doppler component to detect hand motions), vocal expressions (e.g., a microphone to detect changes in voice pitch that may indicate tension), facial expressions (e.g., a camera to detect expressions or micro-expressions of a person such as a smile), body gestures, and eye tracking (e.g., detecting the focal point of a person's eyes or patterns in eye movement). The biometric components 550 may also include, for example, biosignal components to measure biosignals such as blood pressure, heart rate, body temperature, perspiration, and brain waves (e.g., as determined by a electroencephalogram). In further examples, the biometric components 550 include identification components to identify people such as retinal scanners (e.g., a camera component), vocal detectors (e.g., a microphone to receive audio data for voice identification), facial detectors, fingerprint detectors, and electroencephalogram sensors (e.g., to identify a person via unique brain wave patterns).

The detection components 560 provide functionality to detect a variety of identifiers. For example, the detection components 560 include Radio Frequency Identification (RFID) tag reader components, Near Field Communication (NFC) smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar codes, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In additional, a variety of information may be derived via various communication components such as location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth.

Referring back to FIG. 4, at operation 450, the presentation module 210 causes presentation of the item listings based on the real-time contextual data. For instance, the presentation module 210 communicates the item listing to the user device with instructions to cause presentation of the item listing to the user. In response to receiving the instructions to cause the presentation, the user device generates a user interface including the item listings and displays the user interface to the user, according to some implementations. In alternative implementations, the presentation module 210 generates the user interface including the item listings and communicates the generated user interface to the user device for presentation to the user.

In further embodiments, the offer module 260 generates an advertisement offer associated with the item listings. Subsequently, the presentation module 210 provides the advertisement offer to the user (e.g., a user interface configured to present the offer and receive an indication of a selection of the offer). In some instances, the offer module 260 generates the offer based on the contextual data. For example, if the contextual data indicates the user location is within a distance of a store the sells a particular item associated with the item listings, the offer module 260 may generate the offer for the particular item based on that basis (e.g., a discount to entice the user to stop by the store). In various implementations, the offer comprises discounts, coupons, free shipping, or exclusive item features (e.g., an item style otherwise unavailable) associated with the item listings.

In still further embodiments, the presentation module 210 or the offer module 260 may provide exclusive features (e.g., the exclusive features may be included in the presentation) associated with the item listings to the user. For example, the exclusive features may include various promotional techniques such as offer items otherwise not available or of limited availability (e.g., a book including an author autograph). In this embodiment, the exclusive features are intended to incentivize the user to interact with the advertisement.

FIG. 6 is a flow diagram 600 illustrating communication between various entities, according to some example embodiments. At operation 606, user 602 initiates advertisement indication capture. For example, the user 602 activates an app executing on a mobile device of the user to capture the advertisement indication. At operation 608, user device 604 captures the advertisement indication. As discussed above, the user device 604 captures the advertisement indication using a variety of techniques such as QR code scanning, RFID tag detection, NFC tag detection, smart tag detection, audio tag detection, user location mapping, and so on.

As discussed above, the advertisement system 150 receives the advertisement indication at the operation 410, identifies the advertisement at the operation 420, determines the item listings at the operation 430, receives the contextual data at the operation 440, and causes presentation of the item listings at the operation 450. At the operation 450, the advertisement system 150 communicates the item listings to the user device 604 for presentation to the user, according to some implementation.

At operation 610, the user device 604 presents the item listings to the user (e.g., a user interface that includes the item listings). The user 602 may receive (e.g., viewing) the presentation at operation 612 and may select an option to make a purchase associated with the item listings at the operation 614. For instance, the user interface that includes the item listings may be configured to receive a selection to make a purchase associated with the item listings. At operation 616, the user device 604 receives the selection to make a purchase associated with the item listings. The user device 604 communicates the selection to make the purchase to the advertisement system 150 (e.g., received at the communication module 220). At operation 618, the advertisement system 150 may facilitate the purchase associated with the item listings. For instance, the offer module 260 may perform the transaction for the purchase.

FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments. Subsequent to the operation 440, the presentation module 210 causes presentation of the item listings based on the contextual data at the operation 450. In addition, at operation 710, the condition module 250 accesses contextual conditions associated with the advertisement. For example, the contextual conditions may include a location condition, a temporal condition, or other conditions.

At operation 720, the condition module 250 determines satisfaction of the contextual conditions based on the contextual data. The condition module 250 evaluates some, or all, of the conditions included in the contextual conditions (e.g., the condition module 250 iterates through and evaluates each condition included in the contextual conditions). For example, if the contextual conditions include a location condition and a temporal condition, the condition module 250 may determine satisfaction of the contextual conditions if either or both the location condition and the temporal condition are satisfied.

In some implementations, the satisfaction of the contextual conditions is determined based on a weighting of the satisfaction of the conditions included in the contextual conditions (e.g., the location condition may be associated with a higher weight and given more influence in the condition module 250 determining satisfaction of the contextual conditions). The weighting may be predetermined or dynamically determined (e.g., weighting based on feedback data or other engagement data such as the user or similar user showing interest in a particular item listing via tapping or clicking on the particular item listing). In an example implementation, the condition module 250 may calculate a contextual condition metric based on the satisfaction of respective conditions included in the contextual conditions. In this implementation, the condition module 250 determines satisfaction of the contextual conditions when the contextual condition metric exceeds a threshold.

At operation 730, the presentation module 210 causes presentation of the item listings based on the satisfaction of the contextual conditions. For example, if the condition module 250 determines that the contextual conditions are satisfied, the presentation module 210 may then cause presentation of the item listings. Conversely if the condition module 250 determines the contextual conditions are not satisfied, the presentation module 210 does not cause presentation of the item listings.

In further embodiments, the condition module 250 monitors the contextual data to determine that the contextual conditions are satisfied after the presentation of the item listings. For instance, if the contextual conditions include the temporal condition, the condition module 250 may determine, after the item listings are presented to the user, that the temporal condition is not satisfied and restrict the presentation of the item listings or restrict features associated with the item listings (e.g., remove or disable an option to purchase the item listings). In this way, the condition module 250 may create an exclusive experience associated with the advertisement available to users who physically interact with the advertisement. The local or ephemeral nature of the presentation of the item listings created by employing the contextual conditions may have the effect of generating demand, interest, excited, or a “buzz” regarding an advertising campaign and associated item listings.

FIG. 8 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments. Subsequent to the operation 710, the condition module 250 may determine satisfaction of the contextual conditions at the operation 720. In some implementations, the operation 720 includes additional operations as show in FIG. 8, such as the contextual conditions including the distance condition. At operation 810, the analysis module 230 identifies the advertisement location corresponding to the advertisement. For instance, the advertisement may have a fixed location (e.g., a poster affixed to a wall). An operator, advertiser, or another party may assign a predefined advertisement location to the advertisement (e.g., longitude, latitude, altitude coordinates for the poster location) and store the advertisement location in a storage device such as database(s) 126. For instance, the analysis module 230 identifies the advertisement location by accessing the advertisement location based on the advertisement identifier (e.g., the advertisement location stored in a database according to the advertisement identifier).

In some implementations, the analysis module 230 automatically determines the location of the advertisement based on the contextual data. For example, the contextual data may indicate a location of a mobile device of the user (e.g., via a GPS component of the mobile device). The mobile device may further detect the advertisement using any of the short range communication techniques described above (e.g., QR code scanning, RFID detection). In this implementation, based on the location of the mobile device and use of a short range detection technique, the analysis module 230 may infer that the location of the advertisement is in the vicinity of the mobile device when the mobile device detects the advertisement. In further implementations, the analysis module 230 stores (e.g., in a storage device such as database(s) 126) the automatically determined advertisement location to be used in subsequent analysis. Thus, if the user detects the advertisement using a device without location services, the analysis module 230 may access a stored advertisement location corresponding to the advertisement for the user. In some instances, the analysis module 230 stores the automatically determined advertisement location from many users and identifies the true advertisement location using statistical analysis (e.g., an average or standard deviation based analysis). Many other schemes and techniques may be employed by the analysis module 230 to automatically determine the advertisement location.

At operation 820, the analysis module 230 extracts the user location from the contextual data. For example, the contextual data may include location data received from a mobile device of the user operable to provide location as determined by a GPS component of the mobile device. In some implementations, the analysis module 230 may infer the location of the user based the detection of the advertisement using a short range communication technique, similar to that discussed above in connection with the operation 810. In this implementation, if the advertisement location is known (e.g., predefined by an operator or the automatically determine advertisement location is stored from a different user), the analysis module 230 may infer the user location is in the vicinity of the advertisement location based on the user detecting the advertisement using a short range communication technique.

At operation 830, the condition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location. The distance may be predefined (e.g., specified by the advertiser) or dynamically determined.

FIG. 9 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments. Subsequent to the operation 710, the condition module 250 may determine satisfaction of the contextual conditions at the operation 720. In some implementations, the operation 720 includes additional operations as shown in FIG. 9, such as the contextual conditions including the temporal condition. At operation 910, the analysis module 230 extracts a presentation time from the real-time contextual data or the advertisement indication. The presentation time is intended to include a time when the user is being presented the advertisement. For example, the advertisement indication may include a time stamp of when the user device detected the advertisement (e.g., when the QR code embedded in the advertisement was scanned).

At operation 920, the condition module 250 determines satisfaction of the temporal condition by determining that the presentation time is within a time period (e.g., 15 minutes or one week). The time period may be predefined or dynamically determined (e.g., a time period based on the length of time the user viewed the advertisement as determined by the contextual data).

Although FIGS. 8 and 9 illustrate the contextual conditions including the distance condition and the temporal condition, it will be appreciated that many other conditions may be included in the contextual conditions. For example, an environmental condition may be implemented based on environmental data included in the contextual data. For instance, the condition module 250 may implement conditions based on ambient noise data, ambient illumination data, or other environmental data corresponding to the physical context of the presentation of the advertisement. For example, if the ambient noise data indicate the physical context of the presentation of the advertisement is noisy (e.g., audio decibel level exceeding a threshold), the presentation module 210 may omit an audio component of the presentation of the item listings as the user is unlikely to receive an audio presentation under such conditions. In another instance, the condition module 250 may implement conditions based on biometric data corresponding to the user being presented the advertisement. For instance, heart rate data included in the context data may indicate the user is jogging or performing some other vigorous physical activity. The condition module 250 may target physically active users by implementing a biometric condition based on, for example, the heart rate exceeding a threshold.

Example User Interfaces

FIGS. 10-13 depict example user interfaces for interactively presenting the item listings to the user. Although FIGS. 10-13 depict specific example user interfaces and user interface elements, these are merely non-limiting examples and many other alternate user interfaces and user interface elements may be generated by the presentation module 210 and presented to the user. It will be noted that alternate presentations of the displays of FIGS. 10-13 may include additional information, graphics, options, and so forth; other presentations may include less information, or may provide abridged information for easy use by the user.

FIG. 10 depicts an example device 1000 (e.g., a smart phone) displaying an example user interface 1010 that includes user interface element 1020 and item listings 1030, according to some example embodiments. In some implementations, the user interface element 1020 provides an option to sort the item listings 1030, or otherwise navigate the item listings, according to various schemes such as sorting based on recentness (e.g., based on temporal information corresponding to respective item listings), item price, distance from the user, relevance, or other metrics. In an example embodiment, the item listings 1030 include various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142, and the like. In some implementations, activating a particular item listing presents additional information corresponding to the particular item listing. In a specific example, the item listings 1030 may include item listings associated with a celebrity depicted in the advertisement (e.g., a celebrity endorsement for a basket of items).

FIG. 11 depicts an example device 1100 (e.g., a smart phone) displaying an example user interface 1110 that includes an item listing 1120, user interface element 1130, and user interface element 1140, according to some example embodiments. In an example embodiment, the item listing 1120 includes various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142, and the like. In some implementations, activating the item listing 1120 presents additional information corresponding to the item listing 1120. In an example embodiment, activating the user interface element 1130 provides the user the option to purchase the item corresponding to the item listing (e.g., activating the user interface element 1140 may facilitate a transaction for the item, for example, using the payment system(s) 144). In some example embodiments, user interface element 1140 includes a map with locations of the item listings or merchants that sell the item corresponding to the item listing 1120. In further example embodiments, a current user location 1150 is determined (e.g., via a GPS component of a mobile device of the user) and used to determine nearby merchants, such as merchant 1160, that sell the item corresponding to the item listing.

FIG. 12 depicts an example device 1200 (e.g., a smart watch) displaying an example user interface 1210. The example user interface 1210 includes user interface element 1220 that may be associated with the identified advertisement (see the advertisement in connection with FIG. 3). The user interface 1210 includes user interface element 1230 that provides the user an option to interact with the user interface 1210. For instance, activating the user interface element 1230 provides additional information associated with the advertisement. For instance, the item listings may include a group of particular item listings associated with a celebrity (e.g., a celebrity endorsement).

FIG. 13 depicts an example device 1300 (e.g., smart phone) displaying an example user interface 1310 that includes a notification 1320, according to some example embodiments. In various example embodiments, the presentation module 210 causes presentation of the notification 1320 to the user. The notification may be presented to the user in response to automatic detection of the advertisement (e.g., location mapping of the user indicates the user is near the advertisement or automatic detection of an NFC tag embedded in an advertisement). For instance, the presentation module 210 communicates, to the device 1300, instructions to present the notification 1320. In some instances, the instructions include notification content, generated by the presentation module 210, such as a message (e.g., pertinent information) to be presented to the user. In example embodiments, the notification 1320 comprises a text message, such as Short Message Service (SMS) messages, Multimedia Messaging Service (MMS), Enhanced Messaging Service (EMS), and so forth. In other example embodiments, the notification 1320 comprises a push notification or another similar type of notification. In further example embodiments, the notification 1320 comprises interactive user interface elements such as user interface elements 1330. In these example embodiments, the user interface elements 1330 provide the user an option to make a selection (e.g., through an SMS system, mobile application).

Modules, Components, and Logic

Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.

In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.

Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.

Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).

The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.

Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)).

The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.

Applications

FIG. 14 illustrates an example mobile device 1400 executing a mobile operating system (e.g., iOS™, Android™, Windows® Phone, or other mobile operating systems), according to example embodiments. In one embodiment, the mobile device 1400 includes a touch screen operable to receive tactile data from a user 1402. For instance, the user 1402 may physically touch 1404 the mobile device 1400, and in response to the touch 1404, the mobile device 1400 determines tactile data such as touch location, touch force, or gesture motion. In various example embodiments, the mobile device 1400 displays a home screen 1406 (e.g., Springboard on iOS™) operable to launch applications or otherwise manage various aspects of the mobile device 1400. In some example embodiments, the home screen 1406 provides status information such as battery life, connectivity, or other hardware statuses. In some implementations, the user 1402 activates user interface elements by touching an area occupied by a respective user interface element. In this manner, the user 1402 may interact with the applications. For example, touching the area occupied by a particular icon included in the home screen 1406 causes launching of an application corresponding to the particular icon.

Many varieties of applications (also referred to as “apps”) may be executing on the mobile device 1400 such as native applications (e.g., applications programmed in Objective-C, Swift, or another suitable language running on iOS™ or applications programmed in Java running on Android™), mobile web applications (e.g., Hyper Text Markup Language-5 (HTML5)), or hybrid applications (e.g., a native shell application that launches an HTML5 session). For example, the mobile device 1400 includes a messaging app 1420, audio recording app 1422, a camera app 1424, a book reader app 1426, a media app 1428, a fitness app 1430, a file management app 1432, a location app 1434, a browser app 1436, a settings app 1438, a contacts app 1440, a telephone call app 1442, or other apps (e.g., gaming apps, social networking apps, biometric monitoring apps), a third party app 1444.

Software Architecture

FIG. 15 is a block diagram 1500 illustrating an architecture of software 1502, which may be installed on any one or more of the devices described above. FIG. 15 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein. The software 1502 may be implemented by hardware such as machine 1600 of FIG. 16 that includes processors 1610, memory 1630, and I/O components 1650. In this example architecture, the software 1502 may be conceptualized as a stack of layers where each layer may provide a particular functionality. For example, the software 1502 includes layers such as an operating system 1504, libraries 1506, frameworks 1508, and applications 1510. Operationally, the applications 1510 invoke application programming interface (API) calls 1512 through the software stack and receive messages 1514 in response to the API calls 1512, according to some implementations.

In various implementations, the operating system 1504 manages hardware resources and provides common services. The operating system 1504 includes, for example, a kernel 1520, services 1522, and drivers 1524. The kernel 1520 acts as an abstraction layer between the hardware and the other software layers in some implementations. For example, the kernel 1520 provides memory management, processor management (e.g., scheduling), component management, networking, security settings, among other functionality. The services 1522 may provide other common services for the other software layers. The drivers 1524 may be responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 1524 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth.

In some implementations, the libraries 1506 provide a low-level common infrastructure that may be utilized by the applications 1510. The libraries 1506 may include system 1530 libraries (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like. In addition, the libraries 1506 may include API libraries 1532 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic content on a display), database libraries (e.g., SQLite to provide various relational database functions), web libraries (e.g., WebKit to provide web browsing functionality), and the like. The libraries 1506 may also include a wide variety of other libraries 1534 to provide many other APIs to the applications 1510.

The frameworks 1508 provide a high-level common infrastructure that may be utilized by the applications 1510, according to some implementations. For example, the frameworks 1508 provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks 1508 may provide a broad spectrum of other APIs that may be utilized by the applications 1510, some of which may be specific to a particular operating system or platform.

In an example embodiment, the applications 1510 include a home application 1550, a contacts application 1552, a browser application 1554, a book reader application 1556, a location application 1558, a media application 1560, a messaging application 1562, a game application 1564, and a broad assortment of other applications such as third party application 1566. According to some embodiments, the applications 1510 are programs that execute functions defined in the programs. Various programming languages may be employed to create one or more of the applications 1510, structured in a variety of manners, such as object-orientated programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, the third party application 1566 (e.g., an application developed using the Android™ or iOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOS™, Android™, Windows® Phone, or other mobile operating systems. In this example, the third party application 1566 may invoke the API calls 1512 provided by the mobile operating system 1504 to facilitate functionality described herein.

Example Machine Architecture and Machine-Readable medium

FIG. 16 is a block diagram illustrating components of a machine 1600, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 16 shows a diagrammatic representation of the machine 1600 in the example form of a computer system, within which instructions 1616 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1600 to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine 1600 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 1600 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 1600 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1616, sequentially or otherwise, that specify actions to be taken by machine 1600. Further, while only a single machine 1600 is illustrated, the term “machine” shall also be taken to include a collection of machines 1600 that individually or jointly execute the instructions 1616 to perform any one or more of the methodologies discussed herein.

The machine 1600 may include processors 1610, memory 1630, and I/O components 1650, which may be configured to communicate with each other via a bus 1602. In an example embodiment, the processors 1610 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, processor 1612 and processor 1614 that may execute instructions 1616. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (also referred to as “cores”) that may execute instructions contemporaneously. Although FIG. 16 shows multiple processors, the machine 1600 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.

The memory 1630 may include a main memory 1632, a static memory 1634, and a storage unit 1636 accessible to the processors 1610 via the bus 1602. The storage unit 1636 may include a machine-readable medium 1638 on which is stored the instructions 1616 embodying any one or more of the methodologies or functions described herein. The instructions 1616 may also reside, completely or at least partially, within the main memory 1632, within the static memory 1634, within at least one of the processors 1610 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 1600. Accordingly, in various implementations, the main memory 1632, static memory 1634, and the processors 1610 are considered as machine-readable media 1638.

As used herein, the term “memory” refers to a machine-readable medium 1638 able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1638 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 1616. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1616) for execution by a machine (e.g., machine 1600), such that the instructions, when executed by one or more processors of the machine 1600 (e.g., processors 1610), cause the machine 1600 to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory (e.g., flash memory), an optical medium, a magnetic medium, other non-volatile memory (e.g., Erasable Programmable Read-Only Memory (EPROM)), or any suitable combination thereof. The term “machine-readable medium” specifically excludes non-statutory signals per se.

The I/O components 1650 include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. In general, it will be appreciated that the I/O components 1650 may include many other components that are not shown in FIG. 16. The I/O components 1650 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1650 include output components 1652 and input components 1654. The output components 1652 include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. The input components 1654 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.

In some further example embodiments, the I/O components 1650 include biometric components 1656, motion components 1658, environmental components 1660, or position components 1662 among a wide array of other components. For example, the biometric components 1656 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 1658 include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The environmental components 1660 include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 1662 include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.

Communication may be implemented using a wide variety of technologies. The I/O components 1650 may include communication components 1664 operable to couple the machine 1600 to a network 1680 or devices 1670 via coupling 1682 and coupling 1672, respectively. For example, the communication components 1664 include a network interface component or another suitable device to interface with the network 1680. In further examples, communication components 1664 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 1670 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).

Moreover, in some implementations, the communication components 1664 detect identifiers or include components operable to detect identifiers. For example, the communication components 1664 include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect a one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), acoustic detection components (e.g., microphones to identify tagged audio signals), or any suitable combination thereof. In addition, a variety of information can be derived via the communication components 1664, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth.

Transmission Medium

In various example embodiments, one or more portions of the network 1680 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, the network 1680 or a portion of the network 1680 may include a wireless or cellular network and the coupling 1682 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, the coupling 1682 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1xRTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS). High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology.

In example embodiments, the instructions 1616 are transmitted or received over the network 1680 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1664) and utilizing any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)). Similarly, in other example embodiments, the instructions 1616 are transmitted or received using a transmission medium via the coupling 1672 (e.g., a peer-to-peer coupling) to devices 1670. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions 1616 for execution by the machine 1600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.

Furthermore, the machine-readable medium 1638 is non-transitory (in other words, not having any transitory signals) in that it does not embody a propagating signal. However, labeling the machine-readable medium 1638 as “non-transitory” should not be construed to mean that the medium is incapable of movement; the medium should be considered as being transportable from one physical location to another. Additionally, since the machine-readable medium 1638 is tangible, the medium may be considered to be a machine-readable device.

Language

Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.

Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.

The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.

As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims

1. A system comprising:

a communication module to receive, from a portable device of a user, a promotion indication that indicates a presentation of a promotion to a user;
an analysis module to identify the promotion that corresponds to the promotion indication;
an item module, implemented by at least one processor of a machine, to determine at least one item listing based, at least in part, on the promotion;
the communication module further to receive, from the portable device of the user, real-time contextual data that corresponds to the promotion indication, the real-time contextual data that corresponds to a physical environment of the presentation of the promotion;
a condition module to access contextual conditions associated with the promotion and determine satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, a presentation module to cause presentation of a user interface including the at least one item listing to the user, the presentation.

2. The system of claim 1, wherein:

the analysis module further to identify an advertisement location that corresponds to the advertisement and further to extract a user location from the real-time contextual data; and
the condition module further to determine satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.

3. A method comprising:

receiving, from a user device, an advertisement indication that indicates a presentation of an advertisement to a user,
identifying the advertisement corresponding to the advertisement indication;
determining, using a hardware processor of a machine, at least one item listing based, at least in part, on the advertisement;
receiving, from the user device, real-time contextual data corresponding to the advertisement indication, the real-time contextual data corresponding to a physical context of the presentation of the advertisement;
accessing contextual conditions associated with the advertisement;
determining satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, causing presentation of the at least one item listing to the user, the presentation being exclusive based on the real-time contextual data.

4. The method of claim 3, further comprising:

identifying an advertisement location corresponding to the advertisement;
extracting a user location from the real-time contextual data; and
determining satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.

5. The method of claim 3, further comprising:

extracting a presentation time from the real-time contextual data; and
determining satisfaction of a temporal condition, included in the contextual conditions, by determining that the presentation time is within a time period specified by the temporal condition.

6. The method of claim 3, further comprising:

extracting a user identity from the real-time contextual data;
accessing user data corresponding to the user based on the user identity; and
determining the at least one item listing based on the advertisement and the user data.

7. The method of claim 3, further comprising:

generating an advertisement offer associated with the at least one item listing; and
providing the advertisement offer to the user.

8. The method of claim 7, wherein the generating the offer is based, at least in part, on the real-time contextual data.

9. The method of claim 7, wherein the advertisement offer comprises a discounted purchase associated with the at least one item listing.

10. The method of claim 3, further comprising:

extracting the advertisement indication from the real-time contextual data by: extracting a user location from the real-time contextual data; accessing location data corresponding to a plurality of advertisements, and identifying a match between the user location and the location data of a particular advertisement among the plurality of advertisements.

11. The method of claim 3, further comprising:

storing the real-time contextual data and the advertisement indication in association with the user; and
receiving a subsequent advertisement indication that indicates a presentation of the advertisement to the user;
identifying the advertisement corresponding to the advertisement indication; and
determining the at least one item listing based on the stored real-time contextual data and the advertisement.

12. The method of claim 3, wherein the advertisement indication results from the user device physically detecting an advertisement identifier corresponding to the advertisement.

13. The method of claim 12, wherein the advertisement identifier is detected from at least one of a QR code, a RFID tag, an audio tag, or a smart tag.

14. A machine readable medium having no transitory signals and storing instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising:

receiving, from a user device, an advertisement indication that indicates a presentation of an advertisement to a user,
identifying the advertisement corresponding to the advertisement indication;
determining at least one item listing based, at least in part, on the advertisement;
receiving, from the user device, real-time contextual data corresponding to the advertisement indication, the real-time contextual data corresponding to a physical context of the presentation of the advertisement;
accessing contextual conditions associated with the advertisement;
determining satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, causing presentation of the at least one item listing to the user, the presentation being restricted based on the real-time contextual data.

15. The machine-readable medium of claim 14, further comprising:

identifying an advertisement location corresponding to the advertisement;
extracting a user location from the real-time contextual data; and
determining satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.

16. The machine-readable medium of claim 14, further comprising:

extracting a presentation time from the real-time contextual data; and
determining satisfaction of a temporal condition, included in the contextual conditions, by determining that the presentation time is within a time period specified by the temporal condition.

17. The machine-readable medium of claim 14, further comprising:

extracting a user identity from the real-time contextual data;
accessing user data corresponding to the user based on the user identity; and
determining the at least one item listing based on the advertisement and the user data.

18. The machine-readable medium of claim 14, further comprising:

generating an advertisement offer associated with the at least one item listing based on the advertisement; and
providing the advertisement offer to the user.

19. The method of claim 14, wherein the generating the offer is based, at least in part, on the real-time contextual data.

20. The method of claim 14, wherein the advertisement offer comprises a discounted purchase associated with the at least one item listing.

Patent History
Publication number: 20150058123
Type: Application
Filed: Aug 21, 2014
Publication Date: Feb 26, 2015
Inventors: Michael George Lenahan (Moraga, CA), Chahn Chung (San Francisco, CA), Myra Sandoval (San Francisco, CA), Ben Mitchell (Oakland, CA), Timothy Sean Suglian (San Francisco, CA)
Application Number: 14/465,786
Classifications
Current U.S. Class: Based On User Location (705/14.58)
International Classification: G06Q 30/02 (20060101);