METHOD AND APPARATUS FOR CREATING RULE-BASED INTERACTION OF PORTABLE CLIENT DEVICES AT A LIVE EVENT
A system creates an enhanced experience at a concert or other event. A central server provides commands through which information is sent or gathered or both between portable interactive devices of audience members and a central server. A shared experience such as a common display sent to all users is created. A system “app” is installed in the portable interactive device. Rules are implemented in a processor to enable a game to be played by audience members in which a processor enables interaction of users with the server and in which the processor processes user inputs according to rules of the game.
This patent application claims priority of Provisional Patent Application 61/648,593 filed May 18, 2012, Provisional Patent Application 61/670,754 filed Jul. 12, 2012, Provisional Patent Application 61/705,051 filed Sep. 24, 2012, Provisional Patent Application 61/771,629 filed Mar. 1, 2013, Provisional Patent Application 61/771,646 filed Mar. 1, 2013, Provisional Patent Application 61/771,690 filed Mar. 1, 2013, and Provisional Patent Application 61/771,704 filed Mar. 1, 2013, the disclosures of which are each incorporated by reference herein in its entirety.
BACKGROUND1. Field of the Invention
The present subject matter relates to providing a shared experience to audience members communicated from a central server by creating rule-based individual and shared interactions with portable interactive devices of audience members.
2. Related Art
In order to enhance an audience's involvement in a live event such as a concert, video displays may be combined with a concert performance.
For example, United States Patent Application Publication No. 201200239526 discloses an interactive method and apparatus which provide limited interaction between a performer and concert attendees. The performer enters concert information into a server, which is then accessed wirelessly by an electronic device of a concert attendee. Animations from the server are dynamically displayed on the electronic device. In this arrangement, attendees may select a song to download or to view the lyrics. The user may select an encore screening to vote on a song to be played during an encore performance. In this arrangement, the attendee interacts only with previously stored information. There is no new information generated to enhance the performance. In order to combine further information sources, whether local or accessed through the Internet, the system must provide sufficient bandwidth or delays and gaps in the data will occur. In the past, it has generally been impossible to provide sufficient bandwidth through a venue connection. Possible interactions between a performer and an audience are greatly limited.
United States Published Patent Application No. 20070292832 discloses a system for creating sound using visual images. Various controls and features are provided for the selection, editing, and arrangement of the visual images and tones used to create a sound presentation. Visual image characteristics such as shape, speed of movement, direction of movement, quantity, and location can be set by a user. These systems do not provide for interaction with the audience.
To the extent that audience interaction and provision of displays constructed for particular users have been provided, they have had very limited capabilities.
U.S. Pat. No. 8,090,321 discloses a method and system for wirelessly providing venue-based data to one or more hand held devices. The venue-based data can be authenticated and wirelessly transmitted to one or more hand held devices through one or more wireless telecommunications networks in response to authenticating the venue-based data. This method and system provide data to hand held devices. However, an interaction between a device and the venue data source is not disclosed.
United States Published Patent Application No. 20130080348 describes capturing event feedback and providing a representation of feedback results generated using the feedback indicia. The capturing of event feedback involves storing a plurality of different event records for each of a plurality of different events. The information is used by a program presenter for determining audience behavior in response to transmitted content. Production of an enhanced concert experience is not disclosed.
U.S. Pat. No. 7,796,162 discloses a system in which one set of cameras generates multiple synchronized camera views for broadcast of a live activity from a venue to remote viewers. A user chooses which view to follow. However, there is no plan for varying the sets of images sent to users. There is no uploading capability for users.
U.S. Pat. No. 6,731,940 discloses methods of using wireless geolocation to customize content and delivery of information to wireless communication devices. The communication devices send signals to a central control system. The method uses an RF receiving site including antenna array and a mobile device operated by a user. At least one p-dimensional array vector is derived from RF signals sampled from p antennas of an array, where p is an integer. At least one p-dimensional array vector is used to derive a location of the mobile device. The device addresses a data source in order to customize information in correspondence with the location. The customized information is transmitted to a user. A significant application of this system is to send and/or receive location-specific information of interest, such as targeted advertisements and special services, to travelers and shoppers. A control function is not provided for generating a display comprising an entertainment performance.
United States Published Patent Application No. 20110075612 discloses a system in which content is venue-cast. The content is sent to a plurality of receiving access terminals comprising portable interactive devices within a venue boundary. Content generated at an access terminal is transmitted to a venue-cast server. A venue-specific network could comprise a wide area network (WAN) or a Wi-Fi hotspot deployment. The system provides “unscheduled ad hoc deliveries” of content via the venue transmission system to provide venue visitors with venue related information. Content is specific to the venue and is not related to groups of users within the venue. The only function provided is a venue cast.
SUMMARYBriefly stated, in accordance with the present subject matter, there are provided a system, method, and machine-readable medium comprising instructions to be executed on a digital processor for permitting a system to create cooperatively determined video compositions and interaction between audience members to produce a composite result. A central server provides commands through which information is sent or gathered or both between audience members and a central server. Information received from audience members is processed to determine relationships between data from individual users. More specifically, in one form, rules may be implemented in a processor to enable a game played by audience members which is coordinated by a central control system.
In one preferred form, a Wi-Fi link is provided in a venue so that the ability to communicate is not limited by Internet or cellular system bandwidth constraints.
A program may issue commands to all interactive devices in an audience to produce a composite result. The present subject matter can direct portable interactive devices to perform functions in response to received signals from a central source including a central server. The central server may interact with a portable interactive device through an app created in accordance with the present subject matter. The app is installed in the portable interactive device.
Another composite result is implementation of a game in which the central server sends commands to gather selected information from each portable interactive device. The information is processed according to a preselected rule, and results are provided to users. A plurality of iterations of information gathering and processing may be performed. Various forms of information may be gathered and processed in accordance with different preselected rules in order to implement different games or information transmission.
The present subject matter may be further understood by reference to the following description taken in connection with the following drawings:
A central clock 9 synchronizes operations. The venue 10 may include a stage 12, audience area 14, a control room 16, and a media system 18 which may be located in the control room 16. The media system 18 receives audio, video, and intelligence from sources and may be operated to perform control room functions such as mixing, selecting, and processing. A video program 20 is shown on a display 22.
The media system 18 is used to couple outputs from a video source 26, a sound source 28, and other intelligence source 30. The video source 26 may comprise one or more television cameras 24. In the present illustration, a media source 34 includes the video source 26, sound source 28, and other intelligence source 30. The sound source 28 comprises audio output from a live performance provided by a performer or performers 40 coupled by transducers 42, such as microphones. Alternatively, one or more of the video source 26, the sound source 28, and other intelligence source 30 may comprise sources of streaming content, prerecorded content, stored data, or currently processed content from any source. These sources may be local, remote, or both.
In one preferred form the display 22 is a screen 50 that comprises a backdrop for the stage 12. The display 22 could comprise an array 52 of screens over which the video program 20 is distributed. In another form, often used in arenas, the display 22 could comprise a display unit 56 which includes a plurality of monitors 58 on one support 60, with each monitor 58 facing in a different direction. Examples of the display unit 56 are available under the trademark Jumbotron®.
The media system 18 is operated by a VJ 70. The VJ 70 may comprise one or more personnel or a programmed computer. It is not essential that the control room 18 be located at the venue 10. The media system 18 provides content to a concert network controller 100. The concert network controller 100 may both receive and transmit information. The concert network controller 100 provides an input to a display link 102, which is coupled by a patch panel 104 to the display unit 56.
The concert network controller 100 may also comprise a Wi-Fi hotspot 120 providing and receiving signals to and from an audience area 14. As further described below, content may be provided both to and from audience members 4. Audience members may also be informed of information relating to the composite responses from all audience members. The concert network controller 100 may also interact with remote participants 140. In another form, a Wi-Fi system 124, discussed below with respect to
The concert network controller 100 is preferably wirelessly connected to an event server 130, which can provide communications between remote participants 140 and the concert network controller 100. The event server is coupled to a content editor 134, which interacts with a staging server 136. The staging server 136 may be coupled to the remote participants 140 by a network, for example, the Internet 144.
Communications will be provided between a target system and a source system. In the present description, “source system” is a device that wishes to send a message to a “target system.” The target system is a device that is configured to receive sent messages via its operating system provided network connection subsystem. The business logic running on the device can select as needed to operate as the target or the source system at any moment. Operating as a source system or target system for a particular messaging transaction does not preclude operating as the other system for a different messaging transaction simultaneously.
In a nominal application, thousands of portable user devices 6 may communicate with the concert network controller 100. The communication will provide interaction for intended uses of the system 2. This alone could strain resources and require expensive T1 access lines far beyond the capacity normally utilized within a concert venue. Providing such capacity would be both expensive and impractical. Additionally, users 4 have the option to operate their portable user devices 6 in order to access the Internet and to access cell phone services. It is important to limit bandwidth requirements to accommodate a large number of portable user devices 6. This can be accomplished by disabling access to applications that are not part of the entertainment functions of the system 2. For purposes of the present description, the applications, contributing to functioning of the system 2 are referred to as business logic.
The enhanced shared experience by the users 4 may include receiving common displays from the concert controller 100. An example might be each portable interactive device 300 being commanded to display the same solid color. Another alternative is providing different displays to each of a group 662 (
In accordance with a further aspect of the present subject matter, the app 306 is provided to coordinate the functions described herein. The operation description acts as a description of the software architecture of the app 306. The app 306 may be supplied for each portable interactive device through a number of means. It may be supplied via the Internet 350 or via a cell phone connection 352 from a software company 360. The software company may comprise a software developer. Apps 306 that are written by developers may be downloaded from a source such as the iTunes store for iOS phones or Google Play for Android phones.
The processor 410 also interacts with an audio module 440 coupled to a microphone 442 and to a speaker 444. Functions connected with the use of the camera lens 420 and associated circuitry within the processor 410 include an ambient light sensor 450, flash lamp 452, and optical proximity sensor 454. Memories associated with the processor 410 include a data memory 460 and a program memory 470. The data memory 460 stores data and provides data as commanded by a program in the program memory 470.
The app 306 is loaded into the program memory 470 when downloaded by a user 4. When the app 306 is activated by the user 4, the smartphone 400 will respond to commands from the central server 8. Information will be uploaded from or downloaded to the smartphone 400. When downloading an app, a user grants permissions for the app to access and upload data, control selected functions, and download data. This grant of permissions may be explicit or it may be made by default.
Permissions utilized by the app 306 may include permission to: modify or delete USB storage contents; discover a list of accounts known by the phone; view information about the state of Wi-Fi in the phone; create network sockets for Internet communication; receive data from the Internet; locate the smartphone 400, either by reading a coarse network location or a fine GPS location; read the state of the phone including whether a call is active, the number called and serial number of the smartphone 400; modify global system setting; retrieve information about currently and recently running tasks; kill background processes; or discover private information about other applications. Consequently, an app 306 can be provided that will readily access the forms of information discussed below from a smartphone 400. The serial number of the smart phone 400 may be used to compose a Unique Identification Number (UID). Other ways of assigning unique identification numbers include assigning numbers as users 6 log on to a session.
Particular parameters to be requested in order to achieve varying compounds or results are explained further with respect to
The central server 8 may permit users to log-in to their Facebook accounts when joining a session. Using the permissions described above, the central server 8 scrapes demographic information about them such as age, gender, location, and stored information from their device such as favorite musical artist, number of prior shared, enhanced experience events attended, or other information. Selected information may be shown graphically on the display 22 as shown in
The system illustrated in
In order to group devices, a device family segregation method is employed. The controller-only framework includes provision to segregate client devices into arbitrary groupings based on any provided metric. Groups 662 may be generated from any available information. For example, a group 662 may include selected identification numbers of portable interactive devices 6. A group 662 may include location data of each user device 6 as received from the device. Another group 662 may comprise demographic information gathered from the user devices 6. In order to address a selected segment of an audience, data from a selected group 662 is mapped into an address list. This selection may be made by the VJ 70 via the central server 8. Displays may be provided to the user devices 6 so that an individual group 662 receives an input. Additionally, different groups may be provided with different information or displays in order to produce a composite display for all or part of a venue 10. Groups 662 may also consist of individual users 4.
Criteria for establishing a group 662 include the component or other users' role for interaction with the group. Components or other users to be factored into criteria include the concert network controller, Jumbotron®, OCS, remote client, gender, age, location within venue, device type, e.g., iOS, Android, HTML5, Windows, OSX, or Linux, device families including iPhone 3G, 3GS, 4/4S, 5, iPad, iPad2, and iPad3 and random selection. In addition, the framework allows the creation, collection, and persistence of arbitrary metrics for use as a segregation criterion. Individual devices can be added or removed from any grouping as desired. Groups 662 may be mutually exclusive or can overlap in any manner desired. Once defined, the grouping can be used as targets for specific commands.
Simplified access to selected users 4 is provided. Additionally, users 4 can be enabled to request specific data. The sorting function supports the creation of arbitrary data elements such as groupings or commands for later reference by name. This can be used to send a specific complex or often used command or to define a grouping of devices.
A first example of an interaction producing a composite result is illustrated in
Further in accordance with the timed group photo command, at block 730 each picture is transmitted via the RF module 414 for transmission back to the event server 130, which contains memory for storing the received pictures. At block 732, various tags may be added to each picture. Most commonly, the tag will comprise a timestamp. Processing is performed at block 734 in order to obtain a result which comprises a composite of the interaction of the portable interactive devices 6 and the system 2. This result may take many forms. For example, the VJ 70, at block 734, can create collages and other photographic displays of the resulting images. As indicated in the loop connection from block 736 back to block 734, this operation can be repeated over the course of an event. This allows accumulation of a large photo archive of the event itself on the event server 130 or the central server 8. The stored images can also be mined for souvenir material after the event. This data can be used to create a searchable, time-stamped record of the event which can be published later.
Another form of interaction is illustrated in
At block 770, the images obtained are sent to the user server 130 and may be sent to the data memory 610 in the central server 8. At a processing block 772 the VJ 70 processes the images. At block 774 images are displayed on the big screen 50 or sent to all of the audience or to selected members of the audience.
The system will provide the ability for users who employ “client devices” to upload content to the system. Examples of uploaded content include photos, or short video clips. Users may access content for uploading in a variety of ways. Users may take pictures from their client devices. They may access shared material from social media. The user may access a storage area of a client device and may select pictures or other items from storage files. When a first and a second user are connected in selected social applications, each user may also choose content from the other user's library.
Uploaded content may be reviewed at the controller 100 (
At block 800, a command is selected by the VJ 70. Many forms of “game-like” interactions can be commanded. Upon a command from the controller 100 at block 802, an interaction called “Shaking,” is initiated when a command is issued to enable reading of sensor accelerometers 432 in smartphones 400. Each device 400 provides a message to its user on its display 416 that says “Shake your phone!” Each user 4 then begins shaking the respective smartphone 400. The issued command derives output from accelerometer 432 of each smartphone 400 and transmits data back to the controller 100. At block 804, the accelerometers 432 are read and information is sent back to the central server 8. The accelerometers 432 of the smartphones 400 provide a substantially real-time dataset of the rates of motion and amounts of kinetic energy being expended by the audience members. Tags may be attached at block 806. Data is stored at the central server 8 at block 808. In a loop, data from the central server 8 is integrated at block 810 and stored again at block 808 to provide updated processed data. Rule based processing is used at block 812 to determine preselected information derived from processing the stored data. For example, data indicative of a user 4's movements may be used to provide a characterization of the kinds of kinetic energy being created by users, who are either dancing, or swaying, or waving their hands, or standing around idly.
A rule is applied over successive operating cycles, usually clock periods, to update status and keep games current.
This information can be processed in a number of ways. For example, a ranking can be assigned to the physical attributes such as applying the most energy to each smartphone or maintaining the most constant rhythm, or performing according to some other criterion. This process can assign a ranking of all the participating devices. The VJ 70 can then command the display of the “winners” or “leaders” of the ranking.
The central clock 9 also allows solution of problems in traditional distributed multiplayer gaming. In a shooting game in which players are instructed to “draw-and-shoot” their weapons as soon as they see a special signal appear either on the big screen 50 at the venue 10 or on their respective portable user devices 6. A timestamp signal from the central clock 9 may be associated with each “BANG” message at the time a response is commanded from a user device 6. A winner is determined by comparison of timestamps rather than by the arrival time of their “BANG” messages at the central server 8.
Another game is a scavenger hunt. In one form of scavenger hunt games, a group 662 contains a code which another group 662 requires to complete a game or puzzle. A “scavenger hunt” may be implemented by providing codes associated with a subset of users 4 which another subset of users 4 requires to complete a game or puzzle. The second set of users 4 has to “ping” other users to access the required code.
These interactions promote interplay and communication in the physical space among people attending the event. By addressing messages to individual devices 6, lotteries can be conducted and winners and losers can be informed of individual outcomes.
Utilizing device addressing also facilitates messaging between individuals in the audience either one-on-one, one-to-many, or many-to-one. In this example, “many” comprises a group 662.
At block 906 selected data can be arranged for display. At block 908 data is displayed. For example, plot 954 is an illustration of distributions of kinetic energy readings received from the smartphones 400. Displays may be provided on the large screen 50 as well as on the displays 416 on the smartphones 400. Since the app 306 in one preferred form has a wide range of permissions, virtually any data within a smartphone 400 can be accessed. This data can include information scraped from the Facebook social graph, such as a map of home locations represented in a crowd, as seen in the map graphic 956. Statistics may be repeatedly collected from the contents of libraries 302 across all the devices, statistics about local light/sound levels over time, or statistics about accelerometer information over time. The repeated collection updates computed values, thus providing dynamic statistics.
For example, the act of dancing may be measured by accelerometers 432 instrumentation in the smartphones 400. The processor may register measurements to determine the top five most active “dancers” in the audience. By virtue of the downloading of social network information corresponding to particular users, the system may access the Facebook picture of each of the five most active “dancers,” as seen in panel 960. Another form of statistical information can be gathered by the geolocation transducers in the smartphone 400. The system can measure an amount of physical movement of each smartphone 400 and then display a list 962 of the most “restless” users.
The above description is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the spirit or scope of the invention. For example, one or more elements can be rearranged and/or combined, or additional elements may be added. A wide range of systems may be provided consistent with the principles and novel features disclosed herein.
Claims
1. A method for providing a shared interactive experience to respective users of a set of portable interactive devices at an event in a venue comprising:
- selecting a set of commanded actions defining a shared interactive experience;
- defining a sequence of signals to provide to the shared experience;
- transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
- receiving response signals from the portable interactive devices produced in response to the sequence of signals;
- processing the response signals according to a rule for respective operating cycles to provide processed data; and
- selecting processed data to be embodied in a form of the shared interactive experience.
2. A method according to claim 1 further comprising the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.
3. A method according to claim 2 further comprising establishing a form of processing in accordance with the rules of the game.
4. A method according to claim 3 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein processing the outputs comprises comparing outputs and informing users of comparative performances in accordance with game criteria.
5. A method according to claim 4 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.
6. A method according to claim 1 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.
7. A method according to claim 6 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.
8. A method according to claim 7 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.
9. A non-transitory machine-readable medium that provides instructions, which when executed by a processor, causes said processor to perform operations comprising:
- selecting a set of commanded actions defining a shared interactive experience;
- defining a sequence of signals to provide to the shared experience;
- transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
- receiving response signals from the portable interactive devices produced in response to the sequence of signals;
- processing the response signals according to a rule for respective operating cycles to provide processed data; and
- selecting processed data to be embodied in a form of the shared interactive experience.
10. A non-transitory machine-readable medium according to claim 9 further causing the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.
11. A non-transitory machine-readable medium according to claim 10 that causes the processor to perform the further step of establishing a form of processing in accordance with the rules of the game.
12. A non-transitory machine-readable medium according to claim 11 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein the medium causes the processor to perform the further step of processing comparing outputs and informing users of comparative performances in accordance with game criteria.
13. A non-transitory machine-readable medium according to claim 12 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.
14. A non-transitory machine-readable medium according to claim 13 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.
15. A non-transitory machine-readable medium according to claim 14 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.
16. A non-transitory machine-readable medium according to claim 15 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.
17. A system for providing a shared interactive audience enhanced experience at an event comprising:
- a server;
- a communications link for coupling outputs from said server to a set of portable interactive devices in a venue;
- said server comprising a processor to provide selected content and selected commands;
- said processor providing addresses to select portable interactive devices for interaction;
- a receiver to receive signals from the portable interactive devices produced in response to respective commands;
- a data memory to store received signals from the portable interactive devices; and
- said processor comprising a rule for processing the device inputs for respective operating cycles.
18. A system according to claim 17 in which said processor is coupled to integrate successive results and wherein said processor comprises a program to produce composite results in response to receipt of inputs received from portable interactive devices.
Type: Application
Filed: May 15, 2013
Publication Date: Nov 21, 2013
Inventors: ANDREW MILBURN (LOS ANGELES, CA), THOMAS HAJDU (SANTA BARBARA, CA)
Application Number: 13/895,307
International Classification: H04L 29/08 (20060101);