METHOD AND APPARATUS FOR CREATING RULE-BASED INTERACTION OF PORTABLE CLIENT DEVICES AT A LIVE EVENT

A system creates an enhanced experience at a concert or other event. A central server provides commands through which information is sent or gathered or both between portable interactive devices of audience members and a central server. A shared experience such as a common display sent to all users is created. A system “app” is installed in the portable interactive device. Rules are implemented in a processor to enable a game to be played by audience members in which a processor enables interaction of users with the server and in which the processor processes user inputs according to rules of the game.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This patent application claims priority of Provisional Patent Application 61/648,593 filed May 18, 2012, Provisional Patent Application 61/670,754 filed Jul. 12, 2012, Provisional Patent Application 61/705,051 filed Sep. 24, 2012, Provisional Patent Application 61/771,629 filed Mar. 1, 2013, Provisional Patent Application 61/771,646 filed Mar. 1, 2013, Provisional Patent Application 61/771,690 filed Mar. 1, 2013, and Provisional Patent Application 61/771,704 filed Mar. 1, 2013, the disclosures of which are each incorporated by reference herein in its entirety.

BACKGROUND

1. Field of the Invention

The present subject matter relates to providing a shared experience to audience members communicated from a central server by creating rule-based individual and shared interactions with portable interactive devices of audience members.

2. Related Art

In order to enhance an audience's involvement in a live event such as a concert, video displays may be combined with a concert performance.

For example, United States Patent Application Publication No. 201200239526 discloses an interactive method and apparatus which provide limited interaction between a performer and concert attendees. The performer enters concert information into a server, which is then accessed wirelessly by an electronic device of a concert attendee. Animations from the server are dynamically displayed on the electronic device. In this arrangement, attendees may select a song to download or to view the lyrics. The user may select an encore screening to vote on a song to be played during an encore performance. In this arrangement, the attendee interacts only with previously stored information. There is no new information generated to enhance the performance. In order to combine further information sources, whether local or accessed through the Internet, the system must provide sufficient bandwidth or delays and gaps in the data will occur. In the past, it has generally been impossible to provide sufficient bandwidth through a venue connection. Possible interactions between a performer and an audience are greatly limited.

United States Published Patent Application No. 20070292832 discloses a system for creating sound using visual images. Various controls and features are provided for the selection, editing, and arrangement of the visual images and tones used to create a sound presentation. Visual image characteristics such as shape, speed of movement, direction of movement, quantity, and location can be set by a user. These systems do not provide for interaction with the audience.

To the extent that audience interaction and provision of displays constructed for particular users have been provided, they have had very limited capabilities.

U.S. Pat. No. 8,090,321 discloses a method and system for wirelessly providing venue-based data to one or more hand held devices. The venue-based data can be authenticated and wirelessly transmitted to one or more hand held devices through one or more wireless telecommunications networks in response to authenticating the venue-based data. This method and system provide data to hand held devices. However, an interaction between a device and the venue data source is not disclosed.

United States Published Patent Application No. 20130080348 describes capturing event feedback and providing a representation of feedback results generated using the feedback indicia. The capturing of event feedback involves storing a plurality of different event records for each of a plurality of different events. The information is used by a program presenter for determining audience behavior in response to transmitted content. Production of an enhanced concert experience is not disclosed.

U.S. Pat. No. 7,796,162 discloses a system in which one set of cameras generates multiple synchronized camera views for broadcast of a live activity from a venue to remote viewers. A user chooses which view to follow. However, there is no plan for varying the sets of images sent to users. There is no uploading capability for users.

U.S. Pat. No. 6,731,940 discloses methods of using wireless geolocation to customize content and delivery of information to wireless communication devices. The communication devices send signals to a central control system. The method uses an RF receiving site including antenna array and a mobile device operated by a user. At least one p-dimensional array vector is derived from RF signals sampled from p antennas of an array, where p is an integer. At least one p-dimensional array vector is used to derive a location of the mobile device. The device addresses a data source in order to customize information in correspondence with the location. The customized information is transmitted to a user. A significant application of this system is to send and/or receive location-specific information of interest, such as targeted advertisements and special services, to travelers and shoppers. A control function is not provided for generating a display comprising an entertainment performance.

United States Published Patent Application No. 20110075612 discloses a system in which content is venue-cast. The content is sent to a plurality of receiving access terminals comprising portable interactive devices within a venue boundary. Content generated at an access terminal is transmitted to a venue-cast server. A venue-specific network could comprise a wide area network (WAN) or a Wi-Fi hotspot deployment. The system provides “unscheduled ad hoc deliveries” of content via the venue transmission system to provide venue visitors with venue related information. Content is specific to the venue and is not related to groups of users within the venue. The only function provided is a venue cast.

SUMMARY

Briefly stated, in accordance with the present subject matter, there are provided a system, method, and machine-readable medium comprising instructions to be executed on a digital processor for permitting a system to create cooperatively determined video compositions and interaction between audience members to produce a composite result. A central server provides commands through which information is sent or gathered or both between audience members and a central server. Information received from audience members is processed to determine relationships between data from individual users. More specifically, in one form, rules may be implemented in a processor to enable a game played by audience members which is coordinated by a central control system.

In one preferred form, a Wi-Fi link is provided in a venue so that the ability to communicate is not limited by Internet or cellular system bandwidth constraints.

A program may issue commands to all interactive devices in an audience to produce a composite result. The present subject matter can direct portable interactive devices to perform functions in response to received signals from a central source including a central server. The central server may interact with a portable interactive device through an app created in accordance with the present subject matter. The app is installed in the portable interactive device.

Another composite result is implementation of a game in which the central server sends commands to gather selected information from each portable interactive device. The information is processed according to a preselected rule, and results are provided to users. A plurality of iterations of information gathering and processing may be performed. Various forms of information may be gathered and processed in accordance with different preselected rules in order to implement different games or information transmission.

BRIEF DESCRIPTION OF THE DRAWINGS

The present subject matter may be further understood by reference to the following description taken in connection with the following drawings:

FIG. 1, consisting of FIGS. 1A and 1B, is an illustration of the method and apparatus of the present subject matter operating in a venue;

FIG. 2 is a block diagram of the system illustrated in FIG. 1;

FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network,

FIG. 4 is a block diagram of a smartphone;

FIG. 5 is a flow chart illustrating one form of app for enabling portable user devices to interact in the system;

FIG. 6 is a block diagram illustrating the central server acting as a source communicating via a server with portable interactive devices;

FIGS. 7, 8, and 9 are each a flow chart illustrating interactive applications within the present system;

FIG. 10 is a flow chart illustrating a further use of data gathered from interactive devices; and

FIG. 11 is an illustration of a display of forms of interaction performed by the system such as a game or shared activity.

DETAILED DESCRIPTION

FIG. 1, consisting of FIGS. 1A and 1B, is an illustration of a venue 10 housing a system 2 in accordance with the present subject matter. FIG. 2 is a high-level block diagram of communication paths in the system illustrated in FIG. 1. FIGS. 1 and 2 are discussed at the same time. The system 2 may be used in conjunction with a live event, for example a concert. Two-way interactivity is provided between a central server 8 and individual audience members 4 who may each have a portable interactive device 6. The portable interactive device 6 may be a smartphone, tablet, or other device.

A central clock 9 synchronizes operations. The venue 10 may include a stage 12, audience area 14, a control room 16, and a media system 18 which may be located in the control room 16. The media system 18 receives audio, video, and intelligence from sources and may be operated to perform control room functions such as mixing, selecting, and processing. A video program 20 is shown on a display 22.

The media system 18 is used to couple outputs from a video source 26, a sound source 28, and other intelligence source 30. The video source 26 may comprise one or more television cameras 24. In the present illustration, a media source 34 includes the video source 26, sound source 28, and other intelligence source 30. The sound source 28 comprises audio output from a live performance provided by a performer or performers 40 coupled by transducers 42, such as microphones. Alternatively, one or more of the video source 26, the sound source 28, and other intelligence source 30 may comprise sources of streaming content, prerecorded content, stored data, or currently processed content from any source. These sources may be local, remote, or both.

In one preferred form the display 22 is a screen 50 that comprises a backdrop for the stage 12. The display 22 could comprise an array 52 of screens over which the video program 20 is distributed. In another form, often used in arenas, the display 22 could comprise a display unit 56 which includes a plurality of monitors 58 on one support 60, with each monitor 58 facing in a different direction. Examples of the display unit 56 are available under the trademark Jumbotron®.

The media system 18 is operated by a VJ 70. The VJ 70 may comprise one or more personnel or a programmed computer. It is not essential that the control room 18 be located at the venue 10. The media system 18 provides content to a concert network controller 100. The concert network controller 100 may both receive and transmit information. The concert network controller 100 provides an input to a display link 102, which is coupled by a patch panel 104 to the display unit 56.

The concert network controller 100 may also comprise a Wi-Fi hotspot 120 providing and receiving signals to and from an audience area 14. As further described below, content may be provided both to and from audience members 4. Audience members may also be informed of information relating to the composite responses from all audience members. The concert network controller 100 may also interact with remote participants 140. In another form, a Wi-Fi system 124, discussed below with respect to FIG. 2, couples audience members 4 to interact with the system 2.

The concert network controller 100 is preferably wirelessly connected to an event server 130, which can provide communications between remote participants 140 and the concert network controller 100. The event server is coupled to a content editor 134, which interacts with a staging server 136. The staging server 136 may be coupled to the remote participants 140 by a network, for example, the Internet 144.

Communications will be provided between a target system and a source system. In the present description, “source system” is a device that wishes to send a message to a “target system.” The target system is a device that is configured to receive sent messages via its operating system provided network connection subsystem. The business logic running on the device can select as needed to operate as the target or the source system at any moment. Operating as a source system or target system for a particular messaging transaction does not preclude operating as the other system for a different messaging transaction simultaneously.

In a nominal application, thousands of portable user devices 6 may communicate with the concert network controller 100. The communication will provide interaction for intended uses of the system 2. This alone could strain resources and require expensive T1 access lines far beyond the capacity normally utilized within a concert venue. Providing such capacity would be both expensive and impractical. Additionally, users 4 have the option to operate their portable user devices 6 in order to access the Internet and to access cell phone services. It is important to limit bandwidth requirements to accommodate a large number of portable user devices 6. This can be accomplished by disabling access to applications that are not part of the entertainment functions of the system 2. For purposes of the present description, the applications, contributing to functioning of the system 2 are referred to as business logic.

FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network. In this embodiment, individual portable interactive devices 300-1 through 300-n such as smartphones or tablet computers each store an application, or app. Each portable interactive device contains its own library 302 and a program memory 304 storing an app 306. The portable interactive device 300 includes a processor 310. Additionally, each portable interactive device may include a display 316, and a graphical user interface 320 of the type normally found on a smartphone. The portable interactive devices 300 each interact via a communications link 330 with a video processor 340. In one preferred form the communications link 330 comprises the Wi-Fi link 120. Interaction is commanded from the central server 8.

The enhanced shared experience by the users 4 may include receiving common displays from the concert controller 100. An example might be each portable interactive device 300 being commanded to display the same solid color. Another alternative is providing different displays to each of a group 662 (FIG. 6) of portable interactive devices 300. As further described below, inputs from the portable interactive devices 300 may be read by the central server 8. Commands may be provided from the central server 8 to respond to the mood of an event and to create an evolving experience to all participants.

In accordance with a further aspect of the present subject matter, the app 306 is provided to coordinate the functions described herein. The operation description acts as a description of the software architecture of the app 306. The app 306 may be supplied for each portable interactive device through a number of means. It may be supplied via the Internet 350 or via a cell phone connection 352 from a software company 360. The software company may comprise a software developer. Apps 306 that are written by developers may be downloaded from a source such as the iTunes store for iOS phones or Google Play for Android phones.

FIG. 4 is a block diagram of the internal circuitry of a nominal smartphone 400 utilizing an app in accordance with the present subject matter. A processor 410 controls operation of the smartphone 400 and communications with the system 2 (FIG. 1). Wi-Fi communication is made through an RF module 414 coupled to the processor 410. The smartphone 400 is preferably of a type equipped with transducers. In one form, the processor 410 receives condition-responsive inputs from many different sources. These sources may include a camera lens 420. Ambient physical parameter sensors may include a humidity sensor 422, gyroscope 424, digital compass 426, atmospheric pressure sensor 428, and temperature sensor 430. An accelerometer 432 and a capacitive touch sensor 434 sense movements made by a user 4.

The processor 410 also interacts with an audio module 440 coupled to a microphone 442 and to a speaker 444. Functions connected with the use of the camera lens 420 and associated circuitry within the processor 410 include an ambient light sensor 450, flash lamp 452, and optical proximity sensor 454. Memories associated with the processor 410 include a data memory 460 and a program memory 470. The data memory 460 stores data and provides data as commanded by a program in the program memory 470.

The app 306 is loaded into the program memory 470 when downloaded by a user 4. When the app 306 is activated by the user 4, the smartphone 400 will respond to commands from the central server 8. Information will be uploaded from or downloaded to the smartphone 400. When downloading an app, a user grants permissions for the app to access and upload data, control selected functions, and download data. This grant of permissions may be explicit or it may be made by default.

Permissions utilized by the app 306 may include permission to: modify or delete USB storage contents; discover a list of accounts known by the phone; view information about the state of Wi-Fi in the phone; create network sockets for Internet communication; receive data from the Internet; locate the smartphone 400, either by reading a coarse network location or a fine GPS location; read the state of the phone including whether a call is active, the number called and serial number of the smartphone 400; modify global system setting; retrieve information about currently and recently running tasks; kill background processes; or discover private information about other applications. Consequently, an app 306 can be provided that will readily access the forms of information discussed below from a smartphone 400. The serial number of the smart phone 400 may be used to compose a Unique Identification Number (UID). Other ways of assigning unique identification numbers include assigning numbers as users 6 log on to a session.

FIG. 5 is a flowchart illustrating one form of software used for the app 306. The app 306 may also enable functions further described below. The app 306 may take many different forms in order to provide the functions of enabling interaction of the portable interactive devices 300 (FIG. 3) and the central server 8. At block 500, a command input is created and sent to the central server 8. The command may be created by the VJ 70 or invoked by an automated program. At block 502, the command input accesses a stored command which is transmitted from the central server 8 via the Wi-Fi transmitter 120 (FIG. 1A). At block 504, the command signal is received by the RF module 414 (FIG. 4) in the smartphone 400. The command signal is translated to the program memory 470 at block 506 in order to access a command. At block 508, entitled “access data,” the command is provided from the program memory 470 in order to access appropriate locations in the data memory 460 for uploading or downloading information. At block 510, the data that is the subject of the command is exchanged between the central server 8 and the smartphone 400.

FIG. 6 is a detailed partial view of the block diagram in FIG. 1 illustrating the central server 8 and interconnections. The central server 8 is coupled via a data bus 600 to the media system 18 in order to receive preprogrammed selections or selections made by the VJ 70. The central server 8 is also coupled via the data bus 600 to the concert network controller 100. The central server 8 sends commands and information via the concert network controller 100 to the portable interactive devices 6 and receives information from the portable interactive devices 6. The central server 8 comprises a data memory 610 and a program memory 620. A processor 630 coordinates operations. The specific requests may be made by the VJ 70 through a GUI 650 at the media system 18.

Particular parameters to be requested in order to achieve varying compounds or results are explained further with respect to FIGS. 7-9.

The central server 8 may permit users to log-in to their Facebook accounts when joining a session. Using the permissions described above, the central server 8 scrapes demographic information about them such as age, gender, location, and stored information from their device such as favorite musical artist, number of prior shared, enhanced experience events attended, or other information. Selected information may be shown graphically on the display 22 as shown in FIG. 11 to inform the users 4 of audience information.

The system illustrated in FIG. 6 supports messaging between individuals 4 in the audience. Users 4 are allowed to initiate contact with other audience members. A user 4 may operate the GUI 372 (FIG. 3) to select a particular audience member or to select an entire category or request the server 300 to produce a new category 362 in accordance with the wishes of the user 4. Information is gathered and used as described with respect to FIG. 6 above and FIGS. 10 and 11 below to allow a user 6 to search for characteristics of other users 6. For example, a user 6 can request a display of a graph of gender and age and choose to see only the list of men between the ages of 30 and 50 who have joined the session. The user 6 may post a text message to only that subset of users.

In order to group devices, a device family segregation method is employed. The controller-only framework includes provision to segregate client devices into arbitrary groupings based on any provided metric. Groups 662 may be generated from any available information. For example, a group 662 may include selected identification numbers of portable interactive devices 6. A group 662 may include location data of each user device 6 as received from the device. Another group 662 may comprise demographic information gathered from the user devices 6. In order to address a selected segment of an audience, data from a selected group 662 is mapped into an address list. This selection may be made by the VJ 70 via the central server 8. Displays may be provided to the user devices 6 so that an individual group 662 receives an input. Additionally, different groups may be provided with different information or displays in order to produce a composite display for all or part of a venue 10. Groups 662 may also consist of individual users 4.

Criteria for establishing a group 662 include the component or other users' role for interaction with the group. Components or other users to be factored into criteria include the concert network controller, Jumbotron®, OCS, remote client, gender, age, location within venue, device type, e.g., iOS, Android, HTML5, Windows, OSX, or Linux, device families including iPhone 3G, 3GS, 4/4S, 5, iPad, iPad2, and iPad3 and random selection. In addition, the framework allows the creation, collection, and persistence of arbitrary metrics for use as a segregation criterion. Individual devices can be added or removed from any grouping as desired. Groups 662 may be mutually exclusive or can overlap in any manner desired. Once defined, the grouping can be used as targets for specific commands.

Simplified access to selected users 4 is provided. Additionally, users 4 can be enabled to request specific data. The sorting function supports the creation of arbitrary data elements such as groupings or commands for later reference by name. This can be used to send a specific complex or often used command or to define a grouping of devices.

FIGS. 7, 8, and 9 are each a flow diagram illustrating interactive applications within the present system.

A first example of an interaction producing a composite result is illustrated in FIG. 7. In this case, the VJ 70 addresses the central server 8 via the GUI 650 (FIG. 6) in order to select a “timed group photo command.” This will initiate a countdown period at the end of which the portable interactive devices 6 will be commanded to take a flash picture. At block 700, the VJ 70 initiates the command. At block 702, the command is translated via the data bus 600 to address the processor 630 (FIG. 6). At block 704 the timed group photo command is accessed from the program memory 620. Also at block 704, the command is coupled to the concert controller 100 for transmission by the Wi-Fi transceiver 124 (FIG. 1A). At block 706, the command is received by the RF module 414 (FIG. 4) in the smartphone 400. At block 708 the command is coupled via the processor 410 to access appropriate commands from the program memory 470, and more specifically from the app 306. The program memory 470 operates the smartphone 400 by coupling signals to appropriate modules at block 720, which contains blocks 722 and 724. A first portion of the command within block 720 is at block 722 at which a counter in the processor 410 initiates a countdown and produces an image on the display 416. The portable interactive devices 6 play a synchronous countdown message such as a countdown, “3 . . . 2 . . . 1 . . . ” At the end of the countdown, operation proceeds to a second portion of the block 720, namely block 724. At block 724 the flash 452 in each smartphone 400 is activated. The smartphones 400 all flash at substantially the same time. A system and method for executing a command at the same time is disclosed in commonly owned patent application serial number 2053U11, the disclosure of which is incorporated herein by reference. The processor 410 enables optical information from the camera lens 420 to be loaded into data memory 460 to store a picture.

Further in accordance with the timed group photo command, at block 730 each picture is transmitted via the RF module 414 for transmission back to the event server 130, which contains memory for storing the received pictures. At block 732, various tags may be added to each picture. Most commonly, the tag will comprise a timestamp. Processing is performed at block 734 in order to obtain a result which comprises a composite of the interaction of the portable interactive devices 6 and the system 2. This result may take many forms. For example, the VJ 70, at block 734, can create collages and other photographic displays of the resulting images. As indicated in the loop connection from block 736 back to block 734, this operation can be repeated over the course of an event. This allows accumulation of a large photo archive of the event itself on the event server 130 or the central server 8. The stored images can also be mined for souvenir material after the event. This data can be used to create a searchable, time-stamped record of the event which can be published later.

Another form of interaction is illustrated in FIG. 8. At block 760, the VJ initiates a command file via the graphical user interface 650 (FIG. 6) in order to choose a set of portable interactive devices 6 to be commanded. At block 762, a command is issued for enabling the imaging function on the selected portable interactive devices 6 and for commanding activation of the camera flash 452, indicating the location of the active cameras within the audience. At block 764, the signal is transmitted to the RF module 414 in each selected portable interactive device 6. At block 766 the command signal is translated to the processor 410, and the imaging function is executed.

At block 770, the images obtained are sent to the user server 130 and may be sent to the data memory 610 in the central server 8. At a processing block 772 the VJ 70 processes the images. At block 774 images are displayed on the big screen 50 or sent to all of the audience or to selected members of the audience.

The system will provide the ability for users who employ “client devices” to upload content to the system. Examples of uploaded content include photos, or short video clips. Users may access content for uploading in a variety of ways. Users may take pictures from their client devices. They may access shared material from social media. The user may access a storage area of a client device and may select pictures or other items from storage files. When a first and a second user are connected in selected social applications, each user may also choose content from the other user's library.

Uploaded content may be reviewed at the controller 100 (FIG. 1A). Content review may take many forms. Content review may be manual, automated, or both. The VJ 70 and automated criteria measurement subsystems can browse through uploaded submissions. The uploaded submissions may be handled individually or in groups.

FIG. 9 illustrates a further form of composite result. A game is implemented based on physical actions of audience members. In one form the game compares how much energy users 4 can apply to their respective smartphones 400.

At block 800, a command is selected by the VJ 70. Many forms of “game-like” interactions can be commanded. Upon a command from the controller 100 at block 802, an interaction called “Shaking,” is initiated when a command is issued to enable reading of sensor accelerometers 432 in smartphones 400. Each device 400 provides a message to its user on its display 416 that says “Shake your phone!” Each user 4 then begins shaking the respective smartphone 400. The issued command derives output from accelerometer 432 of each smartphone 400 and transmits data back to the controller 100. At block 804, the accelerometers 432 are read and information is sent back to the central server 8. The accelerometers 432 of the smartphones 400 provide a substantially real-time dataset of the rates of motion and amounts of kinetic energy being expended by the audience members. Tags may be attached at block 806. Data is stored at the central server 8 at block 808. In a loop, data from the central server 8 is integrated at block 810 and stored again at block 808 to provide updated processed data. Rule based processing is used at block 812 to determine preselected information derived from processing the stored data. For example, data indicative of a user 4's movements may be used to provide a characterization of the kinds of kinetic energy being created by users, who are either dancing, or swaying, or waving their hands, or standing around idly.

A rule is applied over successive operating cycles, usually clock periods, to update status and keep games current.

This information can be processed in a number of ways. For example, a ranking can be assigned to the physical attributes such as applying the most energy to each smartphone or maintaining the most constant rhythm, or performing according to some other criterion. This process can assign a ranking of all the participating devices. The VJ 70 can then command the display of the “winners” or “leaders” of the ranking.

The central clock 9 also allows solution of problems in traditional distributed multiplayer gaming. In a shooting game in which players are instructed to “draw-and-shoot” their weapons as soon as they see a special signal appear either on the big screen 50 at the venue 10 or on their respective portable user devices 6. A timestamp signal from the central clock 9 may be associated with each “BANG” message at the time a response is commanded from a user device 6. A winner is determined by comparison of timestamps rather than by the arrival time of their “BANG” messages at the central server 8.

Another game is a scavenger hunt. In one form of scavenger hunt games, a group 662 contains a code which another group 662 requires to complete a game or puzzle. A “scavenger hunt” may be implemented by providing codes associated with a subset of users 4 which another subset of users 4 requires to complete a game or puzzle. The second set of users 4 has to “ping” other users to access the required code.

These interactions promote interplay and communication in the physical space among people attending the event. By addressing messages to individual devices 6, lotteries can be conducted and winners and losers can be informed of individual outcomes.

Utilizing device addressing also facilitates messaging between individuals in the audience either one-on-one, one-to-many, or many-to-one. In this example, “many” comprises a group 662.

FIG. 10 is a flow chart illustrating gathering of data from interactive devices 6. FIG. 11 is an illustration of a display 950, which may be produced by the method of FIG. 10. FIGS. 10 and 11 are discussed together. This technique may be used to produce “dynamic statistics.” At block 900, the VJ 70 issues a command to gather data. The selected systems within the smartphones 400 are queried at block 902. At block 904, data is received and sent to the data memory 610 in the main server 8.

At block 906 selected data can be arranged for display. At block 908 data is displayed. For example, plot 954 is an illustration of distributions of kinetic energy readings received from the smartphones 400. Displays may be provided on the large screen 50 as well as on the displays 416 on the smartphones 400. Since the app 306 in one preferred form has a wide range of permissions, virtually any data within a smartphone 400 can be accessed. This data can include information scraped from the Facebook social graph, such as a map of home locations represented in a crowd, as seen in the map graphic 956. Statistics may be repeatedly collected from the contents of libraries 302 across all the devices, statistics about local light/sound levels over time, or statistics about accelerometer information over time. The repeated collection updates computed values, thus providing dynamic statistics.

For example, the act of dancing may be measured by accelerometers 432 instrumentation in the smartphones 400. The processor may register measurements to determine the top five most active “dancers” in the audience. By virtue of the downloading of social network information corresponding to particular users, the system may access the Facebook picture of each of the five most active “dancers,” as seen in panel 960. Another form of statistical information can be gathered by the geolocation transducers in the smartphone 400. The system can measure an amount of physical movement of each smartphone 400 and then display a list 962 of the most “restless” users.

The above description is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the spirit or scope of the invention. For example, one or more elements can be rearranged and/or combined, or additional elements may be added. A wide range of systems may be provided consistent with the principles and novel features disclosed herein.

Claims

1. A method for providing a shared interactive experience to respective users of a set of portable interactive devices at an event in a venue comprising:

selecting a set of commanded actions defining a shared interactive experience;
defining a sequence of signals to provide to the shared experience;
transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
receiving response signals from the portable interactive devices produced in response to the sequence of signals;
processing the response signals according to a rule for respective operating cycles to provide processed data; and
selecting processed data to be embodied in a form of the shared interactive experience.

2. A method according to claim 1 further comprising the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.

3. A method according to claim 2 further comprising establishing a form of processing in accordance with the rules of the game.

4. A method according to claim 3 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein processing the outputs comprises comparing outputs and informing users of comparative performances in accordance with game criteria.

5. A method according to claim 4 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.

6. A method according to claim 1 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.

7. A method according to claim 6 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.

8. A method according to claim 7 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.

9. A non-transitory machine-readable medium that provides instructions, which when executed by a processor, causes said processor to perform operations comprising:

selecting a set of commanded actions defining a shared interactive experience;
defining a sequence of signals to provide to the shared experience;
transmitting to portable interactive devices content and command signals in correspondence with the defined sequence;
receiving response signals from the portable interactive devices produced in response to the sequence of signals;
processing the response signals according to a rule for respective operating cycles to provide processed data; and
selecting processed data to be embodied in a form of the shared interactive experience.

10. A non-transitory machine-readable medium according to claim 9 further causing the step of conducting a game for the set of users by providing data instructing users to perform an action to be performed by users as an element of performance of the game.

11. A non-transitory machine-readable medium according to claim 10 that causes the processor to perform the further step of establishing a form of processing in accordance with the rules of the game.

12. A non-transitory machine-readable medium according to claim 11 wherein users are instructed to perform an action to be sensed by a transducer in a respective portable interactive device, and wherein the medium causes the processor to perform the further step of processing comparing outputs and informing users of comparative performances in accordance with game criteria.

13. A non-transitory machine-readable medium according to claim 12 wherein the step of informing users comprises creating a graphic representation of game results and providing the representation to a venue display.

14. A non-transitory machine-readable medium according to claim 13 wherein selecting a sequence of commanded actions comprises detecting personal user information on a respective portable interactive device and wherein providing a command signal comprises instructing the portable user device to provide user information stored on the portable user device.

15. A non-transitory machine-readable medium according to claim 14 wherein selecting a sequence of commanded actions comprises permitting users to upload selected information.

16. A non-transitory machine-readable medium according to claim 15 wherein selecting a sequence of commanded actions comprises constructing at least a group of users according to a preselected criteria.

17. A system for providing a shared interactive audience enhanced experience at an event comprising:

a server;
a communications link for coupling outputs from said server to a set of portable interactive devices in a venue;
said server comprising a processor to provide selected content and selected commands;
said processor providing addresses to select portable interactive devices for interaction;
a receiver to receive signals from the portable interactive devices produced in response to respective commands;
a data memory to store received signals from the portable interactive devices; and
said processor comprising a rule for processing the device inputs for respective operating cycles.

18. A system according to claim 17 in which said processor is coupled to integrate successive results and wherein said processor comprises a program to produce composite results in response to receipt of inputs received from portable interactive devices.

Patent History
Publication number: 20130311566
Type: Application
Filed: May 15, 2013
Publication Date: Nov 21, 2013
Inventors: ANDREW MILBURN (LOS ANGELES, CA), THOMAS HAJDU (SANTA BARBARA, CA)
Application Number: 13/895,307
Classifications
Current U.S. Class: Computer Conferencing (709/204)
International Classification: H04L 29/08 (20060101);