Media enhanced gaming system

An integrated group of systems, processes, and controls that enable real-time/near real-time media (video and audio) enhancement and capabilities in a gaming environment. Media from a variety of sources may be streamed or pushed to either individual gaming terminal devices, a group of these devices, or an entire network of such units. Additional system functionality allows for two-way interactive visual and audio communications between gaming terminal users/operators and call center personnel as well as providing a standard interface to interact with existing retail sales-oriented equipment that may exist at the installation location.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of U.S. Provisional Application No. 60/590,255, filed Jul. 22, 2004, the entirety of which is hereby fully incorporated herein by this reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The invention relates generally to gaming and lottery systems. More particularly, the invention relates to systems, processes and controls that allow for the use of modern video and audio compression processes along with high-bandwidth communications circuits to bring media-rich services to the gaming and lottery environment.

2. Description of the Related Art

Traditionally, graphics and other media presented to the operators, players, and other persons present at a gaming establishment have been either pre-generated (canned) or message-based content. An example of such gaming system is an Keno game implemented by a state lottery authority. The graphic content resides on the gaming terminal and is presented through various interfaces. This content is either downloaded from the central data center(s) during off-hours or via background downloads during operational hours. Message-based content is pushed out to the gaming terminals from a centralized console and presented, usually via a dot-matrix type display. The security required to maintain system integrity typically prevents advanced computer features to the real time play of the game because of the need to protect the data flow of the game.

These relatively crude methods, by today's standards, places limits on both the quality of the content as well as the quantity of unique content to present. These deficiencies manifest themselves as players losing interest in the games quickly, which thereby results in lowered sales and/or participation. To attract players, increase their interest, and provide general information, the gaming industry has traditionally relied upon these rudimentary graphics and printed produces. What is needed, therefore, is a media-rich method for attracting and informing players of secure game offerings in a real-time environment.

SUMMARY OF THE INVENTION

The present invention provides an improved gaming system which overcomes some of the deficiencies of the known art. In one embodiment, the system is comprised of several hardware and software components which embody and enable core functionality. It is this core design that integrates known encoding schemes with new software and processes to enable ground-breaking media-rich delivery from a central site to remote gaming venues.

In one embodiment, the invention is a system for providing media to users at secure remote gaming locations that one or more secure gaming terminals located at remote locations on a communication network, with the one or more secure gaming terminals each allowing a user to play and wager in a game of chance. The system includes at least one media server on the communication network that determines the usable media for the one or more secure gaming terminals, such as multimedia, live video, etc. Then one or more media feeds in the system selectively feed media to the media server and the media server selectively distributes the appropriate media content from the one or more media feeds to the one or more secure gaming terminals, preferably during game play. The system can include an assistance server, such as a telephone call center to help the players and others at the remote terminals.

In one embodiment, the invention is a method of for providing media to users at secure remote gaming locations that includes the steps of hosting a game of chance at the one or more secure gaming terminals located at remote locations on a communication network, with the one or more terminals each allowing a user to play and wager in the game of chance, then feeding media content from one or media feeds to a media server, with the media server determining the usable media for the one or more secure terminals. The method then includes the step of distributing the appropriate media content from the media server to the one or more secure gaming terminals ate least during the game of chance.

The present invention therefore provides a media-rich environment at the secure gaming terminal that can both attract and inform players of secure game offerings, even in a real-time environment. Such function is advantageous because it increases player interest and can provide a simplified delivery of general information and instruction.

Other objects, features and advantages of the present application will before apparent after review of the hereinafter set forth Brief Description of the Drawings, Detailed Description of the Invention, and the Claims.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a schematic illustration of an embodiment of a media gaming system of the invention.

FIG. 2 is a schematic illustration of an embodiment of a media server of the invention.

FIG. 3 is an illustration of a video call center for use with the invention.

FIG. 4 is an illustration of a discrete terminal system for use with the invention.

FIG. 5 is an illustration of an integrated terminal system for use with the invention.

FIG. 6 is a flowchart of media server operations.

FIG. 7 is a flowchart of main conferencing operations.

FIG. 8 is a flowchart of call center operations.

DETAILED DESCRIPTION OF THE INVENTION

Referring now to the drawings in which like reference numbers indicate like parts throughout the several views, and in particular here to FIG. 1, the main content delivery system 10 is based upon a Media Server/Sequencer System 12, which is responsible for controlling content type, mix, and delivery. The uniqueness of this core device is found in the software and system interfaces driving its operation. The server will accept various types of media input via industry-standard hardware interfaces such as composite, component, and 5-video ports. Additional content is available via encoded media stored locally on a mass storage device 14 or over the communications network 26.

Standard raw media content is passed through the aforementioned standard hardware ports and encoded using well-known and available encoding algorithms. The various types of media that can be processed by the system could be third-party video feeds 16, computer generated graphics 18, and live broadcast content 20. It is the availability of this real-time media and the ability to deliver this content that differentiates this system from those traditionally used and currently available within the industry.

Once this media is available, the sequencing and control logic within the server provides a method to distribute the content to the desired gaming devices over the communications network 26. This distribution can entail a single remote device, a group of these devices, or the entire installed base of devices. The specialized software within the Media Server/Sequencer System 12 controls this distribution via standard Internet Protocol (IP) unicast, multicast, and broadcast methods.

For the far-end gaming terminal locations, two methods of providing media functionality can be utilized. In discrete system locations 28, the existing terminal device 36 is not capable of handling the media content. This could be due to either the terminal be a third-party device or not having the processing power/interfaces to accommodate this feature. In these instances, a separate Media Processor 32 with corresponding media interface devices 34 would be installed to permit delivery of content.

The integrated method is utilized where the terminal device is controlled by the system licensee and it has the ability to handle the media processing tasks. In this scenario at an integrated system remote location 30, the terminal with integrated media capabilities 38 contains the necessary software and interfaces to provide for the delivery of content. These interfaces handle the connections to the various media interface devices 40.

Due to the media-rich capabilities of the remote device locations, they now lend themselves easily to be a source of media input. Already containing a method of displaying video and producing audio output, the incorporation of readily available video camera and microphone technology provides the capability for the remote location to send video and audio back to the Media Server/Sequencing System 12. This capability enables video conferencing features that can be utilized by the Call Center Media Controller/Queuing System 22.

The Call Center Media Controller/Queuing System 22 is designed to function as an add-on system as well as a standalone offering to customers. Designed around the same core processes and functionality of the Media Server/Sequencer System 12, this system provides for real-time video conferencing contact between the remote device locations and a call center/help desk service.

The Call Center Media Controller/Queuing System 22 receives the encoded media streams from the remote locations through the same functionality that allows it to accept raw media input like its counterpart, the Media Server/Sequencer System 22. How it handles this media differently is a function of additional specialized programming. As in traditional call center telecommunications systems, there are times when all personnel are already assisting callers. The ability to handle this type of situation is handled by the queuing feature of the system.

Requests for conferencing sessions from remote locations route to the Call Center Media Controller/Queuing System 22. If there is an available call center technician, the session is routed through to the selected media-enabled workstation 26 where the technician answers the request. This action begins the two-way video conferencing session. If a technician is not available to immediately handle the session, the queuing controls process the session until the situation changes.

While in queue, the remote location can be controlled to display various informational messages. This entails a display that no technicians are available, the anticipated wait time, and possibly a logo or promotional graphic. Depending on bandwidth availability over the communications network 26, video-based promotional, technical, or informational content could be displayed. This content is pushed to the remote location from the In-queue Media Pool 24 which resides on a storage device within the server or other like device on the network. Once a call center technician becomes available, the remote session is passed through to the corresponding workstation 26.

Additional functionality is incorporated into this system through more specialized software features. The design features include tracking media and bandwidth capabilities of each individual remote location, real-time bandwidth monitoring of the network, current media sessions, and scheduled media events. These features enable the various functions provided by the system to remain in check and adjust their operation accordingly.

Due to the design of the communications network tying the remote locations back to where the system is housed, varying bandwidth capabilities may exist across the installation base. In order to account for this very possible design constraint, the per-location bandwidth available should be incorporated into the system so that it may adjust media content.

Since media capabilities and/or desires may vary by remote location, this fact should be considered also. Certain groups of remote locations may be members of a chain or corporate structure and thereby have unique needs or restrictions for content. There may also exist a need to provide content based upon regional areas. This capability would be very important should the system be utilized to broadcast weather alerts.

To take these factors into account and act accordingly, both the Media Server/Sequencer System 12 and the Call Center Media Controller/Queuing System 22 maintain a database containing pertinent information. Before establishing a stream or terminating a video conferencing session respectively, these systems will perform a call to the database to determine the best configuration or capability to carry the session. Information is also contained in this database that provides the system with the configuration of the backbone communications network so that it can adjust system-wide aggregate bandwidth utilization accordingly. When both systems are installed concurrently, one system can be designated to hold the primary database and the other the backup. Changes in information to the primary database are migrated to the backup database by system process. Each system has the capability to utilize the others database if corruption or other failure renders its own database unusable.

Similar to the database redundancy and failover capability, both systems are designed with the ability to be deployed in redundant sets. When this method is employed, either strictly for redundancy or for accommodating large installations of remotes, one system will be designated primary and others as backup units. Inter-machine processes on each server monitor the status and eligibility of other servers within the group and react accordingly should a failure occur.

A media scheduling process is contained within both the Media Server/Sequencer System 12 and the Call Center Media Controller/Queuing System 22. In the prior instance, this process controls media content and distribution based upon information contained within a separate scheduling database. In the latter it provides the ability to push out scheduled notices and informational content such as maintenance downtime and impairment releases. The database utilized is structured to control content distribution based on both time of day and remote location affected. An example using this feature would be the distribution of a corporate announcement at a particular time and only to those locations belonging to that corporate entity.

The functional components of the Media Server/Sequencer System are shown in FIG. 2. At the heart of the system is the media server engine 50 which is tasked with distributing content based upon control input and automated operational monitoring sub-processes. Content control allows for multiple simultaneous streams of media based upon distribution commands from the server side or on-demand requests from remote terminal locations. Terminal, as used herein, herein refers to a terminal or a device adapted for gaming use and which is traditionally defined as a purpose-built unit that accepts and processes wagering transactions and also provides a wagering system interface to the user/operator.

This content control is provided for by the sequencing and control logic 64 process. Programming enables input from various sources to dictate content distribution. Additional inputs from the media server engine 50 and communications interface 72 provide for monitoring of system and network communications operational parameters. This feedback is an essential component of the system and provides for proper operation and utilization of resources.

Explained individually, the first input is provided for by the media schedule 68. This component is comprised of a database and an interface process to the sequencing and control logic 64. Entries into this database control the scheduled distribution of content and to which location(s) this content is directed. The data is maintained by interaction via the operator workstation 70. Date and time information as well as content and intended destination(s) is input into the database. At the prescribed moment, the proper content is pushed out to the intended recipient(s).

The second method for controlling the distribution of content is via commands entered directly into the system from the operator workstation 70. Content selection and recipient information is input via the GUI interface and passed to the sequencing and control logic 64 through the media server control interface 66. This latter process handles the human-machine I/O interface requirements and provides a method to adapt and present a standardized interface to the operator.

Other than providing a universal interface to the communications network media, the communications interface 72 provides feedback to the monitoring and control logic 64 on communications functioning as related to bandwidth utilization and impairments to the communication network 74. To make available content for distribution, the media server engine 50 has several sources which to draw from. First is a raw media interface 62 that is the gateway for pre-encoded external real-time media. Another source for pre-encoded media is drawing from media storage 52.

For interfacing with traditional video signals, the media server contains a process dedicated to encoding video signals utilizing well-known compression algorithms. The encoder 54 performs this function. It accepts these traditional signals through industry standard hardware interface adapters installed in the server. Media sources can consist of third-party feeds 56, computer generated graphics 58 input, and live media 60 such as from a broadcast studio. Besides providing real-time content sources, additional processes provide the ability to take these encoded inputs and buffer and/or store them to media storage 52 for later delivery.

Designed around the same core concept of the Media Server/Sequencer System is the Call Center Media Controller/Queuing System detailed next which can is illustrated in FIG. 3, which illustrates video call center detail. Being such, these two systems share many of the same components and logic. Because of the modular architecture of the systems, they are designed to allow deployment individually or as an integrated solution.

Once again, the media server engine 80 is responsible for controlling the flow of media streams to and from the system. Unique to this system is that it is designed to handle the routing of real-time two-way video conferencing traffic. This capability is provided for by the sequencing and control logic 86 process which listens for conferencing requests from stations, queue and routes these requests, and also oversees established conferences by way of a monitoring process through the communications interface 94.

The feedback received via the communications interface 94 allows the sequencing and control logic 86 to monitor communications network 96 utilization and adjust the operation of the system to prevent degradation to other activities that rely on the network.

System operation is controlled and monitored via the video call center control interface 88 from the master workstation 90. The design of the system allows for the master workstation 90 to be physically connected to the system or located elsewhere on the network. When the workstation is located on the network, no specialized client software is required and this allows for control of the system to be easily relocated to another workstation as when a shift change at the call center may dictate.

The video call center control interface 88 maintains a database of the media capabilities and other operational restraints for each remote location and call center workstations 92. In the case of remotes, limitations in the communication network 96 may reduce or preclude the capability for video conferencing and the system must tailor operation accordingly. For the call center workstations, the system must know which workstations are staffed, in conference, and available for service. Additionally, the video call center control interface 88 tracks in-progress conferences to calculate hold times for queued conference requests. This conference volume and hold time information is displayed on each call center workstation 92, master workstation 90, and can be pushed down to queued remotes.

Similar to modern voice-only call center software, the system provides the capability to determine the source of conference requests and perform a lookup within a database of location information. Basic details of in-process and queued conference session are displayed on each call center workstation 92 and the master workstation 90. The availability of this information alerts supervisors and technicians to session volume and location detail which allows them to recognize common denominators amongst the sessions that may indicate problems in the associated gaming system. When flagged for assignment of a new conference session, remote location detail and history information is displayed on the call center workstation 92 to enhance service and reduce conference times. This last function is very similar to the Computer Telephony Interface (CTI) utilized in standard call center software.

The video call center control interface 88 can either utilized its integrated database for remote location detail or interface to an external database via standard Structured Query Language (SQL) calls. This capability allows for a tight integration with an existing gaming system database and precludes the requirement to duplicate location information and associated updates across multiple independent databases.

In order to enhance system functionality it incorporates a capability to push notices and other informational messages out to remote locations, either preconceived or real-time. This delivery is controlled via the master workstation 90 and pulls content from media storage 82, the raw media interface 84, or via the communications network 96. The system is also designed to permit call center workstations 92 to place conference requests to remote locations. This feature allows technicians to proactively contact remote locations, perform follow-up/courtesy calls, and establishes a basis to enable telemarketing functions with the system.

To enable this media capability at remote locations, two methods can be utilized. Depending upon circumstances, on a per-remote location basis, either an integrated or discrete media processing system can be installed. The first method discussed will be that of a discrete configuration as detailed and referenced in FIG. 4, which illustrates a discrete terminal system.

The discrete method is utilized primarily when the existing remote device either can not be touched or is incapable of providing the required hardware and software integration. In this instance, a separate processor unit is installed and handles all media-related activities. This method could also be used to provide stand-alone media capabilities within a gaming establishment where media capabilities on a per-terminal basis are either not required or desired.

At the core of the discrete terminal system is the media engine 100 which directly controls and processes various media streams traversing the unit. Under command from the sequencing and control logic 122, the media engine 100 may establish, route, terminate, and otherwise control content flow. Content may be processed either across the communications network 130 via the communications interface 128, from local media storage 102, or from local external sources.

In the case of external sources, basic video conferencing media capability is provided for by means of a camera 108 and monitor 110 through the video interface 106 and also a speaker 114 and microphone 116 via the audio interface 112. The external monitor 110 and speaker 114 would be utilized in the case of pushed or streamed media to the remote location. Also available is an external interface 118 which provides a means to provide connectivity to external audio/video devices 120. This external interface 118 allows connection to existing or an otherwise available media distribution system that may exist within the remote location. The signals traversing these various interface are processed by the encoder/decoder 104 module utilizing well-know compression/decompression (Codec) algorithms.

The sequencing and control logic 122 also monitors real-time communications properties via a hook into the communications interface 128. This allows the sequencing and control logic 122 to be aware of communications network 130 utilization, current media sessions, and pending media requests. Video conferencing and on-demand media control is primarily handled by the sequencing and control logic 122 through user commands entered via integrated keyboard or touch-screen methods. To allow for interfacing with existing external systems 126, an adaptive machine interface 124 provides a common-ground capability. The media system may need to interface with traditional Point of Sale (POS) or other terminal devices.

Programming contained within the adaptive machine interface 124 allows the system to accept and provide information to external systems 126 through a separate software module. This module can be modified to present a standard interface to the systems on both sides of the interface without necessarily requiring unique modifications to the systems themselves. The result is a highly adaptable system that is capable of enabling rich media functions integrated with basic and/or legacy terminal devices.

The integrated terminal system, as diagramed and referenced in FIG. 5, which illustrates an integrated terminal system, and is utilized in instances where the remote terminal or system has the capability to accommodate the required hardware interfaces and software modules. The components and design of this integrated system is not much different than the discrete implementation (FIG. 4) and vary only in the means by which it interfaces with the pre-existing terminal application.

Once again, at the core of the integrated terminal system is the media engine 100 which directly controls and processes various media streams traversing the unit. Under command from the sequencing and control logic 122, the media engine 100 may establish, route, terminate, and otherwise control content flow. Content may be processed either across the communications network 126 via the communications interface 124, from local media storage 102, or from local external sources.

In the case of external sources, basic video conferencing media capability is provided once again by means of a camera 108 and monitor 110 through the video interface 106 and also a speaker 114 and microphone 116 via the audio interface 112. The external monitor 110 and speaker 114 would be utilized in the case of pushed or streamed media to the remote location. Also available is an external interface 118 which provides a means to establish connectivity to external audio/video devices 120. This external interface 118 allows connection to existing or an otherwise available media distribution system that may exist within the remote location.

The sequencing and control logic 122 also monitors real-time communications properties via a hook into the communications interface 124. This allows the sequencing and control logic 122 to be aware of communications network 126 utilization, current media sessions, and pending media requests. Video conferencing and on-demand media control is primarily handled by the sequencing and control logic 122 through user commands entered via integrated keyboard or touch-screen methods. To allow for interfacing with existing external systems 132 like that of the discrete terminal system, an adaptive machine interface 120 provides a common-ground capability. The media system may need to interface with traditional Point of Sale (POS) or other terminal devices and this capability provides that functionality.

Programming contained within the adaptive machine interface 130 allows the system to accept and provide information to external systems 132 through a separate software module. This module can be modified to present a standard interface to the systems on both sides of the interface without necessarily requiring unique modifications to the systems themselves. The result is a highly adaptable system that is capable of enabling rich media functions integrated with basic and/or legacy terminal devices.

Likewise, the terminal application interface 128 allows this same functionality and ease of adaptability to take place with the pre-existing terminal application. In some instances, the licensee will be installing the system on a third-party terminal device that is up to the task of handling the required media content and control. The terminal application interface 128 allows programming a discrete interface software module to allow for seamless interaction without requiring code changes to either the host application or media system core. In the case that the licensee installs the system on their own terminal device, the terminal application interface 128 can be written to provide a standard interface to the application software. In many instances, when a vendor offers multiple models of terminal devices, they will provide for standard interface specifications to external applications. The capability of this system to do likewise allows for portability of the media system across their compatible product line.

From and end-to-end viewpoint, the two systems described herein function along the same basic principals. However, the following text and diagrams will detail the overall interaction between the centralized server systems and remote terminal devices independently due to the distinct properties of each. The flow of processes within the Media Server/Sequencer System is detailed as shown in FIG. 6.

Media can be streamed to remotes utilizing several methods. The first is manually via the operator workstation 140, the next is with a prompt from the schedule 144, and lastly, from a on-demand request from a remote terminal 142. Prompts for these media triggers are validated for conflicts related to the time of this media session with sessions either imminent or already in progress that may be of higher priority, as shown by decision 148. If there is a conflict, the system will adjust according to schedule and notify the operator via the workstation 140 interface.

If a conflict does not exist, the sequencing and control process 150 queries the database for the remotes capability 152 to ensure that it is indeed capable of receiving the media feed. If the remote is flagged in the database as having a bandwidth limitation, the media feed is checked to see if it can be scaled back to fit within the available bandwidth. If the feed is valid (decision 154), the sequencing and control process 156 checks that this bandwidth (decision 160) is available on the communications network by interfacing with the communications interface monitoring process 158.

With bandwidth available, the sequencing and control process 162 sends a command to the media server engine 170 to start the proper media feed. It also informs the communications interface monitor process 158 that the media feed request has been placed. The sequencing and control process continues to monitor 164 the status and bandwidth 166 of the feed via interfaces with the communications interface monitor process 158 and the media server engine 170. If bandwidth must be reduced or the feed must be stopped, the sequencing and control process 168 sends the appropriate commands to the media server engine 170.

As part of the command to the media server engine 170 to start the feed, a direction as to what media and/or source is to be utilized to supply the given feed. The media server engine 170 selects the proper input from either third party media 172, computer generated media 174, live media 176, or media storage 178. If the media is not available (decision 180) the media server engine 170 notifies the sequencing and control monitor process 164 where the error is displayed on the operator workstation 140.

If the media is available, the media server engine 182 streams the video to the specified remote(s) via the communications interface 184. The media server engine 182 constantly listens for commands to end or otherwise terminate (step 188) the feed. Once the feed has ended or is terminated (decision 186), the media server engine 182 informs the sequencing and control monitor process 164.

The process flow for the setup and teardown of video conferencing sessions pertaining to the call center media controller/queuing system is detailed and referenced in FIG. 7. The sequencing and control monitor 204 process continually monitors sessions and network utilization via the communications interface monitor process 206. It also utilizes the communications interface to listen for conference requests 208 from call center workstations 210 and remote terminals 212. Continuous control and monitoring is available to the master workstation 200 via the video call center control interface 202.

Being that the call center workstations 210 are all capable of full conference features, the sequencing and control process 214 checks for remote capability 216 via a database query. If the request is not valid (decision 218), sequencing and control 214 handles the issue and sends a notice to the master workstation 200. If the request is valid, the sequencing and control process 214 next checks to see if the destination is available (decision 220).

If the destination is not available, the sequencing and control process 222 queues the request, makes note of the situation, and sends a request to the media server engine 224 to stream a hold time message to the destination 228 via the communications interface 226. If the destination is available at decision 220, the sequencing and control process 230 continues to process the connection.

The sequencing and control process 230 checks if bandwidth is available (decision 234) for the conference through the communications interface monitor process 232. If not, it will notify the initiator (if a call center workstation 210) that there is a bandwidth conflict and offer an option to queue the call or drop the request. If the initiator is a remote terminal 212, the sequencing and control process 230 will send a message advising of a busy status and queue the request.

With bandwidth available (decision 234), the sequencing and control process 236 will broker the call with the call center workstation 242 and the remote terminal 244 via the communications interface 238 and communications network 240. The communications interface setup conference 238 process is where the proper setup commands and addressing is specified to the conferencing endpoints. The communications interface monitor process 246 continuously monitors the conference for activity (decision 248) and bandwidth (decision 252) availability. If the call is not active at decision 248, the sequencing and control process tears down any remaining conference components, step 250. If bandwidth is a problem at decision 252, the sequencing and control process 254 throttles bandwidth of the conference accordingly.

Once a conference is in session, the call center operator may want to stream media to the remote. This may be a help video or other way of assisting the remote conference caller. This associated process flow is depicted and annotated in the flowchart of FIG. 8.

The sequencing and control monitoring process 262 is actively handling a conference in session 260 and aware of media and other traffic on the communications network through the communications interface monitor process 264. A media push request is received from a call center workstation 268 through the communications interface 266. The first step will be for the sequencing and control process 270 to perform a remote capability query 272 in the database. This allows the system to validate (decision 274) the remote device ability to handle the media stream required.

Through the communications interface monitor process 278, the sequencing and control process 276 then checks for bandwidth capacity (decision 280) on the network. If bandwidth is not available at that time, the call center workstation 268 is notified of the situation and offered the opportunity to wait, cancel, or to push the media to the remote terminal in a near real-time fashion. In the latter instance, the media feed is pushed out to the remote as bandwidth permits and is buffered on the remote's storage device.

If bandwidth is available, the sequencing and control process 282 send a command to the media server engine 284 to send the media stream to the remote terminal 288 via the communications interface 286. The sequencing and control process 282 continues to monitor the feed through the communications interface 286. If bandwidth continues to be available (decision 290) the feed continues unchanged. If bandwidth utilization on the communications network changes and cannot continue to support the media feed at the current rate, the sequencing and control process 292 throttles down the rate and/or buffers the media stream on the remote terminal 288 to minimize the bandwidth impact.

Although several preferred embodiments of the invention have been disclosed in the foregoing specification, it is understood by those skilled in the art that many modifications and other embodiments of the invention will come to mind to which the invention pertains, having the benefit of the teaching presented in the foregoing description and associated drawings. It is thus understood that the invention is not limited to the specific embodiments disclosed herein, and that many modifications and other embodiments of the inventions are intended to be included within the scope of the appended claims. Moreover, although specific terms are employed herein, as well as in the claims, they are used in a generic and descriptive sense only, and not for the purposes of limiting the described invention, nor the claims which follow below.

Claims

1. A system for providing media to users at secure remote lottery gaming locations, comprising:

one or more secure lottery gaming terminals located at remote locations on a communication network, the one or more secure lottery gaming terminals each allowing a user to play and wager in a lottery game of chance;
at least one media server on the communication network, the media server being capable of determining the usable media for the one or more secure lottery gaming terminals and encoding a live video;
a plurality of media feeds feeding media to the media server including at least one real-time third party video feed, the media server selectively distributing the encoded live video and appropriate media content from the media feeds to the one or more secure lottery gaming terminals during game play according to a scheduling database, the scheduling database having entries related to the date, time, and intended media content for the different respective remote gaming terminals, the scheduling database controlling the distribution of media content between different gaming terminals based on said entries such that a different combination of media is simultaneously sent to different gaming terminals; and
at least one call center media controller configured for real-time video conferencing between the remote gaming terminals and a central call center service location for real-time assistance to players at the gaming terminals without interrupting game play.

2. The system of claim 1, wherein at least one of the media feeds is stored media.

3. The system of claim 1, wherein the call center media controller further comprises an assistance server on the network to selectively provide requested support to the one or more secure gaming terminals.

4. The system of claim 3, wherein the assistance server is a telephone call center.

5. The system of claim 1, wherein the system utilizes internet protocol (IP) on the communication network.

6. The system of claim 1, wherein the lottery game of chance is a sporting event.

7. A method of for providing media to users at secure remote lottery gaming locations, comprising the steps of:

hosting a lottery game of chance at one or more secure lottery gaming terminals located at remote locations on a communication network, the one or more lottery terminals each allowing a user to play and wager in the lottery game of chance;
feeding media content from one or more media feeds to a media server, the media content including at least one real-time third party video feed, the media server being capable of determining the usable media for the one or more secure lottery gaming terminals and encoding a live video; and
distributing the encoded live video and appropriate media content from the media server to the one or more secure lottery gaming terminals during game play of the lottery game of chance according to a scheduling database, the scheduling database having entries related to the date, time, and intended media content for the different respective remote gaming terminals, the scheduling database controlling the distribution of media content between different gaming terminals based on said entries such that a different combination of media is simultaneously sent to different gaming terminals; and
with a call center media controller, real-time video conferencing between the remote gaming terminals and a central call center service location for real-time assistance to players at the gaming terminals without interrupting game play.

8. The method of claim 7, wherein the steps of feeding media content and distributing the appropriate media content includes feeding and distributing stored media.

9. The method of claim 7, further comprising the step of providing support from an assistance server on the network to the one or more secure lottery gaming terminals via the call center media controller.

10. The method of claim 9, wherein the assistance server is a telephone call center and the step of providing support is providing telephonic assistance.

11. The method of claim 7, wherein the step of hosting a lottery game of chance is relaying data relative to a sporting event.

12. A system for providing media to users at secure remote gaming locations, comprising:

at least one lottery gaming means located at a remote location on a communication network, the lottery gaming means for allowing a user to play and wager in a lottery game of chance;
at least one media serving means on the communication network, the media serving means for determining the usable media for the at least one lottery gaming means for encoding a live video; and
a plurality of media feeding means for selectively feeding media content to the media serving means, the media content including a real-time third party video feed;
wherein the at least one media serving means selectively encoding the live video and distributing the appropriate media content from the media feeding means to the at least one lottery gaming means during game play according to a scheduling database, the scheduling database having entries related to the date, time, and intended media content for the different respective remote gaming terminals, the scheduling database controlling the distribution of media content between different gaming terminals based on said entries such that a different combination of media is simultaneously sent to different gaming terminals; and
means for real-time video conferencing between the remote gaming terminals and a central call center service location for real-time assistance to players at the gaming terminals without interrupting game play.
Referenced Cited
U.S. Patent Documents
1527929 February 1925 Simons
3089123 May 1963 Hennis et al.
3245697 April 1966 BNugent
3699311 October 1972 Dunbar
3736368 May 1973 Vogelman et al.
3826499 July 1974 Lenkoff
3868057 February 1975 Chavez
3876865 April 1975 Bliss
3902253 September 1975 Sabuzawa et al.
3918174 November 1975 Miller et al.
3922529 November 1975 Orloff
3934120 January 20, 1976 Maymarev
4017834 April 12, 1977 Cuttill et al.
4095824 June 20, 1978 Bachman
4105156 August 8, 1978 Dethloff
4176406 November 1979 Matkan
4191376 March 4, 1980 Goldman et al.
4194296 March 25, 1980 Pagnozzi et al.
4195772 April 1, 1980 Nishimura
4206920 June 10, 1980 Weatherford et al.
4241942 December 30, 1980 Bachman
4243216 January 6, 1981 Mazumder
4273362 June 16, 1981 Carrier et al.
4309452 January 5, 1982 Sachs
4313087 January 26, 1982 Weitzen et al.
4355300 October 19, 1982 Weber
4375666 March 1, 1983 Buck et al.
4398708 August 16, 1983 Goldman et al.
4407443 October 4, 1983 McCorkle
4451759 May 29, 1984 Heynisch
4455039 June 19, 1984 Weitzen et al.
4457430 July 3, 1984 Darling et al.
4464423 August 7, 1984 LaBianca et al.
4466614 August 21, 1984 Bachman et al.
4488646 December 18, 1984 McCorkle
4491319 January 1, 1985 Nelson
4494197 January 15, 1985 Troy et al.
4536218 August 20, 1985 Ganho
4544184 October 1, 1985 Freund et al.
4579371 April 1, 1986 Long et al.
4591189 May 27, 1986 Holmen et al.
4634149 January 6, 1987 Donovan
4665502 May 12, 1987 Kreisner
4669729 June 2, 1987 Solitt et al.
4689742 August 25, 1987 Troy et al.
4726608 February 23, 1988 Walton
4736109 April 5, 1988 Dvorzsak
4740016 April 26, 1988 Konecny et al.
4760247 July 26, 1988 Keane et al.
4763927 August 16, 1988 Schneider
4775155 October 4, 1988 Lees
4792667 December 20, 1988 Chen
4805907 February 21, 1989 Hagiwara
4817951 April 4, 1989 Crouch et al.
4835624 May 30, 1989 Black et al.
4836546 June 6, 1989 Dire et al.
4836553 June 6, 1989 Suttle et al.
4837728 June 6, 1989 Barrie et al.
4856787 August 15, 1989 Itkis
4861041 August 29, 1989 Jones et al.
4870260 September 26, 1989 Niepolomski et al.
4880964 November 14, 1989 Donahue
4888244 December 19, 1989 Masubuchi et al.
4922522 May 1, 1990 Scanlon
4943090 July 24, 1990 Fienberg
4960611 October 2, 1990 Fujisawa et al.
4961578 October 9, 1990 Chateau
4964642 October 23, 1990 Kamille
4996705 February 26, 1991 Entenmann et al.
4998010 March 5, 1991 Chandler et al.
4998199 March 5, 1991 Tashiro et al.
5032708 July 16, 1991 Comerford et al.
5037099 August 6, 1991 Burtch
5046737 September 10, 1991 Fienberg
5074566 December 24, 1991 Desbiens
5083815 January 28, 1992 Scrymgeour et al.
5092598 March 3, 1992 Kamille
5094458 March 10, 1992 Kamille
5100139 March 31, 1992 Di Bella
5109153 April 28, 1992 Johnsen et al.
5112050 May 12, 1992 Koza et al.
5116049 May 26, 1992 Sludikoff et al.
5118109 June 2, 1992 Gumina
5119295 June 2, 1992 Kapur
5158293 October 27, 1992 Mullins
5165967 November 24, 1992 Theno et al.
5168353 December 1, 1992 Walker et al.
5186463 February 16, 1993 Marin et al.
5189292 February 23, 1993 Batterman et al.
5193815 March 16, 1993 Pollard
5193854 March 16, 1993 Borowski, Jr. et al.
5228692 July 20, 1993 Carrick et al.
5232221 August 3, 1993 Sludikoff et al.
5234798 August 10, 1993 Heninger et al.
5249801 October 5, 1993 Jarvis
5259616 November 9, 1993 Bergmann
5273281 December 28, 1993 Lovell
5276980 January 11, 1994 Carter et al.
5282620 February 1, 1994 Keesee
5308992 May 3, 1994 Crane et al.
5317135 May 31, 1994 Finocchio
5326104 July 5, 1994 Pease et al.
5332219 July 26, 1994 Marnell, II et al.
5342047 August 30, 1994 Heidel et al.
5342049 August 30, 1994 Wichinsky et al.
5344144 September 6, 1994 Canon
5346258 September 13, 1994 Behn et al.
5380007 January 10, 1995 Travis et al.
5393057 February 28, 1995 Marnell, II et al.
5401024 March 28, 1995 Simunek
5401541 March 28, 1995 Hodnett, III
5403039 April 4, 1995 Borowski, Jr. et al.
5407199 April 18, 1995 Gumina
5420406 May 30, 1995 Izawa et al.
5432005 July 11, 1995 Tanigami et al.
5451052 September 19, 1995 Behm et al.
5456465 October 10, 1995 Durham
5456602 October 10, 1995 Sakuma
5471039 November 28, 1995 Irwin, Jr. et al.
5471040 November 28, 1995 May
5475205 December 12, 1995 Behm et al.
5486005 January 23, 1996 Neal
5513846 May 7, 1996 Niederlein et al.
5528154 June 18, 1996 Leichner et al.
5536016 July 16, 1996 Thompson
5540442 July 30, 1996 Orselli et al.
5548110 August 20, 1996 Storch et al.
5550746 August 27, 1996 Jacobs
5560610 October 1, 1996 Behm et al.
5564700 October 15, 1996 Celona
5564977 October 15, 1996 Algie
5591956 January 7, 1997 Longacre, Jr. et al.
5599046 February 4, 1997 Behm et al.
5602381 February 11, 1997 Hoshino et al.
5621200 April 15, 1997 Irwin et al.
5628684 May 13, 1997 Bouedec
5630753 May 20, 1997 Fuchs
5651735 July 29, 1997 Baba
5655961 August 12, 1997 Acres et al.
5667250 September 16, 1997 Behm et al.
5682819 November 4, 1997 Beatty
5690366 November 25, 1997 Luciano
5704647 January 6, 1998 Desbiens
5722891 March 3, 1998 Inoue
5726898 March 10, 1998 Jacobs
5732948 March 31, 1998 Yoseloff
5741183 April 21, 1998 Acres et al.
5743800 April 28, 1998 Huard et al.
5752882 May 19, 1998 Acres et al.
5756220 May 26, 1998 Hoshino et al.
5768142 June 16, 1998 Jacobs
5769458 June 23, 1998 Carides et al.
5770533 June 23, 1998 Franchi
5772509 June 30, 1998 Weiss
5772510 June 30, 1998 Roberts
5772511 June 30, 1998 Smeltzer
RE35864 July 28, 1998 Weingardt
5779840 July 14, 1998 Boris
5781734 July 14, 1998 Ohno et al.
5789459 August 4, 1998 Inagaki et al.
5791990 August 11, 1998 Schroeder et al.
5797794 August 25, 1998 Angell
5803504 September 8, 1998 Deshiens et al.
5816920 October 6, 1998 Hanai
5818019 October 6, 1998 Irwin, Jr. et al.
5820459 October 13, 1998 Acres et al.
5823874 October 20, 1998 Adams
5830063 November 3, 1998 Byrne
5830066 November 3, 1998 Goden et al.
5830067 November 3, 1998 Graves et al.
5830068 November 3, 1998 Brenner et al.
5833537 November 10, 1998 Barrie
5835576 November 10, 1998 Katz et al.
5836086 November 17, 1998 Elder
5836817 November 17, 1998 Acres et al.
5848932 December 15, 1998 Adams
5851149 December 22, 1998 Xidos et al.
5863075 January 26, 1999 Rich et al.
5871398 February 16, 1999 Schneier et al.
5876284 March 2, 1999 Acres et al.
5882261 March 16, 1999 Adams
5883537 March 16, 1999 Luoni et al.
5885158 March 23, 1999 Torango et al.
5887906 March 30, 1999 Sultan
5903340 May 11, 1999 Lawandy et al.
5911418 June 15, 1999 Adams
5915588 June 29, 1999 Stoken et al.
5934671 August 10, 1999 Harrison
5970143 October 19, 1999 Schneier et al.
5971271 October 26, 1999 Wynn et al.
5979894 November 9, 1999 Alexoff
5996997 December 7, 1999 Kamille
5997044 December 7, 1999 Behm et al.
6003307 December 21, 1999 Naber et al.
6004207 December 21, 1999 Wilson, Jr. et al.
6004208 December 21, 1999 Takemoto et al.
6007162 December 28, 1999 Hinz et al.
6012982 January 11, 2000 Piechowiak et al.
6014819 January 18, 2000 Elder
6017032 January 25, 2000 Grippo et al.
6024641 February 15, 2000 Sarno
6053405 April 25, 2000 Irwin, Jr. et al.
6077162 June 20, 2000 Weiss
6080062 June 27, 2000 Olson
6086477 July 11, 2000 Walker et al.
6089978 July 18, 2000 Adams
6099407 August 8, 2000 Parker, Jr. et al.
6102400 August 15, 2000 Scott et al.
6107913 August 22, 2000 Gatto et al.
6113495 September 5, 2000 Walker et al.
6119364 September 19, 2000 Elder
6125368 September 26, 2000 Bridge et al.
6142872 November 7, 2000 Walker et al.
6146272 November 14, 2000 Walker et al.
6149521 November 21, 2000 Sanduski
6155491 December 5, 2000 Dueker et al.
6168521 January 2, 2001 Luciano et al.
6168522 January 2, 2001 Walker et al.
6179710 January 30, 2001 Sawyer et al.
6203430 March 20, 2001 Walker et al.
6206373 March 27, 2001 Garrod
6210275 April 3, 2001 Olsen
6217448 April 17, 2001 Olsen
6220596 April 24, 2001 Horan
6220961 April 24, 2001 Keane et al.
6224055 May 1, 2001 Walker et al.
6227969 May 8, 2001 Yoseloff
6238288 May 29, 2001 Walker et al.
6309300 October 30, 2001 Glavich
6312334 November 6, 2001 Yoseloff
6315291 November 13, 2001 Moody
6330976 December 18, 2001 Dymetman et al.
6331143 December 18, 2001 Yoseloff
6334814 January 1, 2002 Adams
6340158 January 22, 2002 Pierce et al.
6368213 April 9, 2002 McNabola
6375568 April 23, 2002 Roffman et al.
6379742 April 30, 2002 Behm et al.
6394899 May 28, 2002 Walker et al.
6398214 June 4, 2002 Moteki et al.
6398643 June 4, 2002 Knowles et al.
6398644 June 4, 2002 Perrie et al.
6398645 June 4, 2002 Yoseloff
6416408 July 9, 2002 Tracy et al.
6419579 July 16, 2002 Bennett
6435408 August 20, 2002 Irwin, Jr. et al.
6435500 August 20, 2002 Gumina
6478677 November 12, 2002 Moody
6491215 December 10, 2002 Irwin, Jr. et al.
6497408 December 24, 2002 Walker et al.
6552290 April 22, 2003 Lawandy
6588747 July 8, 2003 Seelig
6599186 July 29, 2003 Walker et al.
6601772 August 5, 2003 Rubin et al.
6637747 October 28, 2003 Garrod
6648735 November 18, 2003 Miyashita et al.
6648753 November 18, 2003 Tracy et al.
6648755 November 18, 2003 Luciano et al.
6676126 January 13, 2004 Walker et al.
6692354 February 17, 2004 Tracy et al.
6702047 March 9, 2004 Huber
6773345 August 10, 2004 Walker et al.
6776337 August 17, 2004 Irwin, Jr. et al.
6786824 September 7, 2004 Cannon
6823874 November 30, 2004 Lexcen
6875105 April 5, 2005 Behm et al.
6929186 August 16, 2005 Lapstun
20010027130 October 4, 2001 Namba et al.
20010030978 October 18, 2001 Holloway et al.
20010034262 October 25, 2001 Banyai
20010040345 November 15, 2001 Au-Yeung
20020022511 February 21, 2002 Eklund et al.
20020084335 July 4, 2002 Ericson
20020171201 November 21, 2002 Au-Yeung
20020187825 December 12, 2002 Tracy et al.
20030050109 March 13, 2003 Caro et al.
20030114210 June 19, 2003 Meyer et al.
20030216185 November 20, 2003 Varley
20040048670 March 11, 2004 Rowe
20040076310 April 22, 2004 Hersch et al.
20040097288 May 20, 2004 Sloate et al.
20040106454 June 3, 2004 Walker et al.
20040173965 September 9, 2004 Stanek
20040178582 September 16, 2004 Garrod
20040185931 September 23, 2004 Lowell et al.
20040204222 October 14, 2004 Roberts
20040259631 December 23, 2004 Katz et al.
20040266514 December 30, 2004 Penrice
20040266527 December 30, 2004 Anderson et al.
20050014562 January 20, 2005 Fujimoto
20050113173 May 26, 2005 Waters
Foreign Patent Documents
B-18428/92 December 1992 AU
B-21070/92 July 1993 AU
A-50327/96 February 1997 AU
B-52499/96 February 1997 AU
199716432 September 1997 AU
A-45403/97 April 1998 AU
A-63553/98 October 1998 AU
2938307 April 1981 DE
3035898 April 1982 DE
3035947 May 1982 DE
2938307 June 1982 DE
29803107 August 1988 DE
3822636 January 1990 DE
2938307 August 1990 DE
3822636 January 1992 DE
3415114 October 1995 DE
19646956 May 1998 DE
19706286 May 1998 DE
29816453 April 1999 DE
19751746 May 1999 DE
0122902 April 1984 EP
0333934 September 1989 EP
0458623 November 1991 EP
0798676 October 1997 EP
0799649 October 1997 EP
0149712 July 1998 EP
0874337 October 1998 EP
0896304 February 1999 EP
0914875 May 1999 EP
0914875 May 1999 EP
0919965 June 1999 EP
0983801 March 2000 EP
0983801 March 2001 EP
1149712 October 2001 EP
529535 June 1983 ES
529536 June 1983 ES
2006400 April 1989 ES
2006401 April 1989 ES
642892 September 1950 GB
2075918 November 1981 GB
2222712 March 1990 GB
2230373 October 1990 GB
2295775 December 1996 GB
3328311 February 1999 GB
23282311 February 1999 GB
02235744 September 1990 JP
04132672 December 1992 JP
WO85/02250 May 1985 WO
WO91/17529 November 1991 WO
WO 98/03910 January 1998 WO
WO 98/40138 September 1998 WO
WO 99/09364 February 1999 WO
WO 99/26204 May 1999 WO
WO 99/39312 August 1999 WO
WO00/00256 January 2000 WO
WO00/78418 December 2000 WO
WO01/74460 November 2001 WO
WO01/93966 December 2001 WO
WO02/056266 July 2002 WO
Other references
  • ‘Are You In?’, (Article), Feb. 20, 1998.
  • ‘Beginner's Guide—How to Bet’, (www.plimico.com/How+to+wager/beginnersguide/), (Internet Article), 3 Pgs., [accessed May 25, 2005].
  • Chip Brown, ‘Austin American-Statesman’, (Article), May 28, 1998, 2 Pgs., Texas.
  • John C. Hallyburton, Jr., ‘Frequently Asked Questions About Keno’, (Internet Article),1995, 1998, 10 Pgs., (http://conielco.com/faq/keno.html).
  • ‘Horse betting Tutorial—Types of Bets’ (www.homepokergames.com/horsebettingtutorial.php), (Internet Article), 2 Pgs., [accessed May 25, 2005].
  • Judith Gaines, ‘Pool Party Betting Business Booming Throughout Area Workplaces’, (Internet Article), Mar. 19, 1994, 2 Pgs., Issue 07431791, Boston Globe, Boston, MA.
  • ‘Maryland Launches Let It Ride’, (Internet Article), Circa 2001,1 Pg.
  • ‘Notice of Final Rulemaking’, (Internet Article) Mar. 24, 2000, 10 Pgs., vol. 6, Issue #13, Arizona Administrative Register, Arizona.
  • ‘How to Play Megabucks’, (Internet Article), Mar. 9, 2001, 2 Pgs., Oregon Lottery Megabucks,(http://www.oregonlottery.org/mega/mhowto.htm).
  • ‘How to Play Megabucks’, (Internet Article), May 8, 2001, 2 Pgs., Oregon Lottery Megabucks, (http://www.oregonlottery.org/mega/mhowto.htm).
  • ‘Oregon Lottery’, (Internet Article), Apr. 30, 2004, 9 Pgs., Oregon Lottery Web Center, (http://www.oregonlottery.org/general/ghist.shtml).
  • ‘Powerball Odd & Prizes’, ‘How to Play Powerball’, (Internet Article), Dec. 2002, 2 Pgs., (www.powerball.com/pbhowtoplay.shtm).
  • ‘Powerball Prizes and Odds’, (Internet Article), 2 Pgs., http://www.powerball.com/pbprizesNOdds.shtm, Dec. 2002.
  • ‘Learn to Play the Races’ (Internet Article), 15 Pgs., Racing Daily Form (www.drf.com), Circa 2003.
  • Mike Parker, ‘The History of Horse Racing’ (Internet Article),1996, 1997,1998, 5 Pgs., http://www.mrmike.com/explore/hrhist.htm.
Patent History
Patent number: 7621814
Type: Grant
Filed: Jul 20, 2005
Date of Patent: Nov 24, 2009
Patent Publication Number: 20060019751
Assignee: Scientific Games International, Inc. (Newark, DE)
Inventor: Thomas Eugene Garcia (Dacula, GA)
Primary Examiner: Peter DungBa Vo
Assistant Examiner: Milap Shah
Attorney: Dority & Manning, P.A.
Application Number: 11/186,464