Interactive gaming system with animated, real-time characters

A gaming system enables users to interact with one another during electronic game play with animated imagery. The system comprises a central processing unit having certain character-building menus and certain character-directing commands. Users thereof, via user-based hardware, communicate with one another by way of the central processing unit and a communication network. Animatable characters and character display units are displayed upon the user's hardware and provide an entertaining visual and/or auditory forum for interacting with other players. The character commands are operable to animate user-identifying characters for incorporating animated body language into game play for enhancing player interaction.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention generally relates to the field of interactive, electronic gaming systems. More particularly, the present invention relates to an online gaming system for enabling players to interact with one another in real-time in a character-based setting.

2. Description of the Prior Art

With the advent of computers, the inevitability of inventive computer-based gaming methodologies became apparent. Software developers continually develop electronic means for achieving inventive end results. The gaming industry, for example, has seen rapid growth in the use of web-based interface means for enabling players to game electronically. In this regard, a number of inventive systems and methods have been developed as a means to embrace consumer demands and provide the marketplace with quicker, more efficient ways to meet gaming demands. Several of the more pertinent U.S. patent disclosures describing web-based or electronically-based network methodologies for providing players with quicker, more entertaining ways to game are described hereinafter.

U.S. Pat. No. 6,893,347 ('347 patent), which issued to Zilliacus et al., discloses a method and apparatus for playing games between the clients of entities at different locations by linking multiple players together through a network using the users' mobile phones. The method includes connecting a plurality of mobile phones together through a network for playing a game, setting up a game scenario for each of the plurality of mobile phones and transmitting game signals between the plurality mobile phones across the network. At least two of the plurality of mobile phones is remotely located. The mobile phones connect to the network through a base station. A connection from a mobile phone to the network through a base station is a low power radio frequency connection. The interactive game may be interrupted when a mobile phone receives a call. The interactive game is interrupted only for the mobile phone receiving the call.

U.S. Pat. No. 6,908,390 ('390 patent), which issued to Nguyen et al, discloses a gaming system which may include a number of gaming units and a host computer operatively coupled to the gaming units, and be configured to allow a gaming tournament to be conducted. Each of the gaming units may comprise a video display unit, a microphone, a camera, a speaker and a gaming unit controller. The gaming unit controller may be programmed to allow a person to select tournament play as a single or a group tournament player at a reserved or unreserved gaming unit, and to allow player data to be transmitted to the host computer. The host computer may include a host interface unit capable of receiving audio, visual and/or data input from a tournament host during the tournament, and a host computer controller capable of causing host data to be transmitted to the gaming units.

United States Patent Application Publication No. US 2002/0052235, authored by Hirsch et al., discloses a gaming device which can display a plurality of graphical images at any one time. One or more of these graphical images are specified with predetermined sizes relative to a display frame preferably in addition to predetermined Z-levels, movements and other specifications. This type of gaming device provides players with more realistic and enhanced graphics, adding to a player's excitement and entertainment.

United States Patent Application Publication No. 2003/0060258, authored by Giobbi, discloses a bonus event configured for interaction between a player and a gaming machine initiated through a simulated telecommunications link including a telephone handset or similar device such as a speakerphone. The bonus event may be configured as an “incoming” call triggered by some outcome of a primary game on the gaming machine or as an “outgoing” call by a player to the gaming machine. In either instance, the bonus event includes one or more of positive player interactions with the gaming machine, as through a keypad, prior to generation of a bonus award.

United States Patent Application Publication No. 2004/0063486, authored by Mead, discloses a gaming apparatus comprising a display unit that is capable of generating video images, a value input device, and a controller operatively coupled to the display unit and the value input device, the controller comprising a processor and a memory operatively coupled to the processor. The controller is programmed to allow a person to make a wager, to cause a first video image to be generated on the display unit, the first video image representing a first video game, to determine a game event associated with the first video game, to cause a second video image representing a second video game to be generated on the display unit according to the game event associated with the first video game, to animate the second video image, to receive a player's input to stop animation of the second video image, to stop animation of the second video image according to the player's input, and to determine a value payout to be associated with the second video game.

United States Patent Application Publication No. 2006/0058088, authored by Crawford III et al., relates to an electronic poker table and method which provide an electronic poker game to a plurality of players. The electronic poker table includes a table having a table top with a playing surface, a plurality of electronic player interaction areas located around a periphery of the table top, and a game computer. Each electronic player interaction area provides a player interface for interaction with one of the players. Each player interface has a rabbit button. At least one hand of the electronic poker game is administered using virtual cards. A winner for the at least one hand is determined and a pot is awarded to the winner. The hand includes at least one common card and, after the winner has been determined, a player may view any undealt common cards by actuating the rabbit button.

United States Patent Application Publication No. 2006/0058103, authored by Danieli et al., discloses an online event, and a spectator process which monitors a state of the event, updating a spectator model, so that spectator data streams can be generated and provided to spectators. The spectator data streams can be formatted and provided with content appropriate for use by different types of spectator devices used by the spectators. The spectator process can also automatically generate virtual commentary appropriate for the action occurring in the event for inclusion in the spectator data streams. A media server receives the rendered data streams and distributes them to the electronic devices being used by the spectators. The distribution can be delayed to avoid a spectator conveying information to a participant that would provide an unfair advantage. Executable code can be included in the spectator data stream to provide additional functionality and facilitate interaction between the spectators, and to enable a spectator to also “play” the game.

From a review of these prior art disclosures and from a general consideration of other pertinent prior art generally known to exist, it will be seen that the prior art does not disclose an electronic, networked gaming system incorporating user-identifying animated characters, which characters may be directed for real-time interaction amongst the players. The prior art thus perceives a need for an electronic, networked gaming system incorporating user-identifying animated characters, which characters may be directed for real-time interaction amongst the players.

SUMMARY OF THE INVENTION

Objects of the invention thus include making on-line gaming as live as possible between players to players (1) by voice connections between players; (2) by visual communications on mini T.V. sets or character display units to see each player live; and (3) character-building menus with cooperative character-directing commands to enable players who do not wish to be seen live, but still want some dynamic player interaction in order to enhance the entertainment value of the game.

To achieve these and other readily apparent objectives, the present invention provides an electronic gaming system for enhancing player interaction during an electronic gaming event, the gaming system comprising user-based data input means, user-based data output means, central processing means, and a communication network. The user-based data output means essentially comprise certain visual and auditory output means. The communication network electronically interconnects the user-based data input means, the user-based data output means, and the central processing means enabling at least two users to communicate with one another via the user-based data input and data output means in real-time. The central processing means essentially comprise a character menu and a gaming menu, the character menu comprising at least two animatable characters, at least two character display units, and a plurality of predefined character commands. Each animatable character is cooperatively associated with a set of character commands.

The gaming menu essentially comprises a theme menu and a game menu, the theme menu comprising at least one gaming theme and the game menu comprising at least one game. The character menu and the gaming menu are displayable upon the visual output means. The users each select a select character display unit, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus. The user-identifying characters are selectable from the animatable characters. The character display units are visually presented in adjacency to one another, and the user-identifying characters are visually presented upon the character display units. The central processing means enable the users to play the selected common game with the selected common theme, and the character commands are operable to animate the user-identifying characters for incorporating animated body language into game play for enhancing player interaction.

Other objects of the present invention, as well as particular features, elements, and advantages thereof, will be elucidated or become apparent from, the following descriptions and the accompanying drawing figures.

BRIEF DESCRIPTION OF THE DRAWINGS

Other features of my invention will become more evident from a consideration of the following brief descriptions of patent drawings:

FIG. 1 is a flowchart type depiction illustrating the interrelatedness of various components of the gaming system.

FIG. 2 is a screenshot type depiction of an on-line gaming system homepage, showing a first gaming table depiction, a plurality of character display units, and a homepage title.

FIG. 3 is a screenshot type depiction of a gaming system play page showing a second gaming table depiction and a plurality of text boxes positioned about the gaming table depiction.

FIG. 4 is a screenshot type depiction of a gaming system play page showing a third gaming table depiction, two animated characters, and a plurality of text boxes positioned about the third gaming table depiction.

FIG. 5 is a depiction of a user at a personal computer station and a fragmentary depiction of the gaming system play page showing the user upon a character display unit adjacent a fourth gaming table depiction.

FIG. 6 is a screenshot type depiction of a gaming system play page showing a fifth gaming table depiction, and a plurality of character display units positioned about the fifth gaming table depiction.

FIG. 7 is a screenshot type depiction of a gaming system play page showing a sixth gaming table depiction, and a plurality of character display units positioned about the sixth gaming table depiction.

FIG. 8 is a screenshot type depiction of a gaming system play page showing a seventh gaming table depiction, and a plurality of text boxes positioned about the seventh gaming table depiction.

FIG. 9 is a screenshot type depiction of a gaming system play page showing an eighth gaming table depiction, and a plurality of character display units positioned about the eighth gaming table depiction.

FIG. 10 is a screenshot type depiction of a gaming system play page showing a ninth gaming table depiction, a plurality of character display units positioned about the ninth gaming table depiction, a plurality of characters positioned about the ninth gaming table depiction, and a single text box positioned in adjacency to the ninth gaming table depiction.

FIG. 11 is a screenshot type depiction of a partial view of a generic stick man figure character as positioned adjacent a fragmentary gaming table depiction.

FIG. 12 is a screenshot type depiction of a gaming system menu page showing three alternative stick man figure characters with a textual line color selection prompt.

FIG. 13 is a screenshot type depiction of a gaming system menu page showing five alternative stick man figure faces with a textual face color selection prompt.

FIG. 14 is a screenshot type depiction of a gaming system menu page showing four hair selections, five nose selections, four eye selections, and an item selection prompt.

FIG. 15 is a screenshot type depiction of a partial view of a personalized stick man figure character as positioned adjacent a fragmentary gaming table depiction.

FIG. 16 is a screenshot type depiction of a gaming system menu page showing five animation selections.

FIG. 17 is an image of a dog and cat.

FIG. 18 is three images of the dog and cat shown in FIG. 7 in various static views, each of which represents a static image of animation.

FIG. 19 is a fragmentary side view depiction of a microphone stand assembly.

FIG. 20 is a fragmentary perspective view depiction of a headset assembly.

FIG. 21 is a fragmentary side view depiction of a web camera assembly positioned atop a computer monitor.

FIG. 22 is a front view depiction of a loudspeaker type visual depiction.

FIG. 23a is a front view of a loudspeaker type visual auditory depiction in silent mode.

FIG. 23b is a front view of a loudspeaker type visual auditory depiction in audio mode.

FIG. 24 is a flowchart type depiction describing how language may be visually and/or aurally presented to users of the gaming system.

FIG. 25 is a flowchart type depiction of how certain visual depictions of speakers indicating when a user or player is communicating and when a user or player is not communicating.

FIG. 26 is a screenshot type depiction of flower themed poker ships showing various colors and denominations of the poker chips.

FIG. 27 is a fragmentary screenshot type depiction of the sixth gaming table depiction shown in FIG. 7.

FIG. 28 is a screenshot type depiction of a themed character display unit showing a live action image of a user.

FIG. 29 is a screenshot type depiction of prison themed poker ships showing various denominations of the poker chips.

FIG. 30 is a screenshot type depiction of a tenth gaming table depiction showing end of game messages.

FIG. 31 is a screenshot type depiction of a character display unit menu with character display unit selection prompt.

FIG. 32 is a screenshot type depiction of a character menu.

FIG. 33 is a screenshot type depiction of a gaming menu showing a gaming table depiction prompt, a theme menu prompt, and a game menu prompt.

FIG. 34 is a screenshot type depiction of a theme menu showing a theme selection prompt.

FIG. 35 is a screenshot type depiction of a game menu showing a game selection prompt.

DETAILED DESCRIPTION OF THE PREFERRED SYSTEM AND METHOD(S)

Referring now to the drawings, the preferred embodiment of the present invention concerns an electronic gaming system for generally enhancing player interaction during an electronic gaming event, such as, but not limited to, on-line poker. In this regard, it is contemplated that poker players and the like who engage in an electronic or on-line gaming scenario may find it entertaining to interact with other players involved in the game if their persons were represented by real-time, animated characters, which characters can provide both visual and auditory stimulation to enhance the gaming experience. The gaming system comprises certain user-based data input means, certain user-based data output means, certain central processing means, and a communication network.

The user-based data input means may be defined by the group including, but not limited to, a keyboard assembly 10 for textual message input as generically illustrated and referenced in FIG. 5; certain microphonic input hardware such as a stand-alone microphone assembly 11 as generically illustrated and referenced in FIG. 19, or a head-set type microphone assembly 12 as illustrated and referenced in FIGS. 5 and 20; and a web-cam(era) type assembly 13 as generically illustrated and referenced in FIGS. 5 and 21. It is thus contemplated that the user-based data input means may be defined by comprising certain sound-inputting means, the sound-inputting means for incorporating auditory stimuli (such as a user's voice) into game play for enhancing player interaction therein. Further, it is contemplated that the user-based data input means may be defined by comprising certain light-inputting means, the light-inputting means for incorporating user-based visual stimuli (such as a user's real-time image) into game play for enhancing player interaction therein.

The user-based data output means may be defined by the group including, but not limited to, a computer monitor assembly 14 (or other visual display means) as generically illustrated and referenced in FIGS. 5 and 21; an audio speaker assembly or speaker assemblies 15 as generically illustrated and referenced in FIGS. 22, 23a, and 23b. It will thus be seen that the user-based data output means may preferably comprise certain visual output means as may be defined by a visual display device (or computer monitor) and certain auditory output means as may be defined by speaker assembly(ies).

The central-processing means 9 (as generically represented in FIG. 5) may be defined by the group comprising a master game server 15, a game execution server 16, a database server 17, a central processing unit 18, a main unit 19, a secondary back-up unit 20, a progressive controller 21, a display controller 22, and a game-linking controller 23 all as generically depicted as (interconnected) black boxes in FIG. 1. Players 24 (as represented by personal computers or similar other user-based gaming means) are further connected to the central processing means via a communication network 25 as further depicted in FIG. 1. It is contemplated that communication network 25 may well be defined by including the so-called Internet or similar other web-based networks so as to enable players in widely displaced locations to interact in a single gaming scenario. Thus, it should be understood that the communication network 25 functions to electronically interconnect the user-based data input means, the user-based data output means, and the central processing means thereby enabling a plurality of users to communicate with one another via the user-based data input and data output means in real-time, much in the same manner as instant messaging.

Within the central processing means are certain data stores, which stores comprise character menu(s) and certain gaming menu(s). In this regard, it is contemplated that the character menu 26 may preferably comprise a plurality of animatable characters 25 as generally depicted in FIGS. 4, 10, 11, 15, and 32 from which the user can select a user-identifying character; a plurality of character display units 29, and certain predefined character commands. For example, if the user elects to play as a stickman 27 (as generally depicted in FIGS. 4, 11, and 15), the user may so select from the character menu 26 as generally depicted in FIG. 32. If the user elects to forego a character selection, opting instead to play with a live image of the user, it is contemplated that the user may do so via the web-cam assembly 13, whereby the user's image may be displayed as at 28 in FIGS. 4, 5, and 10. Or if the user elects to play without an animated character, he or she may so opt. Alternatively, the user may elect to upload an image of his or her pet or pets as generally depicted in FIGS. 17 and 18 at 40. The user may elect to upload of submit various depictions of the pet (or of himself or herself) in various states of action to simulate static emotions such as (1) happy pets as typified by (a) a wagging tail as depicted in FIG. 18 at 41 or (b) a rewarded pet as depicted in FIG. 18 at 43; or (2) unhappy pets (as typified by scolded pets) as depicted in FIG. 18 at 42.

In any event, it is contemplated that each animatable character may be preferably displayed upon a character display units 29 as generically illustrated and referenced in FIGS. 2-9 which character display units may be defined by any number of framing type visual displays 30 as generally depicted in FIGS. 4, 5, 6, 10, any number of text box type displays 31 as generally depicted in FIGS. 3, 8, and 10; and themed type displays 32 as generally depicted in FIGS. 7, 9. If the player elects to remain invisible, he or she may select either a text box type display 31, an empty chair type display 33 (as referenced in FIGS. 6, 8, and 10) or similar other depiction representing that the player elects to remain hidden. It will be recalled that the light-inputting means may be defined by at least one live-action camera, each live-action camera for capturing a user's image. In this regard, it is contemplated that the user-identifying characters may be selectable from the group consisting of either the animatable characters and/or the user's image. It is thus contemplated that each live-action camera may well function to incorporate user-based visual imagery into game play for enhancing player interaction.

It is further contemplated that the user may prefer to build a character and thus the gaming system may incorporate certain character-building means. In this regard, the gaming system may comprise a character-building menu. For ease of illustration, stickmen figures and body parts thereof have been illustrated in FIGS. 11-16. The character-building menu(s) may include means for selecting color(s) of character outlines as generally depicted in FIG. 12. Further, the menu(s) may include means for selecting the color of the character's face as generally depicted in FIG. 13. FIG. 14 generally depicts various body parts that may be incorporated into character design per the user's elections. The completed character may then be displayed on a screenshot as generally depicted in FIG. 15. Thus, the gaming system may preferably comprise certain character-building means and/or menu(s), the user-identifying character being constructed from the character-building means or menus for incorporating user-built characters into game play for enhancing player interaction.

With regard to the predefined character commands, it is contemplated that each animatable character will be cooperatively associated with a menu of character commands for that character. For example, each animatable character may be commanded to (1) smile 34, comparatively depicted in FIGS. 4, 15, and 16; (2) sit 35 as comparatively depicted in FIGS. 4, 11, and 16; raise arms 36 (as when winning) as depicted in FIG. 16; cry 37 (as when one loses) as depicted in FIG. 16; worry 38 (as when faced with losing) as depicted in FIG. 16; and die 39 (as when one has lost all means to continue playing (i.e. has lost all money or life or health or desire)) as depicted in FIG. 16. FIG. 16 may thus be said to represent a screen shot of a command menu. The completed character as seen in FIG. 15 may then be animated per the character commands as generally depicted in FIG. 16.

Further, it is contemplated that the character commands may be operable to animate the user-identifying characters for incorporating word-based language into game play for enhancing player interaction. In this regard, it is contemplated that users or players engaged in the system may communicate with one another via text messaging as generally depicted at text boxes (or text circles) 44 in FIGS. 4 and 24, or via certain auditory exchange as enabled by the exemplary voice-inputting means as heretofore specified. The character commands may thus be operable to animate the user-identifying characters for incorporating either auditory or text-based (word-based) language into game play for enhancing player interaction therein. In this last regard, as a means to enhance the interactive effect of the gaming system, it is contemplated that onscreen visual depictions of speakers 45 or themed auditory elements 46 may be incorporated into the table depiction 50 as generally depicted in FIGS. 4-7, 10, and 22-23b. When a player inputs communication(s) into game play, it is contemplated that the visual depictions of speakers 45 or the like may light up as generally depicted at 47 in FIGS. 4 and 10. The use of, or incorporation of, visual speaker depictions 45 may help users track which player is speaking or communicating at any given time. FIG. 25 is a flowchart type depiction of the how the visual depictions of speakers 45 indicate when a user or player is speaking or communicating with other players in that a light is illuminated on the visual depiction of speaker 45. Further, when a user is silent or non-communicative, the light is not illuminated on the visual depiction of speaker 45.

It is further contemplated that the gaming table depiction 50 may preferably comprise certain symbolic interactive element depictions 48 as referenced in FIGS. 7-10, such as chips 70, weapons 71, coins 72, and cigars 73. It is contemplated that the character commands may be operable to enable the user-identifying characters to interact with the interactive element depictions 48 for incorporating object-based language or gesture-based language into game play for enhancing player interaction. In other words, players may elect to brandish a weapon 71, or smoke a cigar 73, or stack coins 72 and/or chips 70. It is contemplated that the characters' ability to interact with the element depictions 48 increases the visual activity displayed upon the user-based data output means and thus serves to enhance user interaction.

The gaming menu 60 is contemplated to preferably comprise a gaming table depiction 50 as generically illustrated and referenced in FIGS. 1-10, 27, and 30; a theme menu 51 as generically depicted as a screen shot thereof in FIG. 33; and a game menu 52 as generically depicted as a screen shot thereof in FIG. 34. The theme menu 51 preferably comprises at least two gaming themes, such as a traditional card table theme 53 as depicted in FIG. 2-6, 10, and 34; a flower theme 54 as depicted in FIGS. 7 and 34; a gangster theme 55 as depicted in FIGS. 8 and 34; and a prison theme 56 as depicted in FIGS. 9 and 34. It is contemplated that the game menu 52 may preferably comprise a plurality of games from which to choose, such as poker, black jack, etc. as generally depicted. The gaming table depiction 50 (as modifiable by the selection of the theme), the character menu 26, and the gaming menu 60 are all displayable upon the visual output means as heretofore specified.

With regard to themes, it is contemplated that the gaming table depictions 50 may employ either a common character unit display theme as generally depicted at 90 in FIG. 34 or a miscellaneous character unit display theme as generally depicted at 91. Bearing in mind that the rules of the selected game remain unchanged, and that only the layout of the gaming setting is different, it is further contemplated that the selected theme may be exhaustively practiced. In other words, the character display units 29, the chips 70, the gaming table depiction 50, the auditory elements 46 may all be of a flower theme as comparatively depicted in FIGS. 7 and 26-28. It will be seen from an inspection of the noted figures that the chips 70 may be of different colors; that the gaming table depiction 50 may comprise a flower-based or ornamented periphery; and that the character display unit(s) 29 may be in the form of flower (FIG. 29 illustrates a prison theme set of chips, some of which are soaps 92 on ropes 93). In terms of animation, it is contemplated that when a player dies, the flower may die as well as generally depicted at 94 in FIG. 7. Referring to the prison theme in more detail as generally depicted in FIG. 9, it will be seen that soaps 92 on ropes 93 are representative of the chips 70 and that showerheads 95 are the character display units 29. Each player has his own shower head. If the player has live camera input, his or her image will appear under the showerhead 95. It is contemplated that all themes may provide for a special prize at the end of the game as generally depicted in FIG. 30, the special prize for a jail theme being a soap on a rope collector set 96.

The users may then select (for example, by clicking on) a select character display unit 29, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus 26 and 60. The user-identifying characters are selectable, in part, from the animatable characters 25. Preferably, the character display units 29 are visually presented in peripheral adjacency to the gaming table depiction 50 and the user-identifying characters are visually presented upon the character display units 29. For example, either the selected animatable characters 25 or the user's own imagery 28 may be depicted upon the character display unit(s) 29. The central processing means thus enable the users to play the selected common game (as commonly selected from the game menu) with the selected common theme (as selected from the theme menu). The character commands are ever-operable to animate the user-identifying characters for incorporating animated body language (such as smiling 34, sitting 35, arm raising 36, crying 37, worrying 38, and dying 39) into game play for enhancing player interaction therein.

While the foregoing specifications delineate much specificity, the same should not be construed as limiting the invention, but as providing a backdrop from which the essence of the present invention emerges. Thus, it is contemplated that the present invention discloses various inventive aspects stemming from the same core concepts, including a gaming system for enhancing player interaction during an on-line or networked gaming event, the gaming system comprising certain user-based data input means, certain user-based data output means, certain central processing means, and a communication network. The user-based data output means may preferably comprise certain visual output means and certain auditory data output means. The visual and auditory data output means enabling the users to communicate with visual and auditory messages.

The communication network electronically interconnects the user-based data input means, the user-based data output means, and the central processing means enabling at least two users to communicate with one another via the user-based data input and data output means on-line or via the network in real-time. The central processing means may preferably comprise a character menu and a gaming menu, the character menu comprising at least two animatable characters, at least two character display units, and predefined character commands. Each animatable character is cooperatively associated with a set of character commands. The gaming menu may preferably comprise a theme menu and a game menu, the theme menu comprising at least one gaming theme and the game menu comprising at least one game.

The character menu and the gaming menu are displayable upon the visual output means. Users of the system may then select a select character display unit, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus. The user-identifying characters are selectable from the animatable characters. The character display units are visually presented in adjacency to one another and the user-identifying characters are visually presented upon the character display units. The central processing means enable the users to play the selected common game with the selected common theme, and the character commands are operable to animate the user-identifying characters for incorporating animated body language into game play for enhancing player interaction.

The user-based data input means may preferably comprise certain sound-inputting means and certain light-inputting means, the sound-inputting means for inputting real-time auditory stimuli into game play and the light-inputting means for inputting visual stimuli into game play for enhancing player interaction. The light-inputting means may be defined by at least one live-action camera, each live-action camera for capturing and conveying a user's image. The user-identifying characters are thus not only selectable from the group consisting of the animatable characters, but also selectable based on availability of the user's image. Each live-action camera may thus function to incorporate user-based imagery into game play for enhancing player interaction.

Accordingly, although the invention has been described by reference to a preferred system, it is not intended that the novel system(s) be limited thereby, but that modifications thereof are intended to be included as falling within the broad scope and spirit of the foregoing disclosure, the following claims and the appended drawings.

Claims

1. An electronic gaming system for enhancing player interaction during an electronic gaming event, the gaming system comprising user-based data input means, user-based data output means, central processing means, and a communication network, the user-based data output means comprising visual output means and auditory output means, the communication network electronically interconnecting the user-based data input means, the user-based data output means, and the central processing means enabling a plurality of users to communicate with one another via the user-based data input and data output means in real-time, the central processing means comprising a character menu and a gaming menu, the character menu comprising a plurality of animatable characters, a plurality of character display units, and predefined character commands, each animatable character being cooperatively associated with a set of character commands, the gaming menu comprising a gaming table depiction, a theme menu, and a game menu, the theme menu comprising at least two gaming themes, the game menu comprising at least one game, the gaming table depiction, the character menu, and the gaming menu being displayable upon the visual output means, the users each selecting a select character display unit, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus, the user-identifying characters being selectable from the animatable characters, the character display units being visually presented in peripheral adjacency to the gaming table depiction, the user-identifying characters being visually presented upon the character display units, the central processing means enabling the users to play the selected common game with the selected common theme, the character commands being operable to animate the user-identifying characters for incorporating animated body language into game play for enhancing player interaction therein.

2. The gaming system of claim 1 wherein the user-based data input means comprise sound-inputting means, the sound-inputting means for incorporating auditory stimuli into game play for enhancing player interaction therein.

3. The gaming system of claim 1 wherein the user-based data input means comprise light-inputting means, the light-inputting means for incorporating user-based visual stimuli into game play for enhancing player interaction therein.

4. The gaming system of claim 3 wherein the light-inputting means is defined by at least one live-action camera, each live-action camera for capturing a user's image, the user-identifying characters being selectable from the group consisting of the animatable characters and the user's image, each live-action camera for incorporating user imagery into game play for enhancing player interaction therein.

5. The gaming system of claim 1 wherein the character commands are operable to animate the user-identifying characters for incorporating word-based language into game play for enhancing player interaction therein.

6. The gaming system of claim 5 wherein the character commands are operable to animate the user-identifying characters for incorporating auditory word-based language into game play for enhancing player interaction therein.

7. The gaming system of claim 1 wherein the gaming table depiction comprises symbolic interactive element depictions, the character commands being operable to enable the user-identifying characters to interact with the interactive element depictions for incorporating object-based language into game play for enhancing player interaction therein.

8. The gaming system of claim 1 wherein the character menu comprises a character-building menu, the user-identifying character being constructed from the character-building menu for incorporating user-built characters into game play for enhancing player interaction therein.

9. An electronic gaming system for enhancing player interaction during an electronic gaming event, the gaming system comprising user-based data input means, user-based data output means, central processing means, and a communication network, the user-based data output means comprising visual output means, the communication network electronically interconnecting the user-based data input means, the user-based data output means, and the central processing means enabling at least two users to communicate with one another via the user-based data input and data output means in real-time, the central processing means comprising a character menu and a gaming menu, the character menu comprising at least two animatable characters, at least two character display units, and a plurality of predefined character commands, each animatable character being cooperatively associated with a set of character commands, the gaming menu comprising a theme menu and a game menu, the theme menu comprising at least one gaming theme, the game menu comprising at least one game, the character menu and the gaming menu being displayable upon the visual output means, the users each selecting a select character display unit, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus, the user-identifying characters being selectable from the animatable characters, the character display units being visually presented in adjacency to one another, the user-identifying characters being visually presented upon the character display units, the central processing means enabling the users to play the selected common game with the selected common theme, the character commands being operable to animate the user-identifying characters for incorporating animated body language into game play for enhancing player interaction.

10. The gaming system of claim 9 wherein the user-based data output means comprise auditory data output means, the auditory data output means enabling the users to communicate with auditory messages via the character commands.

11. The gaming system of claim 10 wherein the user-based data input means comprise sound-inputting means, the sound-inputting means for inputting real-time auditory stimuli into game play for enhancing player interaction.

12. The gaming system of claim 11 wherein the user-based data input means comprise light-inputting means, the light-inputting means for inputting visual stimuli into game play for enhancing player interaction.

13. The gaming system of claim 12 wherein the light-inputting means is defined by at least one live-action camera, each live-action camera for capturing and conveying a user's image, the user-identifying characters being selectable from the group consisting of the animatable characters and the user's image, each live-action camera thus for incorporating user imagery into game play for enhancing player interaction.

14. The gaming system of claim 13 wherein the theme menu comprises a gaming table depiction, the gaming table depiction being visually depicted upon the visual output means, the character display units being visually presented in adjacency to one another about the periphery of the gaming table depiction for enhancing player interaction.

15. The gaming system of claim 14 wherein the gaming table depiction comprises symbolic interactive element depictions, the character commands being operable to enable the user-identifying characters to interact with the interactive element depictions for incorporating object-based language into game play for enhancing player interaction therein.

16. A gaming system for enhancing player interaction during an on-line gaming event, the gaming system comprising user-based data input means, user-based data output means, central processing means, and a communication network, the user-based data output means comprising visual output means, the communication network electronically interconnecting the user-based data input means, the user-based data output means, and the central processing means enabling at least two users to communicate with one another via the user-based data input and data output means on-line in real-time, the central processing means comprising a character menu and a gaming menu, the character menu comprising at least two animatable characters, at least two character display units, and predefined character commands, each animatable character being cooperatively associated with a set of character commands, the gaming menu comprising at least one game, the character menu and the gaming menu being displayable upon the visual output means, the users each selecting a select character display unit, a user-identifying character, a select common game theme, and a select common game from the character and gaming menus, the user-identifying characters being selectable from the animatable characters, the character display units being visually presented in adjacency to one another, the user-identifying characters being visually presented upon the character display units, the central processing means enabling the users to play the selected common game with the selected common theme, the character commands being operable to animate the user-identifying characters for incorporating animated body language into game play for enhancing player interaction.

17. The gaming system of claim 16 wherein the user-based data output means comprise auditory input-output means, the auditory input-output means enabling the users to communicate with auditory messages.

18. The gaming system of claim 16 wherein the user-based data input means comprises at least one live-action camera, each live-action camera for capturing a user's image, the user-identifying characters being selectable from the group consisting of the animatable characters and the user's image, each live-action camera for incorporating user imagery into game play for enhancing player interaction.

19. The gaming system of claim 16 wherein the character commands are operable to animate the user-identifying characters for incorporating gesture-based language into game play for enhancing player interaction.

20. The gaming system of claim 16 wherein the theme menu comprises a gaming table depiction, the gaming table depiction being visually depicted upon the visual output means, the character display units being visually presented in adjacency to one another about the periphery of the gaming table depiction for enhancing player interaction.

Patent History
Publication number: 20070283265
Type: Application
Filed: May 16, 2006
Publication Date: Dec 6, 2007
Inventor: Michael D. Portano (Palatine, IL)
Application Number: 11/435,328
Classifications
Current U.S. Class: Virtual Character Or Avatar (e.g., Animated Person) (715/706); In A Chance Application (463/16)
International Classification: A63F 9/24 (20060101);