COMBINING MOTION CAPTURE AND TIMING TO CREATE A VIRTUAL GAMING EXPERIENCE

A method for creating a virtual gaming experience without avatars is disclosed. The method described herein may be a computer-implemented software process. Conventional virtual games may require an avatar on a screen with which the player interacts. The present invention provides for the direct interaction between two or more players, without the need for a separate visual representation on a screen, or television. The two or more gamers may be within view of each other, thus obviating the need for a separate visual representation. The process described herein may use basic motion capture (using accelerometer technology) and timing to represent gaming maneuvers and situations to create a virtual gaming experience in real time. This way, people will be able to physically react to the other people and their actions and reactions involved in this experience.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of priority of U.S. provisional patent application number 61/438,379, filed Feb. 1, 2011, the contents of which are herein incorporated by reference.

BACKGROUND OF THE INVENTION

The present invention generally relates to games and, more particularly, to a virtual gaming experience combining motion capture and timing.

Conventional video gaming uses an avatar on a screen with which the player reacts and interacts. This type of gaming requires the use of a separate visual representation on a screen or television and, in multi-player games, the gamers do not directly interact with each other.

As can be seen, there is a need for a video game system that permits the players to directly interact without the need for a separate visual representation.

SUMMARY OF THE INVENTION

In one aspect of the present invention, a method for creating a virtual gaming experience comprises establishing a connection between all players of the game and a gaming system, each player being within view of each other player; receiving input of a player action by the gaming system, the player action pertaining to the virtual gaming experience, wherein the action equates to a physical motion combined with a timed response; determining whether the player action has an effect on a previous action, if any, by another player; if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and determining consequences for said player action and previous action, if any.

In another aspect of the present invention, a gaming system comprises one or more motion sensors controlled by one or more players; and a processor adapted to detect motion of the one or more motion sensors and create a player action pertaining to the virtual gaming experience, wherein the player action equates to a physical motion combined with a timed response, wherein the processor determines whether the player action has an effect on a previous action, if any, by another player and if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and the processor determines consequences for said player action and previous action, if any.

These and other features, aspects and advantages of the present invention will become better understood with reference to the following drawings, description and claims.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a flow chart describing a method for creating a virtual gaming experience in accordance with one embodiment of the present invention; and

FIG. 1A illustrates a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention.

DETAILED DESCRIPTION OF THE INVENTION

The following detailed description is of the best currently contemplated modes of carrying out exemplary embodiments of the invention. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention is best defined by the appended claims.

Broadly, an embodiment of the present invention generally provides a method for creating a virtual gaming experience without avatars. The method described herein may be a computer-implemented software process.

Prior art virtual games may require an avatar on a screen with which the player interacts. The present invention provides for the direct interaction between two or more players, without the need for a separate visual representation on a screen, or television. The two or more gamers may be within view of each other, thus obviating the need for a separate visual representation The process described herein may use basic motion capture (using accelerometer technology) and timing to represent gaming maneuvers and situations to create a virtual gaming experience in real time. This way, people will be able to physically react to the other people and their actions and reactions involved in this experience. The data generated by the motion capture devices (each player may use one or more such devices, for example, for some games, the motion capture devices may be incorporated into another device, such as a baseball bat, in some embodiments, the players may attach motion capture devices to various body parts to capture the player's motion) may be delivered to a processor of a gaming system.

Referring now to FIG. 1, illustrated is a flow chart describing a method for creating a virtual gaming experience in accordance with one embodiment of the present invention. At step 1, a connection may need to be established between all users so that the actions of different users can be processed on a real-time basis with no substantial lag or delay between the actions and reactions of the different users.

After the connection is established at step 1, then at step 2, game play may begin and the game is able to start. The game play in step 2 may involve a number of additional steps, including a step 4, including motion detection and action, where the action may equate to some sort of physical motion combined with a timed response or timing parameter; a step 5, where a response may include some programmed reaction representing an activity; and a step 6, where each of these actions and responses requires a timing element to allow for the interactive experience to be fully realized. Additional details for these steps, as well as the game play, are described further below.

The addition of a global positioning system could enhance the experience by determining the users locations and react to the program accordingly (mainly to be used in conjunction with an optional element compass described immediately hereinafter). An addition of compass, or directional information, can enhance the experience by determining the users' direction and possible intent, and react to the program accordingly. The addition of sound can further enhance the experience, for example, gun shots with battle situations, cracks of the bat for baseball games, or a virtual ping pong bail hitting the table can help a player determine when to react, such as when to swing a virtual paddle. Vibration can also add to the experience. A vibration could allow a player to sense when they are hit with a bullet, or a sword. It could also let the player know the magnitude of damage they are taking with such a hit. Sound and vibration could be further used to help players anticipate the timing element. For example, in a baseball game, a fastball would make a higher pitch sound than a change-up. This would help the batter determine when to swing the bat. In a ping pong game, the players may hear (with sound) and feel (with vibration) the ball bounce off the table, and that can help them judge the timing of when to swing in order to return the ping pong ball. Lights and lighting effects can also add to the experience. Lights and/or lighting effects may allow a player to see a reaction, for example, a light may illuminate when a user strikes a ping pong ball. Lighting effects may be used to describe closeness to an object. Using the ping pong game as an example, a series of lights may light up as the virtual ball approaches the virtual paddle. As the ball gets closer, more lights may illuminate. Lights and lighting effects may be used in other ways to enhance the gaming experience.

Referring now to FIG. 1A, illustrated is a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention. When two or more users decide that they want to use this game, they will stand in the vicinity and within eyesight of each other and begin. At that point, each user becomes the avatar and makes the motions that he would be using in real life; for example, if he were playing baseball, he would make the motion of throwing a ball or swinging a bat; for a game involving guns, he would make a physical motion that would represent firing a gun; a sword would be a relatively simple motion to understand.

According to this action logic, a first user may perform an action at step 20. The action is any action that the game allows, and may equate to some physical motion combined with a timed response. The response may be a programmed reaction representing an activity. Some examples of such responses may include, but are not limited to, hitting a bail, shooting a gun, dodging attack, blocking with a sword, attacking with a magical fireball, or throwing a football. Each of the actions and responses requires a timing element. This element may allow for the interactive experience to be fully realized. For example, each action may need some sort of timing parameters in order to allow the other player(s) to respond or not respond.

After seeing the first user perform an action, a second or subsequent user may have the potential ability to react to the first user's action with another action, if done correctly and with accurate timing, this subsequent action may take effect and cause the desired result. At step 40, it is determined whether the second or subsequent user performs an action. If not, then at step 50, the first action is completed as directed by the first user.

Alternatively to performing no action, the second or subsequent user may perform such an action at step 30. For example, if player one attacks; a second player may block. Also by way of example, if one player pitches a baseball, a second player may swing a bat. Once the game has started, depending upon the game and the implementation of the actions, the order may change with each game.

If, at step 40, it was determined that the second or subsequent user performed an action then, at step 60, it is determined whether this latest action has any effect on the first users actions. If not, then at step 50, the first action is completed as directed by the first user.

If it is determined, at step 60, that the latest action does have an effect on the first user's actions, then, at step 70, it is then determined whether the latest action was within the timing parameters or limits set for response. If, at step 70, it is determined that the latest action was outside the timing parameters/limits set for response, then, at step 50, the first action may be completed as directed by the user.

If it was within such timing; limits at step 70, then, at step 80, the result of the first action is changed based on the second or subsequent user's latest action. The process described herein may include a pre-set end to a particular action that would compel the action to end For example, if a user runs out of energy, the user dies in the game. If time runs out, the game is over. If the user is killed with a sword, the user dies. If someone fails to return a ping pong ball, the point is won.

Referring back to FIG. 1, at step 3, once ail factors are assessed, the consequences of said actions of FIG. 1A may take place. For example, one of the characters may lose his health, a user may hit a home run, a user may catch a football pass, or someone may be killed in battle, just to name a few possible consequences.

The method described herein may be used with phones running iPhone® or Android® OS software as well as other phones. The method described herein may also be used with mobile applications, certain wireless controllers, desktop computers, laptop computers, iPads® and other tablets and hardware. The method may also be used with other devices that can capture human motion. The present method could be used in places like the stock market exchange to facilitate communication and trading. Many possible other uses also exist.

In an alternate embodiment of the present invention, the system described above may be used to track movements of a single player. For example, if the player holds the device as a golf club, they can practice their swing. The app or hardware could compare the motion data of the swing with standards and tell the player how they did. in some embodiments, the motion detection device could be attached to a golf club during play and a user may input swing results into the system—this could help teach, or calibrate, the device.

Other elements that could be used to enhance the experience include HMD (Head Mounted Displays) as well as various forms of Augmented Reality. These technologies could allow for the user to view the existing world, while at the same time overlaying and superimposing images of gameplay effects. As a user swings a golf club, the HMD would have the ability to display a representation of the ball in flight. In a game with magicians, as you wave a wand toward an enemy, you can see both the person you are playing in real life, and the fireball being thrown at her. One way this effect can be accomplished in HMD hardware is with the use of lenses with mirrors and/or semi-transparent mirrors which will allow these computer generated images to be superimposed onto a real world view. This would allow other effects, such as simulating different battle scenarios: watching an arrow fly at a target in a cowboys and Indians game, or viewing the muzzleflash and smoke that could be overlayed with the reality behind.

It should be understood, of course, that the foregoing relates to exemplary embodiments of the invention and that modifications may be made without departing from the spirit and scope of the invention as set forth in the following claims.

Claims

1. A method for creating a virtual gaming experience, comprising:

establishing a connection between all players of the game and a gaming system, each player being within view of each other player;
receiving input of a player action by the gaming system, the player action pertaining to the virtual gaming experience, wherein the action equates to a physical motion combined with timed responses and/or timing parameters;
determining whether the player action has an effect on a previous action, if any, by another player;
if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits or parameters; and
determining consequences for said player action and previous action, if any.

2. The method of claim 1, further comprising obtaining directional information on each of the players.

3. The method of claim 1, further comprising creating at least one of a vibration, a sound and a light detectable by the player, the vibration, the sound and the light being responsive to the player action.

4. The method of claim 1, further comprising detecting a location of each player with a global positioning device.

5. The method of claim 1, further comprising recording actions made by each of the players for later analysis.

6. The method of claim 1, further comprising utilizing an augmented reality device to provide the player with gameplay effects.

7. The method of claim 1, further comprising utilizing an HMD (Head or Helmet Mounted Display) to provide the player with gameplay effects.

8. A gaming system comprising:

one or more motion sensors controlled by one or more players; and
a processor adapted to detect motion of the one or more motion sensors and create a player action pertaining to the virtual gaming experience, wherein the player action equates to a physical motion combined with timed responses and/or timing parameters, wherein
the processor determines whether the player action has an effect on a previous action, if any, by another player and if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and
the processor determines consequences for said player action and previous action, if any.

9. The gaming system of claim 8, further comprising one or more effects generators, the effects generators adapted to create at least one of a vibration, a sound and a light.

10. The gaming system of claim 8, wherein the gaming system does not include a separate visual representation of the players.

11. The gaming system of claim 8, further comprising at least one augmented reality device adapted to provide the players with gameplay effects.

12. The gaming system of claim 8, further comprising at least one HMD (Head or Helmet Mounted Device) adapted to provide the players with gameplay effects.

Patent History
Publication number: 20120196684
Type: Application
Filed: Oct 17, 2011
Publication Date: Aug 2, 2012
Inventor: DAVID RICHARDSON (Wilton, CT)
Application Number: 13/275,212
Classifications
Current U.S. Class: Network Type (e.g., Computer Network, Etc.) (463/42)
International Classification: A63F 9/24 (20060101);