Game program and game apparatus

- Nintendo Co., Ltd.,

Sound generated by handclaps of a player is converted into an electric signal by a microphone, and the electric signal is inputted to a game apparatus. When a jump instruction is inputted, a player character jumps. However, the character jumps in accordance with the handclaps of the player only when the character is positioned on a jump ramp set in a virtual game world. In a case where ambient noise is inputted through the microphone when the character is not positioned on the jump ramp, the player character does not jump, but performs a provocative action. The provocative action does not exert any influence on a movement of the player character. Thus, in an action game for causing the player character to perform a predetermined action in accordance with a sound input, it becomes possible to prevent the player character from performing an action which is not intended by the player, even if the sound input is mistakenly detected.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATION

The disclosure of Japanese Patent Application No. 2006-130776, filed May 9, 2006, is incorporated herein by reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to a game program and game apparatus capable of controlling a character displayed on a screen in accordance with a sound input.

2. Description of the Background Art

As a conventional game which causes a character displayed on a screen to move by a sound input, there is a game disclosed in Japanese Laid-Open Patent Publication No. 2005-319041 (hereinafter, referred to as patent document 1). Patent document 1 discloses a game which controls a character by operating a percussion controller comprised of two congas disposed side-by-side. More specifically, when a player hits a right conga, the character moves to the right. When the player hits a left conga, the character moves to the left. Finally, when the player simultaneously hits both the right and left congas, the character jumps. The percussion controller includes a sound detecting device, and when the sound detecting device detects sound generated by handclaps of the player, for example, the character performs an action so as to toss an item.

However, in the game disclosed in patent document 1, even when the sound detecting device detects ambient noise around the player, the character performs an arbitrary action despite an intention of the player. In patent document 1, for example, when the player claps his or her hands in the vicinity of the percussion controller, the sound of the handclaps is detected as a sound input. However, the percussion controller can be operated by being hit by the player. Therefore, sound generated by hitting the percussion controller is mistakenly detected as the sound of the handclaps. As a result, there may be a case where the character frequently performs the action so as to toss the item, even when the player attempts to move the character.

SUMMARY OF THE INVENTION

Therefore, an object of the present invention is to prevent, in an action game which causes a player character to perform a predetermined action in accordance with a sound input, the player character from performing an action which is not intended by a player, even if the sound input is mistakenly detected.

The present invention has the following features to attain the object mentioned above. Note that reference numerals and figure numbers are shown in parentheses below for assisting a reader in finding corresponding components in the figures to facilitate the understanding of the present invention, but they are in no way intended to restrict the scope of the invention.

A computer-readable storage medium according to the present invention is a computer-readable storage medium storing a game program instructing a computer (31) of a game apparatus (3), which is connected to sound inputting means (6M) and a display apparatus (2), to function as: display controlling means (S32) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing the display apparatus to display the game image; movement controlling means (S30) for causing the player object to move in the virtual game world; sound detecting means (S42) for determining whether a sound is inputted through the sound inputting means; object position determining means (S54) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means (S56) for causing the player object to perform a specific action, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.

Note that “the movement controlling means” may cause the player object to move in accordance with an instruction of the player, or may allow the computer to automatically move the player object. In a case where the sound is inputted through the sound inputting means when the player object is positioned outside the specific area, the movement controlling means may cause the player object not to perform any action in response to the inputted sound, or may cause the player object to perform another action different from the specific action. The action controlling means determines whether or not the player object is positioned in the specific area based on a positional relationship between a current position of the player object and the specific area. For example, the action controlling means may determine whether or not the player object is to be positioned inside the specific area in a foreseeable future (e.g., in a frame immediately following a current frame), taking into consideration the current position of the player object which is already in the specific area, or a current moving direction with respect to the current position of the player object. Also, the specific action may be an action related to the specific area. Furthermore, the specific action may be different from one specific area to another.

The object position determining means may determine whether the player object is positioned in the specific area, if the sound detecting means determines that the sound is inputted through the sound inputting means (FIG. 8).

The action controlling means may cause the player object to perform an action different from the specific action, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area (S58).

The action controlling means may cause the player object to perform: (a) an action which exerts an influence on a movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.

Note that “the action which exerts an influence on a movement of the player object” indicates an action in which at least one of a position, moving direction and moving speed of the player object is changed accordingly when the action is performed. For example, when the action which exerts an influence on a movement of the player object is performed, movement parameters (e.g., a speed parameter, acceleration parameter, orientation parameter, etc.) of the player object are changed in accordance with the inputted sound. On the other hand, when the action which does not exert any influence on the movement of the player object is performed, the player object is caused to perform a predetermined action on the spot (while continuing to move if the player object is caused to move due to other factors).

The game program may realize a game providing the player with a specific challenge, and the action controlling means may cause the player object to perform: (a) an action which exerts an influence on a success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.

Note that “the specific challenge” includes “to reach a goal as fast as possible”, “to reach a goal faster than a rival character”, “to acquire as high score as possible”, and “to defeat an enemy character”, for example, and is different depending on a genre or type of the game.

The sound detecting means may determine whether the sound is inputted through the sound inputting means, if the object position determining means determines that the player object is positioned in the specific area (FIG. 9).

The specific action may be a jump action.

The specific area may indicate a jump ramp disposed in the virtual game world.

The sound detecting means may determine that the sound is inputted through the sound inputting means, when a sound having a predetermined volume level or higher is inputted through the sound inputting means (S42).

The game apparatus may be connected to operation means (6R, 6L), and the movement controlling means may cause the player object to move based on a signal outputted from the operation means.

Note that the operation means and the sound inputting means may have a common housing, or may have separate housings.

The operation means may be a percussion controller (6).

The game program may instruct the computer to further function as input operation detecting means (S4 6) for determining whether the player operates the operation means based on the signal outputted from the operation means, and the action controlling means may cause the player object not to perform the specific action, at least when the player operates the operation means. Thus, it becomes possible to ignore noise, other than sound of handclaps, which is inputted through the sound inputting means.

The action controlling means may cause the player object not to perform the specific action, when the player operates the operation means and while a predetermined time period has not yet passed after the player finishes operating the operation means (S48). Thus, it becomes possible to ignore the noise, other than the sound of the handclaps, which is inputted through the sound inputting means.

The display controlling means may display an operation guiding image for prompting the player to input the sound in a vicinity of the specific area (FIG. 5). Thus, it becomes possible to inform the player of a timing of inputting the sound in an easily understood manner, thereby allowing the player to appropriately perform an operation.

The game program may be able to realize a game in which at least a first player and a second player simultaneously play against each other, a first player object which can be operated by the first player and a second player object which can be operated by the second player may exist in the virtual game space, the movement controlling means may cause the first player object and the second player object to individually move in the virtual game world, and the action controlling means may include: first player object action controlling means for causing the first player object to perform the specific action, when the sound is inputted through the sound inputting means and the first player object is positioned in the specific area of the virtual game world; and second player object action controlling means for causing the second player object to perform the specific action, when the sound is inputted through the sound inputting means and the second player object is positioned in the specific area of the virtual game world.

The game apparatus is further connected to first operation means (6) operated by the first player and second operation means (7) operated by the second player, each of the first operation means and the second operation means includes the sound inputting means (6M, 7M), the first player object action controlling means may cause the first player object to perform the specific action, when the sound is inputted through the sound inputting means included in the first operation means and the first player object is positioned in the specific area of the virtual game world, and the second player object action controlling means may cause the second player object to perform the specific action, when the sound is inputted through the sound inputting means included in the second operation means and the second player object is positioned in the specific area of the virtual game world.

The game program may instruct the computer to further function as: first virtual camera setting means for setting a parameter (e.g., a view point, fixation point, camera orientation, etc.) of a first virtual camera which images the first player object based on a current position of the first player object; second virtual camera setting means for setting a parameter of a second virtual camera which images the second player object based on a current position of the second player object; and display controlling means for causing the display apparatus to simultaneously display, on the display apparatus, a first game image generated by imaging the virtual game space by means of the first virtual camera and a second game image generated by imaging the virtual game space by means of the second virtual camera.

A game apparatus according to the present invention comprises: sound inputting means (6M); a display apparatus (2); display controlling means (31, S32) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing a display apparatus to display the game image; movement controlling means (31, S30) for causing the player object to move in the virtual game world; sound detecting means (31, S42) for determining whether a sound is inputted through the sound inputting means; object position determining means (31, S54) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means (31, S56) for causing the player object to perform a specific action, when the sound in inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.

According to the present invention, in an action game for causing a player character to perform a predetermined action in accordance with a sound input, it becomes possible to prevent the character from performing an action which is not intended by the player, even if the sound input is mistakenly detected.

These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is an external view illustrating a configuration of a game system according to an embodiment of the present invention;

FIG. 2 is a block diagram illustrating an internal configuration of a game apparatus body;

FIG. 3 shows an exemplary game image displayed on a screen of a television;

FIG. 4 shows another exemplary game image displayed on the screen of the television;

FIG. 5 shows still another exemplary game image displayed on the screen of the television;

FIG. 6 shows a memory map of a work memory;

FIG. 7 is a flowchart illustrating a flow of a process executed by a CPU;

FIG. 8 is a flowchart illustrating a flow of a handclap process;

FIG. 9 is a flowchart illustrating a flow of the handclap process according to a variant;

FIG. 10 is an external view illustrating the game system obtained when two players simultaneously play a game against each other; and

FIG. 11 is an exemplary game image obtained when the two players simultaneously play the game against each other.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

Hereinafter, a game system according to an embodiment of the present invention will be described with reference to the drawings.

FIG. 1 is an external view illustrating a configuration of the game system according to the embodiment of the present invention. As shown in FIG. 1, a game system 1 comprises a television 2, a game apparatus body 3, and a conga controller 6, and has mounted there on a DVD-ROM 4 and a memory card 5. The DVD-ROM 4 and the memory card 5 are mounted on the game apparatus body 3 in a removable manner. The conga controller 6 is connected, by a communication cable, to any of four controller port connectors provided on the game apparatus body 3. The television 2 is connected to the game apparatus body 3 by an AV cable or the like. Note that the game apparatus body 3 and the controller 6 may communicate with each other by radio communication.

The conga controller 6 is provided with a microphone 6M and three switches: a start button 6S, a right strike surface 6R, and a left strike surface 6L. As described herein below, a player can control a motion of a character in a virtual game world by hitting the right strike surface 6R or left strike surface 6L. Instead of the conga controller 6, any controller including a microphone may be used.

The DVD-ROM 4 fixedly stores a game program, game data and the like. The DVD-ROM 4 is mounted on the game apparatus body 3 when the player plays a game. Here, instead of the DVD-ROM 4, an external storage medium such as a CD-ROM, an MO, a memory card, a ROM cartridge or the like may be used as means for storing the game program and the like.

The game apparatus body 3 reads the game program stored in the DVD-ROM 4, and then performs a process in accordance with the read game program.

The television 2 displays, on a screen, image data outputted from the game apparatus body 3.

The memory card 5 has a rewritable storage medium, e.g., a flash memory, as a backup memory for storing data such as saved data of the game.

FIG. 2 is a block diagram illustrating an internal configuration of the game apparatus body 3. Hereinafter, each component of the game system 1 will be described in more detail with reference to FIG. 2.

As shown in FIG. 2, the game apparatus body 3 comprises a CPU 31, a work memory 32, an external memory interface (I/F) 33, a controller interface (I/F) 34, a video RAM (VRAM) 35, a graphics processing unit (GRU) 36, and an optical disc drive 37.

In order for the game to start, the optical disc drive 37 drives the DVD-ROM 4 mounted on the game apparatus body 3, and then the game program stored in the DVD-ROM 4 is loaded into the work memory 32. The game starts when the CPU 31 executes the program stored in the work memory 32. After the game starts, the player plays the game by using the conga controller 6. In accordance with an operation performed by the player, the conga controller 6 outputs operation data to the game apparatus body 3. The operation data outputted from the conga controller 6 is supplied to the CPU 31 via the controller I/F 34. The CPU 31 performs a game process based on inputted operation data. The GPU 36 is used for image data generation and the like performed in the game process.

The GPU 36 performs, for coordinates of a solid model of an object of figure (e.g., an object comprised of polygons) placed in a three-dimensional virtual game world, arithmetic processing (e.g., rotation, scaling and deformation of a three-dimensional model, and coordinate transformation from a world coordinate system to a camera coordinate system or screen coordinate system). Further, the GPU 36 generates a game image by writing, based on a predetermined texture, color data (RGB data) of each pixel of a solid model projected on the screen coordinate system into the VRAM 35. The GPU 36 thus generates the game image to be displayed on the television 2, and outputs the game image to the television 2 as necessary. Although the present embodiment shows a hardware configuration in which a memory dedicated for image processing (VRAM 35) is separately provided, the present invention is not limited thereto. For example, a UMA (Unified Memory Architecture) system, in which a part of the work memory 32 is used as a memory for image processing, may be used.

The work memory 32 stores various program and pieces of data loaded from the DVD-ROM 4. These pieces of data include, for example, data, which is related to polygons comprising the three-dimensional model placed in the virtual game world, and a texture used for coloring the polygons.

FIG. 3 shows an exemplary game image displayed on a screen of the television 2. Although the present embodiment illustrates an example where the present invention is applied to a racing game, the present invention is not limited thereto. The present invention is applicable to an arbitrary game.

On the screen of the television 2, a racecourse set in a virtual game world, a player character operated by the player, and an obstacle and coins disposed on the racecourse are displayed. The player operates the player character by using the conga controller 6: so as to collide with as few obstacles as possible; so as to acquire as many coins as possible; and so as to reach a goal as fast as possible.

By using the conga controller 6, the player can input instructions such as an acceleration instruction, a rightward movement instruction, a leftward movement instruction, a deceleration instruction, and a jump instruction.

The player can input the acceleration instruction by alternately and continuously hitting the right strike surface 6R and left strike surface 6L of the conga controller 6. When the acceleration instruction is inputted, the character accelerates forward (i.e., in a direction in which the character faces or in a moving direction of the character).

The player can input the rightward movement instruction by continuously hitting the right strike surface 6R of the conga controller 6. When the rightward movement instruction is inputted, the character moves to the right (i.e., in a rightward direction with respect to the direction in which the character faces or in a rightward direction with respect to the moving direction of the character). The more rapidly the player continuously hits the right strike surface 6R, the more quickly the character moves to the right. Instead of moving the character to the right, a current direction in which the character faces and a current moving direction of the character may be changed to the right.

The player can input the leftward movement instruction by continuously hitting the left strike surface 6L of the conga controller 6. When the leftward movement instruction is inputted, the character moves to the left (i.e., in a leftward direction with respect to the direction in which the character faces or in a leftward direction with respect to the moving direction of the character). The more rapidly the player continuously hits the left strike surface 6L, the more quickly the character moves to the left. Instead of moving the character to the left, the current direction in which the character faces and the current direction of the character may be changed to the left.

The player can input the deceleration instruction by pressing both the right strike surface 6R and left strike surface 6L of the conga controller 6 for a predetermined time period or longer. When the deceleration instruction is inputted, the character decelerates.

The player can input the jump instruction by clapping his or her hands in the vicinity of the conga controller 6. Specifically, sound generated by handclaps of the player is converted into an electric signal by the microphone 6M, so as to be inputted to the game apparatus body 3. When the jump instruction is inputted, the character jumps. Note that the character jumps in accordance with the handclaps of the player only when the character is positioned on a jump ramp (see FIG. 4) which is set in the virtual game world.

In a case where ambient noise (including voice or handclaps of any person other than the player) is inputted through the microphone 6M when the character is not positioned on the jump ramp, the character does not jump, but performs a provocative action, as shown in FIG. 5. The provocative action does not exert any influence on the movement of the character. Therefore, even if the character performs the provocative action, a movement speed or movement direction of the character never changes according to the provocative action. Furthermore, in a case where any of the acceleration instruction, the rightward movement instruction, the leftward movement instruction and the deceleration instruction is inputted while the character is performing the provocative action, the provocative action is immediately released, thereby allowing the character to perform an action (i.e., an acceleration, rightward movement, leftward movement or deceleration) in accordance with the inputted instruction.

As described above, the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp. Thus, it becomes possible to avoid a case where the character unexpectedly jumps in response to the ambient noise, thereby exerting an adverse effect on a game result (e.g., a player ranking for the race, goal time, score, etc.). Although the present embodiment illustrates an example where the character jumps in accordance with the handclaps of the player, the present invention is not limited thereto. For example, when the character is positioned in an acceleration lane as shown in FIG. 11, the character may accelerate in accordance with the handclaps of the player (more rapidly than when the acceleration instruction is inputted). Further, both the jump ramp and the acceleration lane may exist on the racecourse.

Hereinafter, an area, e.g., the jump ramp or acceleration lane, in which the character performs a special movement action such as a jump or acceleration in accordance with the handclaps of the player, is referred to as a “hand clap area”. Also, an action, e.g., the provocative action, performed by the character in accordance with the handclaps of the player when the character is positioned outside the handclap area, is referred to as a “performance action”.

As shown in FIG. 4, a handclap image for prompting the player to clap his or her hands is displayed in the vicinity of the jump ramp (handclap area).

Note that when the character passes through the jump ramp, the player may not input the jump instruction (sound of the handclaps). In this case, the character passes through the jump ramp without being jumped. The same is also true of the acceleration lane.

Hereinafter, an operation of the game apparatus body 3 according to the present embodiment will be described in detail.

FIG. 6 shows a memory map of the work memory 32. The work memory 32 stores a game program 40, game image data 41, racecourse data 42, character controlling data 43, a sound input flag 44, and a sound input timer 45.

The game image data 41 is data for generating a game image displayed on the screen of the television 2, and includes a character image, a background image, and the handclap image.

The race course data 42 is data for showing a shape of the racecourse set in the virtual game world, and includes handclap area information indicating a position of the handclap area.

The character controlling data 43 is data for controlling the movement of the character in the virtual game world, and includes current position information and speed information. The current position information is information (coordinate data) indicating a current position of the character, and the speed information is information (vector data) indicating a movement speed of the character.

The sound input flag 44 and the sound input timer 45 are a flag or timer, respectively, used in a handclap process to be described later.

Hereinafter, a flow of a process executed by the CPU 31 based on the game program 40 will be described with reference to FIGS. 7 and 8.

In FIG. 7, when the game program 40 starts to be executed, the CPU 31 firstly displays, in step S10, an initial game image. At this time, initial values of the current position and movement speed of the character are set.

In step S12, it is determined whether the right strike surface 6R has been continuously hit. For example, in a case where, within a predetermined time period after the right strike surface 6R is hit, the strike surface 6R is hit again, it is determined that the right strike surface 6R has been continuously hit. When it is determined that the right strike surface 6R has been continuously hit, the process proceeds to step S14. On the other hand, it is determined that the right strike surface 6R has not been continuously hit, the process proceeds to step S16.

In step S14, the speed information is updated (specifically, a direction of a speed vector is changed) such that the character is to turn clockwise (i.e., the character is to move or accelerate to the right). Thereafter, the process proceeds to step S28.

In step S16, it is determined whether the left strike surface 6L has been continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6L is hit, the left strike surface 6L is hit again, it is determined that the left strike surface 6L has been continuously hit. When it is determined that the left strike surface 6L has been continuously hit, the process proceeds to step S18. On the other hand, it is determined that the left strike surface 6L has not been continuously hit, the process proceeds to step S20.

In step S18, the speed information is updated (specifically, the direction of the speed vector is changed) such that the character is to turn counterclockwise (i.e., the character is to move or accelerate to the left). Thereafter, the process proceeds to step S28.

In step S20, it is determined whether the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6L is hit, the right strike surface 6R is hit, or in a case where, within a predetermined time period after the right strike surface 6R is hit, the left strike surface 6L is hit, it is determined that the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit. When it is determined that the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit, the process proceeds to step S22. On the other hand, the right strike surface 6R and the left strike surface 6L have not been alternately or continuously hit, the process proceeds to step S24.

In step S22, the speed information is updated (specifically, a magnitude of the speed vector is changed) such that the character is to accelerate forward. Thereafter, the process proceeds to step S28.

In step S24, it is determined whether the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer. For example, when both the right strike surface 6R and the left strike surface 6L have been pressed for one second or longer, it is determined that both the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer. When it is determined that both the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer, the process proceeds to step S26. On the other hand, when both the right strike surface 6R and the left strike surface 6L have not been pressed for the predetermined time period or longer, the process proceeds to step S28.

In step S26, the speed information is updated (specifically, the magnitude of the speed vector is changed) such that the character is to decelerate. Thereafter, the process proceeds to step S28.

In step S28, the handclap process is executed. In the handclap process, the character is controlled in accordance with the handclaps of the player. Hereinafter, the handclap process will be described in detail with reference to FIG. 8.

In the handclap process, the CPU 31 firstly determines, in step S40, whether the sound input flag 44 is turned on. Note that the sound input flag 44 is turned off in an initial state. When it is determined that the sound input flag 44 is turned on, the process proceeds to step S46. On the other hand, when it is determined that the sound input flag 44 is turned off, the process proceeds to step S42.

In step S42, it is determined whether sound having a predetermined volume level or higher has been detected by the microphone 6M. When it is determined that the sound having the predetermined volume level or higher has been detected, the process proceeds to step S44. On the other hand, when it is determined that the sound having the predetermined volume level or higher has not been detected, the handclap process is to be finished.

In step S44, the sound input flag 44 is turned on, thereby causing the sound input timer 45 to be started. Thereafter, the process proceeds to step S46.

In step S46, it is determined whether either the right strike surface 6R or the left strike surface 6L is being pressed. When it is determined that either the right strike surface 6R or the left strike surface 6L is being pressed, the process proceeds to step S50. On the other hand, when it is determined that neither the right strike surface 6R nor the left strike surface 6L is being pressed, the process proceeds to step S48. That is, in step S46, when the player is pressing either of the two strike surfaces (i.e., in this state, it is determined sound inputted through the microphone 6M is not the hand claps of the player), the sound inputted through the microphone 6M is to be ignored.

In step S48, it is determined whether neither the right strike surface 6R nor the left strike surface 6L is being pressed for a predetermined time period (e.g., 10 frame period) or longer. When it is determined that neither the right strike surface 6R or the left strike surface 6L is being pressed for the predetermined time period or longer, the process proceeds to step S52. On the other hand, when it is determined that the predetermined time period has not yet passed after either the right strike surface 6R or the left strike surface 6L is pressed, the process proceeds to step S50. That is, in step S48, when sound is inputted through the microphone 6M until the predetermined time period has passed after either of the two strike surfaces is pressed (i.e., in this state, it is determined that the sound inputted through the microphone 6M is not the handclaps of the player, because a certain time period is required from when the player removes the pressed strike surface to when the player starts to clap his or her hands), the sound inputted through the microphone 6M is to be ignored.

In step S50, the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.

In step S52, it is determined whether a count value of the sound input timer 45 is a predetermined value (e.g., 10 frame period) or longer. When it is determined that the count value of the sound input timer 45 is the predetermined value or greater, the process proceeds to step S54. On the other hand, when it is determined that the count value of the sound input timer 45 is less than the predetermined value, the handclap process is to be finished. That is, in step S52, when either of the two strike surfaces is pressed until a predetermined time period (e.g., 10 frame period) has passed after the sound is inputted through the microphone 6M (i.e., in this state, it is determined that the sound inputted through the microphone 6M is not the handclaps of the player, because a certain time period is required from when the player finishes clapping his or her hands to when the player presses the strike surface), the sound inputted through the microphone 6M is to be ignored.

In step S54, by reading the handclap area information of the racecourse data 42 and the current position information of the character controlling data 43, it is determined whether the character is positioned in the handclap area. When it is determined that the character is positioned in the handclap area, the process proceeds to step S56. On the other hand, when it is determined that the character is positioned outside the handclap area, the process proceeds to step S58.

In step S56, the speed information is updated such that the character is to perform a special movement action (e.g., jump). Note that when a plurality of types of handclap areas such as the jump ramp or acceleration lane exist on the racecourse, the speed information is changed such that the special movement action, corresponding to a type of the handclap area in which the character is positioned, is to be performed. By using a method other than that of changing the speed information, the character maybe caused to perform the special movement action. By changing the current position information, for example, the character may be caused to instantaneously move from one place to another on the race course.

In step S58, an image of the character is changed such that the character is to perform the performance action (e.g., provocative action).

In step S60, the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.

When the handclap process is finished, the CPU 31 updates, in step S30 shown in FIG. 7, the current position information based on the speed information.

In step S32, the game image is updated based on the current position information which has been updated in step S30. Thereafter, the process returns to step S12.

By repeating steps S12 to S32 mentioned above, the game image is sequentially updated such that the character is to move in accordance with an instruction inputted by the player.

As described above, according to the present embodiment, the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp. Thus, it becomes possible to avoid a case where the character unexpectedly jumps in response to the ambient noise, thereby exerting an adverse effect on a game result (e.g., a player ranking for the race, goal time, score, etc.).

Note that in the handclap process shown in FIG. 8, it is determined whether the character is positioned in the handclap area after the sound having the predetermined volume level or higher is detected. Instead of this, however, the sound may be detected by the microphone 6M only when the character is positioned in the handclap area. FIG. 9 shows a detailed variant of the handclap process. In FIG. 9, the same steps as those shown in FIG. 8 are denoted by the same reference numerals. FIG. 9 is only different from FIG. 8 in that in FIG. 9, a process for determining whether the character is positioned in the handclap area (step S54) is initially executed, and a process for causing the character to perform the performance action (step S58 shown in FIG. 8) is eliminated.

The present invention is particularly effective when a plurality of players simultaneously play the game against each other. Hereinafter, a case where a first player and a second player simultaneously play the game against each other will be described with reference to FIGS. 10 and 11.

In a case where the first player and the second player simultaneously play the game against each other, two conga controllers 6 and 7 used by the first player and the second player, respectively, are connected to the game apparatus body 3. The conga controller 7 used by the second player is also provided with a microphone 7M and three switches: a start button 7S, a right strike surface 7R, and a left strike surface 7L.

FIG. 11 shows an exemplary game image obtained when the first player and the second player simultaneously play the game against each other. The game image is comprised of two image areas: an upper image area and a lower image area. An environment of the virtual game world viewed from a virtual camera imaging a character A is displayed in the upper image area, and another environment of the virtual game world viewed from a virtual camera imaging a character B is displayed in the lower image area. The virtual cameras are mounted in a virtual game space by setting various parameters (e.g., a view point, fixation point, camera orientation, etc. ). The parameters of the virtual camera imaging the character A are updated in accordance with a current position of the character A, and the parameters of the virtual camera imaging the character B are updated in accordance with a current position of the character B. The character A is operated by the first player, and the character B is operated by the second player. The character A and the character B travel on the same racecourse, and the acceleration lane is disposed on the racecourse.

The first player can cause the character A to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 6R and left strike surface 6L of the conga type controller 6. Furthermore, when the first player claps his or her hands, sound generated by handclaps of the first player is inputted to the conga controller 6 via the microphone 6M, thereby making it possible to cause the character A to substantially accelerate.

Similarly, the second player can cause the character B to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 7R and left strike surface 7L of the conga controller 7. Furthermore, when the second player claps his or her hands, sound generated by handclaps of the second player is inputted to the conga controller 7 via the microphone 7M, thereby making it possible to cause the character B to substantially accelerate.

In such a case where the first player and the second player simultaneously play the game against each other, when the second player claps his or her hands so as to cause the character B to substantially accelerate in a state where the character B is positioned in the acceleration lane, the sound generated by the handclaps of the second player is inputted not only to the microphone 7M of the conga controller 7 used by the second player but also to the microphone 6M of the conga controller 6 used by the first player. However, according to the present invention, even when the sound generated by the handclaps of the second player is inputted to the microphone 6M of the conga controller 6 used by the first player, the character A only performs the provocative action unless the character A is positioned in the acceleration lane. Therefore, no adverse effect is exerted on a game result of the first player. The same is also true of a case where the first player claps his or her hands so as to cause the character A to substantially accelerate in a state where the character A is positioned in the acceleration lane.

While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims

1. A computer-readable storage medium storing a game program instructing a computer of a game apparatus, which is connected to sound inputting means and a display apparatus, to function as:

display controlling means for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing the display apparatus to display the game image;
movement controlling means for causing the player object to move in the virtual game world;
sound detecting means for determining whether a sound is inputted through the sound inputting means;
object position determining means for determining whether the player object is positioned in a specific area of the virtual game world; and
action controlling means for causing the player object to perform a specific action, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.

2. The computer-readable storage medium according to claim 1, wherein

the object position determining means determines whether the player object is positioned in the specific area, if the sound detecting means determines that the sound is inputted through the sound inputting means.

3. The computer-readable storage medium according to claim 2, wherein

the action controlling means causes the player object to perform an action different from the specific action, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area.

4. The computer-readable storage medium according to claim 3, wherein

the action controlling means causes the player object to perform:
(a) an action which exerts an influence on a movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and
(b) an action which does not exert any influence on the movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.

5. The computer-readable storage medium according to claim 3, wherein

the game program realizes a game providing the player with a specific challenge, and
the action controlling means causes the player object to perform:
(a) an action which exerts an influence on a success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and
(b) an action which does not exert any influence on the success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.

6. The computer-readable storage medium according to claim 1, wherein

the sound detecting means determines whether the sound is inputted through the sound inputting means, if the object position determining means determines that the player object is positioned in the specific area.

7. The computer-readable storage medium according-to claim 1, wherein

the specific action is a jump action.

8. The computer-readable storage medium according to claim 1, wherein

the specific area indicates a jump ramp disposed in the virtual game world.

9. The computer-readable storage medium according to claim 1, wherein

the sound detecting means determines that the sound is inputted through the sound inputting means, when a sound having a predetermined volume level or higher is inputted through the sound inputting means.

10. The computer-readable storage medium according to claim 1, wherein

the game apparatus is connected to operation means, and
the movement controlling means causes the player object to move based on a signal outputted from the operation means.

11. The computer-readable storage medium according to claim 10, wherein

the operation means is a percussion controller.

12. The computer-readable storage medium according to claim 10, wherein

the game program instructs the computer to further function as input operation detecting means for determining whether the player operates the operation means based on the signal outputted from the operation means, and
the action controlling means causes the player object not to perform the specificaction, at least when the player operates the operation means.

13. The computer-readable storage medium according to claim 12, wherein

the action controlling means causes the player object not to perform the specific action, when the player operates the operation means and while a predetermined time period has not yet passed after the player finishes operating the operation means.

14. The computer-readable storage medium according to claim 1, wherein

the display controlling means displays an operation guiding image for prompting the player to input the sound in a vicinity of the specific area.

15. The computer-readable storage medium according to claim 1, wherein

the game program is able to realize a game in which at least a first player and a second player simultaneously play against each other,
a first player object which can be operated by the first player and a second player object which can be operated by the second player exist in the virtual game space,
the movement controlling means causes the first player object and the second player object to individually move in the virtual game world, and
the action controlling means includes:
first player object action controlling means for causing the first player object to perform the specific action, when the sound is inputted through the sound inputting means and the first player object is positioned in the specific area of the virtual game world; and
second player object action controlling means for causing the second player object to perform the specific action, when the sound is inputted through the sound inputting means and the second player object is positioned in the specific area of the virtual game world.

16. The computer-readable storage medium according to claim 15, wherein

the game apparatus is further connected to first operation means operated by the first player and second operation means operated by the second player,
each of the first operation means and the second operation means includes the sound inputting means,
the first player object action controlling means causes the first player object to perform the specific action, when the sound is inputted through the sound inputting means included in the first operation means and the first player object is positioned in the specific area of the virtual game world, and
the second player object action controlling means causes the second player object to perform the specific action, when the sound is inputted through the sound inputting means included in the second operation means and the second player object is positioned in the specific area of the virtual game world.

17. The computer-readable storage medium according to claim 15, wherein

the game program instructs the computer to further function as:
first virtual camera setting means for setting a parameter of a first virtual camera which images the first player object based on a current position of the first player object;
second virtual camera setting means for setting a parameter of a second virtual camera which images the second player object based on a current position of the second player object; and
display controlling means for causing the display apparatus to simultaneously display, on the display apparatus, a first game image generated by imaging the virtual game space by means of the first virtual camera and a second game image generated by imaging the virtual game space by means of the second virtual camera.

18. A game apparatus comprising:

sound inputting means;
a display apparatus;
display controlling means for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing a display apparatus to display the game image;
movement controlling means for causing the player object to move in the virtual game world;
sound detecting means for determining whether a sound is inputted through the sound inputting means;
object position determining means for determining whether the player object is positioned in a specific area of the virtual game world; and
action controlling means for causing the player object to perform a specific action, when the sound in inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
Patent History
Publication number: 20070265074
Type: Application
Filed: May 4, 2007
Publication Date: Nov 15, 2007
Applicant: Nintendo Co., Ltd., (Kyoto)
Inventors: Eiji Akahori (Minato-ku), Shingo Miyata (Minato-ku), Toshiharu Izuno (Kyoto-shi), Takuji Hotta (Kyoto-shi), Kentaro Nishimura (Kyoto-shi)
Application Number: 11/797,558
Classifications
Current U.S. Class: Audible (463/35)
International Classification: A63F 9/24 (20060101);