Gaming program, gaming machine, and record medium

- ARUZE CORP.

A gaming program, a gaming machine, and a record medium for making it possible to suppress the loss of the interest in a game and enhance the game presence are provided. The gaming machine selects a character action mode based on an operation signal and a plurality of pieces of character data and performs character action control based on the selected character action mode. The gaming machine performs display control of selection of the character action mode and the character action control. Particularly, the gaming machine performs specific image control while the character action mode is selected during the battle, and limits specific image control while the character action control is performed.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATION

This application claims the priority of Japanese Patent Application No. 2005-106337 filed on Apr. 1, 2005, which is incorporated herein by reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

This invention relates to a gaming program, a gaming machine, and a record medium and in particular to a gaming program, a gaming machine, and a record medium for determining the action order of a plurality of characters, selecting the action mode of each character, and controlling the action of each character based on the action mode.

2. Description of the Related Arts

Hitherto, various gaming programs have been provided for making a player play a role of a character, etc., as a command is entered, etc., in response to player operation in a virtual world in a game on a screen of a computer or a display, the action mode of a character in the game is selected according to a predetermined action order, and a preset story is advanced. Such a game is generally called RPG (Role Playing Game).

Such a gaming program containing a battle scene wherein a character operated in response to player operation (which will be hereinafter referred to as “player character”) and an enemy character controlled by a computer fight a battle, the program for the player to get an experience value, virtual money, etc., by beating the enemy character in the battle and raise the character level to advance the story is generally known.

In such a gaming program, if action of attack, etc., of the player character is taken in a battle scene, a command is entered, etc., in response to player operation for each player character according to a predetermined action order, and the action mode of each character is selected based on the character data and operation corresponding to the character, for example, as shown in Japanese Unexamined Patent Publication No. 2004-237071. Display control wherein such action mode selection is made is called character action selection display control. After the action mode of the character is selected, action control of the character is performed in such a manner that display control for the character to fight is performed based on the action mode of the character. Display control wherein actual character action is taken based on such action mode selection is called character action display control. Thus, the character action selection display control and the character action display control are repeatedly switched according to a predetermined order, whereby a battle scene is executed. When the character action display control is executed, display control is performed for providing powerful punch for the acting character so that the player can clearly recognize the acting character.

For example, Japanese Unexamined Patent Publication No. 2001-351123 discloses a gaming program for suppressing specific image control such as fog processing about the display object of the character to be displayed on the screen, etc., for producing image display rich in presence for improving the interest in the game. A gaming program wherein a display area for displaying the display object on a screen and a non-display area not for displaying the display object are set based on the capacity of the display object, the gaming program not for displaying the display object and for suppressing specific image control if it is determined that the display object is contained in the non-display area is disclosed; it is made possible to display a large number of characters on one screen in image processing in the game.

However, in the gaming program, specific image control is simply performed in the battle scene, whereby the control load is raised and it is feared that the displayed image may be degraded because of frame delay, frame skip, etc., and the interest in the game may be decreased. On the other hand, if the number of the display objects is limited to prevent such image degradation, the character to be displayed may be skipped and be undisplayed depending on the position of the character and it is feared that the player may be given a feeling being out of place; it is hard to say that a game high in presence is provided.

SUMMARY OF THE INVENTION

It is therefore an object of the invention to provide a gaming program, a gaming machine, and a record medium for making it possible to suppress the loss of the interest in a game and enhance the game presence.

To the end, according to the invention, there are provided the following:

(1) A gaming program product for use in a computer including an input device that can be operated by a player comprising a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data; a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode and the character action control executed by the special character action controller; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.

(2) The gaming program product described in (1) for further comprising a specific image control invalidation module for invalidating the specific image control as limitation of the specific image control while the character action controller performs the character action control.

(3) A gaming machine including an input device that can be operated by a player; a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data;

a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode executed by the special character action controller and the character action control; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.

According to the invention described in (1) or (3), selection of the action mode of the character based on the operation signal from the input device that can be operated by the player and a plurality of pieces of character data and action control of the character based on the action mode of the selected character are executed in accordance with the action order of the characters, and display control of selection of the action mode of the character to be executed and action control of the character is performed. While the character action control is performed in the character action mode selection and the character action control, execution of specific image control is limited. Therefore, while the character action control whose control load grows relatively is performed, if the control load of image control other than execution of specific image control whose control load is high grows, the specific image control is limited, so that degradation of the image can be prevented and the loss of the interest in the game can be suppressed. While the character action mode selection with relatively small control load is made, the specific image control is executed and the characters to be displayed are displayed without being skipped because the number of displayed characters is not limited, and further specific image processing is performed and game presence can be enhanced.

According to the invention described in (2), as limitation of specific image control, the specific image control is invalidated while the character action control is performed. Therefore, only image control other than the specific image control whose control load is high can be executed and still more degradation of the image can be prevented and the loss of the interest in the game can be suppressed.

According to the invention, the loss of the interest in a game can be suppressed and the game presence can be enhanced.

Additional objects and advantage of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.

BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE INVENTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principals of the invention.

In the drawings:

FIG. 1 is a drawing to show the general configuration of a gaming machine incorporating the invention;

FIG. 2 is a block diagram to show the system configuration of the gaming machine in FIG. 1;

FIG. 3 is a drawing to show the character individual skills of play characters A and B;

FIG. 4 shows display examples of a title screen and a world map;

FIG. 5 is a schematic representation to show a battle scene;

FIG. 6 is a schematic representation to show a battle scene;

FIG. 7 is a schematic representation to show a battle scene;

FIG. 8 is a drawing to show the display mode of a judgment ring displayed at the command determination time;

FIG. 9 is a drawing to show the display mode of the judgment ring after the command determination;

FIG. 10 is a drawing to show other examples of 120% areas;

FIG. 11 is a flowchart to show a procedure of main game processing;

FIG. 12 is a flowchart to show a procedure of battle processing;

FIG. 13 is a flowchart to show a procedure of command processing;

FIG. 14 is a flowchart to show a procedure of judgment processing;

FIG. 15 is a flowchart to show a procedure of display control processing;

FIG. 16 is a drawing to show the configuration of a network gaming system;

FIG. 17 is a schematic representation to show image processing executed based on the distance from an eyepoint position; and

FIG. 18 is a schematic representation to show a battle scene.

BEST MODE FOR CARRYING OUT THE INVENTION

Referring now to the accompanying drawings, there are shown preferred embodiments of the invention.

(Configuration of Gaming Machine)

FIG. 1 shows the general configuration of a gaming machine incorporating the invention. The gaming machine 200 is made up of a machine main unit 1, a input device 4 (that can be operated by a player) for outputting a control command to the machine main unit 1 in response to player operation, and a display 15 for displaying an image based on an image signal from the machine main unit 1. In the gaming machine 200, a game is executed as various images, such as a plurality of characters including a player character and an enemy character, are displayed on a display surface (screen) 16 of the display 15 such as a CRT.

A game executed in the gaming machine 200 is executed as a gaming program recorded on an external record medium separate from the machine main unit 1 is read. In addition to a CD-ROM or a DVD-ROM, an FD (flexible disk) or any other record medium can be used as the external record medium recording the gaming program. In the embodiment, a DVD-ROM is used as the external record medium. A cover 2 that can be opened and closed is provided in the top center of the machine main unit 1. As the cover 2 is opened, a DVD-ROM 31 (see FIG. 2) can be placed in a DVD-ROM drive 29 (see FIG. 2) as a record medium drive provided inside the machine main unit 1.

The input device 4 includes various input parts for outputting a control command to a CPU 21 (see FIG. 2) in the machine main unit 1 in response to operation of the player. The input device 4 is provided in the left portion with an up button 7, a down button 8, a left button 9, and a right button 10 mainly operated by the player to move a character appearing in a game or move an option of a menu as the input parts. The input device 4 is provided in the right portion with a Δ button 11, a ◯ button 12, a X button 13, and a □ button 14 mainly operated by the player to determine or cancel various items. The input device 4 is provided in the center with a selection button 6 at the top and a start button 5 at the bottom.

The display 15 has input terminals of a video signal and an audio signal, which are connected to a video output terminal and an audio output terminal of the machine main unit 1 by terminal cables 18 and 19. Used as the display 15 is an existing television having in one piece the screen 16 that can display image data output from an image output section 25 described later (see FIG. 2) and speakers 17L and 17R that can output audio data output from an audio output section 27 described later (see FIG. 2). The machine main unit 1 and the input device 4 are connected by a signal cable 20 as shown in FIG. 1.

The machine main unit 1 is provided on one side with a memory slot 3 as an insertion slot of a memory card 32 (see FIG. 2). The memory card 32 is a storage medium for temporarily recording game data when the player interrupts the game, etc. The data recorded on the memory card 32 is read through a communication interface 30 described later (see FIG. 2) having a card reader function.

(Electric Configuration of Gaming Machine)

FIG. 2 shows the system configuration of the gaming machine 200. The machine main unit 1 includes the CPU 21 as a controller, ROM 22 and RAM 23 as storage module, an image processing section 24, the image output section 25, an audio processing section 26, the audio output section 27, a decoder 28, the DVD-ROM drive 29, and the communication interface 30.

The DVD-ROM 31 can be attached to and detached from the DVD-ROM drive 29 and the gaming program in the DVD-ROM 31 placed in the DVD-ROM drive 29 is read by the CPU 21 in accordance with a basic operation program of an OS (operating system), etc., stored in the ROM 22. The read gaming program is converted into predetermined signals by the decoder 28 for storage in the RAM 23.

The gaming program stored in the RAM 23 is executed by the CPU 21 in accordance with the basic operation program or an input signal from the input device 4. Image data and audio data are read from the DVD-ROM 31 in response to the executed gaming program. The image data is sent to the image processing section 24 and the audio data is sent to the audio processing section 26.

The image processing section 24 converts the received image data into an image signal and supplies the image signal to the display 15 through the image output section 25, thereby displaying an image on the screen 16. Particularly, the image processing section 24 has functions of calculating the position relationship between the display object placed in a virtual three-dimensional coordinate space (for example, a display object such as a character) and the eyepoint position every predetermined timing, generating image data viewing the display object from the eyepoint position, and displaying an image based on the generated image data on the screen 16.

The audio processing section 26 converts the received audio data into an audio signal and supplies the audio signal to the speakers 17L and 17R through the audio output section 27.

The communication interface 30 enables the input device 4 and the memory card 32 to be connected detachably to the machine main unit 1. Through the communication interface 30, data is read from and written into the memory card 32 and a signal from the input device 4 is sent to the components of the CPU 21, etc.

(Character Individual Skills)

The RAM 23 stores the gaming program in the DVD-ROM 31 and parameters concerning characters based on the memory card 32. As a specific example of the parameters concerning characters, the character individual skills will be discussed with FIG. 3. FIG. 3 is a schematic representation to show the character individual skills of player characters A and B. The character individual skills of the player characters A and B will be discussed below; the character individual skills of other player characters and enemy characters are also to be stored in the RAM 23.

The character individual skills shown in FIG. 3 are stored for each of the characters appearing in a game. The types of character individual skills include hit points (HP), magic points (MP), sanity points (SP), physical offensive power (STR), physical defensive power (VIT), agility (AGL), magic offensive power (INT), magic defensive power (POW), and luck (LUC). Each of them is represented by a numeric value and a different value is set depending on the type of character although the character level is the same.

The character individual skills are set in response to the character level (LV). This LV changes with the experience value stored cumulatively in response to the experience of a battle, etc., in a game. Particularly, as for the HP, MP, and SP, the maximum HP, the maximum MP, and the maximum SP corresponding to the character individual skills and the actual HP, MP, and SP changing in a game are stored. Of course, the AGL and the LUC also change based on special item or special act as described later.

The character individual skills are loaded into the RAM 23 as described above. The character individual skills change with the arm, the protector, the item, etc., with which the character is equipped. The character individual skills change with the magic worked on the character and the used item.

Thus, the CPU 21 reads the parameters concerning the characters such as the character individual skills, etc., stored in the RAM 23 from the RAM 23. The CPU 21 for loading such a character individual skill table into the RAM 23 and the RAM 23 correspond to an example of a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters. The DVD-ROM 31 storing such a character individual skill table also corresponds to the character data storage module.

(Display Screen)

Next, specific examples of display screens displayed on the screen 16 accompanying the game content executed by the CPU 21 based on the gaming program recorded in the DVD-ROM 31 will be discussed with FIGS. 4 to 7.

When the DVD-ROM 31 is placed in the DVD-ROM drive 29 and power of the machine main unit 1 is turned on, “opening demonstration” is displayed on the screen 16. The “opening demonstration” is effect display for telling the player about the start of a game. After the “opening demonstration” is displayed for a predetermined time, a “title screen” drawing a game title large is displayed as shown in FIG. 4A.

Here, specifically the character string of the game title, SHADOW HEARTS, is displayed and two options (NEW GAME and CONTINUE) are displayed below the game title. A cursor 41 is displayed at the left position of the option of either NEW GAME or CONTINUE and as the player operates the up button 7 or the down button 8, the position of the cursor 41 is changed. When the player operates the ◯ button 12, the option pointed to by the cursor 41 is selected.

If the player selects NEW GAME on the “title screen,” a prolog and the game content are displayed and then a “world map” is displayed as shown in FIG. 4B. On the other hand, if the player selects CONTINUE on the “title screen,” the “world map” is displayed based on the saved data when the previous game was over without displaying the prolog or the game content.

Specifically, the main cities of “A country” as the stage of the game story are displayed on the “world map” and options indicated by five city names (CITY A 42a, CITY B 42b, CITY C 42c, CITY D 42d, and CITY E 42e) are displayed. They are options to make a transition to a provided “submap.” As the player operates the up button 7 or the down button 8, the cursor 41 indicating each option moves and as the player operates the ◯ button 12, one option is selected. When one “submap” is thus selected, the “world map” makes a transition to the screen corresponding to the “submap” and the player can play various games set in response to the “submap.” Specifically, the visual scene in each city is prerender-displayed as a background image conforming to scene development and while the player characters move therein, various events are conquered and the story proceeds.

When the player operates the □ button 14 on the “world map,” a “menu screen” is displayed, enabling the player to make various settings, etc., on the menu screen. When the “world map” is displayed, if the player selects a city, the start screen of the “submap” corresponding to the city is displayed. As the action on the “submap,” the player character can walk, can speak to a pedestrian, and can make a purchase.

As the game according to the embodiment, a play character which acts based on operation of the player and an enemy character which acts based only on the gaming program appear and a game developed centering on the battle between the characters is realized on the screen 16. In the embodiment, four characters of player character A 111, player character B 112, player character C 113, and player character D 114 (see FIG. 5A) appear as the player characters and the game proceeds in the party unit made up of the four characters. Various types of status are set for each character. The experience value, money, arms, skill, and the like added by the number of gaming times, the number of times an enemy character has been beaten, etc, are defined as the status.

Then, if the player character party starting action on the “submap” encounters an enemy character, a battle with the enemy character is started as shown in FIG. 5A. In this case, for example, a battle with three enemy characters of enemy character a 115, enemy character b 116, and enemy character c 117 is started.

In such a battle screen, the action order of all characters (containing the player characters and the enemy characters) is determined and the action is selected and is controlled according to the action order.

Specifically, a battle between the player character A 111, the player character B 112, the player character C 113, and the player character D 114 and the enemy character a 115, the enemy character b 116, and the enemy character c 117 is started as shown in FIG. 5A. A predicted action order image 118 is displayed in the upper portion of the screen as shown in FIG. 5A. An action character image 119 indicating the character whose action turn comes is displayed.

If the player character A 111 becomes an action character according to the action order, an action selection screen is displayed as shown in FIG. 5B and the player selects the action type of the player character A 111 such as attack by operating the input device 4. An action object selection screen is displayed as shown in FIG. 6A and the player selects the action object of the player character A 111 in the action type. A judgment ring screen is displayed and a judgment ring 100 and a rotation bar 101 are displayed as shown in FIG. 6B and the action success or failure, the action effect, and the like are determined based on operation of the input device 4. The judgment ring is described later in detail with FIGS. 8 to 10.

An action effect screen of the player character A 111 is displayed as shown in FIG. 7A and FIG. 7B based on the action type, the action object, the action success or failure, the action effect, etc., of the player character A 111 selected and determined based on operation of the input device 4.

Although described later in detail, blurring processing to really represent the depth on a virtual three-dimensional space is executed in the action selection screen, the action object selection screen, and the judgment ring screen as shown in FIGS. 5 and 6; however, the blurring processing is invalidated (limited) in the action effect screen as shown in FIG. 7.

[Blurring Processing]

The blurring processing is processing to produce the effect of a virtual three-dimensional space for providing perspective and an infinite feeling for the game screen. Such blurring processing will be discussed with FIGS. 17 and 18.

The blurring processing is executed based on the distance between an eyepoint position 214 and a character 212 in the depth direction thereof as shown in FIG. 17. Specifically, the eyepoint position 214 and the position of the character 212 are at a distance of symbol a in the depth direction. A plurality of reference positions 210a and 210b are set at a predetermined distance in the depth direction from the eyepoint position 214. The two reference positions sandwiching the character 212 (for example, the reference positions 210a and 210b) are determined based on the distance between the eyepoint position 214 and the character 212. That is, the first reference position 210a shorter than the character distance and the second reference position 210b longer than the character distance are detected based on the character distance between the eyepoint position and the character in the depth direction.

The permeability of texture at the reference position 210a is set as α1 and the permeability of texture at the reference position 210b is set as α2. The permeability α1 with the distance in the depth direction from the eyepoint position 214 being short is set lower than the permeability α2 with the distance in the depth direction from the eyepoint position 214 being long.

The distances from the two reference positions 210a and 210b to the position of the character 212 are calculated. Specifically, the distance from the first reference position 210a to the character 212 becomes symbol b1 and the distance from the second reference position 210b to the character 212 becomes symbol b2. The permeability at the character 212 is calculated based on the distances of the character 212 from the first reference position 210a and the second reference position 210b and the permeability at the first reference position 210a and the permeability at the second reference position 210b. Accordingly, it is made possible to display an image low in permeability, namely, a highly reflective and clear image at the reference position near to the eyepoint position 214.

In addition to the blurring processing, bilinear complementation, trilinear complementation, etc., is also executed for the color shade of texture actually put on the polygon of the character 212.

As a specific example, if the distance from the eyepoint position increases in the order of the player character A 111, the player character B 112, the player character C 113, and the player character D 114, the player character A 111 is displayed as the clearest image and the player character B 112, the player character C 113, and the player character D 114 are displayed as clear images in this order, as shown in FIG. 18. To represent the blurring degree of each display object, the types of lines indicating the characters are changed from one player character to another in FIG. 18. Specifically, the player character C 113 and the player character D 114 are indicated using dashed lines; in fact, however, they are not displayed as dashed lines.

Thus, blurring control is performed based on the permeability set in response to the distance in the depth direction from the eyepoint position, whereby, for example, an image more blurred in response to the distance from the focus (point set as a focal point) from the eyepoint position can be generated and representation of the depth of field (depth of focus) is made possible. Accordingly, unlike a conventional image where all subjects in a screen are brought into focus, a real and natural game image brought into focus in response to the distance from the viewpoint like a view image in the real world can be generated. Consequently, the virtual reality of the player can be improved markedly.

(Description of Judgment Ring)

The above-described judgment ring will be discussed with FIGS. 8 to 10.

The judgment ring 100 as shown in FIG. 8 is displayed just before action control of the player character based on a selection command is performed against the target character, and the judgment ring 100 is used to determine the parameters required for determining the effect described above. FIG. 8 shows the judgment ring 100 at the command determination time when the player character A 111 becomes an action character and soft hit is selected.

The judgment ring 100 as a reference area. is displayed in a state in which it is inclined in a slanting direction as shown in FIG. 6B. Displayed on the judgment ring 100 is the rotation bar 101 as a varying area for clockwise rotating like a clock hand with the center point of the judgment ring 100 as a support, as shown in FIG. 8. This means that the rotation bar 101 as a varying area varies relatively to the reference area. A variable display area whose display mode changes with the passage of time is formed of the reference area. and the varying area varying relatively to the reference area.

Also displayed on the judgment ring 100 are areas colored in predetermined angle ranges, which will be hereinafter referred to as timing areas. The timing areas are “effective areas” relatively advantageous to the player. In the judgment ring 100, the areas except the “effective areas” become “non-effective areas” relatively disadvantageous to the player. The timing areas contain a 120% area as a “special effective area” described later.

That is, the reference area. is made up of the effective areas relatively advantageous to the player and the non-effective areas relatively disadvantageous to the player, and the effective areas contain the special effective area furthermore advantageous to the player. Accordingly, the effect of the action mode is determined as any of the first effect relatively advantageous to the player, the second effect relatively disadvantageous to the player, or the third effect more relatively advantageous to the player than the first effect.

Then, the settings of the parameters are changed depending on whether or not the player can operate the ◯ button 12 when rotation of the rotation bar 101 is started and the rotation bar 101 passes through any of the timing areas. The timing areas include three timing areas as shown in FIG. 8. The timing area through which the rotation bar 101 first passes is a “first timing area” 102, the timing area through which the rotation bar 101 next passes is a “second timing area” 103, and the timing area through which the rotation bar 101 last passes is a “third timing area” 104.

For example, when the player can well operate the ◯ button 12 on any of the three timing areas, namely, the player can operate the ◯ button 12 with the rotation bar 101 on any of the three timing areas, then the action taken by the player character against the enemy character becomes effective. If a FIGHT command is selected, three attacks are made on the enemy character to cause damage thereto by predetermined offensive power. If a SPECIAL command is selected and recovery magic is used, magic having predetermined recovery power can be worked on the player character three times for giving recovery power to the player character.

In contrast, if the player upsets the operation timing of the ◯ button 12 on one timing area, the effect assigned to the timing area becomes ineffective. Particularly, if the player fails three times, the effect becomes zero. In the embodiment, the player visually recognizes the effective areas of the judgment ring 100; the point is that the five senses of the player may be influenced to enable the player to recognize the operation timing. For example, it is also possible to adopt an auditory configuration wherein specific voice (sound) is generated for a predetermined time and the player is requested to operate in the generation section or a tactile configuration wherein the input device 4 or a portable terminal is vibrated and the player is requested to operate in the vibration generation section.

The judgment ring 100 is formed according to the angle ranges of the timing areas. Specifically, if the action character is the player character A 111 and an attack command is selected, it is determined that the top angle and the termination angle of the first timing area 102 are 45 degrees and 135 degrees, that those of the second timing area 103 are 180 degrees and 247 degrees, and that those of the third timing area 104 are 292 degrees and 337 degrees. As shown in FIG. 8, the “120% area” in the first timing area 102 is a range 102a of 105 degrees resulting from subtracting 30 degrees from the termination angle 135 degrees to the termination angle 135 degrees; the “120% area” in the second timing area 103 is a range 103a of 224 degrees resulting from subtracting 23 degrees from the termination angle 247 degrees to the termination angle 247 degrees; and the “120% area” in the third timing area 104 is a range 104a of 322 degrees resulting from subtracting 15 degrees from the termination angle 337 degrees to the termination angle 337 degrees.

FIG. 9 shows the display mode of the judgment ring 100 after the command determination. It shows a state in which the rotation bar 101 starts to rotate and passes through the first timing area 102.

The “120% areas” are not limited to those described above. For example, the “120% area” may be provided in the range of the top angle to a predetermined angle as shown in FIG. 10A or two “120% areas” may be provided in one timing area as shown in FIG. 10B. FIG. 10A shows the case where the range 102a of the top angle 45 degrees to the angle 65 degrees (45 degrees+20 degrees) is set as the “120% area”. FIG. 10B shows the case where the range 102a of the top angle 45 degrees to the angle 65 degrees (45 degrees+20 degrees) and the range of the angle 105 degrees resulting from subtracting 30 degrees from the termination angle 135 degrees to the termination angle 135 degrees are set as the “120% areas”.

(Operation of Gaming Machine)

Various types of processing executed in the configuration described above will be discussed below with FIGS. 11 to 15.

(Main Game Processing)

If the DVD-ROM 31 is placed in the DVD-ROM drive 29 when the power of the machine main unit 1 is on, “opening demonstration” is displayed on the screen 16 as described above and then main game processing as shown in FIG. 11 is executed.

First, whether or not NEW GAME of the two options is selected on the “title screen” as shown in FIG. 4A is determined (ST1) as shown in FIG. 11. If it is determined that NEW GAME is selected (YES at ST1), a prolog and the game content are displayed (ST2). If it is not determined that NEW GAME is selected (NO at ST1), namely, if it is determined that CONTINUE is selected on the “title screen,” the saved data when the previous game was over is set without displaying the prolog or the game content (ST3).

Next, the “world map” shown in FIG. 4B is displayed (ST4). Whether or not any of the options displayed on the “world map” is selected is determined (ST5). When the determination at ST5 is YES, a start screen of the “submap” responsive to the selected option is displayed and the party of the player characters starts action on the “submap” (ST6). On the other hand, when the determination at ST5 is NO, whether or not the player operates the □ button 14 on the “world map” for making a “menu screen” display request is determined (ST20). When the determination at ST20 is YES, the “menu screen” is displayed and various types of setting processing are performed in response to operation of the player (ST21) and then the process is transferred to ST5. On the other hand, when the determination at ST20 is NO, again the process is transferred to ST5. The action on the “submap” is for the player character to walk, talk to a pedestrian, do shopping, etc. The player can also display the “menu screen” by operating the □ button 14 on the “submap” and various types of operation are made possible. For example, if the player selects a TOOL command, tool command processing is executed and the skills of the player character can be recovered; if the player selects a DEALING command, dealing processing is executed and dealing of the possessed item is made possible.

Next, whether or not the player character party starting action on the “submap” encounters an enemy character is determined (ST7). When the determination at ST7 is YES, “battle processing” is started (ST8). When the “battle processing” is started, a transition is made to a “battle scene” where a battle is fought between the player character party and the enemy character. The “battle processing” is described later with FIG. 12. On the other hand, when the determination at ST7 is NO, whether or not some event occurs is determined (ST9). When the determination at ST9 is YES, the process goes to ST16; when the determination at ST9 is NO, again the process is transferred to ST6.

Next, whether or not the player character party succeeds in escaping from the enemy character in the “battle scene” executed by performing the “battle processing” is determined (ST10). When the determination at ST10 is YES, the process goes to ST16. On the other hand, if the player character party fails in escaping from the enemy character or the player character party fights a battle with the enemy character, whether or not the player character party wins the enemy character in the “battle scene” is determined (ST11). When the determination is YES, namely, when the player character party wins the enemy character, points of the experience value, etc., are added or an item, money, etc., is given to each character of the party in response to the type of enemy character and the battle substance (ST12). The level of each character is raised in response to the experience value of the character (ST13). On the other hand, when the determination at ST11 is NO, namely, when the player character party cannot win the enemy character, whether all characters of the player character party die is determined (ST14). When the determination at ST14 is YES, the game is over (ST15) and the main game processing is terminated. When the determination at ST14 is NO, the process is transferred to ST16.

At ST16, a movie responsive to the situation is displayed and subsequently whether or not the selected submap request condition has been cleared is determined (ST17). When the determination at ST17 is NO, again the process is transferred to ST6. When the determination at ST17 is YES, whether or not a transition is to be made to the ending is determined (ST18). When the determination at ST18 is YES, a predetermined ending is displayed (ST19) and the main game processing is terminated. On the other hand, when the determination at ST18 is NO, again the process is transferred to ST4.

(Battle Processing)

The “battle processing” will be discussed with FIG. 12.

First, processing of setting the parameters concerning the characters and calculating and setting the turn interval value is executed (ST30) as shown in FIG. 12. In this processing, the CPU 21 reads the parameters concerning the characters from a predetermined region of the RAM 23, the DVD-ROM 31 and sets the parameters in a predetermined region of the RAM 23. The characters mentioned here correspond to a plurality of characters containing the player characters and the enemy characters appearing in a “battle scene.” The turn interval value is a value to determine the action order and is calculated every plurality of characters (containing the player characters, the enemy characters). The turn interval value is determined in response to the agility (AGL) and luck (LUC) set every plurality of characters and an execution command correction value corresponding to the executed command type. The CPU 21 sets the turn interval value calculated every plurality of characters appearing in a “battle scene” in a predetermined region of the RAM 23. This means that the CPU 21 for executing such processing corresponds to an example of a character action order determination module for determining the action order of the characters. Upon completion of the processing, the process is transferred to ST31.

At ST31, a battle scene start screen of a “battle scene” as shown in FIG. 5A is displayed. On the start screen, the player character party (player character A 111, player character B 112, player character C 113, and player character D 114) is displayed toward the player. The enemy characters (for example, enemy character a 115, enemy character b 116, and enemy character c 117) are displayed at the positions corresponding to the player characters on the opposed side to the player characters. Information concerning the status of each player character is displayed in the lower right portion of the start screen although not shown in FIG. 5A. Further, the action order image 118 for executing actions in the player characters and the enemy characters and the action character image 119 indicating the character whose action turn comes are displayed in the upper portion of the start screen.

At ST32, “turn order processing” is performed to manage the order in which the player characters and the enemy characters can take action of attack, etc. In this processing, the CPU 21 manages the turn order of the characters for which command selection is made effective based on the turn interval value calculated from the skills, etc., of the characters. This means that the CPU 21 for executing such processing corresponds to an example of the character action order determination module for determining the action order of the characters.

The CPU 21 displays an image indicating the turn order on the screen 16. The CPU 21 zooms up the player character for which command selection is made effective (here, the player character A 111) and displays a “command selection screen” as shown in FIG. 5B. Upon completion of the processing, the process is transferred to ST33.

At ST33, whether or not the character for which command selection is made effective in the “turn order processing” is an enemy character is determined. If the determination at ST33 is YES, automatic processing is performed in accordance with the gaming program so that the enemy character makes an attack on the player character (ST34). In the battle automatic processing, the CPU 21 executes action effect screen display control processing in the enemy character. Accordingly, processing of displaying an action effect screen in the enemy character is executed in display control processing described later (see FIG. 15). This means that the CPU 21 for executing such processing performs display control of action control of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of a character action display controller. Although described later in detail, specific image control is limited (invalidated) in the action effect screen. Upon completion of the processing, the process is transferred to ST36.

On the other hand, if it is determined at ST33 that the character for which command selection is made effective is the player character, subsequently “command processing” of accepting command selection of the player is performed (ST35). In the processing, a command is selected in response to operation input from the input device 4 and the action mode based on the selected command is determined. This means that the CPU 21 for executing such processing executes selection of the action mode of the character and action control of the character (described later) in accordance with the action order of the characters determined at ST30, ST32, ST36, etc. The CPU 21 for executing such processing corresponds to an example of a special character action controller.

The CPU 21 displays a command menu 44 indicating commands to determine the action mode of the player character A 111 as options on the screen 16. The CPU 21 moves a selection cursor 45 (see FIGS. 5 and 6) displayed at the left of the command menu 44 (see FIGS. 5 and 6) as the player operates the up button 7 or the down button 8 of the input device 4. When the player operates the ◯ button 12, the command with the selection cursor 45 displayed at the left position is selected and the action mode of the player character A 111 is determined. Various commands represented by ATTACK, MAGIC, ITEM, DEFEND, and ESCAPE are displayed on the command menu 44.

Effect display responsive to the determined action mode is produced. For example, if the player selects attack, magic, specific act, item use command (“action” command described later), display processing such that action is taken against the target character as the action target of the player character, the enemy character, etc., is executed. In the “command processing,” “judgment processing” for enabling technical intervention according to the operation timing of the player is also performed. The “command processing” is described later in detail with FIG. 13. Upon completion of the processing, the process is transferred to ST36.

At ST36, whenever the character takes action, the turn order is updated. In the processing, the CPU 21 stores the character taking the action in a predetermined region of the RAM 23 and updates the turn order of the character taking the action. Accordingly, when the “turn order processing” is again executed, the turn order is compared and the character for which command selection of the character made to execute action is made effective can be determined. If all characters execute action, the characters taking action are stored as the characters not taking action. This means that the CPU 21 for executing such processing corresponds to an example of the character action order determination module for determining the action order of the characters. Upon completion of the processing, the process is transferred to ST37.

At ST37, update display processing of the turn order is executed. In the processing, the CPU 21 updates and displays the turn order to execute action in the next turn based on the turn order updated at ST36. Upon completion of the processing, the process is transferred to ST38.

At ST38, whether or not the “battle processing” exit condition is satisfied is determined. When the determination at ST38 is NO, the process returns to ST32; when the determination at ST38 is YES, “soul point addition processing” is executed (ST39) and the “battle processing” is exited. The “battle processing” exit condition is any of the fact that the enemy characters appearing on the battle screen suffer a crushing defeat, the fact that the player selects an “ESCAPE” command and the player character party succeeds in escaping from the enemy characters, the fact that the player character party suffers a crushing defeat, or the fact that such an event for terminating the battle occurs.

(Command Processing)

The “command processing” will be discussed with FIG. 13.

First, command screen display control processing is executed (ST200) as shown in FIG. 13. In the processing, the CPU 21 sets data to display a command selection screen in the RAM 23. Accordingly, command screen display processing is executed based on the setup data in display control processing described later (see FIG. 15). This means that the CPU 21 for executing such processing performs display control of selection of the action mode of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the process is transferred to ST201.

At ST201, whether or not the command is an ACTION SELECTION command is determined. In this processing, the CPU 21 determines whether or not the command is an ACTION SELECTION command in response to an input signal from the input device 4, etc. The ACTION SELECTION command mentioned here includes the ATTACK command and the magic command as well as a SPECIFIC ACT command such as a FUSION command, an ITEM command, etc. If the CPU 21 determines that the command is an ACTION SELECTION command, the CPU 21 executes “judgment processing” of physical attack, magic, specific act, item use, etc., (ST202) and exits the subroutine. The “judgment processing” is described later in detail with FIG. 14. On the other hand, if the CPU 21 does not determine that the command is an ACTION SELECTION command, the CPU 21 transfers the process to ST203.

Subsequently, at ST203, the CPU 21 determines whether or not the command is a DEFEND command. If the CPU 21 determines that the command is a DEFEND command, the CPU 21 executes defense processing (ST204) and exits the subroutine. On the other hand, if the CPU 21 does not determine that the command is a DEFEND command, the CPU 21 executes escape processing (ST205). Upon completion of the processing, the subroutine is exited.

(Judgment Processing)

The “judgment processing” will be discussed with FIG. 14.

First, the CPU 21 executes action selection screen display control processing for setting the data to display an action selection screen on the screen 16 in the RAM 23 (ST221) and executes “command acceptance processing” (ST222). Accordingly, processing of displaying an action selection screen as shown in FIG. 5B is executed based on the setup data in display control processing described later (see FIG. 15). This means that the CPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of a character action display controller. The CPU 21 determines the action to be executed based on the character data corresponding to the character stored in the RAM 23 in response to an operation input signal supplied from the input device 4. Specifically, the CPU 21 determines the type of attack, magic, specific act, item use. For example, when an ACTION SELECTION command is selected, the type of hit is determined. The hit types are soft hit, normal hit, hard hit, etc. Upon completion of the processing, the process is transferred to ST223.

At ST223, the CPU 21 executes action target selection screen display control processing for setting the data to display an action target selection screen on the screen 16 in the RAM 23 and executes action target selection command acceptance processing (ST224). Accordingly, processing of displaying an action target selection screen as shown in FIG. 6A is executed based on the setup data in the display control processing described later (see FIG. 15). This means that the CPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of the character action display controller. The CPU 21 determines the character (target character) as the target of the action (attack, attack magic use, recovery magic use, specific act use, item use, etc.) taken based on the command selected at ST 222 in response to an operation input signal supplied from the input device 4 and stores the character in a predetermined region of the RAM 23. Upon completion of the processing, the process is transferred to ST225.

At ST225, “judgment ring determination processing” is executed. In the processing, the CPU 21 determines the display mode of the judgment ring 100 and the rotation bar 101 in response to the skills, etc., of the character taking action. This means that the CPU 21 determines the display mode of the judgment ring 100 and the rotation bar 101 based on the character data corresponding to the character stored in the RAM 23. Upon completion of the processing, the process is transferred to ST226.

At ST226, the CPU 21 executes judgment ring screen display control processing for setting the data to display a judgment ring screen on the screen 16 in the RAM 23. Accordingly, processing of displaying a judgment ring screen as shown in FIG. 6B is executed based on the setup data in the display control processing described later (see FIG. 15). This means that the CPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the process is transferred to ST227.

At ST227, “judgment ring judgment processing” is executed. In the processing, the CPU 21 determines the success or failure of action of attack, etc., the effect of action, etc., in response to operation input from the input device 4. Upon completion of the processing, the process is transferred to ST228.

Thus, the CPU 21 selects the character action mode based on the operation signal from the input device 4 and the character data stored in the RAM 23. The CPU 21 for executing such processing corresponds to an example of a character action mode selection module.

At ST228, update processing of the individual skill parameters of HP, MP, SP, AGL, LUC, etc., and the status is executed. In the processing, the CPU 21 updates the values of HP, MP, and SP based on the damage amount or the recovery value calculated by performing the “judgment ring judgment processing.” HP and MP are incremented or decremented and SP is decremented in response to the damage amount, the recovery value, etc. Whenever such processing is executed, SP is decremented by one. That is, SP is decremented by one every turn of the character. The CPU 21 updates and stores the individual skill parameters of AGL, LUC, etc., based on the used special item or the executed special act. Further, the CPU 21 updates the status of the character in response to the action executed by performing the “judgment ring judgment processing.” At the time, if the status of the character is updated to “abnormal status,” the character enters an abnormal state different from the usual state. The “abnormal status” varies depending on the type of attack item, magic, etc. For example, “poison” abnormal status is an abnormal status in which the physical strength of the character is automatically decreased every turn for the character to take action when the character receives magic from the enemy or receives attack based on a predetermined item. “Lithification” abnormal status is an abnormal status in which the character is hardened like a stone and command entry is made impossible when the character receives magic from the enemy or receives attack based on a predetermined item. Upon completion of the processing, the process is transferred to ST229.

At ST229, “action effect screen display control processing” is executed. In the processing, the CPU 21 executes action effect screen display control processing for setting in the RAM 23, data to display an action effect screen as shown in FIG. 7 A and FIG. 7B on the screen 16 where predetermined action of character, etc., (attack, magic, specific act, use of item, etc.) is taken based on the action type selected by performing the “command acceptance processing,” the action target selected by performing the “action target selection command acceptance processing,” and the success or failure of the action, the effect of the action executed by performing the “judgment ring judgment processing.” Accordingly, processing of displaying the action effect screen in the player character is executed based on the setup data in the display control processing described later (see FIG. 15). Although described later in detail, specific image control is limited (invalidated) in the action effect screen. The CPU 21 also displays a parameter image of HP, MP, SP, etc., on the screen 16 based on the updated parameters. Upon completion of the processing, the subroutine is exited.

Thus, the CPU 21 performs character action control based on the action mode of the character selected at steps ST221 to ST227, etc. The CPU 21 for executing such processing corresponds to an example of a character action controller. The CPU 21 for executing such processing performs display control of action control of the character in a battle scene. The CPU 21 for executing such processing corresponds to an example of the character action display controller.

(Display Control Processing)

The “display control processing” called at a predetermined timing in the image processing section 24 unlike the processing described above will be discussed with FIG. 15. In the processing, a screen is displayed based on the screen display data set in the RAM 23 in the various types of display control processing described above.

First, the image processing section 24 sets the eyepoint position, the light source position, etc., and executes display object three-dimensional coordinate calculation processing (ST231). In the processing, the image processing section 24 selects the display object based on the data supplied from the CPU 21, such as the character to be displayed. The image processing section 24 reads the three-dimensional coordinates of all polygons concerning the selected display objects and places the display objects in virtual three-dimensional coordinates. Upon completion of the processing, the process is transferred to ST232.

At ST232, display object three-dimensional conversion processing is executed. In the processing, the image processing section 24 calculates the positional relationship with the display object with the eyepoint position as the reference based on the eyepoint position and the three-dimensional coordinates of the placed display object. Upon completion of the processing, the process is transferred to ST233.

At ST233, whether or not action control is being performed is determined. In the processing, the image processing section 24 determines whether or not action control is being performed depending on whether or not the action effect screen in the player character or the enemy character is displayed based on the data set in the various types of display control processing. If the image processing section 24 determines that action control is being performed, it skips ST234 and transfers the process to ST235. On the other hand, if the image processing section 24 does not determine that action control is being performed, it transfers the process to ST234.

At ST234, specific image control processing is executed. In the processing, the image processing section 24 executes blurring processing as the specific image control processing. The image processing section 24 calculates the distance between the eyepoint position and the center of the display object. The CPU 21 detects two reference positions based on the distance between the eyepoint position and the center of the display object and calculates the permeability at the center of the display object based on the distances between the two reference positions and the center of the display object, the permeability set in the two reference positions. Accordingly, the CPU 21 sets data to display an image of the depth of focus based on the distance of the display object from the eyepoint position. Upon completion of the processing, the process is transferred to ST235.

At ST235, rendering processing is executed. In the processing, the image processing section 24 generates data for visualization based on the positional relationship with the display object with the eyepoint position calculated at ST232 as the reference and the like. Particularly, when the image processing section 24 executes ST234, it executes rendering processing executing the specific image control processing; when the image processing section 24 does not execute ST234, it executes rendering processing skipping the specific image control processing. Upon completion of the processing, the process is transferred to ST236.

This means that the image processing section 24 for executing such processing executes specific image control for selection of the action mode of the character subjected to display control. On the other hand, the image processing section 24 invalidates specific image control and limits specific image control while character action control is performed. The image processing section 24 for executing such processing corresponds to an example of a specific image control execution module, an example of a specific image control limitation module, and an example of a specific image control invalidation module.

At ST236, whether or not processing in all display objects terminates is determined. In the processing, the image processing section 24 determines whether or not processing in all display objects terminates depending on whether or not the processing at steps ST231 to ST235 has been executed for all display objects (for example, characters, etc.) to be displayed based on the data set in the various types of display control processing. If the image processing section 24 determines that processing in all display objects terminates, it transfers the process to ST237. On the other hand, if the image processing section 24 does not determine that processing in all display objects terminates, it again transfers the process to ST231 for executing the processing at steps ST231 to ST235 for the remaining display objects.

At ST237, image display processing is executed. In the processing, the image processing section 24 draws a display object with the eyepoint position as the reference as a result of the rendering at ST235 and displays the drawn image on the screen 16. This means that the image processing section 24 for executing such processing executes display control of selection of the action mode of the character executed in a battle scene and action control of the character. The image processing section 24 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the subroutine is exited.

Thus, selection of the action mode of the character based on the operation signal from the input device that can be operated by the player and a plurality of pieces of character data and action control of the character based on the action mode of the selected character are executed in accordance with the action order of the characters, and display control of selection of the action mode of the character to be executed and action control of the character is performed. While the character action control is performed in the character action mode selection and the character action control, execution of specific image control is limited. Therefore, while the character action control whose control load grows relatively is performed, if the control load of image control other than execution of specific image control whose control load is high grows, the specific image control is limited, so that degradation of the image can be prevented and the loss of the interest in the game can be suppressed. While the character action mode selection with relatively small control load is made, the specific image control is executed and the characters to be displayed are displayed without being skipped because the number of displayed characters is not limited, and further specific image processing is performed and game presence can be enhanced. As limitation of specific image control, the specific image control is invalidated while the character action control is performed. Therefore, only image control other than the specific image control whose control load is high can be executed and still more degradation of the image can be prevented and the loss of the interest in the game can be suppressed.

(Program)

The above-described gaming program will be discussed in detail. The gaming program specifically is a program for causing a computer to function as the following modules. In other words, the gaming program is also a program for causing a computer to execute the following modules (processes, steps). The gaming program is also a program for causing a computer to implement various functions of the following processes. Such a computer includes an input device that can be operated by the player, a display module for displaying an image concerning a game.

(A1) Character data storage module (process) for storing a plurality of pieces of character data concerning a plurality of characters.

(A2) Character action order determination module (process) for determining the action order of the plurality of characters.

(A3) Character action mode selection module (process) for selecting the character action mode based on an operation signal from the input device and the plurality of pieces of character data.

(A4) Character action controller (process) for performing character action control based on the character action mode selected by the character action mode selection module.

(A5) Special character action controller (process) for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module.

(A6) Character action display controller (process) for performing display control of selection of the character action mode and the character action control executed by the special character action controller.

(A7) Specific image control execution module (process) for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller.

(A8) Specific image control limitation module (process) for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.

(A9) Specific image control invalidation module (process) for invalidating the specific image control as limitation of the specific image control while the character action controller performs the character action control.

(Storage Medium)

In a computer-readable record medium recording such a gaming program, a skill parameter and a possessed item parameter may be stored every plurality of characters in addition to the gaming program.

Other Embodiments

Although the embodiment has been described, the invention is not limited to the specific embodiment. For example, the input device 4 operated by the player may be integrated into the machine main unit 1.

In the embodiment, after all characters appearing in a “battle scene” execute action, again the turn order to execute action of all characters is determined, but the invention is not limited to the mode and another embodiment may be adopted. For example, the character terminating execution of action may execute the next action before all characters execute action.

In the embodiment, image control of executing the blurring processing as specific image control is adopted, but the invention is not limited to it. For example, fog processing of masking the display screen color by separately setup color data called fog data for adjustment may be performed as specific image control. Using the fog processing, a distant object can be viewed as a dim object according to the fog color. For example, if the fog color is white, the fog effect is produced; if the fog color is blue, the effect dim in the distance is produced. In the embodiment, as limitation of specific image control, the specific image control is invalidated while the character action control is performed, but the invention is not limited to the mode. For example, while the character action control is performed, the specific image control may be limited by switching execution of blurring processing, etc., for each frame so as to lighten addition of image control without completely invalidating the specific image control. The specific image control may be limited while the character action control is performed. For example, the specific image control need not always be limited over the time during which the character action control is performed; the specific image control may be limited at least in partial time.

Further, the invention can also be applied to a portable gaming machine or a desk-top gaming machine including in one piece an operation section that can be operated by a player, a display section for displaying an image and audio (sound), a storage section for storing a gaming program, and a control section for executing control processing in accordance with the gaming program.

Further, the invention can also be applied to a network game of the type wherein the above-described gaming program is stored in a server connected to a network such as Internet 56 (see FIG. 16), etc., and a player can play a game by connecting to the server from a personal computer, a mobile telephone, a portable information terminal (PDA), etc.

A network game system in FIG. 16 will be discussed by way of example. In the network game system, mobile telephones 53A, 53B, and 53C as terminals for playing the above-described game are connected to a PDC network 51 capable of conducting packet communications, for example, through base stations 52A and 52B, and an information center 55 is accessed through the PDC network 51 in response to player operation and the game state. The information center 55 acquires various pieces of information through a network such as the Internet 56 from servers 57A and 57B storing data required for games and the like as well as gaming programs in response to requests from the mobile telephones 53A, 53B, and 53C, and transmits information required for games to the mobile telephones 53A, 53B, and 53C. Like a server 58 in FIG. 16, the server storing the game data, etc., may be connected to the information center 55 by a private or leased communication line 60 not via the network such the Internet 56.

To play a game, the player previously downloads a gaming program from the server 57A, 57B into the mobile telephone 53A, 53B, 53C and executes the gaming program on the mobile telephone 53A, 53B, 53C main unit. In addition, various systems are possible, such as a system wherein the mobile telephone 53A, 53B, 53C is assigned a role like a browser in such a manner that the gaming program is executed on the server 57A, 57B in accordance with an instruction from the mobile telephone 53A, 53B, 53C and the player views the game on the mobile telephone 53A, 53B, 53C. The players may share the network game system or may be able to fight a battle with each other by connecting the mobile telephones using the PDC network 51.

In the embodiment, the judgment ring 100 containing the reference area. and the rotation bar 101 as a varying area are provided, but the invention is not limited to the mode and another embodiment may be adopted. For example, the judgment ring may be a varying area and an area like the rotation bar may be the reference area. That is, the reference area. or the varying area is formed containing a plurality of effective areas relatively advantageous to the player and a non-effective area relatively disadvantageous to the player. Such a judgment ring may be unused.

Although the embodiment of the invention has been described, it is to be understood that the embodiment is illustrative and not restrictive and that the invention is not limited to the specific embodiment thereof. That is, the invention is mainly characterized by a gaming program for causing a computer including an input device that can be operated by a player to function as a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data; a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode executed by the special character action controller and the character action control; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control. However, the specific configurations of the input device, the character data storage module, the character action order determination module, the character action mode selection module, the character action controller, the special character action controller, the character action display controller, the specific image control execution module, the specific image control limitation module, the specific image control invalidation module etc., can be changed in design as required.

As the advantages described in the embodiment of the invention, the most favorable advantages produced from the invention are only enumerated and the advantages of the invention are not limited to those described in the embodiment of the invention.

Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

Claims

1. A gaming program product for use in a computer having an input device that can be operated by a player comprising:

a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters;
a character action order determination module for determining the action order of the plurality of characters;
a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data;
a character action controller for performing character action control based on the character action mode selected by said character action mode selection module;
a special character action controller for executing selection of the character action mode by said character action mode selection module and the character action control by said character action controller in accordance with the action order of the plurality of characters determined by said character action order determination module;
a character action display controller for performing display control of selection of the character action mode and the character action control executed by said special character action controller;
a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by said character action display controller; and
a specific image control limitation module for limiting the specific image control of said specific image control execution module while said special character action controller performs the character action control.

2. The gaming program product as claimed in claim 1 for further comprising:

a specific image control invalidation module for invalidating the specific image control as limitation of the specific image control while said character action controller performs the character action control.

3. A gaming machine comprising:

an input device that can be operated by a player;
a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters;
a character action order determination module for determining the action order of the plurality of characters;
a character action mode selection module for selecting a character action mode based on an operation signal from said input device and the plurality of pieces of character data;
a character action controller for performing character action control based on the character action mode selected by said character action mode selection module;
a special character action controller for executing selection of the character action mode by said character action mode selection module and the character action control by said character action controller in accordance with the action order of the plurality of characters determined by said character action order determination module;
a character action display controller for performing display control of selection of the character action mode executed by said special character action controller and the character action control;
a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by said character action display controller; and
a specific image control limitation module for limiting the specific image control of said specific image control execution module while said special character action controller performs the character action control.
Patent History
Publication number: 20060223633
Type: Application
Filed: Mar 30, 2006
Publication Date: Oct 5, 2006
Applicant: ARUZE CORP. (Tokyo)
Inventor: Izumi Hamamoto (Tokyo)
Application Number: 11/392,618
Classifications
Current U.S. Class: 463/30.000
International Classification: A63F 9/24 (20060101);