Game process control method, information storage medium, and game device

- NAMCO BANDAI GAMES INC.

A chord type is associated in advance with an attribute of a character. Whether or not a chord is formed is determined from input sound in units of detection time t at specific time intervals, first to third attributes are determined as the attributes of creation candidate characters from the attributes of the characters based on the formation count in chord units, and the creation probability of each of the first to third attributes is determined based on the formation count of the chord of the corresponding type. A character with one of the first to third attributes determined according to the creation probability is created.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description

Japanese Patent Application No. 2006-234164 filed on Aug. 30, 2006, is hereby incorporated by reference in its entirety.

BACKGROUND OF THE INVENTION

The present invention relates to a game device which causes a new character based on input sound to appear and the like.

A game device has been known which includes a sound input means such as a microphone and utilizes sound input from the sound input means for a game process. For example, technology has been known which determines the parameter of a character caused to appear based on the input sound. According to this technology, the input sound (analog signal) is converted into a digital signal, and the digital signal is converted into a numerical value in frequency band units to create sequence data. Whether or not an arbitrary value in the sequence data coincides with predetermined reference data is determined, and the parameter of the character is determined based on the determination results (e.g. Japanese Patent No. 2860097).

According to the technology disclosed in Japanese Patent No. 2860097, the parameter of the character caused to appear is determined based on the input sound. However, a parameter irrelevant to the meaning of the input sound is generated. Specifically, since the player cannot expect the parameter generated based on the input sound, the parameter of the character is virtually determined at random. Therefore, it may be troublesome for the player to input sound for generating a new character, whereby the player may lose interest in the game.

SUMMARY

According to one aspect of the invention, there is provided a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:

detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;

selecting a game character caused to appear based on the detection result;

causing the selected game character to appear; and

controlling display of each game character including the new game character.

BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING

FIG. 1 shows an example of the outward appearance of a portable game device.

FIG. 2A shows an example of an egg selection screen, and FIG. 2B shows an example of a character list screen.

FIG. 3 shows an example of a character creation production screen.

FIG. 4 shows an example of a character creation screen.

FIG. 5 shows a functional configuration example of a portable game device.

FIG. 6 shows a data configuration example of possessed item data.

FIG. 7 shows a data configuration example of a character setting table.

FIG. 8 shows a data configuration example of detected note data.

FIG. 9 shows a data configuration example of level condition data.

FIG. 10 is a view illustrative of causing silence to occur at a detection time t at which five or more consecutive notes are detected.

FIG. 11 shows a data configuration example of note detection total count data.

FIG. 12 shows a data configuration example of start note data.

FIG. 13 is a view illustrative of generation of note-name-unit detection data from detected note data.

FIG. 14 is a view illustrative of causing silence to occur at a detection time at which seven or more consecutive notes are detected.

FIG. 15 is a view illustrative of determination of formation of a chord in note-name-unit detection data.

FIG. 16 shows a data configuration example of chord formation count data.

FIG. 17 shows a data configuration example of a chord classification table.

FIG. 18 shows a data configuration example of score data.

FIG. 19 shows a data configuration example of a group/attribute correspondence table.

FIG. 20 shows a data configuration example of determined creation probability data.

FIG. 21 shows a data configuration example of a score setting table.

FIG. 22 shows a data configuration example of a note classification table.

FIG. 23 shows a data configuration example of note classification data.

FIG. 24 shows a data configuration example of a creation probability setting table.

FIG. 25 shows a data configuration example of an attribute/color correspondence table.

FIG. 26 shows a data configuration example of particle data.

FIG. 27 shows an example of generation percentage control data.

FIG. 28 shows an example of total generation count control data.

FIG. 29 is a flowchart of a game process.

FIG. 30 is a flowchart of a character creation process executed during the game process.

FIG. 31 is a flowchart of a creation probability determination process executed during the character creation process.

FIG. 32 is a flowchart of a filtering process executed during the creation probability determination process.

FIG. 33 is a flowchart of a chord formation determination process executed during the creation probability determination process.

FIG. 34 is a flowchart of a group count shortage process executed during the creation probability determination process.

FIG. 35 is a flowchart of a character creation production process executed during the character creation process.

DETAILED DESCRIPTION OF THE EMBODIMENT

The invention has been achieved in view of the above-described situation, and may allow a player to estimate the relationship between a game character caused to appear and input sound to a certain extent.

According to one embodiment of the invention, there is provided a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:

detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;

selecting a game character caused to appear based on the detection result; causing the selected game character to appear; and

controlling display of each game character including the new game character.

According to another embodiment of the invention, there is provided a game device comprising:

a sound input section;

a note set detection section which detects a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;

a character selection section which selects a game character caused to appear based on the detection result of the note set detection section; and

a character appearance control section which causes the game character selected by the character selection section to appear.

According to the above embodiment, the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game. Specifically, the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord such as a major chord or a minor chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.

In the game process control method, the note sets may be associated in advance with the game characters;

the method may further comprise determining selection candidate characters including at least the game character corresponding to the detected note set; and

the game character caused to appear may be selected from the determined selection candidate characters.

According to this feature, the note sets are associated in advance with the game characters, and the game character caused to appear is selected from the selection candidate characters including at least the game character corresponding to the detected note set. Specifically, the game character selected from the selection candidate characters including the game character corresponding to the detected note set is caused to appear as a new game character. Therefore, since the game character corresponding to the detected note set is caused to appear depending on the probability, the game character caused to appear differs even if the sound is input, whereby the player can enjoy the game.

In the game process control method, the note set included in the input sound may be detected at given time intervals; and

the selection candidate characters may be determined based on a detection total count of each of the note sets detected.

According to this feature, the note set included in the input sound is detected in units of time at given time intervals, and the selection candidate characters are determined based on the detection total count of each note set. Therefore, the game character determined to correspond to the tone of the input sound appears with a higher probability as a new game character by determining the game character corresponding to the note set with a detection total count equal to or greater than a specific number to be the selection candidate character, for example.

In the game process control method, the selection candidate character corresponding to the note set with a larger detection total count may be selected as the game character caused to appear with a higher probability.

According to this feature, the game character caused to appear is selected so that the selection candidate character corresponding to the note set with a larger detection total count is selected with a higher probability. Specifically, since the game character corresponding to the note set with the largest detection total count appears as a new game character with the highest probability, the game character determined to correspond to the tone of the input sound appears as a new game character with the highest probability.

In the game process control method, the game character determined to be the selection candidate character may be associated in advance corresponding to a set note content which is a percentage of a predetermined note in the input sound;

the method may further comprise determining the set note content of the input sound input to the sound input section; and

the game character corresponding to the determined set note content may be determined to be included in the selection candidate characters.

According to this feature, the game character corresponding to the set note content of the input sound is further included in the selection candidate characters. Specifically, a special game character exists which may appear according to the percentage of a specific note in the input sound. This makes it possible to further increase the player's interest in the game character caused to appear.

The game process control method, may further comprise:

detecting whether or not a set note which is a note set in advance is included in the input sound input to the sound input section at given time intervals;

wherein a special character may be determined to be included in the selection candidate characters when a detection total count of the detected set note has reached a specific number.

According to this feature, whether or not the set note is included in the input sound is detected in units of time at given time intervals, and the special character is included in the selection candidate characters when the detection total count has reached a specific number. Specifically, a special character exists which may appear only when the set notes are included in the input sound in a number equal to or greater than a specific number.

In the game process control method, the game character may be associated in advance with each of a plurality of time conditions obtained by dividing a period in which the input sound may be input by date and/or time; and

the game character corresponding to the time condition satisfied by an input time of the input sound from the sound input section may be selected as the game character caused to appear.

According to this feature, the game character is caused to appear which is associated in advance with one of the time conditions obtained by dividing a period in which the input sound may be input by date and/or time and satisfied by the input time of the input sound. Specifically, the game character caused to appear differs depending on the input time, even if the same sound is input, whereby the game playability can be increased.

The game process control method, may further comprise:

detecting an input timing of each note included in the input sound input to the sound input section;

wherein the note set may be detected which includes the notes input at the same input timing.

According to this feature, the input timing of each note included in the input sound is detected, and the note set including the notes input at the same input timing is detected.

In the game process control method, the note set included in the input sound input to the sound input section may be detected in note name units.

According to this feature, the note set included in the input sound is detected in note name units (i.e., while regarding the notes having the same name as those same notes). Specifically, since the note set is detected in note name units irrespective of the octave, the note set is more easily detected.

The game process control method, may further comprise:

subjecting the input sound to a filtering process by detecting only the notes included in the input sound input to the sound input section and having a specific intensity;

the note set may be detected using the input sound subjected to the filtering process as the input sound input to the sound input section.

According to this feature, the note set is detected based on only the notes included in the input sound and having a specific intensity. Specifically, a weak note which is included in the input sound and does not have a specific intensity is not detected as a note.

In the game process control method, the filtering process may include causing a portion of the input sound input to the sound input section in which a specific number or more of notes are input at the same time to be silent.

According to this feature, a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent. For example, when inputting a number of notes at the same time by simultaneously pressing piano keys over one octave, a note set is almost necessarily detected. According to the invention, since a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent, an unfair input operation of inputting a number of notes at the same time can be prevented.

According to a further embodiment of the invention, there is provided a computer-readable information recording medium storing a program for causing a computer to execute the game process control method.

The term “information storage medium” used herein refers to a storage medium, such as a hard disk, an MO, a CD-ROM, a DVD, a memory card, or an IC memory, from which the stored information can be read by a computer. According to the invention, the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game. Specifically, the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.

Preferred embodiments of the invention are described below with reference to the drawings. The following description illustrates an example of causing a portable game device to execute a breeding game. Note that the embodiment to which the invention can be applied is not limited thereto.

<Outward Appearance of Game Device>

FIG. 1 is a view showing an example of the outward appearance of a portable game device 1 according to this embodiment. As shown in FIG. 1, the portable game device 1 is a folding-type game device in which an upper housing 10A and a lower housing 10B are connected through a hinge 11 so that the portable game device 1 can be opened and shut. FIG. 1 illustrates the portable game device 1 in an open state (during use).

The inner sides of the housings 10A and 10B are provided with two displays 12A and 12B disposed on either side of the hinge 11 during use, a speaker 13, a microphone 14, various operation buttons 15, and the like. A touch panel is integrally formed in the display 12B over the entire display region. The touch panel detects a touch position in units of dots forming the display 12B according to a detection principle such as a pressure-sensitive method, an optical method, an electrostatic method, or an electromagnetic induction method, for example. The player can input various operations by utilizing a stylus pen 30 provided as an accessory, or by touching the display 12B.

Game information including a program and data necessary for the portable game device 1 to execute a game process and the like is stored in a cartridge 20 removable from a slot 16 formed in the side surface of the housing 10B. The portable game device 1 may connect with a wireless communication channel through a built-in wireless communication device 18 and acquires the game information from an external instrument.

The portable game device 1 includes a control device 17 including a CPU and an IC memory, the wireless communication device 18 for performing wireless communication conforming to a wireless LAN standard, a reading device for the cartridge 20, and the like. The CPU provided in the control device 17 executes various game processes based on a program and data read from the IC memory and the cartridge 20, a touch position detected by the touch panel, a sound signal input from the microphone 14, an operation signal input from the operation buttons 15, data received by the wireless communication device 18, and the like, and generates an image signal of a game screen and a sound signal of game sound. The CPU outputs the generated image signal to the displays 12A and 12B to cause the displays 12A and 12B to display a game screen, and outputs the generated sound signal to the speaker 13 to cause the speaker 13 to output game sound. The player enjoys the breeding game by operating the operation buttons 15 or touching the display 12B while watching the game screens displayed on the displays 12A and 12B.

<Outline of Game>

In the breeding game according to this embodiment, the player acquires an egg as one type of item during the game. The player causes a game character (hereinafter simply called “character”) to be created (appear) by playing a melody for the egg, and rears the created character. The player can play a melody by inputting melody sound (music) through the microphone 14.

Different types of eggs are provided which differ in outward appearance. Different types of characters are set for each type of egg as characters created from the egg. One of the characters corresponding to the melody played by the player is created. Specifically, the character created from a single egg differs depending on the melody played by the player. The character to be created differs depending on the type of egg, even if the player plays the same melody.

An attribute is set for each character. The attribute is a parameter by which each character is classified. The attribute affects the rearing of the character, the game process, and the like. In this example, the attribute is classified as “fire”, “wind”, “earth”, “water”, “light”, and “darkness” (six types in total).

<Game Screen>

FIGS. 2A and 2B are views showing an example of a game screen when creating a character. FIG. 2A shows a game screen displayed on the display 12B, and FIG. 2B shows a game screen displayed on the display 12A.

As shown in FIG. 2A, an egg selection screen for selecting an egg from which a character is created is displayed on the display 12B. Different types of eggs OB provided in advance are listed on the egg selection screen together with the number of eggs currently possessed by the player. One of the displayed eggs OB is in a selected state. In FIG. 2A, three of all types of eggs OB are displayed. The remaining eggs OB are displayed by scrolling the screen. Among the three eggs OB displayed, the egg OB at the center of the screen is in a selected state and enclosed by a frame M indicating the selected state.

As shown in FIG. 2B, a character list screen which is a list of the characters set for the egg is displayed on the display 12A. The characters set for the egg which is in a selected state on the egg selection screen (i.e., characters which may be created from the egg) are listed in attribute units. The name of the character which has been created and is possessed by the player is displayed, and the name of the character which is not possessed by the player is not displayed (indicated by “???” in FIG. 2B). This allows the player to easily determine whether or not the player possesses each character.

The player selects the desired egg on the egg selection screen. The player inputs melody sound through the microphone 14 by producing a sound or playing a musical instrument according to a countdown instruction displayed on the display 12A, for example. The portable game device 1 then performs a specific analysis process for the input sound, and causes the character corresponding to the processing results to be created from the selected egg. Note that the character is not necessarily created depending on the input sound.

When the player succeeds in creating the character, a character creation production screen is displayed which produces creation of the character for a specific period of time prior to creation of the character. FIG. 3 is a view showing an example of the character creation production screen. As shown in FIG. 3, the selected egg OB and a number of spherical particles P are displayed on the character creation production screen.

The particle P indicates the character to be created. Different types (three types in FIG. 3) of particles P (P1 to P3) are displayed in combination. Each particle P has an identical shape, but differs in color depending on the type. Each particle P is displayed using animation techniques so that the particle P generated from the egg OB moves around the egg OB and is diffused. Each particle P disappears after a specific period of time (about a few seconds) has expired.

The numbers of respective particles P displayed change with the passage of time. Specifically, the numbers of respective particles P displayed are almost the same when the display of the character creation production screen starts, and gradually change (increase/decrease) with the passage of time. One type of particles P (i.e., particles P of a color corresponding to the attribute of the character to be created) are mainly displayed just before the display of the character creation production screen ends, and the numbers of the remaining two types of particles P displayed are reduced to a large extent. A change in the number of particles P displayed is controlled by changing the number of particles P generated.

When a specific period of time has expired after the display of the character creation production screen has started and the character creation production screen has disappeared, a character creation screen of the character to be created is displayed. FIG. 4 is a view showing an example of the character creation screen. As shown in FIG. 4, a state in which a new character CH is created from the selected egg OB is displayed on the character creation screen. The created character CH is added to the possessed characters, and the number of eggs of the selected type is decremented (reduced) by one.

<Functional Configuration>

FIG. 5 is a block diagram showing a functional configuration of the portable game device 1. In FIG. 5, the portable game device 1 is functionally configured to include an operation input section 100, a sound input section 200, a processing section 300, an image display section 400, a sound output section 500, a communication section 600, and a storage section 700.

The operation input section 100 receives an operation instruction input from the player, and outputs an operation signal corresponding to the operation to the processing section 300. The function of the operation input section 100 is implemented by a button switch, a lever, a dial, a mouse, a keyboard, various sensors, and the like. In FIG. 1, the operation button 15 and the touch panel integrally formed in the display 12B correspond to the operation input section 100.

The sound input section 200 collects sound such as voice input by the player, and outputs a sound signal corresponding to the collected sound to the processing section 300. The function of the sound input section 200 is implemented by a microphone or the like. In FIG. 1, the microphone 14 corresponds to the sound input section 200.

The processing section 300 controls the entire portable game device 1 and performs various calculations such as proceeding with the game and generating an image. The function of the processing section 300 is implemented by a calculation device such as a CPU (CISC or RISC) or an ASIC (e.g. gate array) and its control program, for example. In FIG. 1, the CPU provided in the control device 17 corresponds to the processing section 300.

The processing section 300 includes a game calculation section 310 which mainly performs game calculations, an image generation section 330 which generates a game image based on various types of data calculated by the game calculation section 310, and a sound generation section 340 which generates game sound such as effect sound and background music (BGM).

The game calculation section 310 performs various game processes based on the operation signal input from the operation input section 100, the sound signal input from the sound input section 200, a program and data read from the storage section 700, and the like. In this embodiment, the game calculation section 310 includes a character creation control section 320, and realizes the breeding game by performing a game process based on a game program 710.

The character creation control section 320 includes a creation probability determination section 321 and a creation production section 322, and performs a process relating to the creation of a character. Specifically, the character creation control section 320 refers to possessed item data 731, and causes the image display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs currently possessed by the player, as shown in FIG. 2A, for example.

The possessed item data 731 is data relating to the currently possessed items. FIG. 6 shows an example of the data configuration of the possessed item data 731. As shown in FIG. 6, the possessed item data 731 includes possessed egg data 731a and possessed character data 731d. The possessed egg data 731a is data relating to the possession of eggs, in which an egg type 731b and a possession count 731c are stored while being associated with each other. The possessed character data 731a is data relating to the possession of characters, in which a character type 731f and a possession count 731g are stored while being associated with each other in units of attributes 731e of characters.

The character creation control section 320 refers to a character setting table 732, and causes the image display section 400 to display the character list screen in which a list of the characters corresponding to the egg selected on the egg selection screen is displayed, as shown in FIG. 2B, for example.

The character setting table 732 is a data table relating to the characters set for each egg. FIG. 7 shows an example of the data configuration of the character setting table 732. As shown in FIG. 7, the character setting table 732 is provided for each egg type. The character setting table 732 stores a corresponding egg type 732a, and stores a plurality of character types 732c associated with each attribute 732b.

When the egg has been selected on the egg selection screen, the character creation control section 320 performs a specific countdown display and the like, and starts to record the input sound. Specifically, the character creation control section 320 converts the sound input from the sound input section 200 into a digital signal, and stores the digital signal in the storage section 700 as input sound data 721. After completion of recording, the creation probability determination section 321 performs a specific analysis process for the input sound data 721, and determines the creation probability of each of three candidate attributes as the attributes of creation candidate characters based on the processing results.

Specifically, the creation probability determination section 321 detects notes within a specific octave range (e.g. three octaves) from the input sound data 721 at specific time intervals (e.g. intervals of ⅛ seconds). The notes are within a specific octave range (e.g. three octaves) of which one octave includes “do”, “do#”, “re”, “re#”, “mi”, “fa”, “fa#”, “sol”, “sol#”, “la”, “la#”, “ti” (12 notes in total). These 12 notes are also called note names.

The note detection results are stored as detected note data 722. FIG. 8 shows an example of the detected note data 722. As shown in FIG. 8, the presence or absence of detection of each note 722b is stored as the detected note data 722 in units of detection time 722a. In FIG. 8, “O” indicates that the note is detected, and “x” indicates that the note is not detected.

The creation probability determination section 321 determines the maximum level (sound intensity) of the detected notes. The creation probability determination section 321 excludes the note which is included in the detected note data 722 and does not satisfy a specific level condition from the detected notes.

The level condition is stored as level condition data 733. FIG. 9 shows an example of the data configuration of the level condition data 733. As shown in FIG. 9, the level condition of the note with respect to the maximum level of the detected note is stored as the level condition data 733.

The creation probability determination section 321 determines the detected notes in the detected note data 722 in units of detection time t. When five or more adjacent notes have been detected, the creation probability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 10(1), six adjacent notes from “mi” to “la” are detected at the time tn, for example. The creation probability determination section 321 excludes all notes including these six notes from the detected notes so that silence occurs at the time tn, as shown in FIG. 10(2).

The creation probability determination section 321 calculates the total count (detection total count) of each note detected based on the detected note data 722. The creation probability determination section 321 counts the notes having the same name as those same notes irrespective of the octave. The creation probability determination section 321 sums up the detection total count of each note to calculate the detection total count of all the notes. The creation probability determination section 321 sums up the detection total count of each note provided with sharp “#” (black-key note) to calculate the detection total count of all the black-key notes. The black-key notes include “do#”, “re#”, “fa#”, “sol#”, and “la#” (five notes in total).

The calculated detection total count is stored as note detection total count data 741. FIG. 11 shows an example of the data configuration of the note detection total count data 741. As shown in FIG. 11, a note 741a and a detection total count 741b are stored as the note detection total count data 741 while being associated with each other. A detection total count 741c of all the notes and a detection total count 741d of all the black-key notes are also stored as the note detection total count data 741.

The creation probability determination section 321 determines the start (input timing) of the detected note based on the detected note data 722. Specifically, when each note in the detected note data 722 satisfies one of the following conditions A1 to A3, the creation probability determination section 321 determines that note to be the start.

Condition A1: the note has not been detected at the preceding detection time t−1 and is not detected at the subsequent detection time t+1.

Condition A2: the note has not been detected at the preceding detection time t−1 but is detected at the subsequent detection time t+1, and the level of the note detected at the detection time t+1 is higher than the level of the note detected at the present detection time t.
Condition A3: the note has not been detected at the preceding detection time t−1 but is detected at the subsequent detection time t+1, and the level of the note detected at the subsequent detection time t+1 is lower than the level of the note detected at the present detection time t.

The start determination results are stored as a start note data 723. FIG. 12 shows an example of the start note data 723. As shown in FIG. 12, the start note data 723 indicates whether or not each note 723b is the start note in units of detection time 723a in the same manner as the detected note data 722. In FIG. 12, “O” indicates that the note is the start note, and “x” indicates that the note is not the start note.

The creation probability determination section 321 combines the detected note data 722 of three octaves within one octave in note name units to obtain note-name-unit detection data 724. As shown in FIG. 13, the creation probability determination section 321 creates the note-name-unit detection data 724 of one octave by combining the notes having the same name as those same notes in units of detection time t irrespective of the octave.

The creation probability determination section 321 calculates the number of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of notes is seven or more, the creation probability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 14(1), eight notes “do”, “do#”, “mi”, “fa”, “fa#”, “la”, “la#”, and “ti” are detected at the time tn, for example. The creation probability determination section 321 excludes all notes at the time tn including these eight notes from the detected notes, as shown in FIG. 14(2).

Likewise, the creation probability determination section 321 combines the start note data 723 within one octave in note name units to generate note-name-unit start data 725.

The creation probability determination section 321 then determines whether or not a chord is formed in the note-name-unit detection data 724. The term “chord” used herein refers to a combination (note set) of predetermined notes, such as a major chord and a minor chord. The creation probability determination section 321 determines formation of different types of chords. As shown in FIG. 15, the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 in units of detection time t, and calculates the formation count in chord units. In this case, the creation probability determination section 321 determines formation of one chord at each detection time t. Likewise, the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 in units of detection time t, and calculates the formation count in chord units. The creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count.

The calculated formation count is stored as chord formation count data 742. FIG. 16 shows an example of the data configuration of the chord formation count data 742. As shown in FIG. 16, a chord determination order 742a, a chord 742b, and a formation count 742c are stored as the chord formation count data 742 while being associated with one another. The determination order 742 is set so that the order of a four-note chord made up of four notes is higher than the order of a three-note chord made up of three notes. The formation count 742c includes the formation count of each of the note-name-unit detection data 724 and the note-name-unit start data 725 and the total value.

The creation probability determination section 321 determines formation of each chord according to the determination order specified by the chord formation count data 742. Specifically, the creation probability determination section 321 determines whether or not each chord is formed in the note-name-unit detection data 724 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. Likewise, the creation probability determination section 321 determines whether or not each chord is formed in the note-name-unit start data 725 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. The creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to obtain the total formation count.

The creation probability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group according to a chord classification table 734.

The chord classification table 734 is a data table which defines the classification of chords. FIG. 17 shows an example of the data configuration of the chord classification table 734. As shown in FIG. 17, a group 734a and a chord 734b are stored in the chord classification table 734 while being associated with each other. The group 734a is classified into four groups (A) to (D).

The scores of the groups (A) to (D) are stored as score data 743. FIG. 18 shows an example of the data configuration of the score data 743. As shown in FIG. 18, a group 743a and a score 743b are stored as the score data 743 while being associated with each other. The group 743a is classified into six groups (A) to (F). In this example, the scores of only the groups (A) to (D) are determined, and the scores of the groups (E) and (F) are set at “0”.

The creation probability determination section 321 determines whether or not the scores of the groups (A) to (D) satisfy the following condition B.

Condition B: the score of at least one of the groups (A) to (D) is “5” or more, and the scores of three or more groups are “1” or more.

When the condition B is satisfied, the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (D). The creation probability determination section 321 refers to a group/attribute correspondence table 735, and sets the attributes corresponding to the selected groups to be first to third attributes which are attributes of creation candidate characters in the order from the attribute with the highest score.

The group/attribute correspondence table 735 is a data table which defines the correspondence between the groups (A) to (D) and the attributes of the characters. FIG. 19 shows an example of the data configuration of the group/attribute correspondence table 735. As shown in FIG. 19, a group 735a and an attribute 735b of a character are stored in the group/attribute correspondence table 735 while being associated with each other.

In the example shown in FIG. 18, the first attribute is “earth” corresponding to the group (C) with the highest score, the second attribute is “water” corresponding to the group (D) with the second highest score, and the third attribute is “fire” corresponding to the group (A) with the third highest score.

The creation probability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each of the selected groups. Specifically, the creation probability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group. In the example shown in FIG. 18, the score of the group (C) corresponding to the first attribute “earth” is “27”, the score of the group (D) corresponding to the second attribute “water” is “23”, and the score of the group (A) corresponding to the third attribute “fire” is “10”. The creation probability of the first attribute “earth” is 45% (=27/60(=27+23+10)), the creation probability of the second attribute “water” is 38% (=23/60), and the creation probability of the third attribute “fire” is 17% (=10/60).

The determined creation probability of each attribute is stored as determined creation probability data 744. FIG. 20 shows an example of the data configuration of the determined creation probability data 744. As shown in FIG. 20, an attribute 744a and a creation probability 744b are stored as the determined creation probability data 744 while being associated with each other. The attribute 744a includes the first to third attributes. The creation probability 744b is set so that the total value is 100%.

Specifically, when the condition B is satisfied, characters with the attributes corresponding to the groups (A) to (D) (i.e., “fire”, “wind”, “earth”, and “water”) are set to be creation candidate characters (selected candidate characters), and the character to be created is selected from these characters.

When the scores of the groups (A) to (D) do not satisfy the condition B, the creation probability determination section 321 determines the first to third attributes and the creation probabilities as follows. When the scores of all of the groups (A) to (D) are “0” (i.e., no chord is formed), the creation probability determination section 321 determines that the creation of the character has failed, and does not determine the creation probability.

Specifically, when the condition B is not satisfied since the scores of the groups (A) to (D) are less than “5”, the creation probability determination section 321 determines the scores of the groups (E) and (F) referring to a score setting table 736 based on the detection total count of all the black-key notes based on the detected note data 722.

FIG. 21 shows an example of the data configuration of the score setting table 736. As shown in FIG. 21, a ratio 736a of the detection total count of all the black-key notes to the detection total count of all the notes and a score 736b of each of the groups (E) and (F) are stored in the score setting table 736 while being associated each other.

The creation probability determination section 321 refers to the note detection total count data 741, and calculates the ratio of the detection total count of all the black-key notes to the detection total count of all the notes (set note content). The creation probability determination section 321 sets the scores associated with the calculated detection total count ratios in the score setting table 736 to be the scores of the groups (E) and (F). The creation probability determination section 321 selects three groups with higher scores from the groups (A) to (F), and sets the attributes corresponding to the three selected groups to be the first to third attributes in the order from the attribute with the highest score referring to the group/attribute correspondence table 735. The creation probability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each selected group. Specifically, the creation probability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group.

When the condition B is not satisfied since the number of groups included in the groups (A) to (D) and having a score of “1” or more is less than three, the creation probability determination section 321 determines the first to third attributes and the creation probabilities based on the detection total count of each note based on the detected note data 722.

Specifically, when the number of groups included in the groups (A) to (D) and having a score of “1” or more is two, the creation probability determination section 321 determines one of the groups with a higher score to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735. The creation probability determination section 321 determines the attribute corresponding to the other group to be the second attribute.

When the number of groups included in the groups (A) to (F) and having a score of “1” or more is one, the creation probability determination section 321 determines that group to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735. The creation probability determination section 321 sets the sum of the detection total count of each note belonging to each of note classification groups (a) to (f) according to a note classification table 737.

The note classification table 737 is a data table which defines the classification of notes. FIG. 22 shows an example of the data configuration of the note classification table 737. As shown in FIG. 22, a group 737a, a note 737b, and an attribute 737c of a character are stored in the note classification table 737 while being associated with one another. The group 737a is classified into six groups (a) to (f). Each of the groups (a) to (f) is associated with two notes.

The calculated sum of the detection total count of each note of each group is stored as note classification data 745. FIG. 23 shows an example of the data configuration of the note classification data 745. As shown in FIG. 23, a group 745a and a detection total count 745b are stored as the note classification data 745 while being associated with each other.

The creation probability determination section 321 refers to the note detection total count data 741, and calculates the sum of the detection total count of each note belonging to each of the groups (a) to (f). The creation probability determination section 321 determines one of the groups (a) to (f) having the largest sum of the detection total counts, and determines the attribute corresponding to the determined group to be the second attribute. When the attribute corresponding to the group having the largest note count coincides with the first attribute, the creation probability determination section 321 determines the attribute corresponding to the group having the second largest note count to be the second attribute.

After determining the first and the second attributes, the creation probability determination section 321 determines the third attribute based on the possessed characters. Specifically, the creation probability determination section 321 refers to the character setting data 732 corresponding to the type of egg selected for causing the character to be created and the possessed character data 731d, and calculates the possession ratio of the number of characters possessed by the player and having each attribute to the total number of characters in character's attribute units. The creation probability determination section 321 determines the attribute of which the calculated possession ratio is the smallest to be the third attribute.

After determining the first to third attributes, the creation probability determination section 321 determines the creation probability of each of the first to third attributes referring to a creation probability setting table 746.

FIG. 24 shows an example of the data configuration of the creation probability setting table 746. As shown in FIG. 24, a score 746a of the first group and a creation probability 746b of each of the first to third attributes are stored in the creation probability setting table 746 while being associated with each other.

The creation probability determination section 321 determines the creation probability associated with the score of the first group (i.e., one of the groups (a) to (f) corresponding to the first attribute) in the creation probability setting table 746 to be the creation probability of each of the first to third attributes.

Specifically, when the condition B is not satisfied, characters with the attributes corresponding to the groups (A) to (F) (i.e., “fire”, “wind”, “earth”, “water”, “light”, and “darkness”) are set to be creation candidate characters, and the character to be created is selected from these characters.

When the creation probability determination section 321 has determined the creation probability of the character in attribute units based on the chord detected from the input sound, the character creation control section 320 determines the character to be created according to the creation probability of each attribute determined by the creation probability determination section 321. Specifically, the character creation control section 320 determines the attribute of the character to be created from the first to third attributes according to the creation probability. The character creation control section 320 refers to the possessed character data 731d and the character setting data 732 corresponding to the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created.

The creation production section 322 then performs a creation production process of producing the creation of the determined character. Specifically, the creation production section 322 determines the color and the generation percentage of each of the first to third particles as three types of particles P to be displayed. The first to third particles respectively correspond to the first to third attributes. The generation percentage refers to the percentage of the number of respective particles generated in the total generation count which is the total number of first to third particles generated. Specifically, since the respective particles P have the same life (about a few seconds), the number of respective particles P displayed is proportional to the percentage of the respective particles P generated.

The creation production section 322 refers to the determined creation probability data 744 and an attribute/color correspondence table 751, and determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively.

The attribute/color correspondence table 751 is a data table which defines the correspondence between the attribute of a character and a color. FIG. 25 shows an example of the data configuration of the attribute/color correspondence table 751. As shown in FIG. 25, an attribute 751b of a character and a color 751b are stored in the attribute/color correspondence table 751 while being associated with each other.

The creation production section 322 determines an initial generation percentage which is the generation percentage when the character creation production starts, an intermediate generation percentage which is the generation percentage during the character creation production, and a final generation percentage which is the generation percentage when the character creation production ends as the generation percentage of the respective particles. Specifically, the initial generation percentages of the first to third particles are set at 33%. The creation probabilities of the first to third attributes are respectively set as the intermediate generation percentages of the first to third particles. The final generation percentage of the particle corresponding to the attribute of the character to be created is set at 90%, and the final generation percentages of the remaining particles are set at 5%.

The determined colors and generation percentages of the respective particles P are stored as particle data 752. FIG. 26 shows an example of the data configuration of the particle data 752. As shown in FIG. 26, a particle 752a, a color 752b, and a generation percentage 752c are stored as the particle data 752 while being associated with one another. The particle 752a is classified as the first to third particles. The generation percentage 752c includes the initial generation percentage, the intermediate generation percentage, and the final generation percentage.

The creation production section 322 then generates generation percentage control data 754 for controlling generation of the respective particles based on the determined generation percentages of the respective particles.

FIG. 27 shows an example of the generation percentage control data 754. FIG. 27 shows the generation percentages of the respective particles P with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation percentage). As shown in FIG. 27, the generation percentages of the first to third particles are 33% (initial generation percentage) at the character creation production start time t0. The generation percentage is gradually changed (increased/decreased) so that the generation percentage is set at the intermediate generation percentage at the time t1 during the character creation production and is set at the final generation percentage at the finish time t2.

The creation production section 322 causes the image display section 400 to display the character creation production screen in which the selected egg and the respective particles are displayed, as shown in FIG. 3, and causes the sound output section 500 to output specific production sound, for example. The creation production section 322 starts controlling the respective particles in the character creation production screen according to the total generation count control data 753 and the generation percentage control data 754.

The total generation count control data 753 is data for controlling the total generation count which is the sum of the numbers of respective particles generated. FIG. 28 shows an example of the total generation count control data 753. FIG. 28 shows the total generation count N with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation count N). As shown in FIG. 28, the total generation count N is constant at a total generation count N1 from the character creation production start time to the time t1, and is gradually increased so that the total generation count N reaches a predetermined total generation count N2 at the finish time t2.

Specifically, the creation production section 322 determines the total generation count N at the present time from the total generation count control data 753 in units of a specific period of time, and determines the generation percentage of each of the first to third particles from the generation percentage control data 754. The creation production section 322 determines the generation count of each of the first to third particles by multiplying the total generation count N by the generation percentage of the respective particles, and generates the particles P in the determined generation count. The creation production section 322 causes the particles P to disappear (to be deleted) when the specific life has expired.

The creation production section 322 controls the movement of each particle P currently displayed. Specifically, the creation production section 322 sets a moving force field acting on the particle P with the position of the egg being the generation base score (center). The moving force field is set as a positive (+) force field which acts to draw the particle P toward the center of the generation base score or a negative (−) force field which acts to move the particle P away from the generation base score. The creation production section 322 moves each particle P according to the external force corresponding to the distance from the force field base score applied by the moving force field and the initial speed applied to each particle P. For example, when a velocity vector in the direction which rotates around the generation base score of the moving force field is specified as the initial speed, each particle P moves so that each particle P is diffused or drawn while rotating around the egg OB.

When a specific period of time predetermined for the character creation production has expired, the creation production section 322 finishes displaying the character creation production screen, and displays the character creation screen in which the character to be created is displayed, as shown in FIG. 4, for example. The creation production section 322 adds the created character to the possessed characters to update the possessed character data 731d, and decrements the eggs of the selected type by one to update the possessed egg data 731a.

In FIG. 5, the image generation section 330 generates a game image for displaying a game screen based on the calculation results from the game calculation section 310, and outputs an image signal of the generated image to the image display section 400. The image display section 400 displays the game screen based on the image signal from the image generation section 330 while redrawing the screen of one frame every 1/60 second, for example. The function of the image display section 400 is implemented by hardware such as a CRT, an LCD, an ELD, a PDP, or an HMD. In FIG. 1, the displays 12A and 12B correspond to the image display section 400.

The sound generation section 340 generates game sound such as effect sound and BGM used during the game, and outputs a sound signal of the generated game sound to the sound output section 500. The sound output section 500 outputs the game sound such as effect sound and BGM based on the sound signal from the sound generation section 340. The function of the sound output section 500 is implemented by a speaker or the like. In FIG. 1, the speaker 13 corresponds to the sound output section 500.

The communication section 600 communicates data with an external device such as another portable game device 1 according to the control signal from the processing section 300. The function of the communication section 600 is implemented by a wireless communication module, a jack for a communication cable, a control circuit, or the like. In FIG. 1, the wireless communication device 18 corresponds to the communication section 600.

The storage section 700 stores a system program for implementing the function for causing the processing section 300 to integrally control the portable game device 1, a program and data necessary for causing the processing section 300 to execute the game, and the like. The storage section 700 is used as a work area for the processing section 300, and temporarily stores the results of calculations performed by the processing section 300 according to various programs, data input from the operation input section 100, and the like. The function of the storage section 700 is implemented by an IC memory, a hard disk, a CD-ROM, a DVD, an MO, a RAM, a VRAM, or the like. In FIG. 1, the ROM, the RAM, and the like provided in the control device 17 correspond to the storage section 700.

The storage section 700 also stores the game program 710 for causing the processing section 300 to function as the game calculation section 310, and game data. The game program 710 includes a character creation program 711 for causing the processing section 300 to function as the character creation control section 320. The game data includes the input sound data 721, the detected note data 722, the start note data 723, the note-name-unit detection data 724, the note-name-unit start data 725, the possessed item data 731, the character setting table 732, the level condition data 733, the chord classification table 734, the group/attribute correspondence table 735, the score setting table 736, the note classification table 737, the note detection total count data 741, the chord formation count data 742, the score data 743, the determined creation probability data 744, the note classification data 745, the creation probability setting table 746, the attribute/color correspondence table 751, the particle data 752, the total generation count control data 753, and the generation percentage control data 754.

<Process Flow>

FIG. 29 is a flowchart illustrative of the flow of the game process according to this embodiment. This process is implemented by causing the game calculation section 310 to execute the process based on the game program 710.

As shown in FIG. 29, the game calculation section 310 controls the process of a known breeding game according to the operation input from the operation input section 100 and the like (step A1). When the player has acquired a new egg (step A3: YES), the game calculation section 310 adds the acquired egg to the possessed eggs, and updates the possessed egg data 731a (step A5). When causing a character to be created (step A7: YES), the character creation control section 320 performs a character creation process (step A9).

FIG. 30 is a flowchart illustrative of the flow of the character creation process.

As shown in FIG. 30, the character creation control section 320 refers to the possessed egg data 731a, and causes the image display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs possessed by the player. The character creation control section 320 refers to the character setting table 732 corresponding to the egg selected on the egg selection screen, and causes the image display section 400 to display the character list screen which is a list of the characters set for the selected egg. The character creation control section 320 selects one egg from the eggs possessed by the player according to the operation input from the operation input section 100 (step B1).

When the egg has been selected, the character creation control section 320 performs a sound input process of allowing the player to input melody sound by performing a specific countdown display and the like, and storing sound data input from the sound input section 200 as the input sound data 721 (step B3). The creation probability determination section 321 then performs a creation probability determination process based on the input sound data 721 (step B5).

FIG. 31 is a flowchart illustrative of the flow of the creation probability determination process.

As shown in FIG. 31, the creation probability determination section 321 detects each note within a specific octave from the input sound data 721, and generates the detected note data 722 (step C1). The creation probability determination section 321 performs a filtering process for the detected note data 722 (step C3).

FIG. 32 is a flowchart illustrative of the flow of the filtering process.

As shown in FIG. 32, the creation probability determination section 321 determines the maximum level of the detected notes based on the detected note data 722 (step D1). The creation probability determination section 321 refers to the level condition data 733, and excludes any note in the detected note data 722 which does not satisfy the specific level condition from the detected notes (step D3). The creation probability determination section 321 determines the detected notes in the detected note data 722 in units of detection time t. When five or more adjacent notes have been detected, the creation probability determination section 321 excludes all notes at the time t from the detected notes (step D5).

The creation probability determination section 321 calculates the detection total count of each note in the detected note data 722, and sums up the calculated detection total count of each note to calculate the detection total count of all the notes (step D7). The creation probability determination section 321 sums up the detection total count of each black-key note in the detected note data 722 to calculate the detection total count of all the black-key notes (step D9).

The creation probability determination section 321 combines the detected note data 722 within one octave in note name units to generate the note-name-unit detection data 724 (step D11). The creation probability determination section 321 calculates the number of types of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of types of notes is seven or more, the creation probability determination section 321 excludes all notes at the time t from the detected notes (step D13).

The creation probability determination section 321 determines the start of each detected note in the detected note data 722 to generate the start note data 723 (step D15). The creation probability determination section 321 combines the generated start note data 723 within one octave in note name units to generate the note-name-unit start data 725 (step D17).

The creation probability determination section 321 thus completes the filtering process.

As shown in FIG. 31, after the completion of the filtering process, the creation probability determination section 321 refers to the note detection total count data 741, and determines the detection total count of all the notes in the detected note data 722. When the detection total count of all the notes is 10 or more (step C5: YES), the creation probability determination section 321 performs a chord formation determination process to determine the scores of the groups (A) to (D) (step C7).

FIG. 33 is a flowchart illustrative of the flow of the chord formation determination process.

As shown in FIG. 33, the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 at each detection time t, and calculates the formation count in chord units (step E1). Likewise, the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 at each detection time t, and calculates the formation count in chord units (step E3).

The creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count (step E5). The creation probability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group (step E7).

The creation probability determination section 321 thus completes the chord formation determination process.

As shown in FIG. 31, after the completion of the chord formation determination process, the creation probability determination section 321 determines the score of each of the groups (A) to (D). When the score of at least one of the groups (A) to (D) is 5 or more (step C9: YES), and the scores of three or more groups are 1 (step C11: YES), the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (D) (step C13). The creation probability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C15), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C17). The creation probability determination section 321 determines that the character is successfully created (step C19).

When the score of at least one of the groups (A) to (D) is 5 or more (step C9: YES), and the number of groups with a score of 1 or more is less than three (step C11: NO), the creation probability determination section 321 performs a group count shortage process, and determines the first to third attributes and the creation probabilities (step C19).

FIG. 34 is a flowchart illustrative of the flow of the group count shortage process.

As shown in FIG. 34, the creation probability determination section 321 determines one of the groups (A) to (F) with the highest score to be the first group, and determines the attribute corresponding to the first group to be the first attribute (step F1).

The creation probability determination section 321 determines the scores of the groups (A) to (F). When the number of groups included in the groups (A) to (F) and having a score of 1 or more is one (step F3: YES), the creation probability determination section 321 sums up the detection total count of each note in the detected note data 722 belonging to each of the note classification groups (a) to (f) referring to the note detection total count data 741 (step F5).

The creation probability determination section 321 selects one of the groups (a) to (f) having the largest detection total count, and determines whether or not the attribute corresponding to the selected group coincides with the first attribute. When the selected group does not coincide with the first attribute (step F7: YES), the creation probability determination section 321 sets the attribute corresponding to the selected group to be the second attribute (step F9). When the selected group coincides with the first attribute (step F7: NO), the creation probability determination section 321 selects one of the groups (a) to (f) with the second highest score, and sets the attribute corresponding to the selected group to be the second attribute (step F11).

When the number of groups included in the groups (A) to (F) and having a score of 1 or more is two in the step F3 (step F3: NO), the creation probability determination section 321 sets the attribute corresponding to one of the groups (A) to (F) with the second highest score to be the second attribute (step F13).

The creation probability determination section 321 refers to the character setting table 732 corresponding to the type of the selected egg and the possessed item data 731, and determines the attribute with the minimum possession rate from the attribute of each character set for the egg of the selected type (step F15). The creation probability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute. When the determined attribute does not coincide with the first or second attribute (step F17: YES), the creation probability determination section 321 sets the determined attribute to be the third attribute (step F21). When the determined attribute coincides with the first or second attribute (step F17: NO), the creation probability determination section 321 determines the attribute of the character with the second smallest possession rate (step F19). The creation probability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute in the step F17 (step F17).

The creation probability determination section 321 refers to the score setting table 736, and determines the creation probability of each of the first to third attributes according to the score of the first group (step F23).

The creation probability determination section 321 thus completes the group count shortage process.

In FIG. 31, after the completion of the group count shortage process, the creation probability determination section 321 determines that the character is successfully created (step C29).

When the scores of the groups (A) to (D) are less than 5 (step C9: NO), the creation probability determination section 321 determines the number of types of detected notes in the detected note data 722 referring to the note detection total count data 741. When the number of types of detected notes is less than two (step C21: NO), the creation probability determination section 321 determines that the character is not successfully created (step C31).

When the number of types of detected notes is two or more (step C21: YES), the creation probability determination section 321 refers to the note detection total count data 741, and determines the scores of the groups (E) and (F) referring to the score setting table 736 based on the detection total count of all the black-key notes in the detected note data 722 (step C23). The creation probability determination section 321 then determines the score of each of the groups (A) to (F). When the scores of three or more of the groups (A) to (F) are 1 or more (step C25: YES), the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (F) (step C27). The creation probability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C15), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C17). The creation probability determination section 321 determines that the character is successfully created (step C29).

When the number of groups (A) to (F) with a score of 1 or more is less than three (step C25: YES), the creation probability determination section 321 performs the group count shortage process, and determines the first to third attributes and the creation probabilities (step C19). The creation probability determination section 321 determines that the character is successfully created (step C29).

The creation probability determination section 321 thus completes the creation probability determination process.

In FIG. 30, after the completion of the creation probability determination process, the character creation control section 320 determines whether or not the character is successfully created. When the character is successfully created (step B7: YES), the character creation control section 320 determines the character to be created according to the type of the selected egg and the creation probability of each attribute determined. Specifically, the character creation control section 320 determines the attribute of the character to be created according to the creation probability of each attribute determined. The character creation control section 320 refers to the character setting table 732 corresponding to the type of the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created (step B9). The character creation control section 320 decrements (reduces) the eggs of the selected type by one to update the possessed egg data 731a (step B11). The creation production section 322 then performs a character creation production process (step B13).

FIG. 35 is a flowchart illustrative of the flow of the character creation production process.

As shown in FIG. 35, the creation production section 322 determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively (step G1).

The creation production section 322 determines the generation percentage of each of the first to third particles. Specifically, the creation production section 322 sets the initial generation percentage of each of the first to third particles at the same value (33%) (step G3). The creation production section 322 sets the creation probabilities of the first to third attributes as the intermediate generation percentages of the first to third particles, respectively (step G5). The creation production section 322 sets the final generation percentage of the particle corresponding to the attribute of the character to be created at 90%, and sets the final generation percentages of the remaining particles at 5% (step G7). The creation production section 322 refers to the particle data 752, and generates the generation percentage control data 754 according to the generation percentage of each of the first to third particles (step G9).

The creation production section 322 causes the image display section 400 to display the character creation production screen in which the egg of the selected type and the respective particles are disposed, and starts controlling each particle in the character creation screen according to the total generation count control data 753 and the generated generation percentage control data 754 (step G11). When a specific period of time has expired after displaying the character creation production screen (step G13: YES), the creation production section 322 finishes displaying the character creation production screen, and causes the image display section 400 to display the character creation screen displaying a state in which the character is created (step G15). The creation production section 322 adds the created character to the possessed characters to update the possessed character data 731d (step G17).

The creation production section 322 thus completes the character creation production process.

When the character is not successfully created in the step B7 in FIG. 30 (step B7: NO), the character creation control section 320 performs a character creation failure production process such as causing the image display section 400 to display the creation failure screen showing that the character is not successfully created, or causing the sound output section 500 to output sound (step B15).

The character creation control section 320 thus completes the character creation process.

In FIG. 29, after the completion of the character creation process, the game calculation section 310 determines whether or not to finish the game. When the game calculation section 310 does not finish the game (step A11: NO), the game calculation section 310 transitions to the step A1. When the game calculation section 310 has determined to finish the game (step A11: YES), the game calculation section 310 finishes the game process to finish the game.

<Effects>

According to this embodiment, whether or not a chord is formed is determined from the input sound in units of detection time t at specific time intervals, the first to third attributes are determined as the attributes of the creation candidate characters from the attributes of the characters based on the determined formation count in chord units, and the creation probability of each of the first to third attributes is determined. The character with one of the first to third attributes determined according to the creation probability is created and added to the possessed characters.

Specifically, since the character corresponding to the type of chord included in the input sound is created depending on the probability, the character to be created differs even if the same melody is input, whereby the player can enjoy the game. The creation probability of each of the first to third attributes is determined by the formation count of the corresponding chord. Specifically, when the number of specific chords is large, the character with the attribute associated with the specific chord is created with a high probability. The chord is an important element which determines the tone of the melody. Therefore, the player can enjoy estimating the character to be created from the tone of the input melody.

<Modification>

The embodiments to which the invention can be applied are not limited to the above-described embodiments. Various modifications and variations may be made without departing from the spirit and scope of the invention.

(A) Detection Total Count of Black-Key Note

In the above-described embodiments, the characters with the attributes “light” and “darkness” respectively corresponding to the groups (E) and (F) are included in the attributes of the creation candidate characters by determining the scores of the groups (E) and (F) according to the ratio of the detection total count of all the black-key notes to the detection total count of all the notes. The characters with the attributes “light” and “darkness” may be included in the creation candidate characters according to the detection total count of all the black-key notes, for example. Specifically, when the detection total count of all the black-key notes is equal to or greater than a first specific number and less than a second specific number, the character with the attribute “light” or “darkness” is included in the creation candidate characters. When the detection total count of all the black-key notes is equal to or greater than the second specific number, the characters with the attributes “light” and “darkness” are included in the creation candidate characters, for example. Note that the second specific number is greater than the first specific number.

(B) Particle P

In the above-described embodiments, the particle P is spherical. Note that the particle P may have another shape such as a triangle, a quadrangle, or a line. A state in which the spherical particles P are mixed may be displayed as a cloud or smoke.

The display state such as the size, shape, or brightness of each particle P may be changed with the passage of time. In this case, it is desirable that the color of the particle P not be changed because the color of the particle P indicates the corresponding attribute.

In the above-described embodiments, the number of respective particles P is changed by generating the particles P or causing the particles P to disappear. Note that the total number of particles P may be constant without generating the particles P or causing the particles P to disappear, and the ratio of the numbers of respective particles P may be changed by changing the color of each particle P. In the above-described embodiments, each particle P has the same life. Note that the ratio of the numbers of respective particles P may be changed by changing the life of each particle P depending on the type.

(C) Scale

The above-described embodiments have been described taking an example of a Western music scale (e.g. do, re, mi, fa, sol, la, ti, and do). Note that the invention can also be applied to other scales.

(D) Determination of Character to be Created

In the above-described embodiments, a character randomly selected from the characters corresponding to the attribute determined based on the input sound is created. Note that the character to be created may be selected based on the date (date and time). Specifically, the selection probability of each time zone obtained by dividing one day (24 hours) into a plurality of time zones (e.g. morning, daytime, and night) is set for each character. The character to be created is selected according to the selection probability corresponding to the time zone corresponding to the time at which the melody sound is input among the selection probabilities in time zone units set for each character corresponding to the determined attribute. The selection probability of each season obtained by dividing one year (365 days) into a plurality of seasons (e.g. spring, summer, autumn, and winter) may be set instead of the time zone, and the character to be created may be selected according to the selection probability corresponding to the date at which the melody sound is input. This allows the character to be created to be changed corresponding to the date at which the melody sound is input.

Alternatively, the time zone may be associated with the character instead of the selection probability, and the character may be created which corresponds to the time zone corresponding to the time at which the melody sound is input.

(E) Character Creation Timing

In the above-described embodiments, the character is created after performing creation production of displaying the particle P corresponding to the attribute of each creation candidate character. Note that the character may be created during creation production, the character may be created at the same time as creation production, or creation production may be performed after (immediately after) creating the character.

(F) Attribute

In the above-described embodiments, the attribute is set in advance for each character as the parameter by which each character is classified. Note that the capability parameter of each character may be employed such as offensive power, defensive power, or witchcraft.

(G) Applicable Game Device

The above-described embodiments illustrate the case of applying the invention to the portable game device. Note that the invention can also be applied to other devices which can execute a game, such as a consumer game device, an arcade game device, and a portable telephone.

(H) Applicable Game

The above-described embodiments illustrate the case of applying the invention to the breeding game. Note that the invention can also be applied to other games in which a character appears, such as a role-playing game.

Although only some embodiments of the invention have been described above in detail, those skilled in the art would readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of the invention. Accordingly, such modifications are intended to be included within the scope of the invention.

Claims

1. A game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:

detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
selecting a game character caused to appear based on the detection result;
causing the selected game character to appear; and
controlling display of each game character including the new game character.

2. The game process control method as defined in claim 1,

wherein the note sets are associated in advance with the game characters;
wherein the method further comprises determining selection candidate characters including at least the game character corresponding to the detected note set; and
wherein the game character caused to appear is selected from the determined selection candidate characters.

3. The game process control method as defined in claim 2,

wherein the note set included in the input sound is detected at given time intervals; and
wherein the selection candidate characters are determined based on a detection total count of each of the note sets detected.

4. The game process control method as defined in claim 3, wherein the selection candidate character corresponding to the note set with a larger detection total count is selected as the game character caused to appear with a higher probability.

5. The game process control method as defined in claim 2,

wherein the game character determined to be the selection candidate character is associated in advance corresponding to a set note content which is a percentage of a predetermined note in the input sound;
wherein the method further comprises determining the set note content of the input sound input to the sound input section; and
wherein the game character corresponding to the determined set note content is determined to be included in the selection candidate characters.

6. The game process control method as defined in claim 2, further comprising:

detecting whether or not a set note which is a note set in advance is included in the input sound input to the sound input section at given time intervals;
wherein a special character is determined to be included in the selection candidate characters when a detection total count of the detected set note has reached a specific number.

7. The game process control method as defined in claim 1,

wherein the game character is associated in advance with each of a plurality of time conditions obtained by dividing a period in which the input sound may be input by date and/or time; and
wherein the game character corresponding to the time condition satisfied by an input time of the input sound from the sound input section is selected as the game character caused to appear.

8. The game process control method as defined in claim 1, further comprising:

detecting an input timing of each note included in the input sound input to the sound input section;
wherein the note set is detected which includes the notes input at the same input timing.

9. The game process control method as defined in claim 1, wherein the note set included in the input sound input to the sound input section is detected in note name units.

10. The game process control method as defined in claim 1, further comprising:

subjecting the input sound to a filtering process by detecting only the notes included in the input sound input to the sound input section and having a specific intensity;
wherein the note set is detected using the input sound subjected to the filtering process as the input sound input to the sound input section.

11. The game process control method as defined in claim 10, wherein the filtering process includes causing a portion of the input sound input to the sound input section in which a specific number or more of notes are input at the same time to be silent.

12. A computer-readable information recording medium storing a program for causing a computer to execute the game process control method as defined in claim 1.

13. A game device comprising:

a sound input section;
a note set detection section which detects a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
a character selection section which selects a game character caused to appear based on the detection result of the note set detection section; and
a character appearance control section which causes the game character selected by the character selection section to appear.
Patent History
Publication number: 20080058101
Type: Application
Filed: Aug 27, 2007
Publication Date: Mar 6, 2008
Applicant: NAMCO BANDAI GAMES INC. (TOKYO)
Inventor: Yoshikazu Hato (Yokohama-shi)
Application Number: 11/892,789
Classifications
Current U.S. Class: Audible (463/35); Data Storage Or Retrieval (e.g., Memory, Video Tape, Etc.) (463/43)
International Classification: A63F 9/24 (20060101);