COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN DRAWING PROCESSING PROGRAM, AND INFORMATION PROCESSING APPARATUS

- NINTENDO CO., LTD.

First, a designated position on a display screen is continuously obtained based on a designation performed by a pointing device. Next, it is detected that a sound which satisfies a predetermined condition is inputted to sound input means. Then, while it is detected that the sound which satisfies the predetermined condition is inputted, predetermined drawing-related processing is executed at a position based on the obtained designated position.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATION

The disclosure of Japanese Patent Application No. 2009-000443, filed on Jan. 5, 2009, is incorporated herein by reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture, and more particularly, to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture by using a pointing device such as a touch panel.

2. Description of the Background Art

Conventionally, there has been known an apparatus for editing an image by a user operating a touch pen to perform an input to a touch panel (for example, Japanese Laid-Open Patent Publication No. 2003-191567, and Japanese Laid-Open Patent Publication No. 2006-129257). Such an apparatus is capable of, by using the touch pen, editing (e.g., drawing graffiti on) an image obtained by shooting an object (user) itself. In addition, at this time, the thickness or the line type of a pen can also be selected.

However, an apparatus as described above has the following problem. In the above apparatus, drawing is performed while an input is being performed by the touch pen with respect to the touch panel. Therefore, the user can perform an operation as if the user performs drawing on paper by using a pen. On the other hand, such an operation is commonplace, and therefore, does not give sufficient freshness to the user.

SUMMARY OF THE INVENTION

Therefore, an object of the present invention is to provide a drawing processing program and an information processing apparatus which enable drawing to be performed through a nonconventional and novel way of operation.

The present invention has the following features to achieve the objects mentioned above. Note that reference numerals, supplementary explanations, and the like in the parentheses indicate an example of the correspondence relationship with an embodiment described below in order to aid in understanding the present invention and are not intended to limit, in any way, the scope of the present invention.

A first aspect is a computer-readable storage medium having stored therein a drawing processing program which is executed by a computer of an information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the drawing processing program causing the computer to function as designated position detection means (S22), sound detection means (S51), and drawing-related processing execution means (S56). The designated position input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device. Here, upon obtaining the designated position, if, for example, the pointing device is a touch panel, a touch operation with respect to the touch panel corresponds to a designation operation. Alternatively, if the pointing device is an operation device including shooting means for shooting a shooting target and is capable of, based on shooting information obtained by the shooting means, designating any position on a screen, pressing down a predetermined button provided to the operation device while the position is being designated corresponds to a designation operation. The sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means. Here, a determination using the predetermined condition may be a determination using a predetermined threshold value. That is, the predetermined condition may be such that when a sound having a magnitude of certain level is inputted, it is determined that a sound is inputted, or may be such that specified sound determination means described later determines that a predetermined sound is inputted. The drawing-related processing execution means executes, while the sound detection means detects that the sound which satisfies the predetermined condition is inputted, predetermined drawing-related processing on a position based on the designated position obtained by the designated position detection means. Here, the drawing-related processing includes processing of drawing a line (straight line or curved line), a dot, or an image formed by a collection of the line or the dot on the display screen, and in addition, includes processing of working upon (editing) an image or the like which has been already drawn, and processing of erasing an image or the like which has been already displayed.

According to the first aspect, a painting program providing a novel way of operation can be provided.

In a second aspect based on the first aspect, the drawing-related processing execution means changes a content of the drawing-related processing to be executed, based on the sound detected by the sound detection means, and in accordance with a characteristic of the sound. Here, the characteristic of the sound is, for example, a volume, a frequency, a tone, or the like.

According to the second aspect, since a content of the drawing-related processing to be executed is changed depending on a content of an inputted sound, a novel way of enjoyment can be provided to the player.

In a third aspect based on the second aspect, the drawing-related processing execution means sequentially changes a content of the drawing-related processing to be executed, in a coordinated manner with changes in chronological order in the characteristic of the sound repeatedly detected by the sound detection means.

According to the third aspect, since a content of the drawing-related processing is changed in real time in accordance with a change in an inputted sound, a novel way of enjoyment can be provided to the player. Further, while drawing is performed based on the position detected by the designated position detection means, the content of the drawing-related processing is changed in accordance with an input from the sound input means which is means other than the designated position detection means. Therefore, the player can change the content of the drawing-related processing while the player continues to designate a position for drawing.

In a fourth aspect based on the second aspect, sound analysis means executes the predetermined drawing-related processing only when a volume of the inputted sound is equal to or larger than a predetermined threshold value.

According to the fourth aspect, since the drawing-related processing is executed only when the volume of the inputted sound is larger than a certain degree, a novel way of enjoyment can be provided to the player.

In a fifth aspect based on the second aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing a line which connects, in chronological order, the position based on the designated position sequentially obtained by the input coordinate detection means.

According to the fifth aspect, since a handwritten image can be drawn on the display screen through a novel way of operation, a novel way of enjoyment can be provided to the player.

In a sixth aspect based on the fifth aspect, the sound analysis means changes at least one of a thickness of the line and a density of a color in which the line is drawn, in accordance with a volume of the inputted sound.

According to the sixth aspect, since the thickness or the density of the line to be drawn is changed in accordance with the volume of the inputted sound, a novel way of enjoyment can be provided to the player.

In a seventh aspect based on the second aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing one or more dots in a drawing range which is a predetermined range including therein the position based on the designated position.

According to the seventh aspect, for example, a feeling of performing drawing by using a spray, together with a novel way of operation, can be provided to the player.

In an eighth aspect based on the second aspect, the sound analysis means changes at least one of a size of the drawing range and a number of the dots to be drawn in the drawing range, in accordance with a volume of the inputted sound.

In a ninth aspect based on the eighth aspect, the drawing-related processing execution means draws the dots such that an area density of the number of the dots which are nearer to the position based on the designated position is higher, and that an area density of the number of the dots which are farther from the position based on the designated position is lower.

In a tenth aspect based on the eighth aspect, the drawing-related processing execution means draws the dots at random positions in the drawing area.

According to the eighth to the tenth aspects, various types of drawing can be performed in accordance with the volume of the inputted sound.

In an eleventh aspect based on the seventh aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of moving the dots drawn on the display screen in a predetermined direction, based on the position based on the designated position, and the sound input detected by the sound detection means.

In a twelfth aspect based on the eleventh aspect, the drawing-related processing execution means includes movement content calculation means for calculating: a direction of a line connecting each of the dots displayed on the display screen, with a reference point which is the position based on the designated position; and a distance from the reference point to each of the dots displayed on the display screen. In addition, the drawing-related processing execution means moves the dots displayed on the screen, based on the direction and the distance calculated by the movement content calculation means.

According to the eleventh and twelfth aspects, dots which have been already drawn can be moved by using a sound input, and thereby a novel way of enjoyment can be provided to the player.

In a thirteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as sound effect reproduction means (S60) for causing predetermined sound output means to output a predetermined sound effect when the drawing-related processing execution means is executing the predetermined drawing-related processing.

In a fourteenth aspect based on the thirteenth aspect, the sound effect reproduction means changes a volume at which the sound effect is reproduced, in accordance with a characteristic of the sound detected by the sound detection means.

According to the thirteenth and fourteenth aspects, the player can intuitively recognize whether or not the drawing-related processing is being executed.

In a fifteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as cursor display means (S57) and animation display means (S57). The cursor display means displays a predetermined cursor image at the designated position. The animation display means animates the cursor when the drawing-related processing execution means is executing the predetermined drawing-related processing.

In a sixteenth aspect based on the fifteenth aspect, the animation display means changes a speed of the animation in accordance with a characteristic of the sound detected by the sound detection means.

According to the fifteenth and sixteenth aspects, the player can visually recognize whether or not the drawing-related processing is being executed.

In a seventeenth aspect based on the first aspect, the pointing device is a touch panel.

According to the seventeenth aspect, an intuitive way of operation can be provided to the player.

In an eighteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as shot image obtaining means (S1), and shot image display means (S21). The shot image obtaining means obtains image data of an image shot by predetermined shooting means. The shot image display means displays, on the display screen, the shot image. In addition, the drawing-related processing execution means executes the drawing-related processing on the shot image.

According to the eighteenth aspect, editing of an image with respect to a shot image, or the like can be provided together with a novel way of operation.

In a nineteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as specified sound determination means for determining whether or not the sound detected by the sound detection means is a predetermined sound. In addition, the drawing-related processing execution means executes the drawing-related processing only when the specified sound determination means determines that the sound detected by the sound detection means is the predetermined sound.

According to the nineteenth aspect, it becomes possible to execute the drawing-related processing by identifying a specified sound such as a sound of the player blowing breath, and thereby a novel way of enjoyment can be provided to the player. In addition, it becomes possible to execute the drawing-related processing only when the player utters a specified sound, whereby the drawing-related processing can be prevented from being executed in response to a sound involuntarily uttered.

A twentieth aspect is an information processing apparatus capable of using a pointing device (13) for designating a position on a display screen (12), and of using sound input means (42), the information processing apparatus comprising input coordinate detection means (31), sound detection means (31), and drawing-related processing execution means (31). The input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device. The sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means. The drawing-related processing execution means executes, while the sound detection means detects that the sound is inputted, predetermined drawing-related processing on a position based on the designated position.

According to the twentieth aspect, the same effect as that of the drawing processing program of the above aspects can be obtained.

In a twenty-first aspect based on the twentieth aspect, the sound detection means is placed in proximity of the display screen.

According to the twenty-first aspect, since the display screen and the sound detection means are placed at positions which are close to each other, it becomes possible to provide an effect of intuitive rendering, which is, for example, an effect of, when a sound is uttered toward the screen, performing drawing on the display screen in response to the sound.

According to the above aspects, a painting program and a painting game which allow the player to enjoy drawing through a novel way of operation can be provided.

These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is an external view of a hand-held game apparatus 10 according to one embodiment of the present invention;

FIG. 2 is a block diagram of the hand-held game apparatus 10 according to the one embodiment of the present invention;

FIG. 3 shows an example of a screen of a game assumed in the present embodiment;

FIG. 4 shows an example of a shot image;

FIG. 5 shows an example of the screen of the game assumed in the present embodiment;

FIG. 6 shows a relationship between the shot image and a canvas;

FIG. 7 shows an example of the screen of the game assumed in the present embodiment;

FIG. 8 is a drawing for illustrating an operation in the game assumed in the present embodiment;

FIG. 9 shows an example of an image drawn in the game assumed in the present embodiment;

FIG. 10 shows an example of the screen of the game assumed in the present embodiment;

FIG. 11 shows an example of the screen of the game assumed in the present embodiment;

FIG. 12 shows an example of the screen of the game assumed in the present embodiment;

FIG. 13 is an illustrative diagram showing a memory map of a main memory 32 shown in FIG. 2;

FIG. 14 shows an example of a data configuration of a drawing tool master 327;

FIG. 15 shows an example of a data configuration of a spray table 332;

FIG. 16 is a drawing for illustrating a drawing area;

FIG. 17 is a flowchart showing graffiti game processing according to the present embodiment of the present invention;

FIG. 18 is a flowchart showing the detail of the camera processing shown in step S1 in FIG. 17;

FIG. 19 is a flowchart showing the detail of graffiti processing shown in step S2 in FIG. 17;

FIG. 20 is a flowchart showing the detail of pen processing shown in step S29 in FIG. 19;

FIG. 21 is a flowchart showing the detail of spray drawing processing shown in step S43 in FIG. 20;

FIG. 22 is a flowchart showing the detail of eraser processing shown in step S31 in FIG. 19;

FIG. 23 is a flowchart showing the detail of spray eraser processing shown in step S73 in FIG. 22;

FIG. 24 is a drawing for illustrating the processing outline of blow-off processing;

FIG. 25 is a drawing for illustrating the processing outline of the blow-off processing;

FIG. 26 is a drawing for illustrating the processing outline of the blow-off processing;

FIG. 27 is a flowchart showing the detail of the blow-off processing; and

FIG. 28 shows an example of a spray line formed by a plurality of colors.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

The following will describe embodiments of the present invention with reference to the drawings. The present invention is not limited to the embodiments.

FIG. 1 is an external view of a game apparatus 1 which executes a color conversion program according to the present invention. Here, as an example of the game apparatus 1, a hand-held game apparatus is shown. The game apparatus 1 has a camera, and thus functions as a shooting apparatus for shooting an image with the camera, displaying the shot image on a screen, and saving data of the shot image.

As shown in FIG. 1, the game apparatus 1 is a foldable hand-held game apparatus and is shown in a state (opened state) where the game apparatus 1 is opened. The game apparatus 1 is configured so as to have a size which allows a user to hold the game apparatus 1 with both hands or one hand even in the state where the game apparatus 1 is opened.

The game apparatus 1 includes a lower housing 11 and an upper housing 21. The lower housing 11 and the upper housing 21 are connected to each other such that the game apparatus 1 can be opened or closed (folded). In the example of FIG. 1, the lower housing 11 and the upper housing 21 are each formed in a plate-like shape of a horizontally long rectangle, and connected to each other rotatably around long-side portions thereof. Usually, the user uses the game apparatus 1 in the opened state. When not using the game apparatus 1, the user keeps the game apparatus 1 in a closed state. In the example shown in FIG. 1, in addition to the closed state and the opened state, the game apparatus 1 is capable of maintaining an angle between the lower housing 11 and the upper housing 21 at any angle ranging between the closed state and the opened state by frictional force generated at a connection portion, and the like. In other words, the upper housing 21 can be stationary at any angle with respect to the lower housing 11.

In the lower housing 11, a lower LCD (Liquid Crystal Display) 12 is provided. The lower LCD 12 has a horizontally long shape, and is located such that a long-side direction thereof corresponds to a long-side direction of the lower housing 11. Note that although an LCD is used as a display device provided in the game apparatus 1 in the present embodiment, any other display devices such as a display device using an EL (Electro Luminescence), and the like may be used. In addition, the game apparatus 1 can use a display device of any resolution. Note that although details will be described later, the lower LCD 12 is used mainly for displaying, in real time, an image to be shot by an inner camera 23 or an outer camera 25.

In the lower housing 11, operation buttons 14A to 14K, and a touch panel 13 are provided as input devices. As shown in FIG. 1, among the operation buttons 14A to 14K, the direction input button 14A, the operation button 14B, the operation button 14C, the operation button 14D, the operation button 14E, the power button 14F, the start button 14G, and the select button 14H are provided on an inner main surface of the lower housing 11 which is located inside when the upper housing 21 and the lower housing 11 are folded. The direction input button 14A is used, for example, for a selection operation, and the like. The operation buttons 14B to 14E are used, for example, for a determination operation, a cancellation operation, and the like. The power button 14F is used for turning on/off the power of the game apparatus 1. In the example shown in FIG. 1, the direction input button 14A and the power button 14F are provided on the inner main surface of the lower housing 11 and to one of the left and the right (to the left in FIG. 1) of the lower LCD 12 provided in the vicinity of a center of the inner main surface of the lower housing 11. Further, the operation buttons 14B to 14E, the start button 14G, and the select button 14H are provided on the inner main surface of the lower housing 11 and to the other one of the left and the right (to the right in FIG. 1) of the lower LCD 12. The direction input button 14A, the operation buttons 14B to 14E, the start button 14G, and the select button 14H are used for performing various operations with respect to the game apparatus 1.

Note that the operation buttons 14I to 14K are omitted in FIG. 1. For example, the L button 14I is provided at a left end portion of an upper side surface of the lower housing 11, and the R button 14J is provided at a right end portion of the upper side surface of the lower housing 11. The L button 14I and the R button 14J are used, for example, for performing a shooting instruction operation (shutter operation) with respect to the game apparatus 1. In addition, the volume button 14K is provided on a left side surface of the lower housing 11. The volume button 14K is used for adjusting the volume of speakers of the game apparatus 1.

In addition, the game apparatus 1 further includes the touch panel 13 as an input device other than the operation buttons 14A to 14K. The touch panel 13 is mounted on the lower LCD 12 so as to cover a screen of the lower LCD 12. Note that in the present embodiment, for example, a resistive film type touch panel is used as the touch panel 13. However, the touch panel 13 is not limited to the resistive film type, and any press-type touch panel may be used. In addition, in the present embodiment, the touch panel 13 having the same resolution (detection accuracy) as that of the lower LCD 12 is used, for example. However, the resolutions of the touch panel 13 and the lower LCD 12 may not necessarily be the same as each other. In addition, in a right side surface of the lower housing 11, an insertion opening (indicated by a dotted line in FIG. 1) is provided. The insertion opening is capable of accommodating a touch pen 27 which is used for performing an operation with respect to the touch panel 13. Note that although an input with respect to the touch panel 13 is usually performed by using the touch pen 27, a finger of the user, instead of the touch pen 27, can be used for operating the touch panel 13.

In the right side surface of the lower housing 11, an insertion opening (indicated by a two-dot chain line in FIG. 1) is formed for accommodating a memory card 28. Inside the insertion opening, a connector (not shown) is provided for electrically connecting the game apparatus 1 to the memory card 28. The memory card 28 is, for example, an SD (Secure Digital) memory card, and detachably mounted to the connector. The memory card 28 is used, for example, for storing (saving) an image shot by the game apparatus 1, and for loading an image generated by other apparatuses into the game apparatus 1.

Further, in the upper side surface of the lower housing 11, an insertion opening (indicated by a chain line in FIG. 1) is formed for accommodating a cartridge 29. Also inside the insertion opening, a connector (not shown) is provided for electrically connecting the game apparatus 1 to the cartridge 29. The cartridge 29 is a storage medium storing the color conversion program, a game program, and the like, and is detachably mounted in the insertion opening provided in the lower housing 11.

Three LEDs 15A to 15C are mounted to a left side part of the connection portion where the lower housing 11 and the upper housing 21 are connected to each other. Here, the game apparatus 1 is capable of performing wireless communication with another apparatus, and the first LED 15A is lit up while the power of the game apparatus 1 is ON. The second LED 15B is lit up while the game apparatus 1 is being charged. The third LED 15C is lit up while wireless communication is established. Thus, the three LEDs 15A to 15C can notify the user of a state of ON/OFF of the power of the game apparatus 1, a state of charge of the game apparatus 1, and a state of communication establishment of the game apparatus 1.

On the other hand, in the upper housing 21, an upper LCD 22 is provided. The upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21. Note that similarly to the lower LCD 12, a display device which is of any other type or has any other resolution may be used instead of the upper LCD 22. Note that a touch panel may be provided so as to cover the upper LCD 22. For example, on the upper LCD 22, an operation illustration screen is displayed for teaching roles of the operation buttons 14A to 14K and the touch panel 13 to the user.

In addition, in the upper housing 21, two cameras (the inner camera 23 and the outer camera 25) are provided. As shown in FIG. 1, the inner camera 23 is mounted in an inner main surface of the upper housing 21 and at the vicinity of the connection portion. On the other hand, the outer camera 25 is mounted in a surface opposite to the inner main surface in which the inner camera 23 is mounted, namely, in an outer main surface of the upper housing 21 (which is a surface located on the outside of the game apparatus 1 in the closed state, and which is a back surface of the upper housing 21 shown in FIG. 1). Note that in FIG. 1, the outer camera 25 is indicated by a dashed line. Thus, the inner camera 23 is capable of shooting an image in a direction in which the inner main surface of the upper housing 21 faces, and the outer camera 25 is capable of shooting an image in a direction opposite to the shooting direction of the inner camera 23, namely, in a direction in which the outer main surface of the upper housing 21 faces. As described above, in the present embodiment, the two cameras 23 and 25 are provided such that the shooting directions thereof are opposite to each other. For example, the user can shoot a view seen from the game apparatus 1 toward the user with the inner camera 23 as well as a view seen from the game apparatus 1 in a direction opposite to a direction toward the user with the outer camera 25.

Note that in the inner main surface of the upper housing 21 and at the vicinity of the connection portion, a microphone (a microphone 42 shown in FIG. 2) is accommodated as a sound input device. In the inner main surface of the upper housing 21 and at the vicinity of the connection portion, a microphone hole 16 is formed to allow the microphone 42 to detect a sound from outside the game apparatus 1. The position at which the microphone 42 is accommodated and the position of the microphone hole 16 are not necessarily in the connection portion, and, for example, the microphone 42 may be accommodated in the lower housing 11 and the microphone hole 16 may be formed in the lower housing 11 so as to correspond to the accommodating position of the microphone 42.

In addition, in the outer main surface of the upper housing 21, a fourth LED 26 (indicated by a dashed line in FIG. 1) is mounted. The fourth LED 26 is lit up at a time when shooting is performed by the outer camera 25 (when the shutter button is pressed). In addition, the fourth LED 26 is lit up while a moving picture is shot by the outer camera 25. The fourth LED 26 can notify a person to be shot and people around the person that shooting has been performed (is being performed) by the game apparatus 1.

In addition, sound holes 24 are formed in the inner main surface of the upper housing 21 and to the left and right of the upper LCD 22 provided in the vicinity of a center of the inner main surface of the upper housing 21. The speakers are accommodated in the upper housing 21 and at the back of the sound holes 24. The sound holes 24 are holes for releasing a sound from the speakers to the outside of the game apparatus 1 therethrough.

As described above, the inner camera 23 and the outer camera 25 which are configurations for shooting an image, and the upper LCD 22 which is display means for displaying, for example, an operation illustration screen upon shooting are provided in the upper housing 21. On the other hand, the input devices for performing an operation input with respect to the game apparatus 1 (the touch panel 13 and the buttons 14A to 14K), and the lower LCD 12 which is display means for displaying a game screen are provided in the lower housing 11. Thus, when using the game apparatus 1, the user can hold the lower housing 11 and perform an input with respect to the input device while viewing a shot image (an image shot by the camera) displayed on the lower LCD 12.

The following will describe an internal configuration of the game apparatus 1 with reference to FIG. 2. Note that FIG. 2 is a block diagram showing an example of the internal configuration of the game apparatus 1.

As shown in FIG. 2, the game apparatus 1 includes electronic components including a CPU 31, a main memory 32, a memory control circuit 33, a storage data memory 34, a preset data memory 35, a memory card interface (memory card I/F) 36, a cartridge I/F 44, a wireless communication module 37, a local communication module 38, a real time clock (RTC) 39, a power circuit 40, an interface circuit (I/F circuit) 41, and the like. These electronic components are mounted on an electronic circuit substrate and accommodated in the lower housing 11 (or may be accommodated in the upper housing 21).

The CPU 31 is information processing means for executing a predetermined program. In the present embodiment, the predetermined program is stored in a memory (e.g., the storage data memory 34) in the game apparatus 1, or in the memory cards 28 and/or 29, and the CPU 31 executes later-described graffiti processing by executing the predetermined program. Note that a program executed by the CPU 31 may be stored in advance in a memory in the game apparatus 1, may be obtained from the memory card 28 and/or the cartridge 29, or may be obtained from another apparatus by means of communication with the other apparatus. For example, the program may be downloaded and obtained from a predetermined server via the Internet, or a predetermined program stored in a stationary game apparatus may be downloaded and obtained by performing communication with the stationary game apparatus.

The main memory 32, the memory control circuit 33, and the preset data memory 35 are connected to the CPU 31. In addition, the storage data memory 34 is connected to the memory control circuit 33. The main memory 32 is storage means used as a work area and a buffer area of the CPU 31. In other words, the main memory 32 stores various data used in the graffiti processing, and stores a program obtained from the outside (the memory cards 28 and 29, another apparatus, or the like). In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is used as the main memory 32. The storage data memory 34 is storage means for storing a program executed by the CPU 31, data of an image shot by the inner camera 23 and the outer camera 25, and the like. The storage data memory 34 is constructed of a nonvolatile storage medium, which is, in the present embodiment, a NAND flash memory, for example. The memory control circuit 33 is a circuit for controlling reading of data from the storage data memory 34 or writing of data to the storage data memory 34 in accordance with an instruction from the CPU 31. The preset data memory 35 is storage means for storing data (preset data) of various parameters which are set in advance in the game apparatus 1, and the like. A flash memory connected to the CPU 31 via an SPI (Serial Peripheral Interface) bus can be used as the preset data memory 35.

The memory card I/F 36 is connected to the CPU 31. The memory card I/F 36 reads data from the memory card 28 mounted to the connectors or writes data to the memory card 28, in accordance with an instruction from the CPU 31. In the present embodiment, data of images shot by the outer camera 25 is written to the memory card 28, and image data stored in the memory card 28 is read from the memory card 28 to be stored in the storage data memory 34.

The cartridge I/F 44 is connected to the CPU 31. The cartridge I/F 44 reads out data from the cartridge 29 mounted to the connector or writes data to the cartridge 29, in accordance with an instruction from the CPU 31. In the present embodiment, an application program which can be executed by the information processing apparatus 10 is read out from the cartridge 29 to be executed by the CPU 31, and data associated with the application program (e.g., saved data in a game) is written to the cartridge 29.

Note that the graffiti game program according to the present invention may be supplied to a computer system not only from an external storage medium such as the cartridge 29, but also via a wired or wireless communication line. In addition, the graffiti game program may be stored in advance in a nonvolatile storage unit in the computer system. Note that an information storage medium for storing the color conversion program is not limited to the above nonvolatile storage unit, and may be a CD-ROM, a DVD, or an optical disc-shaped storage medium similar to them.

The wireless communication module 37 has a function of connecting to a wireless LAN, for example, by a method conformed to the standard of IEEE802.11.b/g. The local communication module 38 has a function of wirelessly communicating with a game apparatus of the same type by a predetermined communication method. The wireless communication module 37 and the local communication module 38 are connected to the CPU 31. The CPU 31 is capable of receiving data from and transmitting data to another apparatus via the Internet by using the wireless communication module 37, and capable of receiving data from and transmitting data to another game apparatus of the same type by using the local communication module 38.

In addition, the RTC 39 and the power circuit 40 are connected to the CPU 31. The RTC 39 counts time, and outputs the time to the CPU 31. For example, the CPU 31 is capable of calculating current time (date) and the like, based on the time counted by the RTC 39. The power circuit 40 controls electric power from a power supply (typically, a battery accommodated in the lower housing 11) of the game apparatus 1 to supply the electric power to each electronic component of the game apparatus 1.

In addition, the game apparatus 1 includes the microphone 42 and an amplifier 43. The microphone 42 and the amplifier 43 are connected to the I/F circuit 41. The microphone 42 detects a voice produced by the user toward the game apparatus 1, and outputs a sound signal indicating the voice to the I/F circuit 41. The amplifier 43 amplifies the sound signal from the I/F circuit 41, and causes the speakers (not shown) to output the sound signal. The I/F circuit 41 is connected to the CPU 31.

In addition, the touch panel 13 is connected to the I/F circuit 41. The I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the amplifier (the speakers), and a touch panel control circuit for controlling the touch panel 13. The sound control circuit performs A/D conversion or D/A conversion with respect to the sound signal, and converts the sound signal into sound data in a predetermined format. The touch panel control circuit generates touched position data in a predetermined format, based on a signal from the touch panel 13, and outputs the touched position data to the CPU 31. For example, the touched position data is data indicating coordinate of a position at which an input is performed with respect to an input surface of the touch panel 13. Note that the touch panel control circuit reads a signal from the touch panel 13 and generates touched position data, once every predetermined time period. The CPU 31 is capable of recognizing a position at which an input is performed with respect to the touch panel 13 by obtaining the touched position data via the I/F circuit 41.

An operation button 14 includes the above operation buttons 14A to 14K, and is connected to the CPU 31. The operation button 14 outputs, to the CPU 31, operation data indicating an input state with respect to each of the buttons 14A to 14K (whether or not each button is pressed). The CPU 31 obtains the operation data from the operation button 14, and executes processing in accordance with an input with respect to the operation button 14.

The inner camera 23 and the outer camera 25 are connected to the CPU 31. Each of the inner camera 23 and the outer camera 25 shoots an image in accordance with an instruction from the CPU 31, and outputs data of the shot image to the CPU 31. In the present embodiment, the CPU 31 gives a shooting instruction to the inner camera 23 or the outer camera 25, and the camera which has received the shooting instruction shoots an image and transmits image data to the CPU 31.

In addition, the lower LCD 12 and the upper LCD 22 are connected to the CPU 31. Each of the lower LCD 12 and the upper LCD 22 displays an image thereon in accordance with an instruction from the CPU 31.

Next, referring to FIG. 3 to FIG. 12, the outline of an application assumed in the present embodiment will be described. A game assumed in the present embodiment is a painting application which allows the player to draw a picture by using the touch pen 27. FIG. 3 is an example of a screen of the game assumed in the present embodiment. As shown in FIG. 3, a game screen is displayed on the lower LCD 12, a toolbar 103 is displayed at the top of the game screen, and a canvas 101, which covers most of the game screen, is displayed under the toolbar 103. On the toolbar, a drawing tool icon 111, a line-type icon 112, and the like are displayed. In addition, FIG. 3 shows a state in which the canvas 101 is being touched by the touch pen 27, and a cursor 102 is displayed at the touched position on the canvas 101. In the application, the player can draw a picture on the canvas 101 by moving the touch pen 27 on the canvas 101, and the present invention provides a novel way of operation of the drawing operation, as described later.

In addition, in the application, the player can enjoy drawing graffiti on an image shot by the outer camera (or image shot by the inner camera 23) of the game apparatus 1. For example, when an image shown in FIG. 4 is shot by the outer camera 25, the shot image is placed as a “base picture” in the area of the canvas 101 so as to overlap with the canvas 101, whereby the player can enjoy drawing graffiti on the shot image as shown in FIG. 5.

FIG. 6 is a schematic view showing a positional relationship between the shot image and the canvas 101. The concept of the application is that two layers of a base-picture layer 105 and a handwriting layer 106 are used, and the shot image corresponds to the base-picture layer 105. The canvas 101 corresponds to the handwriting layer 106. The handwriting layer 106 is, as it were, a transparent sheet, and conceptually, processing in which the transparent sheet is overlapped on the shot image and graffiti is drawn on the sheet is performed. In other words, processing in which the shot image is directly edited (graffiti is directly drawn on the shot image) is not performed in the present embodiment.

Next, a drawing operation with respect to the canvas 101 in the application will be described. As described above, in the application, a picture can be drawing by moving the touch pen 27 on the canvas 101. Here, in the application, two types, i.e., a “pen” and an “eraser” can be used as types of drawing tools used for drawing in a game. The “pen” is a tool for drawing something on the canvas, and the “eraser” is a tool for erasing a content drawn on the canvas 101. Upon using the “pen” or the “eraser”, a line of uniform thickness, or a “spray” can be selected as a type (line type) of a drawn line. In the application, upon selection of the drawing tools, the “pen” or the “eraser” can be selected by operating the drawing tool icon 111 on the toolbar 103. The line type of the selected tool can be selected by operating the line-type icon 112. Specifically, use of the line of uniform thickness or use of the “spray” can be selected. At this time, the thickness of the line of uniform thickness can also be designated, and of the four icons of the line-type icon 112, left three icons represent the respective thicknesses. In addition, of the four icons, the rightmost icon on which a picture of a propeller is displayed represents the “spray”. In addition, when the “pen” is selected as a drawing tool, the drawing color (color of the line or the spray) can also be designated.

Next, an operation performed when the “pen” is used as a drawing tool will be described. When the “pen” is selected and the line of uniform thickness is selected as the line-type, the line of uniform thickness can be drawn at a touched position as shown in FIG. 3 and FIG. 5. Here, in the case of the pen, a line is drawn at the same time as touch is performed. Specifically, after the player operates the toolbar to select the “pen” and select the line of uniform thickness as the line-type, when the player brings the touch pen 27 into contact with the canvas 101 (touch panel 13), a touch input is detected, and at the same time, a dot (when the touch pen 27 is not being moved) or a line segment (when the touch pen 27 is being moved on the canvas 101) is drawn at the touched position.

On the other hand, when the “spray” is selected as the line-type, although a “spray line” as described later can be drawn at a touched position, the drawing is not performed by only touching the touch panel, unlike in the case of using the line of uniform thickness. Hereinafter, referring to FIG. 7 to FIG. 10, a drawing operation performed when the “spray” is selected as the line type will be described. First, the player operates the toolbar 103 to select the “pen” as a drawing tool. Specifically, every time the player touches the drawing tool icon 111, the drawing tool switches to “pen”→“eraser”→“pen” . . . (at this time, an image content of the drawing tool icon 111 also switches between an image of a pen tip and an image of an eraser). Then, after selecting the “pen”, the player touches the rightmost icon of a propeller image among the four icons of the line-type icon 112, and thereby can select the “spray” as the line-type. Thereafter, when the player touches a desired position on the canvas 101, the cursor 102 whose image represents a propeller is displayed at the touched position as shown in FIG. 7. However, as of this moment, nothing is drawn on the canvas 101 (in the case of using the “pen”, at least a “dot” is drawn at the touched position as of this moment). Therefore, in this state, even if the player moves the touch pen 27 while touching the canvas 101 with the touch pen 27, nothing is drawn on the canvas 101.

In this state, in order to perform drawing on the canvas 101, the player performs an operation of blowing breath on the cursor 102, as shown in FIG. 8. Then, an animation of a rotating propeller is displayed as the cursor 102, and an image (image made of a collection of multiple dots) which looks like sprayed ink is displayed at the touched position, as shown in FIG. 9. Then, by moving the touch pen 27 while blowing breath on the cursor 102, a line (hereinafter, referred to as spray line) made of a collection of multiple dots, as it were, a line which looks like a line drawn by spray is drawn on a trajectory obtained by moving the touch pen 27, as shown in FIG. 10. Thus, in the case of the “spray”, an operation of blowing breath on the cursor 102 which is of propeller type is performed, and drawing can be performed as if ink is sprayed on the canvas 101.

Moreover, in the application, the thickness and the density of the spray line change in accordance with the strength at which the player blows breath. For example, when the player weakly blows breath, the spray line which is thin and dilute (has the reduced number of dots) as shown in FIG. 11 can be drawn. When the player strongly blows breath, the spray line which is thick and dense (has the increased number of dots) as shown in FIG. 12 can be drawn. In addition, how strongly the player blows is reflected in the spray line in real time. For example, when the player desires to draw one spray line, if the player strongly blows at the beginning of the drawing and thereafter, the strength at which the player blows is gradually decreased, the spray line is drawn such that the increased number of dots are drawn when the drawing begins, as shown in FIG. 12, and that the number of dots is gradually decreased with the progression of the drawing. In addition, in the application, when such a spray line is drawn, a spraying sound is reproduced as sound effect.

Here, the outline (principle) of drawing processing of the spray line performed in the present embodiment will be described. As shown in FIG. 8, when the player blows on the cursor 102 (touch panel 13) while touching the canvas 101 with the touch pen 27, a sound produced by the player blowing breath is inputted to the microphone 42. In the application, the volume of a sound (hereinafter, referred to as microphone input sound) inputted to the microphone 42 is detected, the thickness and the like of the spray line are determined in accordance with the magnitude of the detected volume, and the spray line is drawn on the canvas 101. That is, in the present embodiment, in the case of using the “spray”, when the two types of inputs, i.e., a “touch input” to the touch panel 13 and a “sound input” to the microphone 42 are performed at the same time, the drawing processing of the spray line is executed. In addition, the volume of the sound effect reproduced when the spray line is drawn is varied in accordance with the magnitude of the detected microphone input sound.

Next, of the drawing tools described above, the “eraser” will be described. The case where the player operates the toolbar 103 to select the eraser as the drawing tool, and to select the line of uniform thickness as the “line type”, will be described. In this case, when the player touches a position on the canvas 101, the cursor 102 which is of eraser type is displayed. An operation performed in this case conforms with that performed in the case where the “pen” is selected and the line of uniform thickness is selected, and a line (uniform line and spray line) drawn at the touched position can be erased. Next, the case where the “spray” is selected as the “line-type” will be described. In this case, when the player touches a position on the canvas 101, the cursor 102 which is of propeller type is displayed as in the case where the “pen” is selected and the “spray” is selected as the line type. Then, when the player blows on the cursor 102, the spray line or the line of uniform thickness drawn within a predetermined range can be erased in accordance with the strength (that is, the magnitude of the microphone input sound) at which the player blows breath, and the touched position.

Thus, in the present embodiment, processing is performed such that drawing on the canvas 101 can be performed only after a touch input and an operation (sound input to the microphone 42) of blowing breath are performed, as in the case of the “spray”. Thus, it becomes possible to provide a drawing application having a nonconventional and novel way of operation.

Next, the detail of application processing performed by the game apparatus 1 will be described. Firstly, data which is stored in the main memory 32 when the application processing is performed will be described. FIG. 13 is an illustrative diagram showing a memory map of the main memory 32 shown in FIG. 2. Referring to FIG. 13, the main memory 32 includes a program storage area 321 and a data storage area 325. Data in the program storage area 321 and a part of data in the data storage area 325 are obtained by copying, onto the main memory 32, data stored in advance in a ROM of the cartridge 29. Note that other than in the cartridge 29, the programs and the data may be stored in, for example, the save data memory 37 instead of the cartridge 29, and may be copied from the save data memory 37 onto the main memory 32 when the programs are executed. In the present embodiment, the latest input coordinate and the input coordinate just prior to the latest input coordinate can be saved as touched position data 3262. The game apparatus 1 repeatedly detects an input to the touch panel 13 at intervals of a unit of time. When an input is detected, data which has been saved as the latest input coordinate is saved as input coordinate just prior to the latest input coordinate. When the player is touching the touch panel 13, data indicating coordinate of the touched position is saved as the latest input coordinate in the touched position data 3262. When the player is not touching the touch panel 13, the latest input coordinate indicating NULL is saved as in the touched position data 3262.

The program storage area 321 stores a program which is to be executed by the CPU 31 and which includes a main processing program 322, a camera processing program 323, a graffiti processing program 324, and the like.

A main processing program 322 is a program corresponding to processing shown by a flowchart in FIG. 17 described later. A camera processing program 323 is a program for causing the CPU 31 to execute processing for obtaining a shot image by using the outer camera 25, and a graffiti processing program 324 is a program for causing the CPU 31 to execute the processing, shown referring to FIG. 5 and the like, for drawing graffiti on the shot image.

The data storage area 325 stores operation data 326, a drawing tool master 327, drawing color data 328, shot image data 329, current tool data 330, sound effect data 331, a spray table 332, sound characteristic data 333, and the like.

The operation data 326 is data indicating a content of an operation performed by the player with respect to the game apparatus 1, and includes the operation button data 3261 and the touched position data 3262. The operation button data 3261 is data indicating an input state of each of the operation buttons 14A to 14K. The touched position data 3262 is data indicating coordinate (input coordinate) of a touched position inputted to the touch panel 13. In the present embodiment, while the player is touching the touch panel 13, the input coordinate is repeatedly obtained and saved as the touched position data 3262. Note that in the present embodiment, it is possible to save the latest input coordinate and input coordinate just prior to the latest input coordinate as the touched position data 3262.

The drawing tool master 327 is a table associated with the drawing tools described above. FIG. 14 shows an example of a table configuration of the drawing tool master 327. The drawing tool master 327 is a type 3271, a line type 3272, and a cursor image data 3273 in FIG. 14 showing an example of a data configuration of the drawing tool master 327. The drawing tool master 327 shown in FIG. 14 includes the type 3271, the line type 3272, and the cursor image data 3273. The type 3271 is data indicating drawing tools, which are, in the present embodiment, the “pen” and the “eraser”. The line type 3272 is data indicating the line types, which are, in the present embodiment, a line (hereinafter, referred to as uniform line) of uniform thickness, and the “spray”. The cursor image data is image data to be displayed as the cursor 102. In the present embodiment, when the type 3271 is the “pen” and the line type 3272 is the “uniform line”, image data indicating an image of a pen tip is stored as the cursor image data. When the type 3271 is the “pen” and the line type 3272 is the “spray”, the cursor image data stores image data of a propeller as described above. When the type 3271 is the “eraser” and the line type 3272 is the “uniform line”, image data of an eraser is stored as the cursor image data, and when the line type 3272 is the “spray”, an image of a propeller is stored as the cursor image data.

The drawing color data 328 is data indicating the color of a line or the like drawn on the canvas 101 when the type of the drawing tool is the “pen”. The shot image data 329 is data indicating an image shot by the outer camera 25. The current tool data 330 is data indicating the type of the drawing tool (pen or eraser) currently selected and the line type (uniform line or spopray). The sound effect data 331 is data of a sound effect to be reproduced upon drawing.

The spray table 332 is a table which defines the size of an area in which drawing is performed and the number of dots to be drawn, so as to associate the size and the number with the volume of the above-described microphone input sound. FIG. 15 shows an example of a data configuration of the spray table 332. The spray table shown in FIG. 15 includes a volume 3321, an area size 3322, and a dot number 3323. The volume 3321 indicates a range of magnitudes of the volume of the microphone input sound. Note that in the present embodiment, the magnitude of the volume is represented as a value from 0 to 100. The area size 3322 indicates a drawing area in which the spray line is drawn by performing the drawing processing once. In the present embodiment, the shape of the drawing area is circular, and a value indicating the radius of the drawing area is defined as a value of the area size 3322. The dot number 3323 defines the number of dots to be drawn in the drawing area. As shown by an example in FIG. 15, for example, when the volume 3321 indicates “11 to 30”, dots whose number is indicated by the dot number 3323 are drawn in a circular area 201 having a size shown in FIG. 16 (a). When the volume 3321 indicates “31 to 50”, an increased number of dots are drawn in the circular area 201 having an increased size, as shown in FIG. 16 (b) (note that dotted lines indicating the circular area 201 in FIG. 16 are just drawn as a matter of convenience, and are not displayed on the screen).

Referring to FIG. 13 again, the sound characteristic data 333 is data indicating a characteristic of a sound inputted to the microphone 42, and specifically, is data indicating the volume, frequency, tone, and the like. Note that in the present embodiment, data indicating the volume of the microphone input sound is stored as the sound characteristic data 333.

In processing described later, the data storage area 325 stores, in addition to the above-described data, various flags such as a reproduction flag used for indicating whether or not reproduction of a sound effect is being performed, various image data, and the like.

Next, a flow of application processing (hereinafter, referred to as graffiti game processing) executed by the game apparatus 1 will be described referring to FIG. 17 to FIG. 21. FIG. 17 is a flowchart showing the flow of the graffiti game processing executed by the game apparatus 1. When the game apparatus 1 is powered on, the CPU 31 of the game apparatus 1 executes a starting program stored in a boot ROM which is not shown, thereby initializing each unit such as the main memory 32. Then, a game program stored in the cartridge 29 is read into the main memory 32, thereby starting execution of the application program. As a result, a game image is displayed on the lower LCD 12, and thereby the application is started.

Referring to FIG. 17, first, the CPU 31 displays an inquiry screen for inquiring whether or not to execute camera processing (step S1). That is, in the application processing, it is also possible to execute the graffiti processing described later without performing shooting by using the camera. In other words, it is also possible to perform drawing processing with respect to the canvas 101 which is blank, as shown in FIG. 3, without using the base picture shown in FIG. 5 and the like.

Next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S2). Next, it is determined whether or not a content indicated by the operation data is an instruction of executing camera processing (step S3). As a result, if the content is not an instruction of executing camera processing (NO in step S3), the CPU 31 proceeds to processing in step S5 described later. On the other hand, if the content is an instruction of executing camera processing (YES in step S3), the CPU 31 executes camera processing (step S4). In the camera processing, processing for shooting an image which is to be used as the base picture by using the outer camera 25 and saving the shot image is executed. Next, the CPU 31 executes the graffiti processing (step S5). In the graffiti processing, processing for displaying the screen as shown in FIG. 5 and enabling graffiti to be drawn on the image shot by the camera is executed.

FIG. 18 is a flowchart showing the detail of the camera processing shown in step S4. As shown in FIG. 18, first, the CPU 31 performs initialization processing (step S11). In the initialization processing, various parameters (shooting magnification, exposure time, and the like) for shooting which are defined as initial values in advance are set.

Next, the CPU 31 displays, on the lower LCD 12, a video being shot by the outer camera 25 (step 12).

Next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S13). Thereafter, the CPU 31 determines whether or not a content of an operation performed by the player which is indicated by the operation data 326 indicates that the shutter button is pressed down (step S14). As a result of the determination, if the shutter is pressed down (YES in step S14), the CPU 31 performs processing of storing an image shot by the outer camera 25. That is, the image shot by the outer camera 25 is stored as the shot image data 329 in the main memory 32 (step S15). Thereafter, the CPU 31 returns to the processing in step S12 to repeat the processing.

On the other hand, as a result of the determination in step S14, if the shutter is not pressed (NO in step S14), next, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an operation of an instruction of ending the camera processing (step S16). As a result, if the content is an instruction of ending the camera processing (YES in step S16), the CPU 31 ends the camera processing. On the other hand, if the content is not an instruction of ending the camera processing (NO in step S16), the CPU 31 executes other processing based on the operation data 326 (step S17). For example, the CPU 31 executes setting of control of zoom magnification, exposure control, or the like. Thereafter, the CPU 31 returns to step S12 to repeat processing therefrom. Description of the camera processing is finished here.

Next, the graffiti processing shown in step S2 will be described. FIG. 19 is a flowchart showing the detail of the graffiti processing shown in step S2. As shown in FIG. 19, first, the CPU 31 executes initial processing with respect to the graffiti processing (step S21). Specifically, the CPU 31 generates and displays a game screen as shown in FIG. 3 and the like. In addition, the CPU 31 reads out the shot image data 329 from the main memory 32, and displays, as a “base picture”, a shot image which has been shot through the camera processing, on the canvas 101. At this time, if the camera processing has not been performed, nothing is stored in the shot image data 329, and therefore, in this case, the CPU 31 displays nothing on the canvas 101. That is, the canvas 101 which is blank is displayed. In addition, the CPU 31 sets, as an initial value of the current tool data 330, data indicating that the drawing tool is the “pen” and the line type is the “uniform line”. That is, at the start of the graffiti processing, the “pen” whose line-type is the “uniform line” is selected as the drawing tool.

When the initial processing is finished, next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S22). Thereafter, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an instruction of ending the graffiti processing (step S23). As a result of the determination, if the content is an instruction of ending the graffiti processing (YES in step S23), the CPU 31 ends the graffiti processing.

On the other hand, if the content is not an instruction of ending the graffiti processing (NO in step S23), next, the CPU 31 determines whether or not the content of an operation is an operation of selecting the type of the drawing tool (step S24). As a result, if the content is an operation of selecting the drawing tool (YES in step S24), processing of selecting the drawing tool is executed based on the content of the operation data 326 (step S25). Here, an example of the selection operation will be described. First, the player touches the drawing tool icon 111 on the screen as shown in FIG. 3 and the like. Every time this operation of touching the drawing tool icon 111 is detected, the CPU 31 alternately sets the “pen” and the “eraser” as the drawing tool in the current tool data 330. That is, every time the player touches the drawing tool icon 111, the drawing tool switches between the “pen” and the “eraser”. Moreover, the player touches one of the four icons of the line-type icon 112. The CPU 31 detects the touched icon (more accurately, the coordinate where the icon is displayed), the line type corresponding to a content of the icon is set in the current tool data 330. When the rightmost icon is touched among the four icons, the “spray” is set as the line type in the current tool data 330. On the other hand, when the other icons are touched, the “uniform line” is set as the line type, and data indicating the thickness of the line is also set in the current tool data 330 in accordance with the touched icon. Referring to an example in FIG. 3, one of three types of thicknesses of lines can be selected. The thickness of the uniform line indicated by the rightmost icon is the thinnest, and the third icon from the left indicates the thickest uniform line. Thus, the CPU 31 executes processing of setting data indicating the drawing tool in the current tool data 330 based on operation data, and thereafter, the CPU 31 returns to step S22 to repeat processing therefrom.

On the other hand, as a result of the determination in step S24, if the content of an operation is not an operation of selecting the drawing tool (NO in step S24), next, the CPU 31 refers to the operation data 326 and thereby determines whether or not a touch input (more accurately, touch input to an area, of the touch panel 13, corresponding to an area in which the canvas 101 is displayed) to the canvas 101 is being performed (step S26). Specifically, the CPU 31 refers to the latest input coordinate stored in the touched position data 3262, and determines whether or not the latest input coordinate is in the area in which the canvas 101 is displayed. As a result, if a touch input to the canvas 101 is being performed (YES in step S26), the CPU 31 executes processing of displaying the cursor 102 at the position where the touch input is being performed (step S27). More specifically, first, the CPU 31 refers to the drawing tool master 327 and obtains a piece of the cursor image data 3272 which corresponds to the drawing tool currently selected. Then, the CPU 31 refers to the touched position data 3262 and displays, as the cursor 102, an image based on the piece of the cursor image data 3272 at the position where the touch input is being performed.

Subsequently to the display of the cursor 102, the CPU 31 determines whether or not the drawing tool currently selected, that is, the drawing tool indicated by the current tool data 330 is the “pen” (step S28). As a result, if the drawing tool is the “pen”, the CPU 31 executes pen processing (step S29). FIG. 20 is a flowchart showing the detail of the pen processing. As shown in FIG. 20, first, the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S41). As a result, if the line type is not the “spray” (NO in step S41), the CPU executes pen drawing processing (step S42). That is, based on the touched position, processing of drawing a uniform line of a thickness currently selected is executed. Thereafter, the CPU 31 ends the pen processing.

On the other hand, as a result of the determination in step S41, if the line type is the “spray” (YES in step S41), the CPU 31 executes spray drawing processing for drawing a spray line as described above referring to FIG. 10 and the like (step S43), and thereafter, ends the pen processing.

FIG. 21 is a flowchart showing the detail of the spray drawing processing shown in step S43. As shown in FIG. 21, first, the CPU 31 detects the volume of a sound (microphone input sound) inputted to the microphone 42, and stores the volume as the sound characteristic data 333 (step S51).

Next, the CPU 31 determines whether or not the volume indicated by the sound characteristic data 333 is equal to or larger than a predetermined threshold value which is set in advance (step S52). As a result of the determination, if the volume of the microphone input sound is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 refers to the spray table 332, and determines the size of an area in which a spray line is drawn, and the number of dots to be drawn, based on the magnitude of the volume (step S53).

Next, in accordance with the magnitude of the volume indicated by the sound characteristic data 333, the CPU 31 determines a volume at which a sound effect reproduced upon drawing a spray line is reproduced (step S54).

Next, in accordance with the magnitude of the volume indicated by the sound characteristic data 333, the CPU 31 sets a speed at which an animation display of the cursor 102 is reproduced (step S55). As described above, in the present embodiment, when a spray line is drawn, an animation in which a propeller rotates is displayed as the cursor 102 which is of propeller type. The CPU 31 executes processing in which the speed at which the propeller rotates is set in accordance with the magnitude of the volume of the microphone input sound. For example, the speed at which the animation is reproduced is set such that if the volume of the microphone input sound is larger, the propeller rotates faster. For example, in the case where the animation in which the propeller rotates includes three images, setting may be performed such that when the volume of the microphone input sound is large, the image may be changed for every one frame, and that when the volume of the microphone input sound is not large, the image may be redrawn for every ten frames.

Next, the CPU 31 places the above-described drawing area such that the center of the drawing area coincides with the touched position, and draws dots to form a spray line on the canvas 101 (in the drawing area) in accordance with a content of the determination in step S53 (step S56). Here, the dots to form a spray line may be randomly placed in the drawing area, or may be drawn around the touched position such that the density of the dots is greatest at the touched position and that the density of the dots gradually becomes lesser as the dots become more distant from the touched position.

Next, the CPU 31 displays an animation (animation in which a propeller rotates) of the cursor 102 in accordance with the speed, set in step S55, at which the animation is reproduced (step S57).

Next, the CPU 31 determines whether or not the reproduction flag is set at OFF (step S58). The reproduction flag is a flag indicating whether or not a sound effect is being reproduced, and when a sound effect is not being reproduced, the reproduction flag is set at OFF. As a result of the determination, if the reproduction flag is OFF (YES in step S58), the CPU 31 sets the reproduction flag to ON (step S59). Then, the CPU 31 refers to the sound effect data 331, and starts reproducing a sound effect (spraying sound of a spray) for drawing of a spray line at a volume set in step S54 (step S60). Thereafter, the CPU 31 ends the spray processing.

On the other hand, as a result of the determination in step S58, if the CPU 31 determines that the reproduction flag is not OFF (NO in step S58), since it is considered that a sound effect is being reproduced, the CPU 31 ends the spray processing without executing the processing in steps S59 and S60.

Next, processing (NO in step S52) performed when, as a result of the determination in step S52, the volume indicated by the sound characteristic data 333 is smaller than the predetermined threshold value, will be described. In this case, next, the CPU 31 determines whether or not the reproduction flag is set at ON (step S61). As a result, if the reproduction flag is ON (YES in step S61), the CPU 31 stops the reproduction of the sound effect which has been started in step S60. Then, the CPU 31 sets the reproduction flag to OFF (step S63).

On the other hand, as a result of the determination in step S61, if the reproduction flag is not ON (NO in step S61), the CPU 31 ends the spray drawing processing without executing the processing in steps S62 and S63. Description of the spray drawing processing will be finished here.

Referring to FIG. 19 again, as a result of the determination in step S28, if the current tool data 330 does not indicate the “pen” (NO in step S28), next, the CPU 31 determines whether or not the drawing tool indicated by the current tool data 330 is the “eraser” (step S30). As a result of the determination, if the drawing tool is not the “eraser” (NO in step S30), the CPU 31 returns to step S22 to repeat processing therefrom.

On the other hand, as a result of the determination in step S30, if the current tool data 330 indicates the “eraser” (YES in step S30), the CPU 31 executes eraser processing (step S31). In the eraser processing, processing in which, when a microphone input sound of a volume equal to or larger than a predetermined value is inputted, a spray line and the like which are drawn at the touched position are erased, is executed. FIG. 22 is a flowchart showing the detail of the eraser processing. As shown in FIG. 22, first, the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S71). As a result, if the line type is not the “spray” (NO in step S71), the CPU 31 executes pen eraser processing (step S42). That is, the CPU 31 executes processing of, based on the touched position, erasing a uniform line or a spray line at the thickness of a line currently selected. Thereafter, the CPU 31 ends the eraser processing.

On the other hand, as a result of the determination in step S71, if the line type is the “spray”, the CPU 31 executes spray eraser processing (step S73). FIG. 23 is a flowchart showing the detail of the spray eraser processing shown in step S73. Since, in FIG. 23, the processing in steps S51 and S52, and the processing in steps S57 to S63 are the same as the processing in the corresponding steps described referring to FIG. 21, detailed description thereof is omitted and the other processing will mainly be described here.

As shown in FIG. 23, in step S52, the CPU 31 determines whether or not the volume of the microphone input sound is equal to or larger than a predetermined threshold value, and as a result, if the volume is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 determines the size of an area (hereinafter, referred to as erasing area) to be erased, in accordance with the magnitude of the volume (step S81). A method for determining the erasing area conforms to a method for determining the drawing area for the spray line. That is, the CPU 31 refers to the spray table 332 and obtains the area size 3322 corresponding to the magnitude of the volume. Then, based on this size, the CPU 31 determines the size of the erasing area. Note that, similarly to the above-described drawing area, the shape of the erasing area is circular.

Next, the CPU 31 determines the volume at which the sound effect for erasing the spray line or the like is reproduced, in accordance with the magnitude of the volume (step S82).

Next, the CPU 31 determines the speed at which an animation of the cursor for the erasing is reproduced (step S83). That is, as in step S55, the CPU 31 determines the speed at which the propeller rotates, in accordance with the magnitude of the volume of the microphone input sound.

Next, the CPU 31 places the erasing area such that the center of the erasing area coincides with the touched position, and erases the spray line drawn within the erasing area (step S84).

Thereafter, the CPU 31 displays the animation of the cursor (step S57), and proceeds to processing of determining whether or not the reproduction flag is OFF (step S58). Since processing in step S58 and subsequent steps is the same as the processing in the respective steps described above referring to FIG. 21, detailed description thereof is omitted. Description of the spray eraser processing is finished here.

Referring to FIG. 19 again, next, processing (NO in step S26) performed when, as a result of the determination in step S26, a touch input is not being performed, will be described. In this case, next, the CPU 31 determines whether or not touch off has been performed, based on the operation data 326 (step S33). That is, the CPU 31 determines whether a state (state in which a touch continues to be performed) in which a touch input continues to be performed is interrupted, or a state in which a touch is not being performed continues. As a result of the determination, touch off has been performed (YES in step S33), the CPU 31 erases the cursor 102 (step S34). Thereafter, the CPU 31 returns to step S22 to repeat processing therefrom. On the other hand, if touch off has not been performed (NO in step S33), the CPU 31 directly returns to step S22 to repeat processing therefrom. Description of the graffiti processing is finished here. Specifically, if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is the coordinate of the touched position, the CPU 31 determines that touch off has been performed, and if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is also NULL, the CPU 31 determines that a state in which a touch is not being performed continues.

Thus, in the present embodiment, in the case where the player performs drawing by using the “spray”, drawing on the canvas 101 can be performed when two types of inputs, that is, a touch input to the canvas 101 and a sound input to the microphone 42 are performed. As a result, a novel game having a nonconventional and new feeling of operation can be provided.

In addition, in the present embodiment, the spray line is drawn while a touch input and a sound input to the microphone 42 continue to be performed (the player continues to blow). Therefore, by changing the strength at which the player blows breath, the thickness of the spray line can be changed in real time. Thus, it becomes possible to provide a novel way of enjoyment in which, depending on how the player blows breath, the thickness (corresponding to so-called pen pressure) of the spray line can be changed, that is, a content to be drawn can be changed.

In the above-described eraser processing, processing of erasing the spray line drawn at a touched position is performed. Other than such processing, processing (hereinafter, referred to as blow-off processing) in which dots forming the spray line which is present near or at a touched position are blown off in accordance with blowing breath on the touch panel may be executed. Hereinafter, the outline of the blow-off processing will be described. For example, it is assumed that a positional relationship between a touched position and dots is as shown in FIG. 24. FIG. 24 shows, as an example, a state in which there are five dots 211a to 211e above a touched position 210. In this state, when the player blows on the touched position 210 (that is, a sound input to the microphone 42 is performed), first, as shown in FIG. 25, straight lines 212a to 212e which respectively connect the dots 211a to 211e with the touched position 210 are calculated. Then, as shown in FIG. 26, processing of moving the dots 211 in accordance with the lengths and the directions of the respective straight lines 212 is executed. In the example in FIG. 26, the dots 211 are moved in the directions, along the respective straight lines 212, opposite to the directions toward the touched position 210. The distances of the movements are inversely proportional to the lengths of the respective straight lines. That is, the distance of a movement of a dot nearer to the touched position 210 is longer, and the distance of a movement of a dot farther from the touched position 210 is shorter. That is, in the processing, a dot nearer to a position (in this case, touched position 210) on which the player blows is subjected to stronger blow and thereby blown off farther.

Next, referring to FIG. 27, the detail of the blow-off processing will be described. FIG. 27 is a flowchart showing the detail of the blow-off processing. Here, the case where, in the flowchart of the graffiti processing described referring to FIG. 19, the blow-off processing is executed in place of the spray eraser processing in step S73, will be described as an example. As a matter of course, instead of executing the blow-off processing in place of the spray eraser processing, the blow-off processing may be executed together with the spray eraser processing.

As shown in FIG. 27, first, the CPU 31 detects a volume (step S51), and determines whether or not the detected volume is equal to or larger than a predetermined value (step S52). Since this processing is the same as the processing in steps S51 and S52 in FIG. 23, detailed description thereof is omitted.

As a result of the determination in step S52, if the volume is smaller than the predetermined threshold value (NO in step S52), the CPU 31 proceeds to processing in step S61. Since processing to be performed in this case is also the same as the processing from step S61 in FIG. 23, description thereof is omitted.

On the other hand, as a result of the determination in step S52, if the CPU 31 determines that the volume is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 determines the size of an area (hereinafter, referred to as blow-off area) for the blow-off processing, in accordance with the magnitude of the volume (step S81). The size of the area is determined by referring to the spray table 332 and obtaining the area size 3322 corresponding to the magnitude of the volume, as in step S53.

Next, the CPU 31 calculates straight lines (see FIG. 25) which connect, with a touched position, the respective dots forming a spray line present within the blow-off area (step S83).

Next, the CPU 31 moves the dots within the blow-off area in accordance with the directions and the lengths of the respective calculated straight lines (see FIG. 26). Note that at this time, if a moved dot overlaps with a position of another dot, a dot which is nearer to the touched position before the movement is drawn over the other dot.

The dots are moved through the above processing, and thereafter, the CPU 31 proceeds to processing in step S58. Since the processing from step S58 is the same as the corresponding processing described above referring to FIG. 21, description thereof is omitted.

Thus, by performing processing in which when the player blows breath on a touched position, a dot is moved as if sands are blown off, it becomes possible to provide the player with a novel way of enjoyment.

In addition, in the above embodiment, the CPU 31 detects a sound produced when the player blows breath on the touch panel 13 and performs processing based on the volume thereof, and at this time, any other sound can be used (for example, a sound of clapping hands can be used). That is, the type and the content of the sound are not identified. However, the present invention is not limited thereto, and a sound of “blow” may be identified. A method of detection or identification of the “sound of blow” may be of any type. For example, there can be considered a method in which a waveform pattern of a sound segment included in a sound (sound of breath) of blow is stored in advance, the stored sound segment and a sound segment of an inputted sound are compared with each other, and thereby it is determined whether or not the player has blown. Alternatively, there may be used a method in which by using a fast fourier transform processing (FFT), the spectrum of an inputted sound is calculated, the calculated spectrum and the spectrum of a sound of blow which is stored in advance are compared with each other, and thereby it is determined whether or not the player has blown.

In addition, instead of using a volume, or using the type and the content of a specified sound as described above, a characteristic of an inputted sound, such as tone or frequency, may be calculated or identified, and a content of drawing processing may be changed in accordance with the characteristic of the inputted sound.

In addition, the reproduction of the sound effect may be executed such that a fade-in/fade-out effect is used upon start and end of the reproduction of the sound effect. This can prevent a noise (for example, noise of a sound “putsu”) upon start of reproduction from being generated.

In addition, for a drawing color used upon drawing, only one color may be used, or a plurality of colors may be used at the same time. In an exemplar case where a plurality of colors are used at the same time, an edged line (whose edge has a color different from a color of part of the edged line other than the edge) may be used if the “pen” is used as the drawing tool, for example. In addition, if the “spray” is used, dots forming a spray line may have colors different from each other. For example, when “gray” and “black” are designated as the drawing colors, the spray line which is formed by both a gray dot and a black dot may be drawn (see FIG. 28).

In addition, in the case where a plurality of drawing colors are used, when the above-described blow-off processing is executed and dots having different colors overlap with each other as a result of movements of the dots, the dots may be displayed as one dot having a color obtained by mixing the colors thereof with each other. Thus, when the spray line is drawn by using multiple drawing colors, the spray line which includes various colors in a mixed manner and cannot be predicted by the player can be displayed through the above-described blow-off processing, whereby a new way of enjoyment using the blow-off processing can be provided to the player.

In addition, in the spray eraser processing, a translucence effect may be used for erasing the spray line or the like. That is, instead of erasing the spray line or the like at the moment when the player blows breath, processing in which the spray line or the like is gradually diluted, and finally, cleanly erased may be executed.

In addition to other than the drawing processing using the “spray line” formed by multiple dots, the drawing processing using the “uniform line” may also be executed only when a microphone input sound is being inputted. Further, in this case, the pen pressure may be changed in accordance with the magnitude of the microphone input sound. For example, when the magnitude of the microphone input sound is small (when the strength at which the player blows breath is weak), a “faded line” or a “line of a dilute color” may be drawn, and when the magnitude of the microphone input sound is large (when the strength at which the player blows breath is strong), a “clear line” or a “line of a deep color” may be drawn. In addition, drawing of the “uniform line” with the “pen” may be executed without a microphone input sound, and the thickness of the line may be changed in real time by a breath being blown on the touch panel 13 while the “uniform line” is being drawn with the “pen”.

In addition, upon drawing in the spray processing, when the volume of a microphone input sound becomes smaller than a predetermined threshold value (that is, when the player stops blowing) while a touch input continues to be detected, drawing of the spray line may continue during about 1 to 2 seconds, for example, instead of immediately stopping drawing of the spray line. That is, processing in which even if the player stops blowing breath, rotation of the propeller continues during a short time and the spray line is drawn during the short time, may be executed.

In addition, it is understood that the image created in the above embodiment, on which graffiti has been drawn, may be saved. In this case, only data corresponding to the above-described handwriting layer 106 (see FIG. 6) may be saved, or data obtained by combining data corresponding to handwriting layer 106 with a shot image may be saved as a composite image.

In addition, in the above embodiment, the case where graffiti is drawn on an image shot by the outer camera is described as an example. However, the present invention is not limited thereto, and the present invention is applicable to general painting software which does not use the outer camera 25, that is, which does not allow graffiti to be drawn on a shot image or the like.

In addition, in the present embodiment, a hand-held game apparatus having two display devices is described as an example. However, the present invention is applicable to a hand-held terminal which has a single display device and has a touch panel on a screen of the display device. In addition, in the present embodiment, a touch panel is used as an example of a device which detects a designated position, in an operation area, designated by the player. However, a so-called pointing device which allows the player to designate a position in a predetermined area may be used as the device. For example, there may be used, as the device, a mouse which is capable of designating any position on a screen, or a tablet which designates any position on an operation surface having no display screen. Alternatively, there may be used, as the device, a pointing device in which: a device including shooting means for remotely shooting, for example, a display screen, or a marker positioned in the periphery of the display screen obtains a shot image by pointing toward the display screen; and from the position of the display screen or the marker on the shot image, coordinate, on the display screen corresponding to the position on the display screen at which the device has pointed, is calculated.

While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims

1. A computer-readable storage medium having stored therein a drawing processing program which is executed by a computer of an information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the drawing processing program causing the computer to function as:

designated position detection means for continuously obtaining a designated position on the display screen, based on a designation performed by the pointing device;
sound detection means for detecting that a sound which satisfies a predetermined condition is inputted to the sound input means; and
drawing-related processing execution means for, while the sound detection means detects that the sound is inputted, executing predetermined drawing-related processing at a position based on the designated position obtained by the designated position detection means.

2. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing-related processing execution means changes a content of the drawing-related processing to be executed, based on the sound detected by the sound detection means, and in accordance with a characteristic of the sound.

3. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means sequentially changes a content of the drawing-related processing to be executed, in a coordinated manner with changes in chronological order in the characteristic of the sound repeatedly detected by the sound detection means.

4. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes the predetermined drawing-related processing only when a volume of the inputted sound is equal to or larger than a predetermined threshold value.

5. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing a line which connects, in chronological order, the position based on the designated position sequentially obtained by the designated position detection means.

6. The computer-readable storage medium having stored therein the drawing processing program according to claim 5, wherein the drawing-related processing execution means changes at least one of a thickness of the line and a density of a color in which the line is drawn, in accordance with a volume of the inputted sound.

7. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing one or more dots in a drawing range which is a predetermined range including therein the position based on the designated position.

8. The computer-readable storage medium having stored therein the drawing processing program according to claim 7, wherein the drawing-related processing execution means changes at least one of a size of the drawing range and a number of the dots to be drawn in the drawing range, in accordance with a volume of the inputted sound.

9. The computer-readable storage medium having stored therein the drawing processing program according to claim 8, wherein the drawing-related processing execution means draws the dots such that an area density of the number of the dots which are nearer to the position based on the designated position is higher, and that an area density of the number of the dots which are farther from the position based on the designated position is lower.

10. The computer-readable storage medium having stored therein the drawing processing program according to claim 8, wherein the drawing-related processing execution means draws the dots at random positions in the drawing area.

11. The computer-readable storage medium having stored therein the drawing processing program according to claim 7, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of moving the dots drawn on the display screen in a predetermined direction, based on the position based on the designated position, and the sound input detected by the sound detection means.

12. The computer-readable storage medium having stored therein the drawing processing program according to claim 11, wherein the drawing-related processing execution means

includes movement content calculation means for calculating: a direction of a line connecting each of the dots displayed on the display screen, with a reference point which is the position based on the designated position; and a distance from the reference point to each of the dots displayed on the display screen, and
moves the dots displayed on the screen, based on the direction and the distance calculated by the movement content calculation means.

13. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as sound effect reproduction means for causing predetermined sound output means to output a predetermined sound effect when the drawing-related processing execution means is executing the predetermined drawing-related processing.

14. The computer-readable storage medium having stored therein the drawing processing program according to claim 13, wherein the sound effect reproduction means changes a volume at which the sound effect is reproduced, in accordance with a characteristic of the sound detected by the sound detection means.

15. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as:

cursor display means for displaying a predetermined cursor image at the designated position; and
animation display means animates the cursor when the drawing-related processing execution means is executing the predetermined drawing-related processing.

16. The computer-readable storage medium having stored therein the drawing processing program according to claim 15, wherein the animation display means changes a speed of the animation in accordance with a characteristic of the sound detected by the sound detection means.

17. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the pointing device is a touch panel.

18. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as:

shot image obtaining means for obtaining image data of an image shot by predetermined shooting means; and
shot image display means for displaying, on the display screen, the shot image based on the image data, and
the drawing-related processing execution means executes the drawing-related processing on the shot image.

19. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein

the drawing processing program further causes the computer to function as specified sound determination means for determining whether or not the sound detected by the sound detection means is a predetermined sound, and
the drawing-related processing execution means executes the drawing-related processing only when the specified sound determination means determines that the sound detected by the sound detection means is the predetermined sound.

20. An information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the information processing apparatus comprising:

designated position detection means for continuously obtaining a designated position on the display screen, based on a designation performed by the pointing device;
sound detection means for detecting that a sound which satisfies a predetermined condition is inputted to the sound input means; and
drawing-related processing execution means for, while the sound detection means detects that the sound is inputted, executing predetermined drawing-related processing at a position based on the designated position.

21. The information processing apparatus according to claim 20, wherein the sound detection means is placed in proximity of the display screen.

Patent History
Publication number: 20100210332
Type: Application
Filed: Dec 23, 2009
Publication Date: Aug 19, 2010
Applicant: NINTENDO CO., LTD. (Kyoto)
Inventor: Daiji IMAI (Kyoto-shi)
Application Number: 12/646,306
Classifications