GESTURE CONTROLLED GAME SCREEN NAVIGATION

- Microsoft

A game controller, such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller. The gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console. In this way, the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND

Video game consoles employ controllers to allow users to interface with software, such as video games. A typical controller has a number of controls. For example, a gamepad type controller will typically incorporate one or more directional controls, such as a number of buttons arranged in a directional keypad, one or more analog sticks, or a combination of such controls. In addition to the directional controls, a controller will typically include one or more action buttons that may be located on the face or shoulders of the controller. In some cases, directional controls may provide action selection functionality as well. For example, the analog sticks in a controller compatible with the XBOX 360® brand video game console, available from Microsoft Corp. of Redmond, Wash., can be pushed in as well as moved directionally.

In addition to controlling the movement and actions of a character in a video game, a controller can be used to navigate a graphic user interface, such as the dashboard presented to users of the XBOX 360® brand video game console. The graphic user interface may include a number of menus and sub-menus that allow a user to, for example, execute game software, access media resources such as image, video, or audio files or media discs, configure system settings, etc. Navigating the graphic user interface is conventionally accomplished via a combination of directional navigation commands, e.g., left, right, up, and down, input using the directional controls and action buttons. While this control scheme can work well for traditional gamepad type controllers, it relies on the use of several buttons and is thus not well-suited for controllers that lack the buttons typically found on gamepad type controllers.

For example, some video games involve singing into a microphone type controller. The limited space available on the surface of the microphone type controller limits the number of buttons that can be implemented on the microphone type controller. In addition, the presence of buttons on such a controller may detract from the overall gaming experience by reducing the degree of realism of the controller.

SUMMARY

According to various embodiments, a game controller, such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller. The gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console.

One embodiment is directed to a method for using a game controller to navigate a graphic user interface presented by a video game console to a user. A motion of the game controller is detected and is recognized as a gesture. An operational mode in which the game controller is operating is then determined. If the game controller is operating in a first operational mode, a navigation command corresponding to the recognized gesture is executed in the graphic user interface. On the other hand, if the game controller is operating in a second operational mode, an action corresponding to the recognized gesture is performed in the graphic user interface. This method may be performed by a computer executing instructions stored on a computer readable storage medium.

Another embodiment is directed to a game controller for use with a video game console. A microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller. While not required, in some embodiments, the at least one motion sensor is incorporated in the game controller. The microcontroller or the video game console is configured to recognize the detected motion as a gesture. If the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user. If the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.

Various embodiments may realize certain advantages. For example, by using gestures to perform navigation commands and actions in the graphic user interface, the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS

The illustrative embodiments will be better understood after reading the following detailed description with reference to the appended drawings, in which:

FIG. 1 is a block diagram representing an exemplary computing device.

FIG. 2 is a block diagram illustrating an implementation of the computing device of FIG. 1 as a game console.

FIG. 3 is a plan view illustrating an example implementation of a microphone controller according to one embodiment.

FIG. 4 is a block diagram representing the microphone controller of FIG. 3.

FIG. 5 is a flow diagram illustrating an example method of performing navigation commands and actions in a graphic user interface using gestures.

DETAILED DESCRIPTION

The inventive subject matter is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, it is contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies.

FIG. 1 illustrates an example of a suitable computing system environment 100 in which the subject matter described above may be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the subject matter described above. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.

With reference to FIG. 1, computing system environment 100 includes a general purpose computing device in the form of a computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus (also known as Mezzanine bus).

Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media include both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110. Communication media typically embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.

The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 1 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.

The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156, such as a CD-RW, DVD-RW or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like. The hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150.

The drives and their associated computer storage media discussed above and illustrated in FIG. 1 provide storage of computer readable instructions, data structures, program modules and other data for the computer 110. In FIG. 1, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146 and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136 and program data 137. Operating system 144, application programs 145, other program modules 146 and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus 121, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A graphics interface 182 may also be connected to the system bus 121. One or more graphics processing units (GPUs) 184 may communicate with graphics interface 182. A monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190, which may in turn communicate with video memory 186. In addition to monitor 191, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195.

The computer 110 may operate in a networked or distributed environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks/buses. Such networking environments are commonplace in homes, offices, enterprise-wide computer networks, intranets and the Internet.

When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 1 illustrates remote application programs 185 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.

It will be appreciated that one particular application of computer 110 is in the form of a game console 200 as depicted in FIG. 2. As seen therein, game console 200 has a central processing unit (CPU) 201 having a level 1 (L1) cache 202, a level 2 (L2) cache 204, and a flash ROM (Read-Only Memory) 206. The level 1 cache 202 and level 2 cache 204 temporarily store data and hence reduce the number of memory access cycles, thereby improving processing speed and throughput. The flash ROM 206 can store executable code that is loaded during an initial phase of a boot process when the game console 200 is initially powered. Alternatively, the executable code that is loaded during the initial boot phase can be stored in a FLASH memory device (not shown). Further, the flash ROM 206 can be located separate from CPU 201. Game console 200 can, optionally, be a multi-processor system; for example, game console 200 can have three processors 201, 203, and 205, where processors 203 and 205 have similar or identical components to the CPU 201.

A graphics processing unit (GPU) 208 and a video encoder/video codec (coder/decoder) 214 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 208 to the video encoder/video codec 214 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 240 for transmission to a television or other display device. A memory controller 210 is connected to the GPU 208 and CPU 201 to facilitate processor access to various types of memory 212, such as, but not limited to, a RAM (Random Access Memory).

Game console 200 includes an I/O controller 220, a system management controller 222, an audio processing unit 223, a network interface controller 224, a first USB controller 226, a second USB controller 228 and a front panel I/O subassembly 230 that may be implemented on a module 218. The USB controllers 226 and 228 serve as hosts for peripheral controllers 242(1)-242(2), a wireless adapter 248, and an external memory unit 246 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). The network interface 224 and/or wireless adapter 248 provide access to a network (e.g., the Internet, a home network, etc.) and may be any of a wide variety of various wired or wireless interface components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like. The game console 200 may be connected to a controller sensing device 254 to sense the position or motion of the peripheral controllers 242(1)-242(2) or other accessories. The controller sensing device may be implemented using, for example, a three-dimensional camera or an ultrasonic triangulation system.

System memory 243 is provided to store application data that is loaded during the boot process. A media drive 244 is provided and may comprise a DVD/CD drive, a hard drive, or a removable media drive, etc. The media drive 244 may be internal or external to the game console 200. When the media drive 244 is a drive or reader for removable media (such as removable optical disks or flash cartridges), then the media drive 244 is an example of an interface onto which (or into which) media are mountable for reading. Application data may be accessed via the media drive 244 for execution, playback, etc. by game console 200. Media drive 244 is connected to the I/O controller 220 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 3394). While media drive 244 may generally refer to various storage embodiments (e.g., hard disk, removable optical disk drive, etc.), game console 200 may specifically include a hard disk 253, which can be used to store game data.

The system management controller 222 provides a variety of service functions related to assuring availability of the game console 200. The audio processing unit 223 and an audio codec 232 form a corresponding audio processing pipeline with high fidelity, 3D, surround, and stereo audio processing according to aspects of the present subject matter described herein. Audio data is carried between the audio processing unit 223 and the audio codec 232 via a communication link. The audio processing pipeline outputs data to the A/V port 240 for reproduction by an external audio player or device having audio capabilities.

The front panel I/O subassembly 230 supports the functionality of the power button 250 and the eject button 252, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the game console 200. A system power supply module 236 provides power to the components of the game console 200. A fan 238 cools the circuitry within the game console 200.

The CPU 201, GPU 208, memory controller 210, and various other components within the game console 200 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures.

When the game console 200 is powered on or rebooted, application data can be loaded from the system memory 243 into memory 212 and/or caches 202, 204 and executed on the CPU 201. The game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200. In operation, applications and/or other media contained within the media drive 244 may be launched or played from the media drive 244 to provide additional functionalities to the game console 200.

The game console 200 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the game console 200 may allow one or more users to interact with the system, watch movies, listen to music, and the like. However, with the integration of broadband connectivity made available through the network interface 224 or the wireless adapter 248, the game console 200 may further be operated as a participant in a larger network community.

The game console 200 can be used with a variety of controllers 242, such as the controllers 242(1), 242(2), and 242(3) of FIG. 2. In the embodiment shown in FIG. 2, controllers 242(1) and 242(2) are wired controllers that communicate with the game console 200 via the USB controller 226. Controller 242(3) is a wireless controller that communicates with the game console 200 via the wireless adapter 248. According to certain embodiments, one or more of the controllers 242 can be implemented as specialized controllers for playing certain types of games. As a particular example, one or more of the controllers 242 can be implemented as a microphone controller for playing singing and music games.

One example implementation of a microphone controller is depicted at FIG. 3 as a microphone controller 300, which may embody the controller 242(3) of FIG. 2. The microphone controller 300 may communicate with the game console 200 via a wired connection, for example, using the USB controller 226, or a wireless connection, for example, using the wireless adapter 248 of FIG. 2. As shown in FIG. 3, the microphone controller 300 may incorporate a power button 302 for turning the microphone controller 300 on and off. The microphone controller 300 may also incorporate a modifier button 304 for selecting an operational mode of the microphone controller 300, as explained more fully below. It will be appreciated that some embodiments of the microphone controller 300 may omit either or both of the power button 302 and the modifier button 304.

FIG. 4 is a block diagram illustrating the functional components of one embodiment of the microphone controller 300. The microphone controller 300 includes a transducer 402 configured to receive an acoustic input, such as a human voice, and to generate an electrical signal. The transducer 402 may be implemented using any of a variety of well-known technologies. The generated electrical signal is output to the game console 200 using a wired connection, such as using the USB controller 226 or the USB controller 228, or a wireless connection, such as using the wireless adapter 248. In some embodiments, the microphone controller 300 may perform some processing on the electrical signal via an analog-to-digital (A/D) codec 403 before outputting the electrical signal to the game console 200. In other embodiments, the electrical signal may be output without any processing.

The power button 302 is connected to a power control module 404, which is in turn connected to a power source 406, such as one or more batteries. When the power button 302 is actuated, the power control module 404 causes the microphone controller 300 to draw power from the power source 406, activating the microphone controller 300. If the microphone controller 300 is already activated, actuating the power button 302 may cause the power control module 404 to deactivate the microphone controller 300. In some embodiments, the power control module 404 may consist of a simple electrical switch that completes a circuit when the power button 302 is actuated. In other embodiments, the power control module 404 may be more complex, such that, for example, the power button 302 must be continuously actuated for some period to either activate or deactivate the microphone controller 300. As another example, the power control module 404 may be configured to automatically deactivate the microphone controller 300 when the microphone controller 300 is not used beyond a specified timeout period. This configuration may provide the advantage of avoiding excess power consumption when the microphone controller 300 is not in use. In some embodiments, as an alternative to deactivating the microphone controller 300, the power control module 404 may be configured to place the microphone controller 300 in a low-power “sleep” mode, either in response to actuation of the power button 302 or after the timeout period.

The microphone controller 300 also includes one or more motion sensors. While FIG. 4 depicts three motion sensors 408, 410, and 412, it will be appreciated by those of skill in the art that more or fewer motion sensors may be used. The motion sensors 408, 410, and 412 may be implemented using any of a variety of known technologies, including, for example, accelerometers, gyroscopes, and the like. In addition, while the motion sensors 408, 410, and 412 are depicted as being located in the microphone controller 300, some embodiments may incorporate motion sensing devices located outside the microphone controller 300, such as the controller sensing device 254 of FIG. 2. The number and type of motion sensors affects the capability of the microphone controller 300 to detect motion in various dimensions.

In the embodiment shown in FIG. 4, the motion sensors 408, 410, and 412 generate signals in response to motion of the microphone controller 300. The signals generated by the motion sensors 408, 410, and 412 may be output to the game console 200 for further processing, as indicated by the solid lines emanating from the motion sensors 408, 410, and 412 in FIG. 4. Alternatively, as indicated by the dashed lines emanating from the motion sensors 408, 410, and 412, the signals may be provided to a microcontroller 414, which is indicated by a dashed box in FIG. 4 to denote that it may be omitted in some embodiments.

In embodiments in which the signals are provided to the microcontroller 414, the microcontroller 414 generates an output signal based on the signals received from the motion sensors 408, 410, and 412. This output signal may also be based in part on the signal generated by the transducer 402, as indicated by the dashed line connecting the transducer 402 to the microcontroller 414 in FIG. 4. If the microphone controller 300 incorporates the modifier button 304, the output signal generated by the microcontroller 414 may also be affected by which operational mode has been selected with the modifier button 304. The output signal generated by the microcontroller 414 is output to a radio block 413, to a USB port block 415, or to both the radio block 413 and the USB port block 415, for output to the game console 200 via a wired or wireless connection, e.g., to the USB controller 226 or the wireless adapter 248 of FIG. 2.

In operation, the motion sensors 408, 410, and 412 detect gestures performed by a user of the microphone controller 300. Specifically, each motion sensor 408, 410, and 412 detects motion in one or more orthogonal directions and outputs motion data. Gestures are derived from the motion data by software. The software that converts the motion data to gestures may reside in the game console 200 or may be embedded in the microphone game controller 300. If the software resides in the game console 200, the game controller 300 may be configured to output the motion data to the game console 200. The gestures can be simple directional movements, e.g., UP, DOWN, LEFT, or RIGHT, or more complex movements to represent commands such as START, BACK, ENTER, ESCAPE, and the like. More complex movements can be represented by simple movements combined with an actuation of the modifier button 304.

The gestures are then used to control various aspects of the operation of the game console 200, including, for example, selecting and launching a game. Further, as described above, the game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200. One particular example of such a graphic user interface is the dashboard menu used by the XBOX 360® brand video game console. According to various embodiments, gestures detected by the motion sensors 408, 410, and 412 are used to navigate the graphic user interface and to perform actions using the graphic user interface. The modifier button 304 may be used to switch between one operational mode in which gestures are used to perform navigation commands, such as UP, DOWN, LEFT, and RIGHT, and another operational mode in which gestures are used to perform actions, such as START, BACK, ENTER, and ESCAPE. It will be appreciated by those skilled in the art that the modifier button 304 may also be used to place the microphone controller 300 in operational modes other than those specifically described in this disclosure. By using gestures to perform navigation commands and actions in the graphic user interface, the microphone controller 300 avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons on the body of the microphone controller 300. As a result, the sense of realism and the overall gaming experience may be enhanced.

As described above in connection with FIG. 3, both the power button 302 and the modifier button 304 are optional. In some embodiments, the functionality provided by the power button 302 can be implemented using gestures. For example, the microphone controller 300 may be configured to activate when the user picks up the microphone controller 300. When the microphone controller 300 is set down after use, the microphone controller 300 may enter a “sleep” mode after a specified timeout period expires without any sensed motion. Similarly, the functionality provided by the modifier button 304 can be implemented using gestures. For example, the microphone controller 300 can be configured to switch between the operational mode in which gestures are used to perform navigation commands and the other operational mode in which gestures are used to perform actions when a specified modifier gesture or combination of gestures is performed. As another alternative, the game console 200 may send a command to the microphone controller 300 to switch between operational modes, for example, in response to the occurrence of a triggering event or based on the context in which a gesture or combination of gestures is performed. Thus, the microphone controller 300 can be implemented without any buttons, thereby enhancing the simulation of a real microphone.

FIG. 5 is a flow diagram illustrating a method of using the microphone controller 300 to perform navigation commands and actions in a graphic user interface using gestures. At a step 500, one or more of the motion sensors 408, 410, and 412 detects motion of the microphone controller 300. Next, at a step 502, the motion of the microphone controller 300 is recognized as a gesture, either by the microcontroller 414 in the microphone controller 300 or by software stored in the system memory 243 of the game console 200 or on a media disc accessible by the media drive 244. After the motion of the microphone controller 300 is recognized as a gesture, the game console 200 determines the operational mode in which the microphone controller 300 is currently operating. If the microphone controller 300 is operating in the operational mode in which gestures are used to perform navigation commands, then, at a step 504, the game console 200 maps the detected gesture to a navigation command, such as UP, DOWN, LEFT, or RIGHT. This gesture mapping can be performed by software located either in the system memory 243 or on a media disc loaded in the media drive 244. At a step 506, the navigation command is executed, causing the graphic user interface to respond appropriately. For example, gestures that are mapped to the navigation commands UP and DOWN may cause various items, such as games and movies, in a menu to be highlighted, while gestures that are mapped to the navigation commands LEFT and RIGHT may cause the graphic user interface to rotate between displays of various menus, such as a game menu, a system configuration menu, etc.

On the other hand, if the microphone controller 300 is operating in the operational mode in which gestures are used to perform actions, then, at a step 508, the game console 200 maps the detected gesture to an action, such as START, BACK, ENTER, and ESCAPE. At a step 510, the action is performed, causing the graphic user interface to respond appropriately. For example, if the detected gesture is mapped to the action START, the highlighted item, e.g., a game or a movie, may be initiated. As another example, if the detected gesture is mapped to the action BACK, the graphic user interface may display a previously displayed menu or menu item. Using a combination of navigation and action gestures, the user can use the microphone controller 300 to perform most or all of the functions that are supported by a gamepad type controller without needing to use a separate controller.

While the above embodiments have been described in the context of a microphone controller, it will be appreciated that the principles described herein can be applied to any of a variety of game controllers. These principles may be particularly suitable for application to game controllers for which it is desirable to minimize the number of buttons, for example, to enhance realism. Other types of game controllers in connection with which the principles described herein may be particularly beneficial may include, but are not limited to, exercise controllers intended to be worn on the wrists and/or feet of a user for use in exercise or dancing games, pointing controllers such as guns, and specialized sports controllers for use in sports games, such as simulated tennis rackets, baseball bats, and the like.

Although the subject matter has been described in language specific to the structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features or acts described above are disclosed as example forms of implementing the claims.

Claims

1. A method for using a game controller to navigate a graphic user interface presented by a video game console to a user, the method comprising:

detecting a motion of the game controller;
recognizing the detected motion as a gesture;
determining an operational mode in which the game controller is operating;
if the game controller is operating in a first operational mode, executing in the graphic user interface a navigation command corresponding to the recognized gesture; and
if the game controller is operating in a second operational mode, performing in the graphic user interface an action corresponding to the recognized gesture.

2. The method of claim 1, further comprising switching the operation of the game controller between the first operational mode and the second operational mode in response to at least one of actuation of a button on the game controller, recognizing the detected motion as a modifier gesture, and a command received by the game controller from the video game console.

3. The method of claim 1, wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.

4. The method of claim 1, wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.

5. The method of claim 1, further comprising:

using the executed navigation command to select a game; and
using the performed action to launch the selected game.

6. A computer-readable storage medium storing computer-executable instructions for:

detecting a motion of a game controller;
recognizing the detected motion as a gesture;
determining an operational mode in which the game controller;
if the game controller is operating in a first operational mode, executing in a graphic user interface a navigation command corresponding to the recognized gesture; and
if the game controller is operating in a second operational mode, performing in the graphic user interface an action corresponding to the recognized gesture.

7. The computer-readable storage medium of claim 6, wherein the computer-readable storage medium stores further computer-executable instructions for switching the operation of the game controller between the first operational mode and the second operational mode in response to at least one of actuation of a button on the game controller, recognizing the detected motion as a modifier gesture, and a command received by the game controller from the video game console.

8. The computer-readable storage medium of claim 6, wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.

9. The computer-readable storage medium of claim 6, wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.

10. The computer-readable storage medium of claim 6, wherein the computer-readable storage medium stores further computer-executable instructions for:

using the executed navigation command to select a game; and
using the performed action to launch the selected game.

11. A game controller for use with a video game console, the game controller comprising:

a microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller, wherein at least one of the microcontroller and the video game console is configured to recognize the detected motion as a gesture,
wherein if the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user, and
wherein if the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.

12. The game controller of claim 11, further comprising the at least one motion sensor.

13. The game controller of claim 11, wherein the at least one motion sensor is configured to generate motion data, the game controller is configured to output motion data to the video game console, and the video game console is configured to convert the motion data to the recognized gesture.

14. The game controller of claim 11, further comprising a modifier button in electrical communication with the microcontroller and configured to switch the game controller between the first operational mode and the second operational mode when the modifier button is actuated.

15. The game controller of claim 11, wherein the operation of the game controller is switched between the first operational mode and the second operational mode in response to the microcontroller recognizing the detected motion as a modifier gesture.

16. The game controller of claim 11, wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.

17. The game controller of claim 11, wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.

18. The game controller of claim 11, further comprising a transducer configured to receive an acoustic input and generate a signal based on the acoustic input.

19. The game controller of claim 11, wherein the game controller is configured to activate when the at least one motion sensor detects motion of the game controller.

20. The game controller of claim 11, wherein the game controller comprises at least one of an exercise controller configured to be worn by the user, a pointing controller, and a sports controller.

Patent History
Publication number: 20090305785
Type: Application
Filed: Jun 6, 2008
Publication Date: Dec 10, 2009
Applicant: Microsoft Corporation (Redmond, WA)
Inventors: Steven M. Beeman (Kirkland, WA), Edward C. Giaimo, III (Bellevue, WA), Eric Filer (Renton, WA), Dennis W. Tom (Redmond, WA)
Application Number: 12/134,448
Classifications
Current U.S. Class: Player-actuated Control Structure (e.g., Brain-wave Or Body Signal, Bar-code Wand, Foot Pedal, Etc.) (463/36)
International Classification: A63F 9/24 (20060101);