Video game system with wireless modular handheld controller

- Nintendo Co., Ltd.

A home entertainment system for video games and other applications includes a main unit and handheld controllers. The handheld controllers sense their own motion by detecting illumination emitted by emitters positioned at either side of a display. The controllers can be plugged into expansion units that customize the overall control interface for particular applications including but not limited to legacy video games.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description

This application is a reissue of U.S. Pat. No. 8,430,753, for which more than one reissue application has been filed, namely application Ser. No. 14/694,783 filed on Apr. 23, 2015 which is also a reissue of U.S. Pat. No. 8,430,753.

CROSS-REFERENCES TO RELATED APPLICATIONS

This application is a reissue of U.S. application Ser. No. 13/071,088 (now U.S. Pat. No. 8,430,753), filed Mar. 24, 2011, which is a continuation of U.S. application Ser. No. 11/532,328, filed Sep. 15, 2006, which claims priority from provisional application No. 60/716,937, filed on Sep. 15, 2005. This application is also related to U.S. application Ser. No. 11/446,187, filed on Jun. 5, 2006; and U.S. application Ser. No. 11/446,188, filed on Jun. 5, 2006, the disclosures of which are incorporated herein by reference.

FIELD

The technology herein relates to consumer electronics, and more particularly to video game and entertainment systems. In still more detail, the technology herein relates to a home video game system including a modular remote wireless handheld control device with capabilities including position sensing.

BACKGROUND AND SUMMARY

Computer graphics technology has come a long way since video games were first developed. Relatively inexpensive 3D graphics engines now provide nearly photo-realistic interactive game play on home video game and personal computer hardware platforms costing only a few hundred dollars.

Most game players demand great graphics, but the core of video game play is the man-machine interface—the interaction between the (human) game player and the gaming platform. Video games are fun and exciting to play because the game player can interact with the game and affect or control the gaming events and outcome. Since the essence of an enjoyable video game play experience relates to the way the user interacts with the game and the game playing system, user input details tend to be important to the success and marketability of home video game play systems.

One aspect of the video game user interface relates to how the user controls the position of one or more objects on the display. Much work has been done on this user interface aspect in the past. For example, the first Magnavox Odyssey home video game systems provided detachable handheld controllers with knobs that allowed the game player to control the horizontal and vertical positioning of objects on the screen. Pong®, another early home video game system, had a very simple user interface providing controls the players manipulated to control the positioning of paddles on the screen. Nintendo's Game and Watch® early handheld video game systems used a “cross-switch” as described in Nintendo's U.S. Pat. No. 4,687,200 to control the position of objects on the screen. These were relatively simple yet effective user interfaces.

In recent years, video game system handheld controllers have tended to become increasingly more complicated and more capable. Video game platforms offered by Nintendo and others have provided joysticks, cross-switches or other user-manipulable controls as a means for allowing the user to control game play in a variety of simple and sophisticated ways. Many handheld controllers provide multiple joysticks as well an array of trigger buttons, additional control buttons, memory ports, and other features. Rumble or vibration effects are now common, as are wireless capabilities. Home video game manufacturers supply a variety of user input devices, and game accessory manufacturers often provide an even wider array of input device options. For example, some in the past have also tried to develop a video game handheld controller that senses the orientation of the handheld controller itself to control object position on the display. See U.S. Pat. No. 5,059,958 assigned to the present assignee.

One challenge that some have confronted in the past relates to cross-platform video game play. Generally, most video game system manufacturers differentiate new gaming systems from other or previous ones by providing unique user interface features including for example handheld controller configurations. Video games for play on different home video game platforms may therefore use different handheld controller configurations. While it may be possible in some cases to “remap” the user controls from one interface configuration to another so a game for one platform can be controlled using a different input control interface, such remapping may be less than optimal and/or change the game play experience in significant ways. For example, playing a game using a four-active-position cross-switch to control the movement of the main character on the screen may be quite a different experience for the user as compared with using an analog or digital joystick offering many different directional positions.

Furthermore, most video game platforms in the past have provided a single basic user interface that is used for all games playable on the platform. Even though different video games may provide quite different game play, video game developers have become skilled at using the common set of user input controls provided by the platform to control various different games. For example, most games developed to run on the Nintendo GameCube home video game system make use of the same handheld controller inputs comprising two joysticks, trigger switches and additional miscellaneous controls. Some games allocate different controls to different functions. For example, in one game, the left-hand joystick might navigate a 2D map view of a battlefield whereas in another game that same control might be used to allow the user to adjust virtual camera position or direction within a three-dimensional world.

The technology herein advances home video game user interfaces in ways not previously envisioned, to provide a more flexible and satisfying user experience across an ever increasing and divergent range of video games and other applications.

One illustrative non-limiting exemplary aspect of the technology herein provides for positioning video game objects on the screen in response to the position of a handheld controller relative to the display. Rather than moving a joystick or cross-switch, the user simply moves the entire handheld controller. The motion of the controller is sensed and used to control the position of objects or other parameters in connection with video game play.

Another exemplary non-limiting illustrative aspect of the technology herein provides a handheld controller with a modular design. The basic controller functionality including wireless connectivity, vibration generation, position sensing, orientation sensing and other features are provided within a core or basic handheld controller unit. This core unit can control many or most videogame input functions and play most games. However, for enhanced input functionality, the core unit can be plugged into an expansion controller assembly providing additional controls, inputs and other functionality. As one example, the core unit can be plugged into a first accessory expansion unit providing touch pads when it is desired to play videogames requiring touch pad input. The same core unit can be plugged into a different expansion unit providing joysticks and other input devices to play videogames designed for joystick inputs. The same core controller can be plugged into a still additional expansion unit when the player wishes to interact with a videogame system using a simpler control interface providing a cross-switch and additional input buttons. In one exemplary illustrative non-limiting implementation, some of the accessory units are designed to mimic earlier or different videogame platforms to allow the videogame system to match user interactivity experiences provided by such other systems.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features and advantages will be better and more completely understood by referring to the following detailed description of exemplary illustrative non-limiting implementations in conjunction with the drawings, of which:

FIG. 1 shows an exemplary illustrative videogame system being operated in a typical home game playing environment;

FIG. 2 shows an exemplary illustrative non-limiting implementation of a handheld videogame controller;

FIGS. 2A-2E show different views of the FIG. 2 implementation being grasped by the hand;

FIG. 2F shows exemplary two-handed operation;

FIGS. 3 and 3A show exemplary illustrative variations of the FIG. 2 controller with a top plate removed;

FIG. 4 shows a bottom view of the FIG. 2 controller;

FIG. 5 shows a bottom view of the FIG. 2 controller with bottom cover removed;

FIG. 6 shows a side and front perspective view of the exemplary FIG. 2 controller;

FIG. 6A shows an additional exemplary view of the FIG. 2 controller including a head pivot or tilt feature;

FIGS. 6B-6H show different views of an alternative exemplary illustrative non-limiting handheld controller implementation;

FIGS. 7A and 7B show different views of the FIG. 2 controller when used to detect position relative to light emitters;

FIGS. 8A, 8B, 8B-1, 8C and 8D show exemplary illustrative non-limiting expansion controller units into which the FIG. 2 core unit may be removably disposed and interconnected;

FIG. 9 shows an exemplary illustrative non-limiting block diagram implementation of the FIG. 1 system;

FIG. 10 shows an overall block diagram of the FIG. 2 controller;

FIG. 11 is an exemplary illustrative non-limiting block diagram of an overall system; and

FIGS. 12A-12C show exemplary illustrative non-limiting block diagrams of different expansion unit controller configurations.

DETAILED DESCRIPTION

Example Overall Exemplary Illustrative Non-Limiting System

FIG. 1 shows an illustrative, exemplary non-limiting implementation of a video game system 100. System 100 includes a main unit 102 sometimes also called a “console.” Main unit 102 executes applications including video game software, and generates images for display on the display 104 of a conventional home color television set or other display device 106. Main unit 102 also generates sound for reproduction by TV set 106. People 108 can interact with the video game play to control or affect the images and the progression of the game or other application.

Main unit 102 in the exemplary illustrative non-limiting implementation can be used to play a variety of different games including driving games, adventure games, flying games, fighting games, and almost any other type of game one might think of. The video game software that main unit 102 executes may be delivered on bulk storage devices such as optical disks, semiconductor memory devices or the like, it may be down loaded into the main unit over a network, or it may be provided to the main unit in any other desired manner. Main unit 102 may also be capable of performing applications in addition to video games (e.g., movie playback, email, web browsing, or any other application one can imagine). A security system built into main unit 102 may ensure that only authorized or authentic applications are executed.

FIG. 1 shows people (“video game players”) 108a, 108b interacting with main unit 102 to play a video game. While two players 108 are shown, any number of players may interact with the main unit 102 at any given time. In the exemplary illustrative non-limiting implementation shown, each video game player 108 holds and operates a wireless handheld control unit (“controller”) 200. The players 108 operate these controllers 200 to generate input signals. The controllers 200 communicate their input signals wirelessly to main unit 102. Such wireless communications can be by any convenient wireless method such as radio transmission, infrared, ultraviolet, ultrasonic or any other desired technique. Wireless peripherals could include Bluetooth, 802.11 (WiFi), HiperLAN/1, HiperLAN/2, HomeRF, VWB, WiMax or other. In other implementations, cords or cables could be used to connect controllers 200 to main unit 102.

In the exemplary illustrative non-limiting implementation of system 100 shown, players 108 operate handheld controllers 200 in various ways to provide input signals to main unit 102. For example, players 108 may depress buttons or otherwise manipulate other controls on controllers 200 to generate certain input signals. The effect of such control manipulations in the exemplary illustrative non-limiting implementation depends, at least in part, on the particular software that main unit 102 is executing. For example, depressing a certain button may provide a “start game” or “pause game” in some contexts, and may provide different functions (e.g., “jump character”) in other contexts.

In the illustrative exemplary non-limiting implementation shown, controllers 200 have internal capabilities for detecting position and/or orientation. In the exemplary illustrative non-limiting implementation, players may change the orientation or position of controllers 200 to generate input signals. Controllers 200 may sense position and/or orientation and report that information to main unit 102. Main unit 102 may use that information to control or affect video game play or other functionality.

In one exemplary illustrative non-limiting implementation, each handhold controller 200 may include an internal position, attitude or orientation sensor that can sense the position, attitude and/or orientation of the controller relative to the earth's gravitational force. Such a sensor may for example comprise a 3-axis accelerometer that can sense orientation (or changes in orientation) of the controller 200 relative to the direction of earth's gravitational pull. The output of such a sensor may be reported to main unit 102 and used for example to control motion of a character displayed on display 104.

In addition, the exemplary illustrative non-limiting implementation of system 100 shown in FIG. 1 includes wireless emitters 110a, 110b. These wireless emitters 110 may be placed on each side of display 104 in alignment with the edges of the screen. The wireless emitters 110 may for example each comprise one or more light emitting diodes or other devices 112 that emit infrared or other electromagnetic or other radiation.

In one exemplary illustrative non-limiting implementation, the energy that emitters 110 emit has a wavelength or other characteristic that allows the radiation to be readily distinguished from ambient radiation. In the exemplary illustrative non-limiting implementation, handheld controllers 200 each detect the radiation emitted by emitters 110 and generate signals indicative of the controller's relative position and/or movement. Multiple controllers 200 can sense the same emitted radiation and generate different signals depending on the position or movement of that particular controller. Controllers 200 report the relative position and/or movement signal to main unit 102. Main unit 102 may take any appropriate action in response to such signals such as, for example, moving, rotating or otherwise changing a game character or other object or background on the display 104, scrolling a screen, selecting a different game function, or taking other actions.

In the exemplary illustrative implementation shown, the emitters 110 are added or retro-fitted onto a conventional color television set 106 by for example using an adhesive to attach the emitters onto the top housing of the television set on the extreme left and right of the housing in alignment with the edges of display 104. In this exemplary illustrative non-limiting implementation, emitters 110 can be connected to main unit 102 by cables or wires run behind the television set 106. In other implementations, emitters 110 could be built-in to television set 106 or mounted separately (e.g., on a set top box or otherwise). In still other implementations, emitters 110 could possibly be replaced with small reflective surfaces attached by adhesive to corners of display 104, and controllers 200 could emit electromagnetic radiation and receive reflections from the reflective surfaces (e.g., whose angle of incidence is equal to angle of reflectance). In still other implementations, controllers 200 could emit electromagnetic radiations and units 110 could include sensors that sense the emitted radiation. Other implementations are possible.

Example Illustrative Non-Limiting Handheld Controller Design

FIG. 2 shows a perspective view of an exemplary illustrative non-limiting implementation of controller 200. Controller 200 provides a housing 202 that is graspable by one hand (see FIGS. 2A, 2B, 2C). Controller 200 in the exemplary illustrative non-limiting implementation is compact and has a solid rugged feel to it. It can be dropped onto a hard surface without breaking. Portions of its housing 202 are curved to fit comfortably into the hand (see FIGS. 2A, 2B, 2C).

As shown in FIG. 2A, the thumb can be positioned to operate controls on a top control surface 204 while the fingers are comfortably wrapped around the controller's bottom surface 203. The digits of the hand (including the thumb) can operate the different controls arrayed on a top control surface 204 and elsewhere on the controller without fatigue and without wasted or undue motion. The controller 200 is small and lightweight enough to be comfortably held and supported for long periods of time without fatigue. Controller 200 is dimensioned to exactly and comfortably fit the average hand—not too small, not too big. The controls are arranged such that the controller 200 can be operated equally easily by the right hand or the left hand.

The controller housing 202 provides a top control surface 204 providing an array of controls depressible with the digits (fingers and/or thumbs) of the user's hand. In one illustrative non-limiting implementation, the user may operate a direction switch 206 with a thumb or forefinger to indicate a direction in two dimensions. In the illustrative non-limiting exemplary implementation shown, the directional switch 206 may comprise a switch surface 208 that can be rocked in different directions to provide different direction signals. The simplest form of such a directional switch 206 may comprise a so-called “cross switch” (a switch in the shape of a cross) that can be rocked in four different directions to provide four different, mutually exclusive direction signals (i.e., up, down, left, right). A somewhat more flexible form of a directional switch 208 may comprise a circular switch surface 208 that can be rocked in any of a number of different directions to provide corresponding different control signals indicating for example twelve, sixteen or more different directions. Other directional switch configurations could be used to provide a much higher number of directional inputs approaching, equaling or exceeding the number of signals from an analog or digital joystick. A touch or “joy” pad, a pointing stick, a trackball, or other input device could be used instead of or in addition to a switch. If a joypad were used, it could likely be operated in a direction-indicating mode as opposed to a “drag displacement” mode. Other arrangements could include touch sensitive display(s) or other types of displays.

Top control surface 204 in the exemplary illustrative non-limiting implementation also provides a pair of thumb-operated control switches 210a, 210b. These control switches 210a, 210b can be oriented as shown, or they could each be rotated say 45 degrees so as to be angularly displayed from one another in order to expose more surface area to a thumb positioned to operate either control switches 210 or directional switch 206. Control switches 210a, 210b could be used to actuate a variety of game or other functions including for example “start” and “select” functions.

Top control surface 204 may also provide an additional push button 214 operated by the thumb for other functionality selection. A slide switch 216 on the side of housing 202 may be operated to provide on/off or other functionality. Depending on requirements, a slide switch 216 could be located on either or both side surfaces of the exemplary controller 200.

Top control surface 204 in the exemplary illustrative non-limiting implementation further provides two additional controls 212a, 212b that may comprise indicator lamps or lights. Alternatively, such controls 212 could comprise additional operable controls such as push button switches, so-called “pointing stick” type input devices, or other input devices. These controls 212 may be relatively dormant or little used (while not being subject to accidental operation) when the controller 200 is operated in the hand positions shown in FIGS. 2A, 2B, 2C, 2D, 2E, 2F. However, another way of using controller 200 is to hold the controller in one hand (or place it on a flat surface such a table) and operate its controls with the forefinger and other fingers of the other hand. In such an alternate operating mode, the forefinger could be used to operate controls 212a, 212b if they are activatable input devices as opposed to indicators. FIG. 2D for example shows that in one exemplary illustrative implementation, the user may move his or her thumb forward or backward to access different controls. FIG. 2D shows the ability to move the thumb side to side to provide different control actuations. FIG. 2F shows an exemplary illustrative non-limiting implementation whereby the user can hold the handheld controller in both hands and operate it with both left thumb and right thumb simultaneously.

FIG. 3 shows an exploded view of controller 200 with a top plate 204 removed to reveal a printed circuit board 220. Metallic pathways (not shown) and associated solder or other electrical interconnections may be used to electrically interconnect components via PC board 220. Various components including integrated circuit chips 222 (e.g., a wireless RF “Bluetooth” or other communications device, an accelerometer and other components) may be mounted to the printed circuit board 220. The printed circuit board 220 may also serve as a mounting surface for the directional switch 206, controls 210, 212, etc. The printed circuit board 220 in one exemplary illustrative non-limiting implementation provides a rugged fiberglass structure used to both mount and electrically interconnect components of controller 200. The same or different printed circuit board 220 may provide an edge or other connector 224 for use in electrically connecting controller 200 to other devices (to be described below). FIG. 3A shows a different exemplary illustrative non-limiting implementation with a different exemplary non-limiting control layout. Further configurations are also possible.

FIG. 4 shows a bottom view of an exemplary illustrative non-limiting implementation of controller 200. The bottom view reveals an access plate 230 for installing one or more small conventional removable/replaceable battery cells (see FIG. 5). FIG. 4 also shows an additional “trigger” type switch 232 operable by the forefinger when the controller is held in the hand (see FIG. 2A, 2C). “Trigger” switch 232 may for example sense pressure to provide a variable input signal that depends on how much pressure the user's forefinger is exerting on the switch. Such a variable-pressure “trigger” switch 232 can be used in a video game to fire weapons, control the speed of a vehicle in a driving or space game, or provide other functionality.

In the exemplary illustrative non-limiting exemplary implementation shown, the trigger switch 232 is disposed on an angular surface 234 of the bottom surface 240 of controller 200 within a V-shaped depression 236 located near the front distal end 238. This V-shaped depression 236 is dimensioned to comfortably provide a resting and grasping slot for the forefinger (see FIG. 2C) which may be slightly rotated and pulled toward the user between a resting position (see FIG. 2C) and an actuation position (see FIG. 2A). With the middle, ring and pinkie fingers wrapped around and grasping the curved center 240c and rear 240r portions of the controller's bottom surface 203 and the forefinger comfortably engaged within the v-shaped depression 236, the user feels quite comfortable holding and operating controller 200 with one hand and positioning and aiming it precisely in desired directions.

FIG. 5 shows an exploded view of controller 200 with the lower housing portion 240 removed to expose internal components such as removably replaceable batteries 250 and associated holders/contacts 252, and trigger switch 232. While two batteries 250 are shown in FIG. 5, any number of batteries (e.g., one, three, etc.) can be used depending on weight, power and other requirements. Note that to replace batteries 250, the user would not usually remove the lower housing 240 but rather would simply remove the access plate 230. In other configurations, the controller 200 might be rechargeable and batteries 250 could be of the nickel-cadmium or other type that do not require routine replacement. In such exemplary configuration, the controller 200 could be placed into a charging station to recharge the batteries 250 instead of expecting the user to replace the batteries. While FIG. 5 shows a separate edge connector 224, it is possible that the edge connector could be formed by a distal edge of the printed circuit board 220.

FIGS. 6B-6H show an additional exemplary non-limiting illustrative implementation of a handheld controller with a different control configuration. A power button 1002 may be used to activate power on the main unit 102. A control pad 206 provides directional input. An A button 1004 can be operated by the thumb instead of the control pad 206 to provide a momentary on-off control (e.g., to make a character jump, etc.). Select and start buttons 1006, 1008 may be provided for example to start game play, select menu options, etc. A menu button 1010 (which may be recessed to avoid accidental depression) may be provided to display or select menu/home functions. X and Y buttons may be used to provide additional directional or other control. Light emitting diodes or other indicators 1016a-d may be used to indicate various states of operation (e.g., for example to designate which controller number in a multi-controller environment the current controller is assigned). A connector 1018 is provided to connect the controller to external devices. FIG. 6C shows an underneath side perspective view, FIG. 6D shows a top plan view, FIG. 6E shows a side plan view, FIG. 6F shows a bottom plan view, FIG. 6G shows a front plan view, and FIG. 6H shows a rear plan view.

Example Illustrative Non-Limiting Optical Pointing Device Motion Detection

FIG. 6 shows a front perspective view of controller 200 illustrating an additional sensing component 260 also shown in FIG. 5. Sensor 260 in the exemplary illustrative non-limiting implementation is disposed on the “nose” or front surface 262 of controller 200 so that it points forward, looking down a pointing axis P. The direction of pointing axis P changes as the user changes the orientation of controller 200. It is possible to provide a pivot mechanism (see FIG. 6A) to allow the user to pivot the nose portion up and down to provide better ergonomics (e.g., the user could be sitting on the floor below the level of the emitters 112 and still be able to point directly forward, with the sensor 260 axis P being aimed upwardly).

Sensing component 260 in the exemplary illustrative non-limiting implementation comprises an infrared-sensitive CCD type image sensor. Sensor 260 may comprise a one-dimensional line sensor or it could comprise a 2D sensor such as for example a low resolution monochrome CCD or other camera. Motion tracking sensor 260 may include a lens and a closely coupled digital signal processor to process incoming images and reduce the amount of information that needs to be conveyed to main unit 102. In one exemplary non-limiting implementation, motion tracking sensor 260 may include a 128 pixel by 96 pixel relatively low resolution monochrome camera, a digital signal processor and a focusing lens. More than one such sensor could be used if desired.

In the exemplary illustrative non-limiting implementation, sensor 260 gives controller 200 optical pointing capabilities. For example, movement of the controller 200 can be detected (e.g., by the controller itself) and used to control what is being displayed on display 104. Such control could include for example scrolling of the screen, rotation or other reorientation of display objects in response to rotation/reorientation of controller 200, and other responsive interactive displays. Such control may provide a better moment arm as compared to a joystick.

In the exemplary illustrative non-limiting implementation, sensor 260 is designed and configured to sense the emitters 110 shown in FIG. 1. FIGS. 7A, 7B show that sensor 260 has a certain well defined field of view (FOV) symmetrical with the sensor pointing axis P. For example, the sensor 260 may have a field of view of about 20.5 degrees on each or every side of pointing axis P (this particular field of view angle is a design choice; other choices are possible in other configurations). Such well defined field of view provides an acute triangularly shaped (or cone-shaped for 2D sensor configurations) viewing area that sensor 260 can “see”—with the base of the triangle increasing in length as distance from the controller 200 increases. Sensor 260 also has a well defined sensitivity such that it can only “see” IR emissions above a certain range of intensity. Emitters 112 are designed in the exemplary illustrative non-limiting to provide sufficient output power and beam spreading consistent with the sensitivity of sensor 260 such that sensor can “see” the emitters at ranges consistent with how video game players arrange themselves in a room relative to a television set 106 (taking into account that a player may sometimes sit close to the television when playing by himself, that players may be sitting on the floor, standing, sitting on chairs or couches or other furniture, etc.).

In more detail, FIG. 7A shows that in the exemplary illustrative non-limiting implementation, the overall field of view of sensor 260 is wider than the typical separation of emitters 112 and is also wider than beam width of each emitter 112. In one exemplary illustrative non-limiting implementation, the ratio of the beam spreading angle (e.g., 34 degrees) of the beams emitted by emitters 112 to the field of view (e.g., 41 degrees) of sensor 260 may be approximately 0.82 (other ratios are possible). Plural emitters 112 can be used at each emission point to provide a wider beam (horizontal field of view) than might otherwise be available from only a single emitter, or a lens or other optics can be used to achieve desired beam width.

At an average distance from controller 200 to television set 106 and associated emitters 112 and assuming a maximum television screen size (and thus a maximum physical separation between the emitters), such a ratio may maximize the displacement of two radiation “dots” or points appearing on the CCD sensor array 270 that sensor 260 comprises. Referring to FIG. 7A for example, when the central axis of sensor 260 is directed centrally between displaced emitters 112 (note that in one exemplary illustrative non-limiting implementation, the emitters are disposed on either side of the television display and are therefore relatively far apart relative to the resolution of the image being generated), the CCD array 270 that sensor 260 defines will register maximal illumination at two points near the ends of the sensor array. This provides a higher degree of resolution when the sensor 260's central axis P is displaced relative to the center of separation of the emitters 112 (see FIG. 7B) even when using a relatively low resolution CCD imaging array (e.g., a 128-cell long sensor array). Note that while a linear array 270 is illustrated in FIGS. 7A, 7B for sake of convenience, a rectangular array could be used instead.

In the illustrative, exemplary non-limiting implementation shown, it is unnecessary to modulate or synchronize emitters 112 in the exemplary illustrative non-limiting implementation, although it may be desirable to power down the emitters when not in use to conserve power usage. In other arrangements, however, synchronous detection, modulation and other techniques could be used.

The exemplary illustrative non-limiting implementation of controller 200 and/or main unit 102 includes software or hardware functionality to determine the position of controller 200 relative to emitters 112, in response to the illumination maxima sensed by sensor 260. In one example illustrative non-limiting implementation, controller 200 includes an onboard processor coupled to the sensor 260 that interprets the currently detected illumination pattern, correlates it with previous sensed illumination patterns, and derives a current position. In another example illustrative non-limiting implementation, controller 200 may simply report the sensed pattern to main unit 102 which then performs the needed processing to detect motion of controller 200. The sensor could be affixed to the human operating the system to provide additional control.

Since it may not be desirable to require end users of system 100 to measure and program in the precise distance between the emitters 112 and since television sets vary in dimension from small screens to very large screens, controller 200 does not attempt to calculate or derive exact positional or distance information. Rather, controller 200 may determine movement changes in relative position or distance by analyzing changes in the illumination pattern “seen” by CCD array 270.

It may be possible to ask the user to initially point the controller 200 at the center of the television screen 104 and press a button, so as to establish a calibration point (e.g., see FIG. 7A)—or the game player may be encouraged to point to the center of the screen by displaying an object at the center of the screen and asking the user to “aim” at the object and depress the trigger switch. Alternatively, to maximize user friendliness, the system can be self-calibrating or require no calibration at all.

Differences in the illumination pattern that CCD array 270 observes relative to previously sensed patterns (see e.g., FIG. 7B) can be used to determine or estimate movement (change in position) relative to previous position in three dimensions. Even though the CCD array 270 illumination shown in the FIG. 7B scenario is ambiguous (it could be obtained by aiming directly at emitter 112a or at emitter 112b), recording and analyzing illumination patterns on a relatively frequent periodic or other basis (e.g., 200 times per second) allows the controller to continually keep track of where it is relative to the emitters 112 and previous controller positions. The distance between the illumination points of emitters 112 and CCD array 270 can be used to estimate relative distance from the emitters. Generally, game players can be assumed to be standing directly in front of the television set and perpendicular to the plane of display 106. However, scenarios in which controller 200 is aimed “off axis” such that its central axis P intersects the plane of emitters 112 at an angle other than perpendicular can also be detected by determining the decreased separation of the two maximum illumination points on the CCD array 270 relative to an earlier detected separation. Care must be taken however since changes in separation can be attributed to changed distance from the emitters 112 as opposed to off-axis. Simpler mathematics can be used for the motion and relative position detection if one assumes that the player is aiming the sensor axis P directly at the display 104 so the axis perpendicularly intersects the plane of the display.

Software algorithms of conventional design can ascertain position of controller 200 relative to emitters 112 and to each logical or actual edge of the display screen 104. If desired, controller 200 may further include an internal conventional 3-axis accelerometer that detects the earth's gravitational forces in three dimensions and may thus be used as an inclinometer. Such inclination (orientation) information in three axis can be used to provide further inputs to the relative position-detecting algorithm, to provide rough (x, y, z) position information in three dimensions. Such relative position information (or signals from which it can be derived) can be wirelessly communicated to main unit 102 and used to control the position of displayed objects on the screen.

Example Modular Control Interface Controller Expansion

FIGS. 8A-8D illustrate an additional feature of the exemplary illustrative non-limiting implementation of controller 200. In accordance with this additional feature, the controller 200 may be used as the “core” of a modular, larger handheld controller unit by connecting the controller 200 to an additional expansion unit 300. Core controller 200 may “ride piggyback” on an expansion unit 300 to easily and flexibly provide additional control interface functionality that can be changed by simply unplugging the controller from one expansion unit an plugging it in to another expansion unit.

FIG. 8A shows one exemplary illustrative non-limiting such additional expansion unit 300 including a housing 302 having a control surface 304 providing an array of additional controls including for example a joystick 306, a cross-switch 308 and various push-button controls 310. Expansion unit 300 includes a depression such that when the rear portion of controller 200 is inserted into the depression, the resulting combined unit provides an overall planar T-shaped control surface that combines the expansion unit 300 control surface with the controller 200 control surface in a flush and continuous manner. In such case, the user may grasp the expansion unit 300 with two hands and may operate the controls of controller 200 (see FIG. 8B-1) or controls on the expansion unit 300. Expansion unit 300 thus effectively converts the controller 200 designed to be held in a single hand into a two-handed controller while also supplying additional controls.

FIG. 8B shows a further expansion unit 300′ having a somewhat different control configuration. FIGS. 8C and 8D show additional non-limiting illustrative example expansion units.

As shown in FIG. 8B-1, expansion units 300 may provide all of the controls that the user would operate to control a video game when controller 200 is plugged into the additional unit. This provides a high degree of flexibility, since any number of additional units 300 of any desired configuration can be provided. Such additional units 300 can be manufactured relatively inexpensively since they can rely on controller 200 for power, processing, wireless communications and all other core functions. In the exemplary illustrative non-limiting implementation, controller edge connector 224 exposes sufficient connections and a sufficiently flexible interface such that an expansion unit 300 of virtually any desirable description can be compatibly used.

One possible motivation for manufacturing expansion units 300 is to provide control interface compatibility with other video game platforms including for example legacy platforms such as the Nintendo Entertainment System, the Super Nintendo Entertainment System, the Nintendo 64, the Nintendo GameCube System, and the Nintendo Game Boy, Game Boy Advance and Nintendo DS systems. An expansion unit 300 providing a control interface similar or identical to for the example the Super Nintendo Entertainment System could be made available for playing Super Nintendo Entertainment System games on system 100. This would eliminate the desire to reprogram or rework Super Nintendo Entertainment System games for use with the newer or different interface provided by controller 200.

Another possible, more general motivation for additional expansion units 300 is to provide customized control interfaces for particular games or other applications. For example, it would be possible to develop a unit 300 with a steering wheel for driving games, a unit with a keyboard for text entry applications, a unit with one or multiple touch pads for touch screen style games, etc. Any desired control configuration is possible and can be flexibly accommodated.

Still another possible application would be to use expansion units 300 to give different players of a multi-player game different capabilities. For example, one game player might use controller 200 “as is” without any expansion, another game player might use the expansion configuration shown in FIG. 12A, yet another game player might use the expansion configuration shown in FIG. 12B, etc. One could imagine a military battle game for example in which game players playing the role of tank drivers use an expansion unit that resembles the controls of a tank, game players playing the role of artillerymen use an expansion unit that resembles controls of heavy artillery, and a game player playing the role of a commanding general uses an expansion unit that provides more general controls to locate infantry, artillery and tanks on the field.

Example Illustrative Non-Limiting Block Diagrams

FIG. 9 shows a block diagram of an exemplary illustrative implementation of system 100. As described above, system 100 includes a main unit 102 and one or several controllers 200a, 200b, 200c, etc. Each controller 200 may be connected to any of additional expansion units 300 or may be used by itself, depending on the application. Additional wireless peripherals to system 100 may include a headset unit 180 for voice chat and other applications, a keyboard unit 182, a mouse or other pointing device 184, and other peripheral input and/or output units.

FIG. 10 is a block diagram of an exemplary illustrative non-limiting implementation of controller 200. In the example shown, controller 200 may comprise a wireless connectivity chip 280 that communicates bidirectionally with main unit 102 via a pattern antenna 278. Wireless communications chip 280 may be based on the Bluetooth standard but customized to provide low latency. In the example shown here, most or all processing is performed by the main unit 102, and controller 200 acts more like a telemetry device to relay sensed information back to the main unit 102. Such sensed inputs may include a motion tracking sensor 260, an accelerometer 290, and various buttons 206, 210, etc. as described above. Output devices included with or within controller 200 may include a vibrational transducer 292 and various indicators 294.

FIG. 11 shows an overall exemplary illustrative non-limiting system block diagram showing a portion of main unit 102 that communicates with controller 200. Such exemplary illustrative non-limiting main unit 102 portion may include for example a wireless controller 1000, a ROM/Real Time Clock 1002, an idle mode indicator 1004, a processor 1006 and various power supplies. Link buttons may be provided on each side of the communications link to provide manual input for synchronization/training/searching.

FIGS. 12A, 12B and 12C show different exemplary block diagram configurations for different expansion units 300. The FIG. 12A example includes dual touch pads 1200 and a joystick 1202 for touch screen compatible gaming; the FIG. 12B example includes two joysticks 1202 and other controls for games requiring two different joysticks (e.g., Nintendo GameCube legacy games); and the FIG. 12C example includes a cross-switch 1204 and other controls for more limited user interface type games (e.g., Nintendo Entertainment System legacy games).

Each expansion unit may be programmed with a 4-bit or other length “type” ID to permit controller 200 to detect which type of expansion unit is being used. Main unit 102 can adapt user interactivity based at least in part on the “type” ID.

While the technology herein has been described in connection with exemplary illustrative non-limiting implementations, the invention is not to be limited by the disclosure. The invention is intended to be defined by the claims and to cover all corresponding and equivalent arrangements whether or not specifically disclosed herein.

Claims

1. A wireless handheld remote controller configured to be held in one hand, comprising:

a housing including an upper surface and a lower surface;
at least one digit operable detector disposed on the upper surface;
at least one depressible trigger disposed on said lower surface;
an inertial sensor mounted in the housing;
a two dimensional radiation detector;
a processor that processes an output of the radiation detector and determines an illumination pattern;
a wireless transceiver that transmits information based on signals generated by the inertial sensor and the processor; and
an output device operatively coupled to the transceiver.

2. The controller of claim 1 wherein the radiation detector is disposed, at least in part, at a front portion of the housing.

3. The controller of claim 1, wherein the radiation detector comprises a two dimensional camera.

4. The controller of claim 1, wherein the radiation detector comprises:

a two dimensional radiation sensor array; and
an infrared filter that is mounted on the housing in front of the two dimensional radiation sensor array such that only infrared light passing through the filter is received by the radiation sensor array.

5. The controller of claim 1, wherein the radiation detector generates frames of two dimensional image data, and wherein the processor determines an illumination pattern for each frame of image data.

6. The controller of claim 5, wherein each illumination pattern comprises X and Y coordinates for illuminated objects appearing within a frame of image data.

7. The controller of claim 5, wherein each illumination pattern comprises X and Y coordinates for illuminated objects appearing within a frame of image data that have an intensity that rises above a predetermined threshold value.

8. The controller of claim 5, wherein each illumination pattern comprises X and Y coordinates for illuminated objects appearing within a frame of image data that emit infrared radiation having an intensity that rises above a predetermined threshold value.

9. The controller of claim 5, wherein the wireless transceiver transmits information regarding the illumination patterns for frames of image data.

10. The controller of claim 9, wherein the inertial sensor comprises an accelerometer.

11. The controller of claim 10, wherein the accelerometer is a three axis accelerometer that senses linear acceleration in each of three mutually perpendicular axes, and wherein the inertial sensor outputs three linear acceleration values corresponding to the three mutually perpendicular axes multiple times every second.

12. The controller of claim 11, wherein the wireless transceiver also transmits a set of the three acceleration values multiple times every second.

13. The controller of claim 1, wherein the inertial sensor comprises an accelerometer.

14. The controller of claim 13, wherein the accelerometer is a three axis accelerometer that senses linear acceleration in each of three mutually perpendicular axes, and wherein the inertial sensor outputs three linear acceleration values corresponding to the three mutually perpendicular axes multiple times every second.

15. The controller of claim 14, wherein the wireless transceiver transmits a set of the three acceleration values multiple times every second.

16. The controller of claim 1, wherein the output device comprises a speaker, and wherein the speaker outputs sounds based on a signal received by the wireless transceiver.

17. The controller of claim 1, wherein the output device comprises a vibration module that causes the housing to vibrate based on a signal received by the wireless transceiver.

18. The controller of claim 1, wherein the output device comprises at least one indicator light that is selectively illuminated based on a signal received by the wireless transceiver.

19. The controller of claim 1, wherein the output device comprises an array of indicator lights that are selectively illuminated based on a signal received by the wireless transceiver.

20. The controller of claim 1, wherein the at least one digit operable detector comprises at least one depressible button disposed on the upper surface of the housing.

21. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
wherein one of the first input device or the second input device includes a directional switch input device and the other includes an inclinable stick input device, and
wherein the top surface comprises a proximal portion closer to the body of the user when the user holds the housing in two hands, the first input device is arranged between the second input device and the proximal portion.

22. The handheld electronic device according to claim 21, further comprising a third input device arranged at the top surface on a second side that is opposite the first side from the lateral center of the top surface.

23. The handheld electronic device according to claim 22, wherein the third input device includes at least four control buttons in a cross-shaped arrangement.

24. The handheld electronic device according to claim 22, wherein the third input device includes an inclinable stick input device.

25. The handheld electronic device according to claim 24, wherein the first input device includes a directional switch input device, and wherein the first input device and the third input device are positioned symmetrically on opposite sides of the lateral center of the top surface.

26. The handheld electronic device according to claim 22, further comprising a fourth input device arranged at the left side of the further surface and a fifth input device arranged at the right side of the further surface, wherein the fourth input device is operable with a finger of the user's left hand and the fifth input device is operable with a finger of the user's right hand when the user holds the housing with both hands to provide input to the processor.

27. The handheld electronic device according to claim 22, further comprising at least one fourth input device on a surface of the housing other than the top surface, the at least one fourth input device configured to generate an analog signal based upon a level of user input.

28. The handheld electronic device according to claim 27, wherein the at least one fourth input device is further configured to vary the analog signal based upon how much pressure is exerted on the at least one fourth input device by a finger of the user.

29. The handheld electronic device according to claim 22, further comprising a fourth input device arranged at a lateral center of the top surface, the fourth input device being a button switch.

30. The handheld electronic device according to claim 29, wherein the button switch is recessed in relation to the top surface.

31. The handheld electronic device according to claim 30, wherein the fourth input device is configured to generate an input signal to cause a MENU or HOME operation.

32. The handheld electronic device according to claim 22, wherein a recessed button switch is arranged at a lateral center of the top surface, and wherein one of the first input device or the second input device and a second inclinable stick input device are arranged symmetrically on opposite sides of the lateral center of the top surface.

33. The handheld electronic device according to claim 21, further comprising at least one wireless antenna, wherein the handheld electronic device is configured to communicate with a console device using a wireless protocol over the wireless antenna.

34. The handheld electronic device according to claim 33, wherein the wireless protocol is based upon the Bluetooth protocol standard.

35. The handheld electronic device according to claim 21, further comprising a vibration generator configured to vibrate the housing in response to a signal received via a wireless communication interface.

36. The handheld electronic device according to claim 21, further comprising at least one image sensor.

37. The handheld electronic device according to claim 36, further comprising a focusing lens associated with the image sensor.

38. The handheld electronic device according to claim 21, further comprising at least one inertial sensor.

39. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
the handheld electronic device further comprising at least one light indication devices arranged on the housing and configured to indicate an identification of the handheld electronic device, wherein the identification uniquely identifies the handheld electronic device among a plurality of controllers communicating with a particular console device.

40. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
the handheld electronic device further comprising additional input devices arranged at locations of the top surface, wherein the additional input devices are configured, respectively, to cause a start operation, to cause a select operation, and to cause processing to return to a predetermined configuration.

41. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
wherein the touch-sensitive input panel includes a touch screen, and wherein
the handheld electronic device further comprises another touch-sensitive input panel.

42. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
wherein one of the first input device or the second input device includes a directional switch input device and the other includes an analog directional input device, and
wherein the top surface comprises a proximal portion closer to the body of the user when the user holds the housing in two hands, the first input device is arranged between the second input device and the proximal portion.

43. A handheld electronic device, comprising:

a housing configured to be held by both hands of a user for providing input to a processor, wherein the housing includes a top surface, a bottom surface, and a further surface extending between the top surface and the bottom surface;
a touch-sensitive input panel arranged at a surface of the housing and configured to receive touch input; and
a first input device and a second input device arranged at the top surface on a first side from the lateral center of the top surface, the first and second input devices being operable with a thumb of the user to provide directional inputs to the processor,
wherein one of the first input device or the second input device includes a directional switch input device and the other includes an analog directional input device, and
wherein the first input device is disposed at a first position on the housing top surface and the second input device is disposed next to the first input device at a second position on the housing top surface, the second position being closer to the body of the user than the first position when the housing is held by both hands of the user.
Referenced Cited
U.S. Patent Documents
3454920 July 1969 Mehr
3474241 October 1969 Kuipers
D220268 March 1971 Kliewer
3660648 May 1972 Kuipers
3973257 August 3, 1976 Rowe
4009619 March 1, 1977 Snyman
4038876 August 2, 1977 Morris
4166406 September 4, 1979 Maughmer
4240638 December 23, 1980 Morrison et al.
4287765 September 8, 1981 Kreft
4303978 December 1, 1981 Shaw et al.
4318245 March 9, 1982 Stowell et al.
4321678 March 23, 1982 Krogmann
4337948 July 6, 1982 Breslow
4342985 August 3, 1982 Desjardins
4402250 September 6, 1983 Baasch
4425488 January 10, 1984 Moskin
4443866 April 17, 1984 Burgiss, Sr.
4450325 May 22, 1984 Luque
4503299 March 5, 1985 Henrard
4514600 April 30, 1985 Lentz
4514798 April 30, 1985 Lesche
4540176 September 10, 1985 Baer
4546551 October 15, 1985 Franks
4558604 December 17, 1985 Auer
4561299 December 31, 1985 Orlando et al.
4578674 March 25, 1986 Baker et al.
4623930 November 18, 1986 Oshima et al.
4672374 June 9, 1987 Desjardins
4739128 April 19, 1988 Grisham
4761540 August 2, 1988 McGeorge
4787051 November 22, 1988 Olson
4816810 March 28, 1989 Moore
4839838 June 13, 1989 LaBiche et al.
4849655 July 18, 1989 Bennett
4851685 July 25, 1989 Dubgen
4862165 August 29, 1989 Gart
4914598 April 3, 1990 Krogmann et al.
4918293 April 17, 1990 McGeorge
4957291 September 18, 1990 Miffitt et al.
4961369 October 9, 1990 McGill
4969647 November 13, 1990 Mical et al.
4988981 January 29, 1991 Zimmerman et al.
4994795 February 19, 1991 MacKenzie
5045843 September 3, 1991 Hansen
D320624 October 8, 1991 Taylor
5059958 October 22, 1991 Jacobs et al.
5062696 November 5, 1991 Oshima et al.
5068645 November 26, 1991 Drumm
D322242 December 10, 1991 Cordell
D325225 April 7, 1992 Adhida
5124938 June 23, 1992 Algrain
5128671 July 7, 1992 Thomas, Jr.
D328463 August 4, 1992 King et al.
5136222 August 4, 1992 Yamamoto
5138154 August 11, 1992 Hotelling
D331058 November 17, 1992 Morales
5175481 December 29, 1992 Kanno
5178477 January 12, 1993 Gambaro
5181181 January 19, 1993 Glynn
5192082 March 9, 1993 Inoue et al.
5202844 April 13, 1993 Kamio et al.
5207426 May 4, 1993 Inoue et al.
D338242 August 10, 1993 Cordell
D340042 October 5, 1993 Copper et al.
5259626 November 9, 1993 Ho
5262777 November 16, 1993 Low et al.
D342256 December 14, 1993 Payne
5280744 January 25, 1994 DeCarlo et al.
D345164 March 15, 1994 Grae
5296871 March 22, 1994 Paley
5307325 April 26, 1994 Scheiber
5317394 May 31, 1994 Hale et al.
5329276 July 12, 1994 Hirabayashi
5332322 July 26, 1994 Gambaro
5339095 August 16, 1994 Redford
D350736 September 20, 1994 Takahashi et al.
D350782 September 20, 1994 Barr
D351430 October 11, 1994 Barr
5357267 October 18, 1994 Inoue
5359321 October 25, 1994 Ribic
5359348 October 25, 1994 Pilcher et al.
5363120 November 8, 1994 Drumm
5369580 November 29, 1994 Monji et al.
H1383 December 6, 1994 Kaplan et al.
5369889 December 6, 1994 Callaghan
5373857 December 20, 1994 Hirabayashi et al.
5396265 March 7, 1995 Ulrich et al.
5421590 June 6, 1995 Robbins
5430435 July 4, 1995 Hoch et al.
D360903 August 1, 1995 Barr et al.
5440326 August 8, 1995 Quinn
5453758 September 26, 1995 Sato
D362870 October 3, 1995 Oikawa
5459489 October 17, 1995 Redford
5469194 November 21, 1995 Clark et al.
5481957 January 9, 1996 Paley et al.
5484355 January 16, 1996 King, II et al.
5485171 January 16, 1996 Copper et al.
5490058 February 6, 1996 Yamasaki et al.
5502486 March 26, 1996 Ueda et al.
5506605 April 9, 1996 Paley
5512892 April 30, 1996 Corballis et al.
5517183 May 14, 1996 Bozeman, Jr.
5523800 June 4, 1996 Dudek
5526022 June 11, 1996 Donahue et al.
5528265 June 18, 1996 Harrison
5531443 July 2, 1996 Cruz
5541860 July 30, 1996 Takei et al.
5551701 September 3, 1996 Bouton et al.
5554033 September 10, 1996 Bizzi
5554980 September 10, 1996 Hashimoto et al.
5561543 October 1, 1996 Ogawa
5563628 October 8, 1996 Stroop
5569085 October 29, 1996 Hashimoto et al.
D375326 November 5, 1996 Yokoi et al.
5573011 November 12, 1996 Felsing
5574479 November 12, 1996 Odell
5579025 November 26, 1996 Itoh
D376826 December 24, 1996 Ashida
5587558 December 24, 1996 Matsushima
5594465 January 14, 1997 Poulachon
5598187 January 28, 1997 Ide et al.
5602569 February 11, 1997 Kato
5603658 February 18, 1997 Cohen
5605505 February 25, 1997 Han
5606343 February 25, 1997 Tsuboyama et al.
5611731 March 18, 1997 Bouton et al.
5615132 March 25, 1997 Horton et al.
5621459 April 15, 1997 Ueda et al.
5624117 April 29, 1997 Ohkubo et al.
5627565 May 6, 1997 Morishita et al.
D379832 June 10, 1997 Ashida
5640152 June 17, 1997 Copper
5641288 June 24, 1997 Zaenglein, Jr.
5643087 July 1, 1997 Marcus et al.
5645077 July 8, 1997 Foxlin et al.
5645277 July 8, 1997 Cheng
5666138 September 9, 1997 Culver
5667220 September 16, 1997 Cheng
5670845 September 23, 1997 Grant et al.
5670988 September 23, 1997 Tickle
5676673 October 14, 1997 Ferre et al.
5679004 October 21, 1997 McGowan et al.
5682181 October 28, 1997 Nguyen et al.
5698784 December 16, 1997 Hotelling et al.
5701131 December 23, 1997 Kuga
5702305 December 30, 1997 Norman et al.
5703623 December 30, 1997 Hall et al.
5724106 March 3, 1998 Autry et al.
5726675 March 10, 1998 Inoue
5734371 March 31, 1998 Kaplan
5734373 March 31, 1998 Rosenberg et al.
5734807 March 31, 1998 Sumi
D393884 April 28, 1998 Hayami
5736970 April 7, 1998 Bozeman, Jr.
5739811 April 14, 1998 Rosenberg et al.
5741182 April 21, 1998 Lipps et al.
5742331 April 21, 1998 Uomori et al.
5745226 April 28, 1998 Gigioli, Jr.
D394264 May 12, 1998 Sakamoto et al.
5746602 May 5, 1998 Kikinis
5751273 May 12, 1998 Cohen
5752880 May 19, 1998 Gabai et al.
5757354 May 26, 1998 Kawamura
5757360 May 26, 1998 Nitta et al.
D395464 June 23, 1998 Shiibashi et al.
5764224 June 9, 1998 Lilja et al.
5769719 June 23, 1998 Hsu
5771038 June 23, 1998 Wang
D396468 July 28, 1998 Schindler et al.
5785317 July 28, 1998 Sasaki
D397162 August 18, 1998 Yokoi et al.
5794081 August 11, 1998 Itoh et al.
5796354 August 18, 1998 Cartabiano et al.
5807284 September 15, 1998 Foxlin
5819206 October 6, 1998 Horton
5820462 October 13, 1998 Yokoi et al.
5822713 October 13, 1998 Profeta
5825350 October 20, 1998 Case, Jr. et al.
D400885 November 10, 1998 Goto
5831553 November 3, 1998 Lenssen et al.
5835077 November 10, 1998 Dao
5835156 November 10, 1998 Blonstein et al.
5841409 November 24, 1998 Ishibashi et al.
D402328 December 8, 1998 Ashida
5847854 December 8, 1998 Benson, Jr.
5850624 December 15, 1998 Gard et al.
5854622 December 29, 1998 Brannon
D405071 February 2, 1999 Gambaro
5867146 February 2, 1999 Kim et al.
5874941 February 23, 1999 Yamada
5875257 February 23, 1999 Marrin et al.
D407071 March 23, 1999 Keating
D407761 April 6, 1999 Barr
5897437 April 27, 1999 Nishiumi et al.
5898421 April 27, 1999 Quinn
5900867 May 4, 1999 Schindler et al.
5902968 May 11, 1999 Sato et al.
D410909 June 15, 1999 Tickle
5912612 June 15, 1999 DeVolpi
5919149 July 6, 1999 Allum
5923317 July 13, 1999 Sayler et al.
5926780 July 20, 1999 Fox et al.
5929782 July 27, 1999 Stark et al.
D412940 August 17, 1999 Kato
5947868 September 7, 1999 Dugan
5955713 September 21, 1999 Titus et al.
5955988 September 21, 1999 Blonstein et al.
5956035 September 21, 1999 Sciammarella et al.
5967898 October 19, 1999 Takasaka et al.
5973757 October 26, 1999 Aubuchon et al.
5982352 November 9, 1999 Pryor
5982356 November 9, 1999 Akiyama
5984548 November 16, 1999 Willner et al.
5984785 November 16, 1999 Takeda
5986644 November 16, 1999 Herder et al.
5991085 November 23, 1999 Rallison et al.
5999168 December 7, 1999 Rosenberg et al.
6002394 December 14, 1999 Schein et al.
D419199 January 18, 2000 Cordell et al.
D419200 January 18, 2000 Ashida
6010406 January 4, 2000 Kajikawa et al.
6011526 January 4, 2000 Toyoshima et al.
6012980 January 11, 2000 Yoshida et al.
6013007 January 11, 2000 Root et al.
6016144 January 18, 2000 Blonstein et al.
6019680 February 1, 2000 Cheng
6020876 February 1, 2000 Rosenberg et al.
6037882 March 14, 2000 Levy
6044297 March 28, 2000 Sheldon et al.
6049823 April 11, 2000 Hwang
6052083 April 18, 2000 Wilson
6057788 May 2, 2000 Cummings
6058342 May 2, 2000 Orbach et al.
6059576 May 9, 2000 Brann
6069594 May 30, 2000 Barnes et al.
6072467 June 6, 2000 Walker
6072470 June 6, 2000 Ishigaki
6075575 June 13, 2000 Schein et al.
6081819 June 27, 2000 Ogino
6084315 July 4, 2000 Schmitt
6084577 July 4, 2000 Sato et al.
6087950 July 11, 2000 Capan
D429718 August 22, 2000 Rudolph
6110039 August 29, 2000 Oh
6115028 September 5, 2000 Balakrishnan
6137457 October 24, 2000 Tokuhashi et al.
D433381 November 7, 2000 Talesfore
6146278 November 14, 2000 Kobayashi
6148100 November 14, 2000 Anderson et al.
6155926 December 5, 2000 Miyamoto et al.
6160405 December 12, 2000 Needle et al.
6160540 December 12, 2000 Fishkin et al.
6162191 December 19, 2000 Foxlin
6164808 December 26, 2000 Shibata et al.
6171190 January 9, 2001 Thanasack et al.
6176837 January 23, 2001 Foxlin
6181329 January 30, 2001 Stork et al.
6183365 February 6, 2001 Tonomura et al.
6184862 February 6, 2001 Leiper
6184863 February 6, 2001 Sibert et al.
6186896 February 13, 2001 Takeda et al.
6191774 February 20, 2001 Schena et al.
6198295 March 6, 2001 Hill
6198470 March 6, 2001 Agam et al.
6198471 March 6, 2001 Cook
6200219 March 13, 2001 Rudell et al.
6200253 March 13, 2001 Nishiumi et al.
6201554 March 13, 2001 Lands
6211861 April 3, 2001 Rosenberg et al.
6217450 April 17, 2001 Meredith
6217478 April 17, 2001 Vohmann et al.
D442998 May 29, 2001 Ashida
6225987 May 1, 2001 Matsuda
6226534 May 1, 2001 Aizawa
6238291 May 29, 2001 Fujimoto et al.
6239726 May 29, 2001 Saida
6239806 May 29, 2001 Nishiumi et al.
6241611 June 5, 2001 Takeda et al.
6243658 June 5, 2001 Raby
6244987 June 12, 2001 Ohsuga et al.
6245014 June 12, 2001 Brainard, II
6264558 July 24, 2001 Nishiumi et al.
6273819 August 14, 2001 Strauss et al.
6280327 August 28, 2001 Leifer et al.
6287198 September 11, 2001 McCauley
6297751 October 2, 2001 Fadavi-Ardekani
6301534 October 9, 2001 McDermott, Jr. et al.
6304250 October 16, 2001 Yang et al.
6315673 November 13, 2001 Kopera et al.
6323614 November 27, 2001 Palazzolo et al.
6323654 November 27, 2001 Needle et al.
6325718 December 4, 2001 Nishiumi et al.
6331841 December 18, 2001 Tokuhashi et al.
6331856 December 18, 2001 Van Hook et al.
6337954 January 8, 2002 Soshi et al.
6346046 February 12, 2002 Miyamoto et al.
6347998 February 19, 2002 Yoshitomi et al.
6361507 March 26, 2002 Foxlin
D456410 April 30, 2002 Ashida
6369794 April 9, 2002 Sakurai et al.
6375572 April 23, 2002 Masuyama et al.
6377793 April 23, 2002 Jenkins
6377906 April 23, 2002 Rowe
D456854 May 7, 2002 Ashida
6383079 May 7, 2002 Takeda et al.
6392613 May 21, 2002 Goto
6394904 May 28, 2002 Stalker
D458972 June 18, 2002 Ashida
6400480 June 4, 2002 Thomas
6400996 June 4, 2002 Hoffberg et al.
6409687 June 25, 2002 Foxlin
D459727 July 2, 2002 Ashida
D460787 July 23, 2002 Nishikawa
6415223 July 2, 2002 Lin et al.
6421056 July 16, 2002 Nishiumi et al.
6424333 July 23, 2002 Tremblay
6426719 July 30, 2002 Nagareda et al.
6426741 July 30, 2002 Goldsmith et al.
D462683 September 10, 2002 Ashida
6452494 September 17, 2002 Harrison
6456276 September 24, 2002 Park
D464053 October 8, 2002 Zicolello
D464950 October 29, 2002 Fraquelli
6466198 October 15, 2002 Feinstein
6466831 October 15, 2002 Shibata et al.
6473070 October 29, 2002 Mishra et al.
6473713 October 29, 2002 McCall et al.
6474159 November 5, 2002 Foxlin et al.
6484080 November 19, 2002 Breed
6492981 December 10, 2002 Stork et al.
6496122 December 17, 2002 Sampsell
6518952 February 11, 2003 Leiper
6530838 March 11, 2003 Ha
6538675 March 25, 2003 Aratani et al.
D473942 April 29, 2003 Motoki et al.
6540607 April 1, 2003 Mokris et al.
6540611 April 1, 2003 Nagata
6544124 April 8, 2003 Ireland et al.
6544126 April 8, 2003 Sawano et al.
6545661 April 8, 2003 Goschy et al.
6554781 April 29, 2003 Carter et al.
D474763 May 20, 2003 Tozaki et al.
6565444 May 20, 2003 Nagata et al.
6567536 May 20, 2003 McNitt et al.
6572108 June 3, 2003 Bristow
6577350 June 10, 2003 Proehl et al.
6582299 June 24, 2003 Matsuyama et al.
6582380 June 24, 2003 Kazlausky et al.
6585596 July 1, 2003 Leifer et al.
6590536 July 8, 2003 Walton
6591677 July 15, 2003 Rothuff
6597342 July 22, 2003 Haruta
6597443 July 22, 2003 Boman
6599194 July 29, 2003 Smith et al.
6605038 August 12, 2003 Teller et al.
6608563 August 19, 2003 Weston et al.
6609977 August 26, 2003 Shimizu et al.
6616607 September 9, 2003 Hashimoto et al.
6628257 September 30, 2003 Oka et al.
6634949 October 21, 2003 Briggs et al.
6636826 October 21, 2003 Abe et al.
6650029 November 18, 2003 Johnston
6650313 November 18, 2003 Levine et al.
6650345 November 18, 2003 Saito et al.
6654001 November 25, 2003 Su
6672962 January 6, 2004 Ozaki et al.
6676520 January 13, 2004 Nishiumi
6677990 January 13, 2004 Kawahara
6681629 January 27, 2004 Foxlin et al.
6682351 January 27, 2004 Abraham-Fuchs et al.
6684062 January 27, 2004 Gosior et al.
D486145 February 3, 2004 Kaminski et al.
6686954 February 3, 2004 Kitaguchi et al.
6692170 February 17, 2004 Abir
6693622 February 17, 2004 Shahoian et al.
6712692 March 30, 2004 Basson et al.
6717573 April 6, 2004 Shahoian et al.
6718280 April 6, 2004 Hermann
6725173 April 20, 2004 An et al.
D489361 May 4, 2004 Mori et al.
6736009 May 18, 2004 Schwabe
D491924 June 22, 2004 Kaminski et al.
D492285 June 29, 2004 Ombao et al.
6743104 June 1, 2004 Ota et al.
6747632 June 8, 2004 Howard
6747690 June 8, 2004 Mølgaard
6749432 June 15, 2004 French et al.
6752719 June 22, 2004 Himoto et al.
6753849 June 22, 2004 Curran et al.
6753888 June 22, 2004 Kamiwada et al.
6757068 June 29, 2004 Foxlin
6757446 June 29, 2004 Li et al.
6761637 July 13, 2004 Weston et al.
6765553 July 20, 2004 Odamura
D495336 August 31, 2004 Andre et al.
6786877 September 7, 2004 Foxlin
6796177 September 28, 2004 Mori
6811489 November 2, 2004 Shimizu et al.
6811491 November 2, 2004 Levenberg et al.
6812881 November 2, 2004 Mullaly et al.
6813525 November 2, 2004 Reid et al.
6813584 November 2, 2004 Zhou et al.
6816151 November 9, 2004 Dellinger
6821204 November 23, 2004 Aonuma et al.
6821206 November 23, 2004 Ishida et al.
6836705 December 28, 2004 Hellmann et al.
6836751 December 28, 2004 Paxton et al.
6836971 January 4, 2005 Wan
6842991 January 18, 2005 Levi et al.
6850221 February 1, 2005 Tickle
6850844 February 1, 2005 Walters et al.
6852032 February 8, 2005 Ishino
6856327 February 15, 2005 Choi
D502468 March 1, 2005 Knight et al.
6868738 March 22, 2005 Moscrip et al.
6872139 March 29, 2005 Sato et al.
6873406 March 29, 2005 Hines et al.
D503750 April 5, 2005 Kit et al.
D504677 May 3, 2005 Kaminski et al.
D505424 May 24, 2005 Ashida et al.
6897845 May 24, 2005 Ozawa
6897854 May 24, 2005 Cho et al.
6906700 June 14, 2005 Armstrong
6908388 June 21, 2005 Shimizu et al.
6922632 July 26, 2005 Foxlin
6925410 August 2, 2005 Narayanan
6929543 August 16, 2005 Ueshima et al.
6929548 August 16, 2005 Wang
6933861 August 23, 2005 Wang
6933923 August 23, 2005 Feinstein
6954980 October 18, 2005 Song
6955606 October 18, 2005 Taho et al.
6956564 October 18, 2005 Williams
6967566 November 22, 2005 Weston et al.
6982697 January 3, 2006 Wilson et al.
6984208 January 10, 2006 Zheng
6990639 January 24, 2006 Wilson
6993206 January 31, 2006 Ishino
6993451 January 31, 2006 Chang et al.
6995748 February 7, 2006 Gordon et al.
6998966 February 14, 2006 Pedersen et al.
7000469 February 21, 2006 Foxlin et al.
7002591 February 21, 2006 Leather et al.
7031875 April 18, 2006 Ellenby et al.
7066781 June 27, 2006 Weston
D524298 July 4, 2006 Hedderich et al.
7081051 July 25, 2006 Himoto et al.
7090582 August 15, 2006 Danieli et al.
7098891 August 29, 2006 Pryor
7098894 August 29, 2006 Yang et al.
7102616 September 5, 2006 Sleator
7107168 September 12, 2006 Oystol et al.
7113776 September 26, 2006 Minear
D531228 October 31, 2006 Ashida et al.
7115032 October 3, 2006 Cantu et al.
7126584 October 24, 2006 Nishiumi et al.
7127370 October 24, 2006 Kelly et al.
D531585 November 7, 2006 Weitgasser et al.
7133026 November 7, 2006 Horie et al.
7136674 November 14, 2006 Yoshie et al.
7139983 November 21, 2006 Kelts
7140962 November 28, 2006 Okuda et al.
7142191 November 28, 2006 Idesawa et al.
7149627 December 12, 2006 Ockerse et al.
7154475 December 26, 2006 Crew
7155604 December 26, 2006 Kawai
7158118 January 2, 2007 Liberty
7173604 February 6, 2007 Marvit et al.
7176919 February 13, 2007 Drebin et al.
7182691 February 27, 2007 Schena
7183480 February 27, 2007 Nishitani et al.
7184059 February 27, 2007 Fouladi et al.
D543246 May 22, 2007 Ashida et al.
7220220 May 22, 2007 Stubbs et al.
7225101 May 29, 2007 Usuda et al.
7231063 June 12, 2007 Naimark et al.
7233316 June 19, 2007 Smith et al.
7236156 June 26, 2007 Liberty et al.
7239301 July 3, 2007 Liberty et al.
7261690 August 28, 2007 Teller et al.
7262760 August 28, 2007 Liberty
D556201 November 27, 2007 Ashida et al.
7292151 November 6, 2007 Ferguson et al.
7301527 November 27, 2007 Marvit
7301648 November 27, 2007 Foxlin
D556760 December 4, 2007 Ashida et al.
D559847 January 15, 2008 Ashida et al.
D561178 February 5, 2008 Azuma
7335134 February 26, 2008 LaVelle
D563948 March 11, 2008 d'Hore
D567243 April 22, 2008 Ashida et al.
7359121 April 15, 2008 French et al.
RE40324 May 20, 2008 Crawford
7379566 May 27, 2008 Hildreth
7395181 July 1, 2008 Foxlin
7414611 August 19, 2008 Liberty
7445550 November 4, 2008 Barney et al.
7488231 February 10, 2009 Weston
7500917 March 10, 2009 Barney et al.
7510477 March 31, 2009 Argentar
7568289 August 4, 2009 Burlingham et al.
7582016 September 1, 2009 Suzuki
7614958 November 10, 2009 Weston et al.
7663509 February 16, 2010 Shen
7774155 August 10, 2010 Sato et al.
7775882 August 17, 2010 Kawamura et al.
7796116 September 14, 2010 Salsman
7877224 January 25, 2011 Ohta
7905782 March 15, 2011 Sawano et al.
7927216 April 19, 2011 Ikeda et al.
7931535 April 26, 2011 Ikeda et al.
7942245 May 17, 2011 Shimizu et al.
20010008847 July 19, 2001 Miyamoto et al.
20010010514 August 2, 2001 Ishino
20010015123 August 23, 2001 Nishitani et al.
20010024973 September 27, 2001 Meredith
20010031662 October 18, 2001 Larian
20010045938 November 29, 2001 Willner et al.
20010049302 December 6, 2001 Hagiwara
20020024500 February 28, 2002 Howard
20020024675 February 28, 2002 Foxlin
20020028071 March 7, 2002 Mølgaard
20020072418 June 13, 2002 Masuyama et al.
20020075335 June 20, 2002 Rekimoto
20020098887 July 25, 2002 Himoto et al.
20020103026 August 1, 2002 Himoto et al.
20020107069 August 8, 2002 Ishino
20020126026 September 12, 2002 Lee
20020137567 September 26, 2002 Cheng
20020140745 October 3, 2002 Ellenby et al.
20020158843 October 31, 2002 Levine et al.
20020183961 December 5, 2002 French et al.
20030038778 February 27, 2003 Noguera et al.
20030052860 March 20, 2003 Park et al.
20030057808 March 27, 2003 Lee et al.
20030063068 April 3, 2003 Anton et al.
20030069077 April 10, 2003 Korienek
20030083131 May 1, 2003 Armstrong
20030107551 June 12, 2003 Dunker
20030144056 July 31, 2003 Leifer et al.
20030193572 October 16, 2003 Wilson et al.
20030195041 October 16, 2003 McCauley
20030204361 October 30, 2003 Townsend et al.
20030216176 November 20, 2003 Shimizu et al.
20030222851 December 4, 2003 Lai et al.
20040028258 February 12, 2004 Naimark et al.
20040034289 February 19, 2004 Teller et al.
20040048666 March 11, 2004 Bagley
20040070564 April 15, 2004 Dawson
20040075650 April 22, 2004 Paul et al.
20040095317 May 20, 2004 Zhang et al.
20040134341 July 15, 2004 Sandoz et al.
20040140954 July 22, 2004 Faeth
20040143413 July 22, 2004 Oystol et al.
20040147317 July 29, 2004 Ito et al.
20040152515 August 5, 2004 Wegmuller et al.
20040193413 September 30, 2004 Wilson et al.
20040203638 October 14, 2004 Chan
20040204240 October 14, 2004 Barney
20040218104 November 4, 2004 Smith et al.
20040222969 November 11, 2004 Buchenrieder
20040227725 November 18, 2004 Calarco et al.
20040229692 November 18, 2004 Breving
20040229693 November 18, 2004 Lind et al.
20040239626 December 2, 2004 Noguera
20040252109 December 16, 2004 Trent et al.
20040254020 December 16, 2004 Dragusin
20040259651 December 23, 2004 Storek
20040268393 December 30, 2004 Hunleth et al.
20050017454 January 27, 2005 Endo et al.
20050020369 January 27, 2005 Davis et al.
20050032582 February 10, 2005 Mahajan
20050047621 March 3, 2005 Cranfill
20050054457 March 10, 2005 Eyestone et al.
20050070359 March 31, 2005 Rodriquez et al.
20050076161 April 7, 2005 Albanna et al.
20050085298 April 21, 2005 Woolston
20050107160 May 19, 2005 Cheng et al.
20050125826 June 9, 2005 Hunleth et al.
20050130739 June 16, 2005 Argentar
20050134555 June 23, 2005 Liao
20050143173 June 30, 2005 Barney et al.
20050170889 August 4, 2005 Lum et al.
20050172734 August 11, 2005 Alsio
20050174324 August 11, 2005 Liberty et al.
20050176485 August 11, 2005 Ueshima
20050179644 August 18, 2005 Alsio
20050210419 September 22, 2005 Kela
20050212749 September 29, 2005 Marvit
20050212750 September 29, 2005 Marvit
20050212751 September 29, 2005 Marvit
20050212752 September 29, 2005 Marvit
20050212753 September 29, 2005 Marvit
20050212754 September 29, 2005 Marvit
20050212755 September 29, 2005 Marvit
20050212756 September 29, 2005 Marvit
20050212757 September 29, 2005 Marvit
20050212758 September 29, 2005 Marvit
20050212759 September 29, 2005 Marvit
20050212760 September 29, 2005 Marvit
20050212764 September 29, 2005 Toba
20050212767 September 29, 2005 Marvit et al.
20050215295 September 29, 2005 Arneson
20050215322 September 29, 2005 Himoto et al.
20050217525 October 6, 2005 McClure
20050233808 October 20, 2005 Himoto et al.
20050239548 October 27, 2005 Ueshima et al.
20050243061 November 3, 2005 Liberty et al.
20050243062 November 3, 2005 Liberty
20050253806 November 17, 2005 Liberty et al.
20050256675 November 17, 2005 Kurata
20060028446 February 9, 2006 Liberty et al.
20060030385 February 9, 2006 Barney et al.
20060046849 March 2, 2006 Kovacs
20060052109 March 9, 2006 Ashman et al.
20060092133 May 4, 2006 Touma et al.
20060094502 May 4, 2006 Katayama et al.
20060122474 June 8, 2006 Teller et al.
20060123146 June 8, 2006 Wu et al.
20060146021 July 6, 2006 Voto et al.
20060148563 July 6, 2006 Yang
20060152487 July 13, 2006 Grunnet-Jepsen et al.
20060152488 July 13, 2006 Salsman et al.
20060152489 July 13, 2006 Sweetser et al.
20060154726 July 13, 2006 Weston et al.
20060178212 August 10, 2006 Penzias
20060205507 September 14, 2006 Ho
20060231794 October 19, 2006 Sakaguchi et al.
20060252477 November 9, 2006 Zalewski et al.
20060256081 November 16, 2006 Zalewski et al.
20060258452 November 16, 2006 Hsu
20060264258 November 23, 2006 Zalewski et al.
20060264260 November 23, 2006 Zalewski et al.
20060282873 December 14, 2006 Zalewski et al.
20060287086 December 21, 2006 Zalewski et al.
20060287087 December 21, 2006 Zalewski et al.
20070015588 January 18, 2007 Matsumoto et al.
20070021208 January 25, 2007 Mao et al.
20070049374 March 1, 2007 Ikeda et al.
20070050597 March 1, 2007 Ikeda et al.
20070052177 March 8, 2007 Ikeda et al.
20070060391 March 15, 2007 Ikeda et al.
20070066394 March 22, 2007 Ikeda et al.
20070066396 March 22, 2007 Weston et al.
20070072680 March 29, 2007 Ikeda et al.
20070091084 April 26, 2007 Ueshima et al.
20070093291 April 26, 2007 Hulvey
20070159362 July 12, 2007 Shen
20070173705 July 26, 2007 Teller et al.
20070252815 November 1, 2007 Kuo et al.
20070265075 November 15, 2007 Zalewski
20070265076 November 15, 2007 Lin et al.
20070265088 November 15, 2007 Nakada et al.
20080014835 January 17, 2008 Weston et al.
20080015017 January 17, 2008 Ashida et al.
20080039202 February 14, 2008 Sawano et al.
20080121782 May 29, 2008 Hotelling et al.
20080273011 November 6, 2008 Lin
20080278445 November 13, 2008 Sweetser et al.
20080280660 November 13, 2008 Ueshima et al.
20090005166 January 1, 2009 Sato
20090051653 February 26, 2009 Barney et al.
20090124165 May 14, 2009 Weston
20090156309 June 18, 2009 Weston et al.
Foreign Patent Documents
1338961 March 2002 CN
1559644 January 2005 CN
3930581 March 1991 DE
19701344 July 1997 DE
19701374 July 1997 DE
19648487 June 1998 DE
19814254 October 1998 DE
19937307 February 2000 DE
10029173 January 2002 DE
10241392 May 2003 DE
10219198 November 2003 DE
1 524 334 March 1977 EP
0 835 676 April 1998 EP
0 848 226 June 1998 EP
0 852 961 July 1998 EP
1 062 994 December 2000 EP
1 279 425 January 2003 EP
1 293 237 March 2003 EP
0993845 December 2005 EP
1524334 September 1978 GB
2 244 546 May 1990 GB
2284478 June 1995 GB
2307133 May 1997 GB
2316482 February 1998 GB
2319374 May 1998 GB
60-077231 May 1985 JP
62-14527 January 1987 JP
03-74434 July 1991 JP
03-08103 August 1991 JP
3-059619 November 1991 JP
04-287888 October 1992 JP
5-056191 July 1993 JP
2-901476 December 1993 JP
6-507758 February 1994 JP
3-262677 May 1994 JP
6-154422 June 1994 JP
03-000028 July 1994 JP
6-190144 July 1994 JP
6-198075 July 1994 JP
3-194841 October 1994 JP
06-77387 October 1994 JP
3-273531 November 1994 JP
6-308879 November 1994 JP
3-228845 January 1995 JP
7-28591 January 1995 JP
7-44315 February 1995 JP
7044315 February 1995 JP
7-107573 April 1995 JP
07-22312 May 1995 JP
7-115690 May 1995 JP
7-146123 June 1995 JP
517482 June 1995 JP
7-200142 August 1995 JP
07-262797 October 1995 JP
7-302148 November 1995 JP
07-318332 December 1995 JP
8-071252 March 1996 JP
8-095704 April 1996 JP
8-106352 April 1996 JP
08-111144 April 1996 JP
11-114223 April 1996 JP
8-114415 May 1996 JP
8-122070 May 1996 JP
8-152959 June 1996 JP
8-211993 August 1996 JP
08-221187 August 1996 JP
8-305355 November 1996 JP
83-35136 December 1996 JP
9-230997 September 1997 JP
9-274534 October 1997 JP
09-319510 December 1997 JP
10-021000 January 1998 JP
10-033831 February 1998 JP
10-99542 April 1998 JP
10-154038 June 1998 JP
10-254614 September 1998 JP
11-099284 April 1999 JP
11-506857 June 1999 JP
2000-270237 September 2000 JP
2000-308756 November 2000 JP
2001-038052 February 2001 JP
30-78268 April 2001 JP
2001-104643 April 2001 JP
03-080103 June 2001 JP
2001-175412 June 2001 JP
2001-251324 September 2001 JP
2001-306245 November 2001 JP
2002-062981 February 2002 JP
2002-082751 March 2002 JP
2002-091692 March 2002 JP
2002-153673 May 2002 JP
2002-202843 July 2002 JP
2002-224444 August 2002 JP
2002-232549 August 2002 JP
2002-233665 August 2002 JP
2002-298145 October 2002 JP
2003-053038 February 2003 JP
34-22383 April 2003 JP
2003-208263 July 2003 JP
2003208260 July 2003 JP
2003-236246 August 2003 JP
2003-325974 November 2003 JP
2004-062774 February 2004 JP
2004-313429 November 2004 JP
2004-313492 November 2004 JP
2005-21458 January 2005 JP
2005-040493 February 2005 JP
2005-063230 March 2005 JP
2003-140823 April 2006 JP
2006-113019 April 2006 JP
2002-136694 June 2006 JP
2006-136694 June 2006 JP
2006-216569 April 2007 JP
2007-083024 April 2007 JP
2007-283134 November 2007 JP
9300171 August 1994 NL
2125853 February 1999 RU
2126161 February 1999 RU
2141738 November 1999 RU
94/02931 February 1994 WO
96/05766 February 1996 WO
97/09101 March 1997 WO
97/12337 April 1997 WO
97/17598 May 1997 WO
97/28864 August 1997 WO
97/32641 September 1997 WO
98/11528 March 1998 WO
99/58214 November 1999 WO
00/33168 June 2000 WO
00/35345 June 2000 WO
00/47108 August 2000 WO
00/63874 October 2000 WO
01/87426 November 2001 WO
01/91042 November 2001 WO
02/17054 February 2002 WO
02/34345 May 2002 WO
03/015005 February 2003 WO
03/107260 June 2003 WO
03/088147 October 2003 WO
2004/039055 May 2004 WO
2004/051391 June 2004 WO
Other references
  • Office Action in related U.S. Appl. No. 14/694.783 dated Sep. 21, 2015.
  • European Examination Report issued in EP Application No. 10176870.3 on Aug. 9, 2011.
  • You et al., Fusion ofVision and Gyro Tracking for Robust Augmented Reality Registration, Proceedings of the Virtual Rality 2001 Conference, 2001, 1-8.
  • Office Action issued in U.S. Appl. No. 12/285,812 on Nov. 9, 2011.
  • English Abstract for Japanese Patent No. JP10021000, published Jan. 23, 1998.
  • English Abstract for Japanese Patent No. JP11053994, published Feb. 26, 1999.
  • English Abstract for Japanese Patent No. JP11099284, published Apr. 13, 1999.
  • English Abstract for Japanese Patent No. JP2001038052, published Feb. 13, 2001.
  • English Abstract for Japanese Patent No. JP2002224444, published Aug. 13, 2002.
  • English Abstract for Japanese Patent No. JP2006136694, published Jun. 1, 2006.
  • English Abstract for Japanese Patent No. WO9732641, published Sep. 12, 1997.
  • Acar, “Robust Micromachined Vibratory Gyroscopes” Dissertation (Dec. 2004).
  • Acar, et al., “Experimental evaluation and comparative analysis of commercial variable-capacitance MEMS accelerometers,” Journal of Micromechanics and Microengineering, vol. 13 (1), pp. 634-645 (May 2003).
  • Achenbach, “Golf's New Measuring Stick,” Golfweek, Jun. 11, 2005, 1 page.
  • Act Labs: Miacomet Background, 1 page, May 1999, http://www.act-labs.com/realfeelbackground/htm.
  • AirPad Controller Manual (AirPad Corp. 2000).
  • Airpad Motion Reflex Controller for Sony Playstation−Physical Product (AirPad Corp.2000).
  • Algrain, “Estimation of 3-D Angular Motion Using Gyroscopes and Linear Accelerometers,” IEEE Transactions on Aerospace and Electronics Systems, vol. 27, No. 6, pp. 910-920 (Nov. 1991).
  • Algrain, et al., “Accelerometers Based Line-of-Sight Stablization Approach for Pointing and Tracking System,” Second IEEE Conference on Control Applications, vol. 1, Issue 13-16 pp. 159-163 (Sep. 1993).
  • Algrain, et al., “Interlaced Kalman Filtering of 3-D Angular Motion Based on Euler's Nonlinear Equations,” vol. 30, No. 1 (Jan. 1994).
  • Allen, et al., “A General Method for Comparing the Expected Performance of Tracking and Motion Capture Systems,” {VRST} '05: Proceedings of the ACM symposium on Virtual reality software and technology, pp. 201-210 (Nov. 2005).
  • Allen, et al., “Tracking Beyond 15 minutes of Thought,” SIGGRAPH 2001 Course 11 (Course Pack) from Computer Graphics (2001).
  • Alves, “Extended Kalman filtering applied to a full accelerometer strapdown inertial measurement unit,” M.S. Thesis Massachusetts Institute of Technology. Dept. of Aeronautics and Astronautics, Santiago (1992).
  • Analog Devices Data Sheet, “MicroConvertor®, Multichannel 12-Bit ADC with Embedded Flash MCU, ADuC812” (2013) (http://www.analog.com/static/imported-files/datasheets/ADUC812.pdf) 60 pages.
  • Analog Devices “ADXL202E Low-Cost ±2 g Dual-Axis Accelerometers with Duty Cycle Output” (Data Sheet), Rev. A (2000).
  • Analog Devices “ADXL330 Small, Low Power, 3-Axis±2 g iMEMS Accelerometer” (Data Sheet), Rev. PrA (2005).
  • Analog Devices “ADXL50 Single Axis Accelerometer” (Data Sheet), http://www.analog.com/en/obsolete/adxl50/products/product.html (Mar. 1996).
  • Analog Devices “ADXL50 Monlothic Accelerometer with Signal Conditioning”Datasheet (1996).
  • Analog Devices “ADXRS150 ±150°/s Single Chip Yaw Rate Gyro with Signal Conditioning” (Data Sheet), Rev. B (2004).
  • Analog Devices “ADXRS401 ±75°/s Single Chip Yaw Rate Gyro with Signal Conditioning” (Data Sheet), Rev. O (2004).
  • Ang, et al., “Design and Implementation of Active Error Cancelling in Hand-held Microsurgical Instrument,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, (Oct. 2001).
  • Ang, et al., “Design of All-Accelerometer Inertial Measurement Unit for Tremor Sensing in Hand-held Microsurgical Instrument,” Proceedings of the 2003 IEEE International Conference on Robotics & Automation (Sep. 2003).
  • Apstolyuk, Vladislav, “Theory and design of micromechanical vibratory gyroscopes,” MEMS/NEMS Handbook, Springer, 2006, vol. 1, pp. 173-195 (2006).
  • Arcanatech, “IMP User's Guide” (1994).
  • Arcanatech, IMP (Photos) (1994).
  • Ascension Technology, The Bird 6D Input Devices (specification) (1998).
  • “ASCII Grip One Handed Controller,”One Switch—ASCII Grip One Handed Playstation Controller, http://www.oneswitch.org.uk/1/ascii/grip.htm , Jul. 11, 2008, pp. 1-2.
  • “ASCII Grip” One-Handed Controller The Ultimate One-Handed Controller Designed for the Playstaion Game Console (ASCII Entertainment 1997).
  • “ASCII/Sammy Grip V2,” One Switch-Accessible Gaming Shop—ASCII Grip V2, http://www.oneswitch.org.uk/1/AGS/AGS-onehand/ascii-grip-v2.html, Jul. 10, 2008, pp. 1-2.
  • ASCII, picture of one-handed controller, 2 pages (Feb. 6, 2006).
  • Ashida et al., entilted “Game Controller,” U.S. Appl. No. 11/790,780, filed Apr. 27, 2007, pending.
  • “At-home fishin!” 1 page, Dec. 1996-1999.
  • Ator, “Image-Velocity with Parallel-slit Reticles,” Journal of the Optical Society of America (Dec. 1963).
  • Azarbayejani, et al, “Real-Time 3-D Tracking of the Human Body,” Proceedings of IMAGE'COM 96(1996).
  • Azarbayejani, et al., “Visually Controlled Graphics,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, No. 6, pp. 602-605 (Jun. 1993).
  • Azuma et al., “Improving Static and Dynamic Registration in an Optical See-Through HMD,” International Conference on Computer Graphics and Interactive Techniques Proceedings of the 21st annual conference on computer graphics and interactive techniques, pp. 197-204 (1994).
  • Azuma et al., “Making Augumented Reality Work Outdoors Requires Hybrid Tracking,” Proceedings of the International Workshop on Augmented Reality, San Francisco, CA, Nov. 1, 1998, Bellevue, Washngton, pp. 219-224 (1999).
  • Azuma, “Predictive Tracking for Augmented Reality,” Ph.D. Dissertation, University of North Carolina at Chapel Hill (1995).
  • Azuma, et al., “A Frequency-Domain Analysis of Head-Motion Prediction,” Proceedings of SIGGRAPH '94, pp. 401-408 (1995).
  • Azuma, et al., “A motion-stabilized outdoor augmented reality system.” Proceedings of IEEE Virtual Reality '99, Houston, TX (Mar. 1999).
  • Bachmann et al., “Inertial and Magnetic Posture Tracking for Inserting Humans into Networked Virtual Environments,” Virtual Reality Software and Technology archive, Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Baniff, Alberta, Canada, pp. 9-16 (2001).
  • Bachmann et al., “Orientation Tracking for Humans and Robots Using Inertial Sensors” (CIRA '99), Naval Postgraduate School, Monterey, CA (1999).
  • Bachmann, “Inertial and Magnetic Angle Tracking of Limb Segments for Inserting Humans into Synthetic Environments,” Dissertation, Naval Postgraduate School, Monterey, CA (Dec. 2000).
  • Baker et al., “Active Multimodal Control of a Floppy Telescope Structure,” Proc. SPIE, vol. 4825, 74 (Mar. 2003).
  • Balakrishnan, “The Rockin' Mouse: Integral 3D Manipulation on a Plane,” CHI '97 , Univ. Toronto, (1997).
  • Ballagas, et al., Jan, “iStuff: A Physical User Interface Toolkit for Ubiquitous Computer Environments,” Proceedings of the SIGCHI Conference on Human Factors in Computing systems, vol. 5, No. 1, at 537-544 (ACM) (Apr. 5-10, 2003).
  • Baraff, “An Introduction to Physically Based Modeling,” SIGGRAPH 97 Course Notes (1997).
  • Bass Fishing “Legends of the Lake”, Radica 2 pages, 2002.
  • Baudisch, et al., “Soap: a pointing device that works in mid-air” Proc. UIST (2006).
  • BBN Report, “Virtual Environment Technology for Training (VETT),” The Virtual Environment and Teloperator Research Consortium (VETREC) (Mar. 1992).
  • Behringer, “Improving Registration Precision Throught Visual Horizon Silhouette Matching,” Proceedings of the internaional workshop on Augmented reality: placing artificial objects in real scenes: placing artificial objects in real scenes, Bellevue, Washington, United States pp. 225-232 (1999).
  • Behringer, “Registration for Outdoor Augmented Reality Applications Using Computer Vision Techniques and Hybrid Sensors,” Virtual Reality, 1999 Proceedings., IEEE Comsuter Society, 244-261 (1999).
  • Bei, “BEI Gyrochip™ QRS11 Data Sheet,” BEI Systron Donner Inertial Division, BEI Technologies, Inc., (Sep. 1998).
  • Benbasat, “An Inertial Measurement Unit for User Interfaces,” Massachusetts Institute of Technology Dissertation, (Sep. 2000).
  • Benbasat, et al., “An Inertial Measurement Framework for Gesture Recognition and Applications,” Gesture and Sign Language in Human-Computer Interaction, International Gesture Workshop, GW 2001, London UK, 2001 Proceedings, LNAI 2298, at 9-20, I. Wachsmuth and T. Sowa (eds.) Springer-Verlag Berlin Heibelberg (2001, 2002).
  • Beuter, A., Publications University of Quebec at Montreal, http://www.er.uqam.ca/nobel/r11040/publicat.htm (Aug. 2007).
  • BGM-109 Tomahawk, http://en.wikipedia.org/wiki/BGM-109Tomahawk, Wikipedia, Jan. 2009.
  • Bhatnagar, “Position trackers for Head Mounted Display systems: A survey” (Technical Report), University of North Carolina at Chapel Hill (Mar. 1993).
  • Bianchi, “A Tailless Mouse, New cordless Computer Mouse Invented by ArcanaTech.” Inc. Article (Jun. 1992).
  • Bishop, “The Self-Tracker: A Smart Optical Sensor on Silicon,” Ph.D. Dissertation, Univ. of North Carolina at Chapel Hill (1984).
  • Bishop, et al., “Grids Progress Meeting” (Slides), Unviersity of North Carolina at Chapel Hill, NC (1998).
  • Bishop, et al., Self-Tracker: Tracking for Hybrid Environments without Infrastructure (1996).
  • Bloomberg: Nintendo Announces Wireless GBA Link, Sep. 2003, 2 pages.
  • Bona, et al., “Optimum Reset of Ship's Inertial Navigation System,” IEEE Transactions on Aerospace and Electronics Systems (1965).
  • Borenstein, et al., “Where am I? Sensors and Methods for Mobile Robot Positioning” (1996).
  • Boser, “3-Axis Accelerometer with Differential Sense Electronics,” http://www.eecs.berkeley.edu/˜boser/pdf/3axis.pdf (1997).
  • Boser, “Accelerometer Design Example: Analog Devices XL-05/5,” http://www.eecs.berkeley.edu/˜boser/pdf/x105.pdf (1996).
  • Bowman et al., 3D User Interfaces: Theory and Practice, Addison-Wesley, Inc., (2005).
  • Bowman,. et al., “An Introduction to 3-D User Interface Design,” MIT Presence, vol. 10, No. 1 pp. 96-108 (2001).
  • Briefs (New & Improved) (Brief Article), PC Magazine, Oct. 26, 1993.
  • Britton et al., “Making Nested rotations Convenient for the User,” ACM SIGGRAPH Computer Graphics, vol. 12, Issue 3, pp. 222-227 (Aug. 1978).
  • Britton, “A Methodology for the Ergonomic Design of Interactive Computer Graphic Systems, and its Application to Crystallography” (UNC Thesis) (1997).
  • Brownell, Richard: Review of Peripheral-GameCube-G3 Wireless Controller, GAF, Jul. 17, 2003, 2 pages.
  • Buchanan, Levi: “Happy Birthday, Rumble Pak,” IGN.com, Apr. 3, 2008, 2 pages.
  • Business Wire, “Feature/Virtual reality glasses that interface to Sega channel,” Time Warner, TCI: project announced concourrent with COMDEX (Nov. 1994).
  • Business Wire, “Free-space‘Tilt’ Game Controller for Sony Playstation Uses Scenix Chip; SX Series IC Processes Spatial Data in Real Time for On-Screen” (Dec. 1999).
  • Business Wire, “InterSense Inc. Launches InertiaCube2—The World's Smallest Precision Orientation Sensor With Serial Interface” (Aug. 14, 2001).
  • Business Wire, “Logitech Magellan 3D Controller,” Logitech (Apr. 1997).
  • Business Wire, “Mind Path Introduces Gyropoint RF Wireless Remote” (Jan. 2000).
  • Business Wire, “Pegasus' Wireless PenCell Writes on Thin Air with ART's Handwriting Recognition Solutions,” Business Editors/High Tech Writers Telecom Israel 2000 Hall 29, Booth 19-20 (Nov. 2000).
  • Business Wire, “RPI ships low-cost HMD Plus 3D Mouse and VR PC graphics card system for CES” (Jan. 1995).
  • Buxton, Bill, “Human input/output devices,” In M. Katz (ed.), Technology Forecast: 1995, Menlo Park, C.A.: Price Waterhouse World Firm Technology Center, 49-65 (1994).
  • Buxton, Bill, A Directory of Sources for Input Technologies, http://www.billbuxton.com/InputSources.html, Apr. 2001 (last update 2008).
  • Buxton et al., “A Study in Two-Handed Input,” ACM CHI '86 Proceedings (1986).
  • Byte, “Imp Coexists With Your Mouse,” What's New, Arcana Tec (Jan. 1994).
  • Canaday, R67-26 “The Lincoln Wand,” IEEE Transactions on Electronic Computers, vol. EC-16, No. 2, p. 240 (Apr. 1967).
  • Caruso et al., “New Perspective on Magnetic Field Sensing,” Sensors Magazine (Dec. 1998).
  • Caruso et al., “Vehicle Detection and Compass Applications using AMR Magnetic Sensors,” Honeywell (May 1999).
  • Caruso, “Application of Magnetoresistive Sensors in Navigation Systems,” Sensors and Actuators, SAE SP-1220, pp. 15-21 (Feb. 1997).
  • Caruso, “Applications of Magnetic Sensors for Low Cost Compass Systems,” Honeywell, SSEC, http://www.ssec.honeywell.com/magnetic/datasheets/lowcost.pdf (May 1999).
  • Chatfield, “Fundamentals of High Accuracy Inertial Navigation,” vol. 174 Progress in Astronautics and Aeronautics, American Institute of Aeronautics and Astronautics, Inc. (1997).
  • Cheng, “Direct interaction with large-scale display systems using infrared laser tracking devices,” ACM International Conference Proceeding Series; vol. 142 (2003).
  • Cho, et al., “Magic Wand: A Hand-Drawn Gesture Input Device in 3-D Space with Inertial Sensors,”Proceedings of the 9th Intl Workshop on Frontiers in Handwriting Recognition (IWFHR-9 2004), IEEE (2004).
  • CNET News.com, http://news.com.com/2300-10433-6070295-2.html?tag =ne.gall.pg, “Nintendo Wii Swings Into Action,” May 25, 2006, 1pg.
  • “Coleco Vision: Super Action™ Controller Set,”www.vintagecomputing.com/wp-content/images/retroscan/colecosac1large.jpg. (Sep. 2006).
  • Computer Mouse (Wikipedia) (Jul. 5, 2005).
  • “Controllers-Atari Space Age Joystick,” Atari Age: Have You Played Atari Today? www.atariage.com/controllerpage.html?SystemID-2600&ControllerID=12. (Sep. 2006).
  • “Controllers-Booster Grip,” AtariAge: Have You Played Atari Today? www.atariage.com/controllerpage.html?SystemID=2600&ControllerID=18. (Sep. 2006).
  • Computergram, “RPI Entertainment Pods Improve Virtual Experience” (1995).
  • Cooke, et al., “NPSNET: flight simulation dynamic modeling using quaternions,” Presence, vol. 1, No. 4,pp. 404-420, MIT Press (1992)/1994).
  • Crossan, A. et al.: A General Purpose Control-Based Trajectroy Playback for Force-Feedback Systems, University of Glasgow, Dept. Computing Science, 4 pages (Feb. 2008).
  • CSIDC Winners—Tablet-PC Classroom System Wins Design Competition, IEEE Computer Society Press, vol. 36, Issue 8, pp. 15-18 , IEEE Computer Society (Aug. 2003).
  • Cutrone, “Hot products: Gyration GyroPoint Desk, GyroPoint Pro gyroscope-controlled wired and wireless mice” (Computer Reseller News) (Dec. 1995).
  • Cutts, “A Hybrid Image/Inertial System for Wide-Area Tracking” (Internal to UNC-CH Computer Science) (Jun. 1999).
  • Cyberglove/Cyberforce, Immersion, Cyberforce CyberGlove Systems “Immersion Ships New Wireless CyberGlove(R) II Hand Motion-Capture Glove; Animators, Designers, and Researchers Gain Enhanced Efficiency and Realism for Animation, Digital Prototyping and Virtual Reality Projects,” Business Wire, Dec. 7, 2005.
  • Deruyck, et al. “An Electromagnetic Position Sensor,” Polhemus Navigation Sciences, Inc., Burlington, VT (Nov. 1973.).
  • Dichtburn,“Camera in Direct3D” Toymaker, Mar. 5, 2005, 5 pages, http://web.arcbive.org/web/20050206032104/http://toymaker.info/games/html/camera.html.
  • Donelson, et al., “Spatial Management of Information” (1978).
  • Eiβele, “Orientation as an additional User Interface in Mixed-Reality Environments,” 1. workshop Ervwiterte und Virtuelle Ralitaät, pp. 79-90. GI-Fachgruppe AR/VR (2007).
  • Electro-Plankton Weblog, http://www.tranism.com/weblog/2005/09/, “This is the Revolution, Nintendo Style,” Sep. 15, 2005, 2 pgs.
  • “Electronic Plastic: BANDAI—Power Fishing”, “Power Fishing Company: BANDAIi”, 1984, 1 page, http://www.handhelden.com/Bandai/PowerFishing.html.
  • Emura, et al., “Sensor Fusion Based Measurement of Human Head Motion,” 3rd IEEE International Workshop on Robot and Human Communication (Jul. 1994).
  • Ewalt, David M., “Nintendo's Wii is a Revolution,” Review, Forbes.com (Nov. 13, 2006).
  • Fielder, Lauren: “E3 2001: Nintendo unleashes GameCube software, a new Miyamoto game, and more,” GameSpot, May 16, 2001, 2 pages, http://www.gamespot.com/downloads/2761390.
  • Ferrin, “Survey of Helmet Tracking Technologies,” Proc. SPIE vol. 1456, p. 86-94 (Apr. 1991).
  • Fishing Games: The Evolution of Virtual Fishing Games and related Video Games/Computer Games , 15 pages, 2003.
  • Foley et al., “Computer Graphics: Principles and Practice,” Second Edition, 1990.
  • Foremski, T. “Remote Control Mouse Aims at Interactive TV”, Electronics Weekly, Mar. 9, 1994.
  • Foxlin et al., “An Inertial Head-Orientation Tracker with Automatic Drift Compensation for Use with HMD's,” Proceedings of the conference on Virtual reality software and technology, Singapore, Singapore, pp. 159-173 (1994).
  • Foxlin et al., “Minitaure 6-DOF Inertial System for Tracking HMDs,”SPIE vol. 3362 (Apr. 1998).
  • Foxlin et al., “Miniaturization Calibration & Accuracy Evalutation of a Hybrid Self-Tracker,” The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 151-160 (2003).
  • Foxlin et al., “WearTrack: A Self-Referenced Head and Hand Tracker for Wearable Computers and Portable VR,”International Symposium on Wearable Computers (ISWC 2000), Oct. 16-18, 2000, Atlanta, GA.
  • Foxlin, “FlightTracker: A Novel Optical/Inertial Tracker for Cockpit Enhanced Vision, Symposium on Mixed and Augmented Reality,” Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 212-221 (Nov. 2004).
  • Foxlin, “Generalized architecture for simultaneous localization, auto-calibration, and map-building,” IEEE/RSJ Conf. on Intelligent Robots and Systems, Lausanne, Switzerland (Oct. 2002).
  • Foxlin, “Head-tracking Relative to a Moving Vehicle or Simulator Platform Using Differential Inertial Sensors,” InterSense, Inc., Presented: Helmet and Head-Mounted Displays V, SPIE vol. 4021, AeroSense Symposium, Orlando FL, Apr. 24-25, 2000.
  • Foxlin, “Inertial Head Tracker Sensor Fusion by a Complementary Separate-bias Kalman Filter,” Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, pp. 185-194, 267 (1996).
  • Foxlin, “Inertial Head-Tracking,” MS Thesis, Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science (Sep. 1993).
  • Foxlin, “Motion Tracking Requirements and Technologies,” Chapter 7, from Handbook of Virtual Environment Technology, Stanney Kay, Ed. (2002).
  • Foxlin, “Pedestrian Tracking with Shoe-Mounted Inertial Sensors,” IEEE Computer Graphics and Applications, vol. 25, No. 6, pp. 38-46 (Nov. 2005).
  • Foxlin, et al., “Constellation: A Wide-Range Wireless Motion-Tracking System for Augmented Reality and Virtual Set Applications,” ACM SIGGRAPH, pp. 372-378 (1998).
  • Foxlin, et al., “VIS-Tracker: A Wearable Vision-Inertial Self-Tracker,” IEEE Computer Society (2003).
  • Frankie, “E3 2002: Roll O Rama”, IGN: Roll-o-Rama Preview, . 3 pages. E3 Demo of Kirby game (“Roll O Rama”), http://cube.ign.com/objects/482/482164.html, (May 23, 2002).
  • Freiburg Center for Data Analysis and Modeling—Publications, http://www.fdm.uni-freiburg.de/cms/puplications/publications/ (Aug. 2007).
  • Friedmann, et al., “Device Synchronization Using an Optimal Linear Filter,” SI3D '92: Proceedings of the 1992 symposium on Interactive 3D graphics, pp. 57-62 (1992).
  • Friedmann, et al., “Synchronization in virtual realities,” MIT Presence, vol. 1, No. 1, pp. 139-144 (1992).
  • Fröhlich, “The Yo Yo: An interaction device combining elastic and isotonic control,” at http://www.uni-weimar.de/cms/medien/vr/research/hci/3d-handheld-interaction/the-yoyo-a-handheld-device-combining-elastic-and-isotonic-input.html (2003).
  • FrontSide Field Test, “Get This!”, Golf Magazine, Jun. 2005, p. 36. Fuchs, “Intertial Head-Tracking,” Massachusetts Institute of Technology, Sep. 1993.
  • Furniss, Maureen, “Motion Capture,” MoCap MIT (Dec. 1999) 12 pages.
  • “Game Controller” Wikipedia, Aug. 2010, 8 pages, http://en.wikipedia.org/w/index.php?title=Gamecontroller&oldid=21390758.
  • “Game Controller” Wikipedia, Jan. 5, 2005.
  • GameCubicle, Jim—New Contributor, Nintendo WaveBird Control, http://www.gamecubicle.com/news-nintendogamecubewavebirdcontroller.htm, May 14, 2002.
  • Geen et al.: “MEMS Angular Rate-Sensing Gyroscope” pp. 1-3 (2003).
  • Gelmis, J.: “Ready to Play, The Future Way”, Jul. 23, 1996, Buffalo News.
  • “Get Bass”, Videogame by Sega, The International Arcade Museum and the KLOV, 1998, 4 pages.
  • “Glove-based input interfaces”, Cyberglove/Cyberforce, Jun. 1991, 12 pages http://www.angelfire.com/ca7/mellott124/glove1.htm.
  • Goschy, “Midway Velocity Controller” (youtube video http://www.youtube.com/watch?v=wjLhSrSxFNw) (Sep. 8, 2007).
  • Grewal et al., “Global Positioning Systems, Inertial Navigation and Integration,” 2001.
  • Grimm et al., “Real-Time Hybrid Pose Estimation from Vision and Inertial Data,” Proceedings, First Canadian Conference on Computer and Robot Vision, pp. 480-486 (2004).
  • Gyration, Inc., GyroRemote and Mobile RF Keyboard User Manual, Saratoga, CA 24 pages, www.theater.stevejenkins.com/docs/GyrationKeyboardManual (Mar. 9, 2011).
  • Gyration, Inc. GyroRemote GP240-01 Professional Series, copyrighted 2003, www.gyration.com.
  • Gyration Ultra Cordless Optical Mouse, Setting Up Ultra Mouse, Gyration Quick Start Card part No. DL00071-0001 Rev. A. Gyration, Inc. (Jun. 2003).
  • Gyration Ultra Cordless Optical Mouse, User Manual, 1-15, Gyration, Inc. Saratoga, CA (2003).
  • Gyration, “Gyration GP110 Ultra Cordless Optical Mouse Data Sheet,” http://www.gyration.com/descriptions/document/GP110-SPEC-EN.pdf (2002).
  • Gyration, “Gyration GP110 Ultra Cordless Optical Mouse User Manual,” http://www.gyration.com/descriptions/document/GP110-MANUAL-EN.pdf (2002).
  • Gyration, “Gyration MicroGyro 100 Developer Kit Data Sheet,” http://web.archive.org/web/19980708122611/www.gyration.com/html/devkit.html (Jul. 1998).
  • Gyration, “Gyration Ultra Cordless Optical Mouse,” photos (2002).
  • Hamilton Institute, http://www.dcs.gla.ac.uk/.about.rod/, R. Murray-Smith (Aug. 2007).
  • Harada, et al., “Portable Absolute Orientation Estimation Device with Wireless Network under Accelerated Situation” Proceedings, 2004 IEEE International Conference on Robotics and Automation, vol. 2, Issue, Apr. 26-May 1, 2004, pp. 1412-1417 vol. 2 (Apr. 2004).
  • Harada, et al., “Portable orientation estimation device based on accelerometers, magnetometers and gyroscope sensors for sensor network,” Proceedings of IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI2003, pp. 191-196 (Jul. 2003).
  • Hartley, Matt, “why is the Nintendo Wii So Successful?”, Smarthouse—The Lifestyle Technology Guide Website (Sep. 12, 2007).
  • Haykin, et al., “Adaptive Tracking of Linear Time-Variant Systems by Extended RLS Algorithms, IEEE Transactions on Signal Processing,” vol. 45, No. 5 (May 1997).
  • Heath, “Virtual Reali Resource Guide AI Expert,” v9 n5 p. 32(14) (May 1994).
  • Hinckley, Ken, “Haptic Issues for Virtual Manipulation,” Thesis (Dec. 1996).
  • Hinckley, “Synchronous Gestures for Multiple Persons and Computer”, CHI Letters vol. 5 No. 2 (ACM 2003) & Proceedings of the 16th Annual ACM UIST 2003 Symposium on User Interface Software & Technology, at 149-58 (UIST '03 Vancouver BC Canada) (ACM) (Nov. 2003).
  • Hinckley, et al., “Sensing Techniques for Mobile Interaction,” Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (San Diego, Cal.), ACM UIST 2000 & Technology, CHI Letters 2 (2), at 91-100 (ACM) (2000).
  • Hinckley, Ken, et al., “The VideoMouse: A Camera-Based Multi-Degree-of-Freedom Input Device,” CHI Letters vol. 1, 1, UIST '99, Asheville, NC, pp. 103-112 (1999).
  • Hinckley. et al. , “A Survey of Design Issues in Spatial Input,” Proceedings of the ACM Symposium on User Interface Software and Technology (1994).
  • Hinkley et al. Stitching: pen gestures that span multiple displays, 2004.
  • Hinkley et al.: Synchronomous gestures for multiple persons and computers, 2003.
  • Hogue, “MARVIN: A Mobile Automatic Realtime Visual and INertial tracking system,” Master's Thesis, York University (2003).
  • Hogue, et al., “An optical-inertial tracking system for fully-enclosed VR displays,” Proceedings of the 1st Canadian Conference on Computer and Robot Vision, pp. 22-29 (May 2004).
  • Holden, Maureen K., et al.: Use of Virtual Environments in Motor Learning and Rehabilitation Department of Brain and Cognitive Sciences, Handbook of Virtual Environments: Design, Implementation, and Applications, Chap. 49, pp. 999-1026, Stanney (ed), Lawrence Erlbaum Associates 2002.
  • Holloway, Richard Lee, “Registration Errors in Augmented Reality Systems,” Ph.D. Dissertation, University of North Carolina at Chapel Hill (1995).
  • House, Matthew, Product Description: Hot Wheels Stunt Track Driver, Hot Wheels (Jan. 2000).
  • Hudson Soft, “Brochure of Toukon Road Brave Warrior, Brave Spirits” (1998).
  • Hudson Soft—Screen Shot of Brave Spirits (1998).
  • Immersion CyberGlove product, Immersion Corporation, 1990, http://www.cyberglovesystem.com.
  • Inman, “Cheap sensors could capture your every move,” http://technology.newscientist.com/article/dn12963-cheap-sensors-could-capture-your-every-move.html (Nov. 2007).
  • InterSense, “InterSense InertialCube2 Devices,” (Specification) (image) (2001).
  • InterSense, “InterSense InertiaCube2 Manual for Serial Port Model” (2001).
  • InterSense, InterSense IS 900Technical Overview—Motion Tracking System, 1999.
  • InterSense, “InterSense IS-1200 FlightTracker Protype Demonstration” (Video) (Nov. 2004).
  • InterSense, “InterSense IS-1200 InertiaHawk Datasheet” (2009).
  • InterSense, “InterSense IS-1200 VisTracker Datasheet” (2007).
  • InterSense, “InterSense IS-1200 VisTracker Devices,” (image) (2007).
  • InterSense, “InterSense IS-900 Micro Trax™ Datasheet” (2007).
  • InterSense, “InterSense IS-900 Systems Datasheet” (2007).
  • InterSense, “InterSense MicroTrax Demo Reel,” http://www.youtube.com/watch?v=O2F4fuCISo (2007).
  • InterSense, “IS-900 Precision Motion Trackers” www.isense.com May 16, 2003.
  • InterSense, “InterSense Motion Trackers” www.isense.com Mar. 12, 1998.
  • InterSense, “InterSence Inc., The New Standard in Motion Tracking” www.isense.com Mar. 27, 2004.
  • InterSense, “IS-900 Precision Motion Trackers” www.isense.com Sep. 10, 2002.
  • Intersense, “IS-900 Product Technology Brief,” http://www.intersense.com/uploadedFiles/Products/WhitePapers/IS900TechOverviewEnhanced.pdf (1999).
  • InterSense, Inc., “Comparison of InterSense IS-900 System and Optical Systems,” http://www.intersense.com/uploadedFiles/Products/WhitePapers/Comparison%20of %20InterSense%20IS-900%20System%20and %20Optical%20Systems.pdf (Jul. 12, 2004).
  • Izumori et al, High School Algebra: Geometry (1986) ().
  • Jacob, “Human-Computer Interaction—Input Devices” http://www.cs.tufts.edu/˜jacob/papers/surveys.html, “Human-Computer Interaction: Input Devices,” ACM Computing Surveys, vol. 28, No. 1, pp. 177-179 (Mar. 1996).
  • Jakubowsk, et al., “Increasing Effectiveness of Human Hand Tremor Separation Process by Using Higher-Order Statistics,” Measurement Science Review, vol. 1 (2001).
  • Jakubowski, et al., “Higher Order Statistics and Neural Network for Tremor Recognition,” IEEE Transactions on Biomedical Engineering, vol. 49, No. 2 (Feb. 2002).
  • Ji, H.: “Study on the Infrared Remote-Control Lamp-Gesture Device”, Yingyong Jiguang/Applied Laser Technology, v. 17, n. 5, p. 225-227, Oct. 1997 Language: Chinese—Abstract only.
  • Jian, et al., “Adaptive Noise Cancellation,” Rice University, http://www.ece.rice.edu/.about.klwang/elec434/elec434.htm, (Aug. 2007).
  • Jiang, “Capacitive position-sensing interface for micromachined inertial sensors,” Dissertation at Univ. of Cal. Berkeley (2003).
  • Ju, et al., “The Challenges of Designing a User Interface for Consumer Interactive Television Consumer Electronics Digest of Technical Papers.,” IEEE 1994 International Conference on Volume , Issue , Jun. 21-23, 1994 pp. 114-115 (Jun. 1994).
  • Kalawsky, “The Science of Virtual Reality and Virtual Environments,” 1993.
  • Keir, et al., “Gesture-recognition with Non-referenced Tracking,” IEEE Synposium on 3D User Interfaces, pp. 151-158 (Mar. 25-26, 2006).
  • Kennedy, P.J., “Hand-Held Data Input Device,” IBM Technical Disclosure Bulletin, vol. 26, No. 11, pp. 5826-5827 (Apr. 1984).
  • Kessler, et al., “The Simple Virtual Environment Library” (MIT Presence) (2000).
  • Kindratenko, “A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System,” MIT Presence, vol. 10, No. 6, Dec. 2001, 657-663 (2001).
  • Klein et al., “Tightly Integrated Sensor Fusion for Robust Visual Tracking,” British Machine Vision Computing, vol. 22, No. 10, pp. 769-776 (2004).
  • Kohler, “Triumph of the Wii: How Fun Won Out in the Console Wars,” www.wired.com/print/gaming/hardware/news/2007/06/wii. (Jun. 2007).
  • Kohlhase, “NASA Report, The Voyager Neptune Travel guide,” Jet Propulsion Laboratory Publication 89-24, excerpt (Jun. 1989).
  • Krumm, et al., “How a Smart Environment Can Use Perception,” Ubicomp 2001 (Sep. 2001).
  • Kuipers, Jack B., “SPASYN—An Electromagnetic Relative Position and Orientation Tracking System,” IEEE Transactions on Instrumentation and Measurement, vol. 29, No. 4, pp. 462-466 (Dec. 1980).
  • Kunz, Andreas M. et al., “Design and Construction of a New Haptic Interface,” Proceedings of DETC '00, ASME 2000 Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Baltimore, Maryland (Sep. 10-13, 2000).
  • La Scala, et al., “Design of an Extended Kalman Filter Frequency Tracker,” IEEE Transactions on Signal Processing, vol. 44, No. 3 (Mar. 1996).
  • Larimer et al., “VEWL: A Framework for building a Windowing Interface in a Virtual Environment,” in Proc. of IFIP TC13 Int. Conf. on Human-Computer Interaction Interact'2003 (Zürich, http://people.cs.vt.edu/˜bowman/papers/VEWLfinal.pdf (2003).
  • Laughlin, et al., “Inertial Angular Rate Sensors: Theory and Applications,” Sensors Magazine (Oct. 1992).
  • Lee et al., “Tilta-Pointer: the Free-Space Pointing Device,” Princeton COS 436 Project, http://www.milyhuang.com/cos436/project/soecs.html (2004).
  • Lee, et al., “Innovative Estimation Method with Measurement Likelihood for all-Accelerometer Type Inertial Navigation Systems,” vol. 38, No. 1 (Jan. 2002).
  • Lee, et al., “Two-Dimensional Position Detection System with MEMS Accelerometer for Mouse Applications” Design Automation Conference, 2001. Proceedings, 2001 pp. 852-857 (Jun. 2001).
  • Leganchuk et al., “Manual and Cognitive Benefits of Two-Handed Input: An Experimental Study,” ACM Transactions on Computer-Human Interaction, vol. 5, No. 4, pp. 326-359 (Dec. 1998).
  • Leonard, “Computer Pointer Controls 3D Images in Free Space,” Electronics Design, pp. 160, 162, 165 , (Nov. 1991).
  • Liang, et al., “On Temporal-Spatial Realism in the Virtual Reality Environment,” ACM 1991 Symposium on User Interface Software and Technology (Nov. 1991).
  • Link, “Field-Qualified Silicon Accelerometers From 1 Milli g to 200,000 g,” Sensors (Mar. 1993).
  • Liu, et al., “Enhanced Fisher Linear Discriminant Models for Face Recognition,” Proc. 14.sup.th International Conference on Pattern Recognition, Queensland, Australia (Aug. 1998).
  • Lobo et al., “Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 25, No. 12, pp. 1597-1608 (Dec. 2003).
  • Logitech, Logitech 2D/6D Mouse Devices Specification (1991).
  • Logitech, “Logitech 2D/6D Mouse Technical Reference Manual” (1991).
  • Logitech, Inc., “3D Mouse & Head Tracker Technical Reference Manual” (1992).
  • Logitech WingMan Cordless Rumblepad, Logitech, Press Release Sep. 2, 2001, 2 pages.
  • Louderback, Jim, “Nintendo Wii,” Reviews by PC Magazine, (Nov. 13, 2006).
  • “LPC2104/2105/2106, Single-chip 32-bit microcontrollers; 128 kB ISP/IAP Flash with 64 kB/32 kB/ 16 KB RAM”, Phillips, Dec. 22, 2004; 32 pages.
  • Luinge, Inertial sensing of human movement, Thesis, University of Twente (2002).
  • Luinge, et al., “Estimation of orientation with gyroscopes and accelerometers,” Proceedings of the First Joint BMES/EMBS Conference, 1999., vol. 2, p. 844 (Oct. 1999).
  • Luthi, P. et al., “Low Cost Inertial Navigation System,” and translation (2000).
  • MacKenzie et al., “A two-ball mouse affords three degrees of freedom,” Extended Abstracts of the CHI '97 Conference on Human Factors in Computing Systems, pp. 303-304. New York: ACM (1997).
  • MacKinlay, “Rapid Controlled Movement Through a Virtual 3D Workspace,” ACM SIGGRAPH Computer Graphics archive vol. 24 , No. 4, pp. 171-176 (Aug. 1990).
  • MacLean, “Designing with Haptice Feedback”, Proceedings of IEEE Robotics and Automation (ICRA '2000), at 783-88 (Apr. 22-28, 2000).
  • MacLean, Karen, Publications and patents, bibliography (Nov. 2006).
  • Maggioni, C., “A novel gestural input device for virtual reality”, IEEE Virtual Reality Annual International Symposium, 118-24, 1993.
  • Markey et al., “The Mechanics of Inertial Position and Heading Indication,” Massachusetts Institute of Technology, 1961.
  • Marti et al., “Biopsy navigator: a smart haptic interface for interventional radiological gestures”, International Congress Series, vol. 1256, Jun. 2003, 6 pages.
  • Marrin, “Possibilities for the Digital Baton as a General-Purpose Gestural Interface”, Late-Breaking/Short Talks. CHI 97, Mar. 22-27, 1997 (pp. 311-312).
  • Marrin, Teresa et al.: “The Digital Baton: a Versatile Performance Instrument” (1997).
  • Marrin, Teresa: “Toward an Understanding of Musical Gesture: Mapping Expressive Intention with the Digital Baton” (1996).
  • Masliah, “Measuring the Allocation of Control in 6 Degree of Freedom Huma-Computer Interaction Tasks,” Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 25-32 (2001 ).
  • Maybeck, “Stochastic Models, Estimation and Control,” vol. 1, Mathematics in Science and Engineering, vol. 141 (1979).
  • “MEMS enable smart golf clubs” Small Times—MEMS enable smart golf clubs, Jan. 6, 2005, 2 pages.
  • Merians, Alam S. et al.: “Virtual Reality-Augmented Rehabilitation for Patients Following Stroke,” Physical Therapy, vol. 82, No. 9 (Sep. 2002).
  • Merrill, “FlexGesture: A sensor-rich real-time adaptive gesture and affordance learning platform for electronics music control,” Thesis, Massachusetts Institute of Technology (Jun. 2004).
  • Meyer et al., “A Survey of Position Tracker,” vol. 1, Issue 2, pp. 173-200, MIT Presence, (1992).
  • Microsoft Research Corp., “XWand Device” (image) (Apr. 2009).
  • Miles, “New pads lack control,” The Times, Dec. 6, 1999.
  • Mizell, “Using Gravity to Estimate Accelerometer Orientation,” IEEE Computer Society (2003).
  • Morgan, C.; “Still chained to the overhead projector instead of the podium? (TV Interactive Corp's LaserMouse Remote Pro infrared mouse) (Clipboard)(Brief Article) (Product Announcement)”, Government Computer News, Jun. 13, 1994.
  • Morris, “Accelerometry—a technique for the measurement of human body movements,” J Biomechanics 6: 729-736 (1973).
  • Moser, “Low Budget Inertial Navigation Platform (2000),” www.tmoser.ch/typo3/11.0.html, Oct. 2008.
  • Mulder, “How to Build an Instrumental Glove Based on the Powerglove Flex Sensors,” PCVR 16, pp. 10-14 (1994).
  • Mulder, “Human movement tracking technology,” School of Kinesiology, Simon Fraser University (Jul. 1994).
  • Myers, et al., “Interacting at a Distance: Measuring the Performance of Laser Pointers and Other Devices,” CHI 2002, (Apr. 2002).
  • N.I.C.E., “The N.I.C.E. Project” (video), (1997) http://www.niceproject.com/.
  • Naimark, et al., “Circular Data Matrix Fiducial System and Robust Image Processing for a Wearable Vision-Inertial Self-Tracker,” Proceedings. International Symposium on Mixed and Augmented Reality, ISMAR (2002).
  • Naimark, et al., “Encoded LED System for Optical Trackers,” Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 150-153 (2005).
  • Navarrete, et al., “Eigenspace-based Recognition of Faces: Comparisons and a new approach,” Image Analysis and Processing (2001).
  • Newswire PR, “Five New Realities to Carry Gyration's Gyropoint Point and Gyropoint Pro” (1996).
  • Newswire PR, “Three-Axis MEMS-based Accelerometer From STMicroelectronics Targets Handheld terminals,” STMicro (Feb. 2003).
  • Nichols, “Geospatial Registration of Information for Dismounted Soldiers (GRIDS),” Contractor's Progress, Status, and Management Report (Milestone 3 Report to DARPA ETO) (Oct. 1998).
  • Nintendo, G3 Wireless Controller (Pelican) (2001).
  • Nintendo, Game Boy Advance SP System (2003).
  • Nintendo, GameBoy Color (1998).
  • Nintendo Game Boy, Consumer Information and Precautions Booklet, Nintendo, Jul. 31, 1969.
  • Nintendo, GameCube Controller (2001).
  • Nintendo, GameCube System and Controller (2001).
  • Nintendo, NES Controller (1984).
  • Nintendo, NES Duck Hunt Game (1984).
  • Nintendo, NES System and Controllers (1984).
  • Nintendo, NES Zapper Guns (1984).
  • Nintendo, Nintendo 64 Controller (1996).
  • Nintendo, Nintendo 64 System (N64) (1996).
  • Nintendo, Nintendo 64 System and Controllers (1996).
  • Nintendo, Nintendo Entertainment System (NES) (1984).
  • Nintendo, Nintendo Game Boy Advance (2001).
  • Nintendo, Nintendo Game Boy Advance System (2001).
  • Nintendo, Nintendo Game Boy Advance Wireless Adapter (Sep. 26, 2003).
  • Nintendo, Nintendo Game Boy Color Cartridge with Built-In Rumble (Jun. 28, 2009).
  • Nintendo, Nintendo GameBoy Color System (1998).
  • Nintendo, Nintendo GameBoy System (1989).
  • Nintendo, Nintendo GameCube System (2001).
  • Nintendo, Nintendo N64 Controller with Rumble Pack (1996-1997).
  • Nintendo, Nintendo N64 Rumble Packs (1996-1997).
  • Nintendo, Nintendo Super NES (SNES) (1991).
  • Nintendo, Nintendo: Kirby Tilt & Tumble game, packaging and user manual (Aug. 2000-2001).
  • Nintendo, Nintendo: WarioWare: Twisted tame, sackatint and user manual (2004-2005).
  • Nintendo, Pokeman Pinball (1998).
  • Nintendo, SNES Superscope (1991).
  • Nintendo, SNES System & Controllers (1991).
  • Nintendo, Wavebird Wireless Controllers (May 2002).
  • Nintendo, Wavebird Controller, Nintendo, Jun. 2010 Wikipedia Article, http://en.wikipedia.org/wiki/WaveBird.
  • Nintendo, Nintendo Entertainment System Consumer Information and Precautions Booklet, Nintendo of America, Inc. 1992.
  • Nintendo, Nintendo Entertainment System Instruction Nintendo of America, Inc. 1992.
  • Nintendo, Nintendo Entertainment System Booth 2002.
  • Nintendo, Nintendo Entertainment System Layout, May 9, 2002.
  • Nintendo, Nintendo Feature: History of Pokeman Part 2, Official Nintendo Magazine May 17, 2009, http://www.offficialnintendomagazine.co.uk/article.php?id =8576.
  • Nishiyama, “A Nonlinear Filter for Estimating a Sinusoidal Signal and its Parameters in White Noise: On the Case of a Single Sinusoid,” IEEE Transactions on Signal Processing, vol. 45, No. 4 (Apr. 1997).
  • Nishiyama, “Robust Estimation of a Single Complex Sinusoid in White Noise-H.infin, Filtering Approach,” IEEE Transactions on Signal Processing, vol. 47, No. 10 (Oct. 1999).
  • Odell, “An Optical Pointer for Infrared Remote Controllers,” Proceedings of International Conference on Consumer Electronics (1995).
  • Odell, Transcript of Testimony, Investigation No. 337-TA-658, Before the United States International Trade Commission, vol. IV, redacted (May 14, 2009).
  • Ogawa et al., “Wii are the Elite,” GameSpot web site (Feb. 5, 2008).
  • Ojeda, et al., “No GPS? No Problem!” University of Michigan Development Award-Winning Personal Dead-Reackoning (PDR) System for Walking Users, http://www.engin.umich.edu/research/mrl/urpr/InPress/P135.pdf (post 2004).
  • OLPC, “One Laptop Per Child,” wiki.laptop.org/go/OneLaptopperChild (May 2009).
  • Omelyan, “On the numerical integration of motion for rigid polatomics: The modified quaternion approach” Computers in Physics, vol. 12 No. 1, pp. 97-103 (1998).
  • Ovaska, “Angular Acceleration Measurement: A Review,” Instrumentation and Measurement Technology Conference, Conference Proceedings. IEEE, vol. 2 (Oct. 1998).
  • PAD-Controller and Memory I/F in Playstation (Apr. 17, 1995; Jan. 12, 2002).
  • Pai, et al., “The Tango: A Tangible Tangoreceptive Whole-Hand Interface,” Proceedings of World Haptics and IEEE Eurohaptics Conference, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (2005).
  • Paley, W. Bradford, “Interaction in 3D Graphics,” SIGGRAPH Computer Graphics Newsletter, Cricket input device (Nov. 1998).
  • Paradiso, et al., “Interactive Therapy with Instrumented Footwear,” CHI 2004, Apr. 24-29, 2004, Vienna, Austria (2004).
  • Paradiso, Joseph A., “The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance” (Nov. 1998) (“Brain Opera Article”).
  • Park, Adaptive control strategies for MEMS gyroscopes (Dissertation), Univ. Cal. Berkley (2000).
  • PC World, “The 20 Most Innovative Products of the Year” (Dec. 27, 2006).
  • Perry Simon: “Nintendo to Launch Wireless Game Boy Adaptor,” Digital Lifestyles, Sep. 26, 2003 http://digital-lifestyles.info/2003/09/26/nintendo-to-launch-wireless-game-boy-adaptor/.
  • Pham, Hubert “Pointing in Intelligent Environments with WorldCursor,” Proceedings of Internet 2003, Andrew Wilson (2003).
  • Phillips, “Forwad/Up Directional Incompatibilities During Cursor Placement Within Graphical User Interfaces,” Ergonomics, informaworld.com (May 2005).
  • Phillips, “On the Right Track: A unique optical tracking system gives users greater freedom to explore virtual worlds” (Apr. 2000).
  • Photographs of prior art ASCII Grip V2 Controller, (ASCII/Sammy Grip V2 One Switch-Accessible Gaming Shop-ASCII Grip V2, http://www.oneswitch.org.uk/1/AGS/AGS-onehand/ascii-grip-v2.html, Jul. 10, 2008, pp. 1-2.).
  • Pierce et al., “Image Plane Interaction Techniques in 3D Immersive Environments,” Proceedings of the 1997 symposium on Interactive 3D graphics, portal.acm.org (1997).
  • Pilcher, “AirMouse Remote Controls,” IEEE Conference on Consumer Electronics (1992).
  • Pique, “Semantics of Interactive Rotations,” Interactive 3D Graphics, Proceedings of the 1986 workshop on Interactive 3D graphics, pp. 259-269 (Oct. 1986).
  • Piyabongkarn, “Development of a MEMS Gyroscope for Absolute Angle Measurement,” IEEE Transactions on Control Systems Technology, vol. 13, Issue 2, pp. 185-195 (Mar. 2005).
  • Piyabongkarn, “Development of a MEMS Gyroscope for Absolute Angle Measurement,” Dissertation, Univ. Minnesota (Nov. 2004).
  • Pokeman Pinball Game, 1999, Wikipedia Article, http://www.en.wikipedia.org/wiki/Pok?C3?A9monPinball.
  • Polhemus, “Polhemus 3Space FASTRAK devices” (image) (2000).
  • Polhemus: “FASTRACK, The Fast and Easy Digital Tracker” copyrighted 2001, Coldiester, Vermont 2 pages.
  • PowerGlove product Program Guide, Mattel, 1989.
  • PowerGlvoe product, Mattel, 1989 Wikipedia Article.
  • PowerGlove product, Instructions, Mattel, 1989.
  • Pryor et al., “A Reusable Software Architecture for Manual Controller Integration,” IEEE conf. on Roboticsand Automation, Univ of Texas (Apr. 1997).
  • Raab et al., “Magnetic Position and Orientation Tracking System,” IEEE Transactions on Aerospace and Electronics Systems, vol. AES-15, No. 5, pp. 709-718 (Sep. 1979).
  • Raethjen, et al., “Tremors Analysis in Two Normal Cohorts,” Clinical Neurophysiology 115 (2004).
  • Rebo, “Helmet-Mounted virtual environment display system,” Thesis, AirForce Institute of Technology, Defense Technical Information Center (Dec. 1988).
  • Rebo, et al., “Helmet-Mounted Virtual Environment Display System,” Proc. SPIE vol. 1116, pp. 80-84 (Sep. 1989).
  • Regan, “Smart Golf Clubs”, The Baltimore Sun. Jun. 17, 2005, 1 page.
  • Rekimoto, “Tilting Operations for Small Screen Interface,” Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, pp. 167-168 (1996).
  • Reunert, “Fiber-Optic Gyroscopes: Principles and Applcations,” Sensors, (Aug. 1993).
  • Ribo, et al., “Hybrid Tracking for Outdoor Augmented Reality Applications,” IEEE Computer Graphics and Applications, vol. 22, No. 6, pp. 54-63 (Nov./Dec. 2002).
  • Riviere, Cameron Testimony, Trial Day 5, In the Matter of Certain Video Game Machines and Related Three-Dimensional Pointing Devices, ITC Investigation No. 337-TA-658 (May 15, 2009).
  • Riviere, C., Robotics Institute, http://www.ri.cmu.edu/people/riviere.sub.--cameron.html http://www.ri.cmu.edu/person.html?type=publications&personid=248 (Aug. 2007).
  • Riviere, et al., “Adaptive Canceling of Physiological Tremor for Improved Precision in Micorsurgery,” IEEE Transactions on Biomedical Engineering, vol. 45, No. 7 (Jul. 1998).
  • Riviere, et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, vol. 19, No. 5 (Oct. 2003).
  • Roberts, “The Lincoln Wand,” AFIPS Conference Proceedings, MIT Lincoln Laboratory (1966).
  • Robinett et al., “Implementation of Flying, Scaling, and Grabbing in Virtual Worlds,” ACM Symposium (1992).
  • Robinett et al., “The Visual Display Transformation for Virtual Reality,” University of North Carolina at Chapel Hill (1994).
  • Robotics Research Group, “Robot Design: Robot Manual Controller Design,” The University of Texas of Austin, May 2009.
  • Roetenberg, “Inertial and magnetic sensing of human motion,”Thesis (2006).
  • Roetenberg, et al., “Inertial and Magnetic Sensing of Human Movement Near Ferromagnetic Materials,” Proceedings. The Second IEEE and ACM International Symposium on Mixed and Augmented Reality (Mar. 2003).
  • Rolland, et al., “A Survey of Tracking Technology for Virtual Environments,” University of Central Florida, Center for Research and Education in Optics Lasers (CREOL) (2001).
  • Sakai, et al., “Optical Spatial Filter Sensor for Ground Speed,” Optical Review, vol. 2, No. 1 pp. 65-67 (1994).
  • Satterfield, Shane, E3 2002: Nintendo announces new GameCube games, GameSpot, May 21, 2002, http://wwwgamespot.com/gamecube/action/rollarama/new.html?sid=2866974&comact-convert&omclk=nesfeatures&tag=newsfeatures%Btitle%3B.
  • Savage, Paul G., “Advances in Strapdown Inertial Systems,” Lecture Series Advisory Group for Aerospace Research and Development Neuilly-Sur-Seine (France) (1984).
  • Sawada et al., “A Wearable Attitude-Measurement System Using a Fiberoptic Gyroscope” Massachusetts Institute of Technology, vol. 11, No., Apr. 2002, pp. 109-118.
  • Saxena et al., “In Use Parameter Estimation of Inertial Sensors by Detecting Multilevel Quasi-Static States,” Lecture Notes in Computer Science, 2005—Berlin: Springer-Verlag, (Apr. 2004).
  • Sayed, “A Framework for Stae-Space Estimation with Uncertain Models,” IEEE Transactions on Automatic Control, vol. 46, No. 7 (Jul. 2001).
  • Sayed, UCLA Adaptive Systems Laboratory—Home Page, UCLA, http://asl.ee.ucla.edu/index.php?option=com.sub.--frontpage&Itemid=1 (Aug. 2007).
  • Schmorrow et al., “The PSI Handbook of Virtual Environments for Training and Education,” vol. 1, 2009.
  • Schofield, Jack et al., Coming up for airpad, The Guardian (Feb. 2000).
  • Sega/Sports Sciences, Inc., “Batter Up, It's a Hit,” Instruction Manual, Optional Equipment Manual (1994).
  • Sega/Sports Sciences, Inc., “Batter Up, It's a Hit,” Photos of baseball ball (1994).
  • Selectech, “Airmouse Remote Control System Model AM-1 User's Guide,” Colchester, VT (Sep. 24, 1991).
  • Selectech, “AirMouse Remote Controls, AirMouse Remote Control Warranty” (1991).
  • Selectech, “Changing Driver Versions on CDTV/AMIGA” (Oct. 17, 1991).
  • Selectech, “Selectech AirMouse Remote Controls, Model # AM-R1,” photographs (1991).
  • Selectech, Facsimile Transmission from Rossner to Monastiero, Airmouse Remote Controls, Colchester, VT (Mar. 25, 1992).
  • Selectech, Selectech AirMOuse Devices (image) (1991).
  • Selectech, Software, “AirMouse for DOS and Windows IBM & Compatibles,” “AirMouse Remote Control B0100EN-C, Amiga Driver, CDTV Driver, Version: 1.00,” “AirMouse Remote Control B0100EM-C.1, Apple Macintosh Serial Driver Version: 1.00(1.01B),” “AirMouse Remote Control B0100EL-B/3.05 DOS Driver Versions: 3.0 Windows Driver Version 1.00,” AirMouse Remote Control MS-DOS Driver Version: 3.00/3.05, Windows 3.0 Driver Version: 1.00 (1991).
  • “Self-Contained, Free Standing “Fishing Rod” Fishing Games,” Miacomet and Interact Announce Agreement to Launch Line of Reel Feel™ Sport Controllers, Press Release, May 13, 1999, 4 pages.
  • Seoul National Univ., “EMMU System”—Seoul National Univ Power Point Presentation, www.computer.org/portal/cmsdocsieeecs/ieeecs/education/cside/CSIDC03Presentations/SNU.ppt (2003).
  • Serial Communication (Wikipedia) (Jul. 2, 2005).
  • Shoemake, Ken, Quaternions, UPenn, Online (Oct. 2006).
  • Simon, et al., “The Yo Yo: A Handheld Combining Elastic and Isotonic Input,” http://www.uni-weimar.de/cms/fileadmin/medien/vr/documents/publications/TheYoYo-Interacts2003-Talk.pdf (2003).
  • Simon, et al., “The Yo Yo: A Handheld Device Combining Elastic and Isotonic Input,” Human-computer Interaction—INTERACT'03, pp. 303-310 (2003).
  • Smartswing internal drawing, 1 page (2004).
  • Smartswing, Training Aid, Apr. 2005, Austin, Texas.
  • SmartSwing: “Register to be notified when Smartswing products are available for purchase,” 3 pages, May 2004, retrieved May 19, 2009, http://web.archive.org/web/20040426182437/www.smartswing-golf.com/.
  • SmartSwing: “SmartSwing: Intellegent Golf Clubs that Build a BetterSwing,” 2 pages, 2004 retrieved May 19, 2009, http://web.archive.org/web/20040728221951/http://www.smartswinggolf . . . .
  • SmartSwing: “The SmartSwing Learning System Overview,” 3 pages, 2004, retrieved May 19, 2009, http://web.archive.org/web/20040810142134/http://www.smartswinggolf.com/t . . . .
  • SmartSwing: “The SmartSwing Product, 3 pages, 2004, retrieved May 19, 2009, http://web.archive.org/web/20040032004628/http://www.smartswinggolf.com/ . . . ”.
  • SmartSwing: The SmartSwing Product Technical Product: Technical Information, 1 page, 2004, retrieved May 19, 2009, http://web.archive.org/web/200400403205906/http://www.smartswinggolf.com/ . . . .
  • SmartSwing, Letter from the CEO—pp. 1-3, May 2009.
  • SmartSwing: The SmartSwing Learning System: How it Works, 3 pages, 2004, retrieved May 19, 2009, http://web.archive.org/web/20040403213108/http://www.smartswinggolf.com/.
  • Smith, “Gyrevolution: Orienting the Digital Era,” http://www.gyration.com/images/pdfs/GyrationWhitePaper.pdf (2007).
  • Sorenson, et al., “The Minnesota Scanner: A Prototype Sensor for Three-Dimensional Tracking of Moving Body Segments,” IEEE Transactions on Robotics and Animation (Aug. 1989).
  • Sourceforge.com,“ARToolkit API Documentation” (SourceForge web pages) (2004-2006).
  • Stovall, “Basic Inertial Navigation,” NAWCWPNS TM 8128, Navigation and Data Link Section, Systems Integration Branch (Sep. 1997).
  • Sulic, “Logitech Wingman Cordless Rumblepad Review”, Review at IGN, 4 pages, Jan. 14, 2002.
  • “Superfamicom Grip controller by ASCII,” http://superfami.com/sfcgrip.html, Jul. 10, 2008, pp. 1-2.
  • Sutherland, “A Head-Mounted Three Dimensional Display,” AFIPS '68 (Fall, part I): Proceedings of the Dec. 9-11, 1968, fall joint computer conference, part I, pp. 757-764 (Dec. 1968).
  • Sutherland, Ivan E., “Sketchpad: A Man-Machine Graphical Communication System,” AFIPS '63 (Spring): Proceedings of the May 21-23, 1963, Spring Joint Computer Conference, pp. 329-346 (May 1963).
  • Sweetster, “A Quaternion Algebra Tool Set,” http://world.std.com/%7Esweetser/quaternions/intro/tools/tools.html (Jun. 2005).
  • Swisher “How Science Can Improve Your Golf Game, Your Club is Watching” The Wall Street Journal, Apr. 18, 2005, 1 page.
  • Templeman, James N., “Virtual Locomotion: Walking in Place through Virtual Environments,” Presence, vol. 8 No. 6, pp. 598-617, Dec. 1999.
  • Thinkoptics, Thinkoptics Wavit devices (image) (2007).
  • Timmer, “Data Analysis and Modeling Dynamic Processes in the Life Sciences,” Freiburg Center for Data Analysis and Modeling, http://webber.physik.uni-freiburg.de/.about.jeti/ (Aug. 2007).
  • Timmer, “Modeling Noisy Time Series: Physiological Tremor,” International Journal of Bifurcation and Chaos, vol. 8, No. 7 (1998).
  • Timmer et al., “Pathological Tremors: Deterministic Chaos or Non-linear Stochastic Oscillators?” Chaos, vol. 10, No. 1 (Mar. 2000).
  • Timmer, et al., “Characteristics of Hand Tremor Time Series,” Biological Cybernetics, vol. 70 (1993).
  • Timmer, et al., Cross-Spectral Analysis of Physiological Tremor and Muscle Activity: I Theory and Application to unsynchronized electromyogram, vol. 78 (1998).
  • Timmer, et al., Cross-Spectral Analysis of Physiological Tremor and Muscle Activity: II Application to Synchronized Electromyogram, Biological Cybernetics, vol. 78 (1998).
  • Timmer, et al., “Cross-Spectral Analysis of Tremor Time Series,” International Journal of Bifurcation and Chaos, vol. 10, No. 11 (2000).
  • Titterton et al., “Strapdown Inertial Navigation Technology,” pp. 1-56 and pp. 292-321 (May 1997).
  • Traq 3D (Trazer) Product, http://www.exergamefitness.com/traq3d.htm, http://www.trazer.com/, http://www.traq3d.com/ (1997).
  • Traq 3D, “Healthcare” 1 pages, //www.traq3d.com/Healthcare/Healthcare.aspx, 1997.
  • Translation of the brief of BigBen of Oct. 27, 2010 and original German text (Nov. 3, 2010).
  • Translation of the brief of System Com 99 of Oct. 27, 2010 and original German text.
  • Translation of Exhibit B-B01: Cancellation Request of BigBen of Oct. 15, 2010 against German utility model 20 2006 020 818 (UM1) (Oct. 15, 2010) and original German text.
  • Translation of Exhibit B-C01: Cancellation Request of BigBen of Oct. 15, 2010 against German utility model 20 2006 020 819 (UM2) (Oct. 15, 2010) and original German text.
  • Translation of Exhibit B-D01: Cancellation Request of BigBen of Oct. 15, 2010 against German utility model 20 2006 020 820 (UM3) (Oct. 15, 2010) and original German text.
  • Translation of Opposition Brief of BigBen of Sep. 2, 2010 Against European Patent No. EP 1854518.
  • Transmission Mode (Apr. 22, 1999).
  • Ulanoff, Lance, “Nintendo's Wii is the Best Product Ever,” PC Magazine (Jun. 21, 2007).
  • UNC Computer Science Department, “News & Notes from Sitterson Hall,” UNC Computer Science, Department Newsletter, Issue 24, Spring 1999, (Apr. 1999).
  • Univ. Illinois at Chicago, “CAVE—A Virtual Reality Theater,” http://www.youtube.com/watch?v=Sf6bJjwSCE 1993.
  • Univ. Wash., “ARToolkit” (U. Wash. web pages) (1999).
  • Urban, “BAA 96-37 Proposer Information,” DARPA/ETO (1996).
  • US Dynamics Corp, “Spinning Mass Mechnical Gyroscopes” (Aug. 2006).
  • US Dynamics Corp, “The Concept of ‘Rate’ (more particularly, angular rate pertaining to rate gyroscopes) (rate gyro explaination),” (Aug. 2006).
  • US Dynamics Corp, “US Dynamics Model 475 Series Rate Gyroscope Technical Brief—brief discussion on rate gyroscope basics, operation, and uses, and a dissection of the model by major component” (Dec. 2005).
  • US Dynamics Corp, “US Dynamics Rate Gyroscope Interface Brief (rate gyro IO)” (Aug. 2006).
  • VTi, Mindflux-VTi CyberTouch, 1996, http://www.mindflux.com.au/products/vti/cybertouch.html.
  • Van Den Bogard, “Using linear filters for real-time smoothing of rotational data in virtual reality application,” http://www.science.uva.nl/research/ias/alumni/m.sc.theses/theses/RobvandenBogard.pdf (Aug. 2004).
  • Van Laerhoven, et al., “Using an Autonnomous Cube for Basic Navigation and Input,” Proceedings of the 5th International Conference on Multimodal interfaces, Vancouver, British Columbia, Canada, pp. 203-210 (2003).
  • Van Rheeden, et al., “Noise Effects on Centroid Tracker Aim Point Estimation,” IEEE Trans. on Aerospace and Electronic Systems, vol. 24, No. 2, pp. 177-185 (Mar. 1988).
  • Vaz, et al., “An Adaptive Estimation of Periodic Signals Using a Fourier Linear Combiner,” IEEE Transactions on Signal Processing vol. 42, Issue 1, pp. 1-10 (Jan. 1994).
  • Verplaetse, “Inertial Proprioceptive Devices: Self-Motion Sensing Toys and Tools,” IBM Systems Journal (Sep. 1996).
  • Verplaetse, “Inertial-Optical Motion-Estimating Camera for Electronic Cinematography,” Masters of Science Thesis, MIT, (1997).
  • Villoria, Gerald, Hand on Roll-O-Rama Game Cube, Game Spot, May 29, 2002, http://www.gamespot.com/gamecube/action/rollorama/news.html?sid=2868421&comact=convert&omclk=newsfeatures&tag=newsfeatures;title;1&m.
  • Virtual Fishing, Operational Manual, 2 pages, Tiger Electronics, Inc., 1998.
  • Virtual Technologies, Inc., Cyberglove brochure, Palo Alto, CA, www.virtex.com. (1999).
  • Vorozcovs, et al., “The Hedgehog: A Novel Optical Tracking Method for Spatially Immersive Displays,”MIT Presence, vol. 15, No. 1, pp. 108-121 (2006).
  • VR Solutions, “IS-1200”, www.vrs.com.au/motion-tracking/intersense/is-1200.html 2 pages (May 2009).
  • Wang, et al., “Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras,”SPIE 1990 Technical Symposium on Optical Engineering and Photonics in Aerospace Sensing, vol. 1290, pp. 47-57 (1990).
  • Ward, et al., “A Demonstrated Optical Tracker With Scalable Work Area for Head-Mounted Display Systems,” Symposium on Interactive 3D Graphics, Proceedings of the 1992 Symposium on Interactive 3D Graphics, pp. 43-52, ACM Press, Cambridge, MA (1992).
  • Watt, 3D Computer Graphics, “Three-Dimensional Geometry in Computer Graphics,” pp. 1-22 Addison-Wesley (1999).
  • Welch et al., HiBall Devices (image) (2002-2006).
  • Welch et al., Motion Tracking: No Silver Bullet, but a Respectable Arsenal IEEE Computer Graphics and Applications, vol. 22, No. 6, pp. 24-38 (Nov. 2002).
  • Welch, “Hybrid Self-Tracker: An Inertial/Optical Hybrid Three-Dimensional Tracking System,” Tech. Report TR95-048, Dissertation Proposal, Univ. of North Carolina at Chapel Hill, Dept. Computer Science, Chapel Hill, N.C. (1995).
  • Welch, “Hawkeye Zooms in on Mac Screens with Wireless Infrared Penlight Pointer,” MacWeek (May 1993).
  • Welch, et al., “Complementary Tracking and Two-Handed Interaction for Remote 3D Medical Consultation with a PDA,” Proceedings ofTrends and Issues in Tracking for Virtual Environments, Workshop at the IEEE Virtual Reality 2007 Conference (Mar. 2007).
  • Welch, et al., “High-Performance Wide-Area Optical Tracking: The HiBall Tracking System,” MIT Presence: Teleoperators & Virtual Environments (2001).
  • Welch, et al., “SCAAT: Incremental Tracking with Incomplete Information,” Computer Graphics, SIGGRAPH 97 Conference Proceedings, pp. 333-344 (Aug. 1997).
  • Welch, et al., “Source Code for HiBall+Inertial device,” UNC-CH Computer Science (Jun. 1998).
  • Welch, et al., “The HiBall Tracker: High-Performance Wide-Area Tracking for Virtual and Augmented Environments,” ACM SIGGRAPH, Addison-Wesley (1999).
  • Welch, et al., “The High-Performance Wide-Area Optical Tracking : The HiBall Tracking System,” MIT Presence, Presence, vol. 10, No. 1 (Feb. 2001).
  • Welch, et al., “Tracking for Training in Virtual Environments: Estimating the Pose of People and Devices for Simulation and Assessment,” [J. Cohn, D. Environments for Training and Education: Developments for the Military and Beyond, Chap.1, pp. 23-47] (2008).
  • Widrow, et al., “Fundamental relations Between the LMS Algorithm and the DFT,” IEEE Transactions on Circuits and Systems, vol. 34, No. CAS-7, (Jul. 1987).
  • Wiley, M.: “Nintendo Wavebird Review,”US, Jun. 11, 2002, 21 pages.
  • Williams, et al., “Physical Presence: Palettes in Virtual Spaces,” Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, vol. 3639, No. 374-384 (May 1999).
  • Williams, Robert L. et al., “Implementation and Evaluation of a Haptic Playback System,” vol. 3 No. 3, Haptics-e (2004).
  • Williams, Robert L. et al., “The Virtual Haptic Back Project,”Presented at the Image 2003 Conference, Scottsdale, Arizong (Jul. 14-18, 2003).
  • Wilson, “Wireless User Interface Devices for Connected Intelligent Environments,” Ubicomp 2003 Workshop (2003).
  • Wilson, “WorldCursor: Pointing in Intelligent Environments with the World Cursor,” UIST '03 Companion (Nov. 2003).
  • Wilson, “XWand: UI for Intelligent Environments,” http://research.microsoft.com/en-us/um/people/awilson/wand/default.htm (Apr. 2004).
  • Wilson, et al., “Demonstration of the XWand Interface for Intelligent Space,”UIST '02 Companion, pp. 37-38 (Oct. 2002).
  • Wilson, et al., “Gesture Recognition Using the Xwand,” ri.cmu.edu (2004).
  • Wilson, et al., “Xwand: UI for Intelligent Space,” CHI 2003, Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 545-552 (Apr. 2003).
  • Wilson, Research page, biography available at http://research.microsoft.com/en-us/um/people/awilson/?0sr=a, Microsoft Corp. (2009).
  • Wilson, Transcript of Testimony Investigation No. 337-TA-658, Before the United States International Trade Commission, vol. V (May 15, 2009).
  • Wilson, XWand video, http://research.microsoft.com/˜awilson/wand/wand%20video%20768k.WMV (Mar. 2002).
  • Wired Glove, Wikipedia Article, 4 pages, http://en.wikipedia.org/wiki/Wiredglove, (Nov. 18, 2010).
  • Wireless (Wikipedia) (Aug. 12, 2005).
  • Wormell, “Unified Camera, Content and Talent Tracking in Digital Television and Movie Production,” InterSense, Inc. & Mark Read, Hypercube Media Concepts, Inc. Presented: NAB 2000, Las Vegas, NV, Apr. 8-13, 2000.
  • Wormell, et al., “Advancements in 3D Interactive Devices for Virtual Environments,” ACM International Conference Proceeding Series; vol. 39 (2003).
  • Office Action issued in Taiwanese Patent Appl. No. 1002112610 on Dec. 14, 2001.
  • Office Action/Search Report issued in Taiwanese Patent Appl No. 10021121610 on Dec. 14, 2011.
  • Worringham, et al., “Directional Stimulus-Response Compatibility: A Test of Three Alternative Principles,” Ergonomics, vol. 41, Issue 6, pp. 864-880 (Jun. 1998).
  • www.3rdtech.com (2000-2006).
  • Yang, et al., “Implementation and Evaluation of ‘Just Follow Me’: An Immersive, VR-Based Motion-Training System,” MIT Presence: Teleoperators and Virtual Environments, vol. 11 No. 3, at 304-23 (MIT Press) (Jun. 2002).
  • You, et al., “Hybrid Inertial and Vision Tracking for Augmented Reality Registration,” http://graphics.usc.edu/cgit/pdf/papers/Vr1999.PDF (1999).
  • You, et al., “Orientation Tracking for Outdoor Augmented Reality Registration,” IEEE Computer Graphics and Applications, IEEE, vol. 19, No. 6, pp. 36-42 (Nov. 1999).
  • Youngblut, et al., “Review ofVirtual Environment Interface Technology,” Institute for Defense Analyses (Jul. 1996).
  • Yun, et al., “Recent Developments in Silicon Microaccelerometers,” Sensors, Unviersity of California at Berkeley (Oct. 1992).
  • Zhai, “Human Performance in Six Degree of Freedom Input Control,” Thesis, University of Toronto (1995).
  • Zhai, “User Performance in Relation to 3D Input Device Design”, Computer Graphics 32(4), Nov. 1998, 15 pages.
  • Zhou, et al., “A survey—Human Movement Tracking and Stroke Rehabilitation,” Technical Report: CSM-420, ISSN 1744-8050, Dept. of Computer Sciences, Universi of Essex, UK (Dec. 8, 2004).
  • Zhu, et al., “A Real-Time Articulated Human Motion Tracking Using Tri-Axis Inertial/Magnetic Sensors Package,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 12, No. 2 (Jun. 2004).
  • Eurosean Search Resort for Application No. EP 07 11 2880, Oct. 18, 2007.
  • European Search Report for Application No. EP 10178309.0 Apr. 2, 2011.
  • Office Action issued in related Chinese patent application 200610111559.7 (Sep. 18, 2009).
  • Office Action issued in related Japanese patent application 2006-216569 (Oct. 20, 2009).
  • Office Action issued in corresponding Japanese patent application 2007-203785 (Oct. 27, 2008).
  • Office Action issued in corresponding Japanese patent application 2008-2566858 (Sep. 9, 2010).
  • Office Action issued in corresponding Japanese patent application 2005-249265 (Apr. 21, 2011).
  • U.S. Appl. No. 11/745,842, filed May 8, 2007.
  • U.S. Appl. No. 11/404,871, filed Apr. 17, 2006.
  • U.S. Appl. No. 11/404,844, filed Apr. 17, 2006.
  • U.S. Appl. No. 11/790,780, filed Apr. 27, 2007.
  • U.S. Appl. No. 12/889,863, filed Sep. 24, 2010.
  • U.S. Appl. No. 13/028,648, filed Feb. 16, 2011.
  • U.S. Appl. No. 13/071,008, filed Mar. 24, 2011.
  • U.S. Appl. No. 13/071,028, filed Mar. 24, 2011.
Patent History
Patent number: RE45905
Type: Grant
Filed: Nov 27, 2013
Date of Patent: Mar 1, 2016
Assignee: Nintendo Co., Ltd. (Kyoto)
Inventors: Akio Ikeda (Kyoto), Kuniaki Ito (Kyoto), Ryoji Kuroda (Kyoto), Genyo Takeda (Kyoto), Masahiro Urata (Kyoto)
Primary Examiner: James S McCellan
Application Number: 14/092,481
Classifications
Current U.S. Class: Display Peripheral Interface Input Device (345/156)
International Classification: A63F 13/20 (20140101); A63F 13/22 (20140101);