Multi-user touch-responsive entertainment device

An entertainment device comprising a housing or base unit that supports display surface is disclosed. A display generating device displays visual images on the display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user appendage on the display surface in the course of a game or activity. In addition, users may interact with the entertainment device with an input controller device that comprises buttons, directional pad devices, etc. A control unit, connected to the touch/proximity sensing device and to the display generating device, is responsive to signals from the at least one input controller and/or position detections from the touch/proximity sensing device to, among other operations, alter the visual image displayed on the display surface.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 60/625,108, filed Nov. 5, 2004, the entirety of which is incorporated herein by reference.

FIELD OF THE INVENTION

The present invention relates to a multi-user entertainment that displays visual images and generates audio in coordinated response to touch and other user interaction.

BACKGROUND OF THE INVENTION

Interactive electronic game devices have evolved to somewhat complex systems that provide audio and visual output in a variety of forms. These devices can be useful to provide entertainment to children as well as serving as a learning tool.

Many entertainment devices of this type heretofore known have limited, if no, multi-user or multiplayer capability. In addition, these games are not flexible in terms of the type of user input devices that can be used. Very often they are limited to unique controllers that operate only with a particular game device. Many prior art devices also use outdated analog technologies and do not take advantage of the many types of data or content available in digital format. Moreover, many interactive game devices require the use of physical game pieces, that are easily lost or misplaced, in combination with displayed images.

It would be desirable to provide an interactive entertainment device that is fully digital and embodied in a flexible hardware platform that can bring an endless variety of environments and experiences in a way not heretofore known.

SUMMARY OF THE INVENTION

Briefly, an entertainment device is provided comprising a housing or base unit that supports display surface. A display generating device displays visual images on the display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user's appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc. A control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio in the form of game sounds, music, etc.

Users may interact with the entertainment device at each of a plurality of user positions that are located around the display surface. When a transition in an activity is made from one user position to another, the control unit controls the display generating device to rotate the displayed visual images so that they are properly aligned with the other user position. In addition, the control unit adjusts how it interprets touch position detections made by the touch/proximity sensing device during such a transition from one user position to another.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a perspective view of an entertainment device in accordance with an embodiment of the present invention, with the entertainment device in a first, projection position.

FIG. 2A illustrates a perspective view of the entertainment device of FIG. 1, with the entertainment device in an initial stage of transition to a second, storage position.

FIG. 2B illustrates a perspective view of the entertainment device of FIG. 1, with the entertainment device completely transformed into the second, storage position.

FIG. 2C illustrates top perspective view of the entertainment device of FIG. 1, with the entertainment device in the second, storage position.

FIG. 3 is a block diagram of the electrical components of the entertainment device shown in FIGS. 1 and 2.

FIG. 4 is a cross-sectional view of an embodiment of a display generating device for use with an entertainment device in accordance with the present invention.

FIGS. 5 and 6 are schematic representations of alternative forms of display generating devices useful with an entertainment device according to the invention.

FIG. 7 is a fragmentary view of part of a corner of the touch sensing device underlying the display surface according to one embodiment of the device shown in FIGS. 1 and 2.

FIG. 8 is a top plan view of a schematic of the cross-point sensor array for the touch sensing device shown in FIG. 7.

FIG. 9 is a schematic view of part of the display surface overlying the touch sensing device shown in FIGS. 7 and 8.

FIGS. 10-11 are diagrammatic sectional views of one sensor in the sensor array shown in FIG. 8, with and without contact by a user's finger/appendage.

FIG. 12 is an electrical block diagram of the circuitry used in the touch-sensing device shown in FIGS. 7-12.

FIGS. 13A and 13B depict a flow chart for touch position response algorithms that interpret the operative position of a user's touch.

FIG. 14 is a schematic diagram of an integrated display monitor/touch panel that may be used in the entertainment device shown in FIGS. 1 and 2 according to an embodiment of the invention.

FIG. 15 is a plan view of a display surface of the entertainment device and showing how it changes the orientation of the displayed visual image according to which of multiple users is active in a game or activity.

FIG. 16 is a flow chart for the display re-orientation algorithm depicted in FIG. 15.

DETAILED DESCRIPTION

The Entertainment Device Generally

In accordance with the present invention, a game/entertainment/creativity device is disclosed. The game system may include a display generating device that displays visual images on a display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc. A control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio.

FIG. 1 shows a perspective view of an entertainment device 1000 in accordance with an embodiment of the present invention. The entertainment device 1000 of the present invention may include a housing or base unit 1100. The housing 1100 may include first, second, third, and fourth sides 1102, 1104, 1106, and 1108, respectively. The housing 1100 of the entertainment device 1000 of the present invention may also include a touch-sensitive display surface 1110 received thereon. The housing 1100 may include a control unit (e.g., a microprocessor—not shown) housed therein.

The housing or base unit 1100 of the entertainment device 1000 of the present invention may also include one or more input controllers, two of which are shown at reference numerals 1120(1) and 1120(2). In the embodiment of the entertainment device 1000 of the present invention illustrated in FIG. 1 the input controllers 1120(1), 1120(2) may be removeably or non-removeably received in one or more recesses or bays 1150 included in the housing 1100 of the entertainment device 1000. Although only two input controllers 1120(1), 1120(1) are shown in FIG. 1, any number of input controllers may be utilized without departing from the scope of the present invention (see input controller 1120(3) in FIG. 2). For example, in the design shown in FIG. 1, there are four user or player positions around the display surface 1110, one on each side 1102, 1104, 1106 and 1108 of the housing 1100, and there may be an input controller at each of these user or player positions. The least one input controller 1120(1), 1120(2), 1120(3) may be operably coupled to the control unit via any known method including a wired coupling at the corresponding bay 1150, or a wireless coupling to electronics in the base unit 1100.

The input controllers 1120(1), 1120(2), 1120(3) may comprise a conventional video game controller. Additionally, the input controllers 1120(1), 1122(2), 1120(3) may comprise any type of user-manipulable electronic input device (e.g., a directional pad, steering wheel, joystick, touchpad, dancepad, motion-sensitive implement (an implement such as a bat, racquet, paddle, etc. housing a motion sensor) without departing from the scope of the present invention. For example, the input controllers 1120(1), 1122(2), 1120(3) may include a directional pad 1120a(1), 1120a(2) and several buttons 1120b(1), 1120b(2), respectively. One or more of the bays 1150 may be “universal” in that it may also accept and operably connect to other accessory devices such as microphones, electric musical instruments (e.g., keyboard), and other user-manipulable electronic input devices such as dancepads, joysticks, steering wheels, etc.

The entertainment device 1000 in accordance with an embodiment of the present invention may also include a display portion 1130. The display portion 1130 of this embodiment includes a display housing 1180 with a display generating device 1175 housed therein. As shown in FIG. 1, the display generating device 1175 is supported above the touch-sensitive display surface 1110. Thus, the display generating device 1175 is positioned in a first, projection position. In this first, projection position, the display generating device 1175 is located spaced-apart from the touch-sensitive display surface 1110. In one embodiment, the display generating device 1175, in its first, projection position, may be positioned approximately 18 inches above the touch-sensitive display surface 1110.

As shown in FIG. 1, the display generating device 1175 of the entertainment device 1000 of the present invention is supported above the touch-sensitive display surface 1110 of the housing 1000. The display generating device 1175 is supported above the touch-sensitive display surface 1110 by support portions or arm members 1132, 1134, 1136, and 1138. The support portions 1132, 1134, 1136, and 1138 are configured to move from a projection position (illustrated in FIG. 1) to a storage/folded position (illustrated in FIG. 2). The support portions (arms) 1132, 1134, 1136, and 1138 may comprise a plurality of generally rigid, arcuate tubes capable of supporting the display housing 1180 above the touch-sensitive display surface 1110. The arm pair formed by arms 1134, 1136 and the arm pair formed by arms 1132, 1138 attach at their distal ends to display mount or clamp members 1160 and 1162, respectively. The clamp members 1160 and 1162 hold the display housing 1180 in position over the touch-sensitive display surface 1110. As will be explained hereinafter in conjunction with FIGS. 2A, 2B and 2C, the display housing 1180 may be removed from the clamp members 1160 and 1162 when the arms 1132, 1134, 1136 and 1138 are rotated to their storage positions. The display housing 1180 has connectors (not shown) for data, control, and power that mate with complementary connectors (not shown) on one or both of the clamp members 1160, 1162, which are in turn electrically connected to the system controller electronics in the base unit 1100. The support portions 1132, 1134, 1136, and 1138 and the display generating device 1175 may also be rotated into a different position convenient for storage (see FIG. 2). In addition, the support arms 1132, 1134, 1136, and 1138 may be hollow metal tubes that act as heat dissipating “fins” to help conduct heat away from the display generating device 1175, e.g., a projector head.

There is also a port 1190 on the housing 1100 at one of the sides 1102, 1104, 1106 and 1108 that receives a program cartridge 1192 that contains computer or microprocessor instructions for one or more games or activities for one or more players of the device 1000 (in the embodiment illustrated in FIG. 1, the port 1190 and the program cartridge 1192 are shown on side 1104 of housing 1100). The program cartridge 1192 may contain a read only memory (ROM) storage device that contains the one or more computer or microprocessor programs to provide game/entertainment content to the control unit and the display generating device 1175.

The housing 1100 of the entertainment device 1000 of the present invention includes indentations or grooves 1140 configured to permit the support portions 1132, 1134, 1136, and 1138 to be moved/adjusted from the projection position to the storage/folded position. Furthermore, the housing 1100 of the entertainment device 1000 of the present invention may incorporate an audio output generating device (e.g., a speaker or speakers (for stereo sound)). Finally, the housing 1100 of the entertainment device 1000 of the present invention may incorporate a removable media storage/playback unit (e.g., a CD, DVD, ROM cartridge—not illustrated) operably coupled to the control unit. The removable media storage/playback unit may be configured to provide additional game/entertainment content to the control unit and the display generating device 1175.

Turning to FIGS. 2A, 2B and 2C, the entertainment device 1000 may be collapsed into a second, storage position 1200 for storage or portability when the entertainment device 1000 is not in use. FIG. 2A illustrates the entertainment device 1000 in an initial stage of transition to the second, storage position. The display housing 1180 has a cylindrical portion 1182 that is recessed or has a smaller diameter as compared to the remainder of the housing 1180. There are curved (scalloped) surfaces 1184 at the transition between the cylindrical portion 1182 and the remainder of the housing 1180 that are complementary to, and mate with, respective curved surfaces on the clamp members 1160 and 1162. In the initial stage of collapsing the entertainment device 1000 to the second, storage position, the arms 1132, 1134, 1136 and 1138 are rotated outward. The display housing 1180 can be disconnected from the data, control, and power connectors in one or both of the clamp members 1160, 1162 and physically removed from the clamp members 1160 and 1162 as shown in FIG. 2A. In addition, the input controllers 1120(1), 1120(2) . . . may be disconnected from the connections at their respective bays 1150 when putting the entertainment device 1000 into its storage position.

FIG. 2A also shows a storage cover 2000 that is used for covering the entertainment device 1000 in its storage position. The storage cover 2000 comprises a generally planar body portion 2010 having a raised section 2100 in which a storage recess 2110 is formed for storing the display housing 1180. The storage cover 2000 further includes lateral portions 2120 and 2122 on the sides of the body portion 2010. Each lateral portion 2120 and 2122 has two additional structural storage features. First, lateral portions 2120 and 2122 have curved bottom surfaces 2130 and 2132, respectively, which mate with and snap-fit to complementary surfaces on the base unit 1100. Second, the lateral portions 2120 and 2122 have curved recesses 2140 and 2142 on their top surfaces that receive and mate with and snap-fit to portions of the clamp members 1160 and 1162, respectively. The storage cover 2000 also has a handle 2200 extended from the main body portion 2010.

Turning to FIGS. 2B and 2C, the entertainment device 1000 is shown fully folded into the second, storage position 1200 with the entertainment device 1000 and the storage cover 2000 locked together as a single integrated unit. These figures show that the main body portion 2010 of the storage cover 2000 covers and protects the touch-sensitive display surface 1110 of the entertainment device 1000, as well as the controller bays 1150 at each user position. The clamp members 1160 and 1162 are shown snapped to the curved surfaces 2140 and 2142 of the storage cover 2000. A recess cover 2300 may be provided that removeably snaps over the storage recess 2110 to contain the display housing 1180 therein. Thus, when the entertainment device 1000 is in the second, storage position, the storage cover 2000 and the entertainment device 1000 become a single unit that is portable (via handle 2200) and easy to store.

The Electrical Systems

Turning to FIG. 3, an electrical system block diagram is shown for the entertainment device 1000. The electrical system of the device 1000 comprises a system controller 3000 that is connected to the various sub-systems. The system controller 3000 may be a commercially available microprocessor device, such as those sold by Sharp Electronics, for example. Each of the input controllers, shown in FIG. 3, at reference numerals 1120(1), 1120(2), 1120(3) and 1120(4) connect to the system controller 3000. With respect to the touch-sensitive surface subsystem, there is touch surface sensing circuitry 3100 connected to a touch surface controller 3200. The touch surface sensing circuitry 3100 is responsive to touch-related conditions on the display surface 1110 (FIGS. 1 and 2). One example of appropriate touch surface sensing circuitry 3100 for use with the present invention is described hereinafter in conjunction with FIGS. 7-13B. The electrical system of the device 1000 may include an external memory sub-system including a flash memory device 3300 and a larger working memory, such as a DRAM device 3310. The external memory sub-system and its memory devices are useful for storing game parameters, such as user scores, user preferences, etc., for a game. These external memory devices 3300, 3310 may also be used by the controller 3000 when executing one or more processes associated with a particular game or activity stored in the ROM cartridge 1192.

The electrical system of the device 1000 of the present invention may include a projection sub-system 3400 that includes the display generating device 1175 (FIGS. 1 and 2). The components of the display generating device 1175 may vary with the type of projection technology used. In the example shown in FIG. 3, the projection sub-system 3400 comprises a light source 3410, an image display panel or portion 3420, a serializer-deserializer (SERDES) 3430. There is also another SERDES 3450 that interfaces data from the system controller 3000 to the projection sub-system 3400.

In order to generate audio output for the device 1000, the electrical system includes a stereo coder/decoder (CODEC) 3500 that is connected to the system controller 3000. The CODEC 3500 is responsive to commands and data received from the system controller 3000 to produce sound in the form of music, speech, or other sound that is synchronized to the data representing the displayed visual images produced by the display generating device 1175. Audio output may be produced by left and right speakers 3510 and 3520, as well as headphone ports 3530 and 3540 connected to the CODEC 3500.

The electrical system of the device 1000 may also include a program cartridge interface 3600 that communicates data stored on a ROM cartridge 1192 to the system controller 3000. In addition, the electrical system of the device 1000 may utilize a memory card interface 3700 connected to the system controller 3000. The memory card interface 3700 may support one or more of a variety of memory card formats, including Multimedia™ memory card, Smartmedia™, and Compactflash™.

The accessory block shown at reference numeral 3650 may include controllers/interfaces for devices such as, audio system, television, compact disk (CD) or digital video disk (DVD) or other accessory devices such as musical instruments (e.g., keyboard), optical devices (cameras), and other user-manipulable electronic input device such as dancepads, joysticks, steering wheels, etc.

As illustrated in FIG. 3, the electrical system of the device 1000 provides for communication between control buttons 1170 (e.g., power, volume, brightness, mode, reset, etc.) and the system controller 3000. Additionally, a power source (AC or DC, not shown) is utilized to power all of the components of the device 1000.

The system controller 3000 coordinates displayed image data with the positions of a player's hand or finger on the touch-sensitive surface as gathered by the touch surface controller 3200. In so doing, the system controller 3000, based on instructions contained in a particular ROM cartridge 1192, will generate image display output and/or audio output, and change its interaction to another player using the device 1000.

Display Generating Device

FIG. 4 illustrates a cross-sectional view of an embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention. As referenced above, the display generating device 1175 may be an image projection system that projects a visual image on the touch-sensitive display surface 1110. FIG. 4 illustrates an example of a projection type display generating device 1175. If such a projection system is utilized, the display generating device 1175 may include a first lens 310, an image display portion 3420, an illumination source 3410, and a second lens 340. One example of a first lens 310 operable within the display generating device 1175 of the present invention is a projection-type lens. The image display portion 3420 operable within the display generating device 1175 of the present invention may comprise a transmissive LCD panel and the second lens 340 may comprise a condensing lens. Finally, the illumination source 3410 may comprise any known light source such as an incandescent source, a KPR bulb, a halogen source (e.g., Xenon), or a light-emitting diode (LED) such as an ultra-bright white LED. Moreover, there may be multiple illumination sources, such as LEDs of multiple colors. In the projection-type display generating device 1175 shown in FIG. 4, light generated by the illumination source 3410 passes through the second lens 340 (condensing lens) and into the rear surface of the image display portion 3420, such as a transmissive TFT LCD panel. The image on the image display portion 3420 is then transmitted to and through the first lens 310 (projection-type lens), where the image is projected from the first lens 310 and onto the touch-sensitive display surface 1110. The display generating device 1175 may also comprise a focus adjustment mechanism (not shown) to more clearly focus the image onto the touch-sensitive display surface 1110.

FIG. 5 illustrates a schematic representation of an alternative embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention. As shown in FIG. 5, a super-bright LED light source 3410 is condensed via condensing lens pair 340A and 340B onto the rear surface of image panel 3420, which in this example is a transmissive liquid crystal on silicon (LCOS) panel. The light and image then passes through a projection lens 310 onto the touch-sensitive display surface 1110. The touch-sensitive display surface 1110 utilized in the present invention may comprise a white, metallized, surface.

FIG. 6 illustrates a schematic representation of an additional alternative embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention. In the embodiment illustrated in FIG. 6, the super-bright LED light source 3410 is condensed onto the front surface of a reflective LCOS panel 3425 via a mirror 360. The lighted image reflected off of the LCOS panel 3425 passes through a projection lens 310 onto the touch-sensitive display surface 1110, which again may be a white metallized surface.

Although an LCD projector-type system is described above for the display generating device 1175, any type of display generating system may be used without departing from the spirit and scope of the present invention. For example, a rear projection system could be utilized. Furthermore, display generating systems such as an LCD panel, plasma display panel, or a digital light processing (DLP) device could be utilized to perform the function of the display generating device 1175. Still other image generating technologies that are useful in connection with the device 1000 are a high temperature polysilicon panel (HTPS) and a MEMS reflective display device.

Regardless of the type of display generating device 1175 used, the system controller 3000 (FIG. 3) calibrates it to ensure that the image projected onto the touch-sensitive display surface 1110 is sized and oriented to match the corresponding touch-sensitive surface underlying the display surface 1110 to achieve a desirable interaction with a game or activity executed by the system controller 3000. This may involve mechanical adjustment of the projection system (focus, etc.) and/or the use of extra “border pixels” in the projected image to move the image properly onto the display surface 1110 in an alignment with the underlying touch-sensitive surface.

In addition, a projection type display generating device may be rotated from its normal projection position so as to project images onto a wall or other surface, rather than onto the display surface 1110. This feature may be useful in the event a user wishes to view images or watch a video presentation on a DVD, CD, etc.

In accordance one embodiment of the present invention, the touch-sensitive display surface 1110 may include a vellum projection screen. More specifically, the vellum projection screen may be a vacuum-metallized vellum screen with a mirrored back portion for improved reflectivity. As referenced above, the touch-sensitive display surface 1110 need not include a projection-type screen (a projector and a separate screen), and may comprise additional appropriate integrated touch-sensitive display surfaces such as an LCD or plasma touch panel display, as described hereinafter in connection with FIG. 14.

Touch-Sensitive Surface Sub-System

Turning to FIGS. 7-13B, an example of a touch-sensitive screen technology useful in connection with the entertainment device 1000 according to the invention will be described. This touch-sensitive screen technology is described in co-pending U.S. Patent Publication No. 2004/0043371 A1, published Mar. 4, 2004, corresponding to U.S. patent application Ser. No. 10/448,582, filed May 30, 2003, entitled “Interactive Multi-Sensory Reading System Electronic Teaching/Learning Device,” the entirety of which is incorporated herein by reference. This touch screen technology also is used in the publicly available Fisher-Price PowerTouch™ Learning System.

As shown in FIG. 7, there is a sensor array 142 located directly beneath a plastic spacer 515 forming recess surface 130. The plastic spacer 515 or the recess surface 130 may serve as the display surface 1110 for the image generated by the display generating device 1175 if a projection system is used. Spaced beneath sensor array or matrix 142 is an electrically conductive metal plate 510.

Turning to FIG. 8, the sensor array or matrix 142 may include two sets of generally parallel, individual separate and separated conductive lines arranged as a plurality of spaced apart, column or vertical conductive lines (also referred to as vertical grid lines) 248 and a plurality of spaced apart, row or horizontal conductive lines or traces (also referred to as horizontal grid lines) 246 transverse and preferably perpendicular to the plurality of column conductive lines 248. The sets of lines 246, 248 are referred to as “rows” or “columns” for convenience, “rows” run east-west/left-right while “columns” are perpendicular (or otherwise transverse) to such “rows” running north-south/up-down, but the nomenclature could be reversed. The set of column conductive lines 248 and the set of row conductive lines 246 are separated by an electrically-insulative spacer, for example a Mylar plastic sheet. The row and column conductive lines 246, 248 are printed in conductive inks on opposite sides of the Mylar sheet to provide electrical isolation between the sets and form the sensor matrix 142. The sensor matrix 142 includes sixteen rows 246 and sixteen columns 248 of the conductive lines or traces however different numbers of either or both could be utilized. Each point where a row 246 and column 248 cross creates a single individual “cross-point” sensor. The sixteen by sixteen line array therefore creates two hundred and fifty-six individual cross-point sensors.

FIG. 9 depicts schematically part of the display surface 1110 that overlies the sensor array 142 of the device 1000, with the word “BALL” projected and displayed on the display surface 1110. Also, shown in phantom is the outline of a user's hand, primarily the user's thumb, being placed on the touch-sensitive display surface 1110. The operation of the entertainment device 1000 allows a user to select any active area on the display surface 1110 by touching or simply placing a finger, thumb, etc., sufficiently closely to the selected area. Upon selection of this active area in this manner, the system controller 3000 of the entertainment device 1000 may generate and output a certain audible message or visual display responsive to this selection. By way of example, when the user's finger touches the word “BALL” on the touch-sensitive display surface 1110, the system controller 3000 of the entertainment device 1000 may produce a spoken audio output “BALL” and the displayed graphical representation of a ball 9000 may change color. The audible message and video output is generated in direct response to the user touching the displayed word “BALL” on the touch-sensitive display surface 1110. Different audible messages and video output would be generated if the user touched other areas of the touch-sensitive display surface 1110. Touching the ball graphic on the display surface could produce a sound of a bouncing ball (and or the image of a bouncing ball). Touching any areas of the touch-sensitive display surface 1110 (overlying the sensor array 142) that do not have text or graphics displayed on it (a “non-assigned area”) could generate a generic sound of a single bell ringing to signify that there is no audio/video associated with this area. Additionally, touching a non-assigned area could produce a generic spoken audio output or visual display, such as “try again” or the input selection could simply be ignored. It can be seen from FIG. 9 that each word and/or image displayed on the touch-sensitive display surface 1110 may be mapped to one or more x and y coordinate pairs of the sensor array 142. For instance, the word “BALL” is located at Row 5, Column 4 and Row 5, Column 5 of the sensor array 142. This map location is stored in memory (e.g., the memory devices 3300, 3310 of FIG. 3) along with the associated audible message that is played when either cross-point sensor location is selected.

FIGS. 10-11 show examples of three cross-sections of the sensor array 142 without and with an overlying display surface 1110. FIGS. 10-12 show a plastic spacer 515, a plurality of the spaced apart column (vertical) traces 248, the non-conductive (e.g. Mylar) sheet 525 and one of the spaced apart row (horizontal) traces 246 transverse to the plurality of column traces 248. The non-conductive sheet 525 supports and separates the column traces 248 from the row traces 246. The conductive plane 510 in the form of a metal plate is connected to system ground and parallel to and spaced away from the sensor array 142.

The plastic spacer 515 which forms the recess surface 130 may be approximately 0.080″ thick and is placed on top of the array 142 to act as an insulator so that a touch surface of a sensor is separated from the matrix 142 by at least this amount. The spacer 515 may be a styrene or ABS with a dielectric constant between about 2 and 3 although the thickness and dielectric constant can be adjusted to achieve the desired sensitivity. The function of the spacer 515 is to provide a stable response from the matrix 142 (when touched by finger/appendage 505). The width and thickness of the column traces 248 (vertical columns) and row traces 246 (horizontal rows) should be kept to a minimum at the cross-points to reduce the capacitive effect at each of the cross-points but are preferably increased between the cross-points and around the cross-points, for example, by widening the individual row and column traces into four pointed stars or diagonal squares or the like around and between the cross-point locations.

The conductive plane 510 is spaced approximately one-quarter inch (5 mm) below the matrix 142. The conductive plane 510 provides shielding for the matrix 142 and as a result, affects the area sensed around each cross-point in the matrix 142.

Referring back to FIG. 7, the individual traces 246, 248 are extended to side and bottom edges of the sheet 525 supporting the traces. Preferably, shorter traces 530 and 535 are extended from the side and bottom edges, respectively, of the sheet 525, one shorter trace 530 or 535 on either side of each sensor trace 246 or 248, respectively (see FIG. 8). The shorter traces 530 and 535 are all connected to system ground through or with the conductive plane 510. The horizontal traces 530 extend inwardly from the vertical edge to just beyond where the row traces 246 widen out to form terminals and, with a uniform length, provide some impedance control. The vertical traces 535 extend from the bottom edge up to a point where the vertical traces 248 begin to run parallel, just below where those traces are flared and to within about one-half inch (12 mm) of the lowest cross-points. Traces 535 prevent cross coupling between the column traces 248 when the columns are being driven by an oscillator.

Generally, baseline or reference values of signals generated by the sensor matrix 142 are read and stored without human interaction with the arrays to obtain a reference value for each cross-point. The reference value of each cross-point sensor is individually determined and updated. Preferably, each is a running “average” of successive scan values (e.g., approximately sixteen) for the cross-point. Successive scans are compared to the reference values to determine the proximity of a human finger or other extremity. Data may be accumulated starting at zero when the device 1000 is powered on.

FIG. 12 is an electrical block diagram of the touch surface sensing circuitry 3100 and the touch surface controller 3200. The touch surface controller 3200 is a dedicated microprocessor controller such as the publicly available Sunplus SPLI30A microprocessor. The touch surface sensing circuitry 3100 comprises a column driver circuit 254, a row select circuit 258, a synchronous detector, multiplexer and filter circuit 260 that processes the raw sensor signals and passes processed signals to an analog to digital converter 262 for digitization. Alternatively, the functions of touch surface controller 3200 might be performed by the device system controller 3000 (FIG. 3). The touch surface sensing circuitry 3100 further comprises the cross-point matrix or sensor array 142 and a signal oscillator 252, which powers the sensor array 142 and controls the detector 260.

Operation of the touch surface sensing circuitry 3100 is as follows (and is illustrated in FIGS. 13A and 13B). Firmware associated with touch surface controller 3200 directs the column driver circuit 254 to pass an RF excitation signal, for example, a 250 kHz, 3300 millivolt square wave signal, from the signal oscillator 252 to column traces 248 of the sensor array 142. The firmware also directs the row select circuit 258 to generate appropriate control signals sent to the row sensor circuit (not shown) to connect a row trace 246 in the sensor array 142 to the synchronous detector, multiplexer and filter circuit 260. The touch surface controller 3200 further controls the transfer of data from the synchronous detector, multiplexer and filter circuit 260, which generates a DC level analog voltage signal through A/D converter 262. The row traces 246 may be scanned bottom to top while the column traces 248 are driven innermost to outermost.

After the initial values from the sensor array 142 are stored, the sensor array 142 is cyclically and continually scanned, and the results for each cross-point sensor are compared with the stored reference values, which are themselves cyclically and continuously updated. If any individual cross-point sensor value has a differential from its reference value that is greater than a predetermined or threshold amount (“threshold”), the touch surface controller 3200 will mark the point as “touched” or “selected”. A fixed threshold is established for the device 1000 by characterizing the device 1000 during manufacture. For the circuitry, materials and structure described, it has been found that with an applied 3300 millivolts, 250 kHz square wave signal, individual cross-point sensors of the sensor array 142 output signals of about 2200 millivolts±400 millivolts without user interaction. Deflection of the signal (i.e. a drop in detected signal strength) at each cross-point sensor location for user contacts ranging between that of a large adult directly touching the surface to a small child touching the surface ranges from about 1600 millivolts in the first case to only about 200-300 millivolts in the second case. The threshold may be set as close as possible to the smallest expected user generated deflection. In this device 1000 being described, the threshold is set for less than 200 millivolts, between about 190 and 200 millivolts, for each cross-point sensor. If the measured voltage value for the cross-point being sensed is less than the reference value in memory by an amount equal to or greater than the threshold amount, the point is considered touched and is “marked” as such by the sensor touch surface controller 3200. If the difference is less than the threshold, the reference value is updated each 64 milliseconds period (full scan time), resulting in a settling of the reference values after about one second. After the sensor array 142 is scanned, cross-points that have been “marked” as a touched for two scan cycles are considered valid and selected for further processing by a “best candidate” algorithm as will be described.

When the sensor array 142 is scanned, each cross-point data value is initially compared to a “High Limit” value. If the data value exceeds this High Limit value, it is ignored as a candidate for that scan and ignored for updating the reference value for that sensor. The purpose of the High Limit value is to prevent abnormally high data values from causing a cross-point sensor to appear permanently pressed.

As noted above, for each array scan, each time the data value associated with a cross-point sensor is read, it is compared against the reference value, which may be thought of and herein referred to as a “Running Average” associated with that cross-point sensor. If the data value is less than the Running Average minus the threshold, the cross-point sensor is considered “touched” for that scan. The threshold is the fixed data value mentioned above (i.e. 190 to 200 millivolts) that represents the minimum deflection which is expected to indicate that a cross-point sensor is considered touched.

If the data value does not indicate that the cross-point sensor is considered touched (that is, data value is less than the [Running Average-Threshold]), then the data value is used to update the Running Average. Upon power-up of the device 1000, the Running Average for each point is set to zero. Each time the data value for a cross-point sensor is not greater than the High Limit, and not low enough to indicate that the cross-point sensor is touched, the data value is used to update the Running Average for that point. The formula used to compute the new Running Average is as follows:
New Running Average=Running Average+(data value−Running Average)/16.
Thus, the preferred “running average” is not truly an average but rather a convergence algorithm.

With the above knowledge, the function of the High Limit algorithm can now be explained. The reference value/running average algorithm can be fooled by situations where high levels of interference exist and the cross-point sensor readings climb significantly. Without the High Limit cut-off, abnormally high data values (due to a continuous noise source) could eventually result in an abnormally high Running Average for a given cross-point sensor. Then, when the scanned data values return to their nominal value range, if the data values being scanned are low enough such that the data values are greater than the abnormally high Running Average minus the threshold, the cross-point sensor will be considered touched. This will result in newly scanned data values never being used in the calculation of the Running Average and therefore, will not allow the Running Average to be lowered to it's normal level, causing the cross-point sensor to appear permanently touched during the duration of use of device 1000. Consequently, the only sensor data which is used or stored is that data which is less than the High Limit. A High Limit value of 3100 millivolts (about fifty-percent higher than the nominal voltage) may be appropriate.

The following describes a “Fast Recovery” algorithm. This algorithm compares the latest reading from a cross-point to the reference value or Running Average. If the latest reading if higher by more than the Fast Recovery Threshold, the reference value will be set equal to the latest reading. This algorithm counters a situation where the user “hovers” a finger over a point for an extended period of time, which artificially forces the reference value down. A quick release and touch of the same point in this situation may cause the system not to respond because the differential between the reference value and latest reading is not more than the touch threshold value (threshold).

The previous section described in detail how the 256 cross-point sensor array 142 is determined to be activated (i.e. “touched” or “selected”) or not. During each scan, every cross-point sensor is considered to be activated/touched or not.

After each scan, the touched points are processed to identify a “best candidate”. Generally speaking, the best candidate is the cross-point sensor selected by the touch surface controller 3200 as being the point most likely to have been selected by the user in touching the sensor. Generally speaking, it is the touched point which is highest (most northern/Top) or the highest and most left (i.e. most northwestern/Top Left) if two potential candidates of equal height are activated on the sensor array 142. For convenience, these will be referred to collectively as simply “the most northwestern” point. Also, the cross-point sensor preferably must be “touched” for two consecutive 64 millisecond scans to be considered as the new most northwestern point of the sensor.

The touch surface controller 3200 first identifies a set of touched sensors. It next identifies those which have been touched for at least two consecutive 64 millisecond cycles. These are the new most northwestern candidate sensors. Once the best candidate has been chosen, its identification/location is communicated from the touch surface controller 3200 to the system controller 3000.

Once a new most northwestern point (cross-point sensor) has been chosen, a “Southern Lockout” algorithm takes effect for the sensor array 142. The Southern Lockout algorithm causes any point of the same array touched in subsequent scans below the new most northwestern point to be ignored until the earlier of one second expiration while the new most northwestern point remains selected, or the new most northwestern point is released. After the lockout, all cross-points of the array become candidates for new most northwestern point. This algorithm covers the situation where the user rests the heel of the pointing hand on the array after finger touching the array (as a young child may be prone to do).

A “Peak Search” algorithm may be employed after a new most northwestern point of the sensor array 142 is identified. The deflection of the cross point sensors immediately East (right), South (below) and Southeast (below right) of the new most northwestern point sensor are examined for touch and the relative deflections of any touched sensor of the four compared to one another. The one sensor of those up to four sensors having the greatest deflection (i.e. change from reference value/Running Average) is selected as the “Best Candidate” and its identity/location/position is passed to the main (base unit) system controller 3000.

Each time a new best candidate is selected, its position is transferred by the touch surface sensor controller 3200 to the system controller 3000. The system controller 3000 would then decide how to use this information (interrupt current activity or not, use a neighbor cross-point sensor instead of the best candidate, etc.).

The device 1000 will determine if there are multiple hands placed touch surface. In the event that the system controller 3000 sees two hands placed on the sensor, it will look to see if either input is a clearly defined most northern point. If so, it will select this input as the best candidate. Instead of having to generate an audio output to direct the user to use “one finger at a time” or any other appropriate statement when the device 1000 cannot determine with reasonable accuracy the likely input, this technique can select a “best candidate” based on the above-described algorithm.

Other types of touch-sensitive surface or position detection technologies may be utilized without departing from the scope of the present invention. For example, analog resistive or capacitive touch panels may be used, digital camera CCD technology, so-called gesture recognition technology, heat sensitive, color sensitive, pattern sensing, object sensing or any other contextual sensing technology based on electro-physical material properties, photo-reflective properties or photo-absorption properties.

Still another alternative is to use a LCD monitor with an integrated touch panel. This alternative embodiment is shown in FIG. 14, where the integrated LCD monitor/touch panel is shown at reference numeral 4000. This integrated LCD monitor/touch panel would replace both the display generating device 1175 and the display surface 1110 (illustrated in FIG. 1). The monitor/touch panel 4000 has its own touch surface sensing circuitry 3100 and touch surface controller 3200 that are in turn connected to the system controller 3000. The system controller 3000 responds to touch position information supplied to it by the touch surface controller 3200 and also generates display image data that is supplied (through the appropriate intervening display driver circuitry) to the monitor/touch panel 4000. Numerous models of integrated display monitor/touch panels are known in the art and may be used in accordance with the present invention.

Examples of other types of touch or proximity sensing technologies that may be used with the entertainment device 1000 of the present invention include pressure-sensitive switch matrices such as a Mylar® switch matrix, proximity sensing antenna arrays and proximity sensing capacitive arrays. Some examples of additional appropriate touch-sensitive display surfaces are LCD or plasma touch panel displays.

Exemplary Games and Game System Operation

With general reference to FIGS. 1 and 3, in the operation of an entertainment device 1000 in accordance with the present invention, an image generated by the display generating device 1175 is displayed on the touch-sensitive display surface 1110. The touch-sensitive display surface 1110 and the at least one input controller 1120(1), 1120(2) are both operably coupled to the system controller 3000. Furthermore, the system controller 3000 is responsive to signals from the touch-sensitive display surface 1110 and, the at least one input controller 1120(1), 1120(2) to alter the visual image displayed on the touch-sensitive display surface 1110 in response to a user's interaction with the touch-sensitive display surface 1110 and/or the at least one input controller 1120(1), 1120(2). Thus, a user of the entertainment device 1000 in accordance with the present invention can use either the touch-sensitive display surface 1110, the at least one input controller 1120(1), 1120(2), or both the touch-sensitive display surface 1110 and the at least one input controller 1120(1), 1122(2) to provide input to the system controller 3000 to alter the image (move portions of, change, re-orient, etc.—as opposed to merely a brightness control or an on/off control) displayed on the touch-sensitive display surface 1110.

Turning to FIGS. 15 and 16, another feature of the entertainment device 1000 will be described. As described above, the device 1000 is intended for use by multiple players, and in the examples described herein there are four play positions at each of four sides 1102, 1104, 1106 and 1108 of the generally rectangular base housing 1100. Consequently, in the course of a game or activity, it is necessary to transition from one player to another player, but the players are positioned at different orientations with respect to the touch-sensitive display surface 1110. Therefore, the system controller 3000 needs to recognize this and re-orient the image displayed by the display generating device 1175 (projected or displayed on a display panel) and also re-orient how it responds to touch commands on the display surface 1110 with respect to the re-oriented images. When a transition in an activity is made from one user position to another user position, the system controller 3000 controls the display generating device 1175 to rotate the displayed visual images so that they are properly aligned with the new currently active user position. In addition, the system controller 3000 adjusts how it interprets touch position detections made by the touch/proximity sensing device during such a transition from one user position to another user position.

A procedure useful to adjust the orientations during a game or activity is shown at reference numeral 5000 in FIG. 16. Step 5010 shows normal execution of a game or activity with the current player. For example, as shown in FIG. 15, the game or activity involves interaction with Player 1. As an example, a message is displayed for Player 1 involving a selection after which the game or activity moves to Player 2. A displayed message to Player 1 in this example is:

“Select the Song You Wish to Hear During Your Next Turn

    • A
    • B
    • C”
      When Player 1 makes the selection (from A, B and C) by either a button on the input controller 1120(1) or touching the touch-sensitive display surface 1110 proximate the desired selection, the system controller 3000 detects that Player 1's turn is over. This corresponds to step 5020 in which the system controller 3000 detects a player transition event. Next, in step 5030, the system controller 3000 determines the next player according to rules of the game or activity. For example, Player 2 may be the next player. Then, in step 5040 the system controller 3000 re-orients (e.g., rotates) the image data to be displayed so that it is aligned properly for the new player, by rotating the image data by 90 degrees to the right on the touch-sensitive display surface 1110 so that it appears oriented and intended for Player 2. The system controller 3000 also, in step 5050, re-orients how it responds to touch surface commands according to the re-oriented displayed image. For example, to respond to touch or proximity detected commands from Player 2, the system controller 3000 adjusts (rotates) by 90 degrees to the right those positions that are “hot” and cause a certain action in response to the touch position signals it receives from the touch surface controller 3200, representing touch positions of Player 2. A partial view of the conductive wires associated with the (M×N) sensor array 142 is shown in FIG. 15 to depict how the system controller 3000 is programmed to re-orient how it responds to touches from a user since the sensor array 142 is fixed, but the relative positions on the sensor array where a user may touch to cause a particular action (visual and/or audible) with respect to a displayed visual image is different from each user position 1102, 1104, 1106, and 1108.

This re-orientation process is repeated when transitioning from Player 2 to any other player. It should be understood that if only 2 or 3 players are active in a particular game or activity, the system controller would know to re-orient the image display data and touch position responsiveness accordingly. Moreover, this re-orientation process can be applied to an entertainment device that has fewer than 4 or more than 4 player positions such that the re-orientation is not a simple 90 degree adjustment. This re-orientation process 5000 applies for a display generating device that is a display panel or monitor 4000 as well as an image projection system 1175. In the case of an image projection system 1175 (such as in FIG. 1), it is possible that the image can be re-orientated by adjusting one or more optical devices in the projected image path. However, it may be more desirable to re-orient the raw image data prior to its projection.

The following are generic examples of games or activities that may be played on the entertainment device 1000 of the present invention. The instructions, scripts, programs for these games or activities may be embodied in a removable memory cartridge device 1192, as described above. The games or activities are software programs containing digital data for animated characters accompanied by voice, music, and other graphical elements. The games/activities may involve sequential, interactive, narrative stories that containing puzzles, activities or games interwoven as challenges to provide a progressive rewarding type experience for the players.

Digitally Animated Adventure Game

One type of game is an animated adventure game where one or more animated characters are displayed and the character(s) negotiate a variety of activities, such as an underwater amusement park. Each player may select a particular character and negotiate a simulated displayed game board, for example, collecting certain items in order to win the game. A player's character may progress on the displayed game board using an electronic or virtual roll of the dice, for example. In addition, when a player lands on a particular spot on the game board, the player may be prompted, through visual and audio stimulus, to engage in a particular activity in order to earn a particular item or “ticket” award that counts towards winning the game. A player may accumulate tickets in order to redeem them for certain animated or displayed items. A player may engage in these so-called mini-games or activities (including educational or learning activities) using the input controllers or the touch-sensitive display screen. These mini-games may be distributed randomly throughout the game board each time a new game is started.

Portions of the visual display proximate each player's position at the entertainment device may be dedicated to tracking each player's digital scorecard concerning their progress in the game. The scorecard may show a player's character and which items the player has collected.

There may be virtual animated “vendors” that appear randomly at different spots on the game board to “sell” certain items to players who have collected a sufficient number of tickets. The items that can be purchased may be used by a player during play of the game (e.g., a rolling bonus, a time bonus, etc.), while others may be used against opponents (lose a turn, etc.). The game may also include sudden appearance of certain animated characters that give bonus tickets to certain players, for example, or play special side games or activities.

Digitally Animated Adventure Tales

Another type of game may involve a digital book consisting of a combination of a traditional storybook, a children's activity book and web-type flash games. A player or user becomes part of the adventure, helping the animated characters complete certain challenges and reach their goals. Each so-called “page” of the storybook includes a full-screen combination of artwork, a story line, object identification and animated “hot spots”. As the story is read to the user, or as an animated character speaks, the accompanying text will appear on-screen and highlight. Each phrase or sentence will highlight individually as those words are also heard as voiceover. On certain pages, several objects are tagged as “identifiable hotspots.” When a child touches one of these objects, that word or phrase is said aloud. Certain areas and objects on the pages are tagged for special animations, so that if a child touches that area, the name of that object is said aloud, and an animation or other reward will be revealed. In addition, certain pages of the storybook may contain mini-games, activities or challenges (including educational or learning activities) related to the storyline.

The game and activity examples described above highlight the necessity for re-orienting the displayed visual image according to which player is active in a game. For example, the animated characters may be intended for a particular player. Consequently, the device needs to keep track of players'turns in the game, and re-orient certain displayed visual images to that player whose turn it currently is. Moreover, if the game calls for detecting a touch or proximity of a command from a player, the device also re-orients on which positions on the sensor array that it needs to respond to for the currently active player.

While the invention has been described in detail and with reference to specific embodiments thereof, it will be apparent to one skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope thereof. For example, the housing 1100 of the entertainment device 1000 of the present invention may include headphone jacks for a user's convenience. Additionally, the entertainment device 1000 of the present invention may include a gesture recognition system, including a camera and/or sensors to sense, model, and react to a user's hand motions, adding an extra dimension of interactivity to the entertainment device 1000. Also, the entertainment device 1000 of the present invention may include a rechargeable power source. The entertainment device 1000 of the present invention may also include night vision goggles, a magnifying glass, or a special optical device that would allow a user to reveal secret codes, cards, letters, or other information displayed in certain wavelengths of light on the touch-sensitive display surface 1110. Furthermore, the entertainment device 1000 of the present invention may include deluxe input controllers which include all of the features of the at least one input controller 1120(1), 1120(2), 1120(3), 1120(4) and also may include an onboard display screen displaying individual user messages (e.g., things like scrabble letters, hidden game clues, etc.). Also, the entertainment device 1000 of the present invention may include a memory unit (removeable or non-removeable) for storing game/player related information (such as high scores, etc.). Finally, the housing 1100 of the entertainment device 1000 of the present invention may include light sources to identify which user is in control of the entertainment device 1000 i.e., (which user's turn it is to control the entertainment device 1000). Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims

1. An entertainment device comprising:

a display surface;
a display generating device operable to display a visual image on the display surface;
a touch/proximity sensing device that detects positions of a user appendage in proximity to the display surface;
a control unit operably coupled to the touch/proximity sensitive device and to the display generating device; and
at least one input controller operably coupled to the control unit;
wherein the control unit is responsive to signals from the at least one input controller and position detections from the touch/promixity sensing device to generate and supply data to the display generating device for displaying visual images on the display surface.

2. The entertainment device of claim 1, wherein the touch/proximity sensing device is positioned beneath the display surface and is responsive to user touches on or in proximity to said display surface to output a signal indicative of the position thereof, and wherein the display generating device comprises a projection system for projecting the visual image on the display surface.

3. The entertainment device of claim 2, wherein the projection system comprises a back-lit projection device having an illumination source, wherein the illumination source comprises at least one of: a light-emitting diode, a halogen bulb, and an incandescent bulb.

4. The entertainment device of claim 2, wherein the projection system comprises a transmissive display panel that generates the visual image in response to data supplied by the control unit.

5. The entertainment device of claim 2, wherein the projection system comprises a reflective display device that generates the visual image.

6. The entertainment device of claim 2, wherein the projection system is moveably mounted with respect to the display surface to assume a first position for projecting the visual image onto the display surface from above, the projection system being located apart from the display surface in the first position.

7. The entertainment device of claim 6, wherein the projection system is configured to assume a second position, the projection system being located proximate the display surface in the second position.

8. The entertainment device of claim 7, and further comprising a base member that supports the display surface, and at least one arm member that extends from the base member to said projection system to hold said projection system above said display surface.

9. The entertainment device of claim 8, wherein said at least one arm is collapsible towards said base member.

10. The entertainment device of claim 1, wherein said display surface, said display generating device and said touch/proximity sensing device are housed within display monitor/touch panel device that is coupled to the control unit.

11. The entertainment device of claim 1, and further comprising a plurality of user positions around said display surface at which a user interacts with the entertainment device, wherein said control unit controls the display generating device to change an orientation of the visual image depending on which user position is engaged in an activity or game.

12. The entertainment device of claim 11, wherein said control unit further adjusts how it responds to signals from the touch/proximity sensing device when there is a change in the orientation of the visual image.

13. The entertainment device of claim 1, and further comprising a port to receive a memory device that stores one or more computer programs for a game or activity, wherein said control unit is connected to said port to read and execute said one or more computer programs.

14. The entertainment device of claim 1, further comprising an audio output generating device operably coupled to said control unit, said audio output generating device configured to generate audio output in response to signals from said control unit, wherein said audio output is synchronized to displayed visual images.

15. The entertainment device of claim 1, and further comprising a port connected to said control unit that removeably receives a memory module that stores instructions executed by said control unit to perform a particular activity or game that involves displaying a particular visual image and responding to commands derived from a user touching or coming into close proximity with certain positions on said display surface.

16. The entertainment device of claim 1, and further comprising at least one controller port that is connected to said control unit and which receives said at least one input controller, wherein the least one controller port is further capable of accepting other accessory devices.

17. An entertainment device comprising:

a base unit;
a display surface on said base unit;
a plurality of player positions on said base unit circumscribing said display surface;
a display generating device operable to display a visual image on said display surface;
a touch/proximity sensing device that detects positions of a user appendage on said display surface;
a control unit operably coupled to the touch/proximity sensing device and to the display generating device;
an input controller at each player position that is operably coupled to the control unit; and
wherein the control unit is responsive to signals from the controller and signals representing user appendage positions detected by touch/promixity sensing device to alter data supplied to said display generating device.

18. The entertainment device of claim 17, wherein said touch/proximity sensing device is positioned beneath said display surface and is responsive to user touches on or in proximity to said display surface to output a signal indicative of the position thereof, and wherein said display generating device comprises a projection system for projecting the visual image on the display surface.

19. The entertainment device of claim 17, wherein said display surface, said display generating device and said touch/proximity sensing device are are integrated into a display monitor/touch panel device that is coupled to said control unit.

20. The entertainment device of claim 17, wherein said control unit supplies data to said display generating device to change an orientation of a displayed visual image for viewing at each of said player positions around said display surface.

21. The entertainment device of claim 20, wherein said control unit further adjusts how it responds to signals from the touch/proximity sensing device upon a change in the orientation of said visual image.

22. An entertainment device comprising:

a housing having multiple player positions;
means for generating a visual image;
means on said housing for displaying said visual image on an image display surface
means for detecting positions of a user appendage on said image display surface;
an input controller at each player position for receiving user input; and
control means coupled to said means for generating a visual image, said means for detecting positions of a user appendage on said image display surface, and said input controller, said control means operable to alter data supplied to the means for generating a visual image in response to signals from said input controller and signals from said means for detecting positions of a user appendage on said image display surface.

23. A method for generating entertainment in a multi-user device, comprising the steps of:

(a) displaying visual images on a surface around which there are a plurality of user positions;
(b) monitoring touch or proximity positions of a user's appendage on said surface;
(c) receiving input from a user via a user input device at one of said plurality of user positions; and
(d) controlling said step (a) based upon a detected touch or proximity positions of said user's appendage on said surface and input received from a user via said user input device.

24. The method of claim 23, wherein step (a) comprises projecting a visual image onto a display surface.

25. The method of claim 23, wherein step (a) comprises displaying the visual image with a display panel device.

26. The method of claim 23, wherein step (b) comprises monitoring the position of a user's hand with respect said surface with a touch or proximity sensing device, and wherein step (a) comprises projecting a visual image onto said surface.

27. The method of claim 23, wherein step (d) comprises changing an orientation of the visual image depending on which user position is active.

28. The method of claim 27, wherein step (d) further comprises adjusting data generated by said step (b) upon a change in the orientation of the visual image.

29. An entertainment device comprising:

a touch or proximity-sensitive display surface;
a plurality of user positions around said display surface;
a display generating device operable to display a visual image on said display surface;
a control unit operably coupled to said display generating device, wherein said control unit executes a program to supply data to said display generating device to generate visual images for display on said display surface according to said program, and wherein said control unit supplies said data to said display generating device to change the orientation of said visual images for viewing by a user at a particular user position.

30. The entertainment device of claim 29, and further comprising a touch/proximity sensing device that detects positions of a user appendage on said display surface, wherein said control unit is responsive to position detections made by said touch/proximity sensing device in supplying data to said display generating device.

31. The entertainment device of claim 30, wherein said control unit is further responsive to signals from said touch/proximity sensing device upon a change in the orientation of visual images for a particular user position.

32. The entertainment device of claim 29, wherein said control unit supplies data to said display generating device to rotate said displayed visual images for viewing at one user position to viewing at said particular user position.

33. A method for generating entertainment in a multi-user entertainment device, comprising:

(a) displaying visual images on a touch or proximity-sensitive surface around which there are a plurality of user positions;
(b) controlling said step (a) to rotate said visual images on said touch or proximity-sensitive surface from one user position to another user position.

34. The method of claim 33, and further comprising monitoring touch or proximity positions of a user's appendage on said touch or proximity-sensitive surface, and further comprising altering a response of said entertainment device upon receipt of signals representing said monitored positions when the visual images are rotated from one user position to another user position.

35. An entertainment device comprising:

a base unit that supports a display surface;
a display generating device operable to display a visual image on said display surface;
at least first and second support members that attach to said base unit and operable to move between a first position and a second position with respect to said base unit;
first and second connectors attached to distal ends of said first and second support members that removeably hold said display generating device above said display surface when said support members are in said first position;
a cover member that fits over said display surface of said base unit and having a storage recess to store said display generating device therein when removed from said first and second connectors, said cover member having a body with surface portions that mate with corresponding surfaces of said first and second connectors when said first and second members are in said second position.

36. The entertainment device of claim 35, wherein said cover member further comprises a carrying handle.

37. The entertainment device of claim 35, wherein said cover member further comprises lateral portions configured to mate with complimentary portions of said base unit.

38. The entertainment device of claim 35, further comprising a recess cover that selectively covers said storage recess in said cover member.

Patent History
Publication number: 20060183545
Type: Application
Filed: Nov 4, 2005
Publication Date: Aug 17, 2006
Inventors: Robert Jourdian (Elma, NY), Jeffrey Miller (Orchard Park, NY), George Inashvili (East Aurora, NY), Brian Mysliwy (Tonawanda, NY), Christopher Cimerman (Clarence Center, NY), John Taylor (Cowlesville, NY), Dan Klitsner (Larkspur, CA), Gary Levenberg (San Francisco, CA), Brian Clemens (San Francisco, CA)
Application Number: 11/266,593
Classifications
Current U.S. Class: 463/36.000
International Classification: A63F 9/24 (20060101);