Interactive game pieces using touch screen devices for toy play
There is provided a system and method for facilitating an interaction using first and second peripheral devices and related structures. Each of the first and second peripheral devices have a plurality of touch points for touching a touch surface of a touch-sensitive system. According to an exemplary embodiment, a method comprises detecting a plurality of contemporaneous touches on the touch surface of the touch-sensitive system. One of the first and second peripheral devices is identified based on the plurality of contemporaneous touches as compared to the plurality of touch points of one of the first and second peripheral devices. In some embodiments, orientation of the one of the first and second peripheral devices can be determined based on the plurality of contemporaneous touches as compared to the plurality of touch points of the one of the first and second peripheral devices.
Latest Disney Patents:
This application claims priority to U.S. Provisional Application No. 61/399,249, filed on Jul. 8, 2010, which is hereby incorporated by reference in its entirety.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates generally to the field of gaming, and more particularly, to gaming using touch-sensitive surfaces.
2. Background Art
Touch-sensitive devices, such as touch screen devices, are becoming increasingly prevalent in the marketplace. These touch-sensitive devices offer a touch-sensitive surface that can detect the presence and position of touch-based input opening up the possibility of new ways to interact with electronic devices. The popularity of recent touch screen devices, such as iPad from APPLE®, means that touch screen devices can be found among many family households. At the same time, physical toys remain a staple of kids, with collectible figures and objects serving as a bedrock for imaginative toy play in the form of interaction between the figures and objects as well as the larger environment.
In this context, kids are having increased exposure to touch screen devices making it desirable to provide new technologies, which can enhance the interactive experience with touch screen devices and their existing toys.
The conventional approach is to integrate toys with video games or other software running on a computer through establishing wired or wireless communications between a toy and the computer. However, adding a communication interface for the toys, such as bluetooth, Wi-Fi, audio I/O interfaces, or proprietary wired connectors adds substantial costs and complexity to the toys.
Accordingly, there is a need to overcome the drawbacks and deficiencies in the art while providing interactive toys, which can be used with touch screen devices.
SUMMARY OF THE INVENTIONA system and method for providing interactive game pieces and/or toys using touch screen devices, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
The features and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:
The present application is directed to a system and method of facilitating an interaction using first and second peripheral devices and related structures. The following description contains specific information pertaining to the implementation of the present invention. One skilled in the art will recognize that the present invention may be implemented in a manner different from that specifically discussed in the present application. Moreover, some of the specific details of the invention are not discussed in order not to obscure the invention. The specific details not described in the present application are within the knowledge of a person of ordinary skill in the art. The drawings in the present application and their accompanying detailed description are directed to merely exemplary embodiments of the invention. To maintain brevity, other embodiments of the invention, which use the principles of the present invention, are not specifically described in the present application and are not specifically illustrated by the present drawings.
It should be noted that the embodiment shown in diagram 100 of
Peripheral devices 102 and 104 comprise, for example, interactive game pieces and/or toys where the touch-sensitive system of
In one example, the video game can be an electronic board game where touch surface 116 can display the game board and peripheral devices 102, 104, and 150 can comprise game pieces. For example, in one embodiment, peripheral devices 102, 104, and 150 can be chess pieces used with a chess board displayed on touch surface 116, where electronic device 106 can identify each chess piece and run a game of chess. Thus, electronic device 106 can, for example, identify peripheral device 102 as a black king and peripheral device 150 as a white king and provide a scoreboard or offer a suggested move displayed on touch surface 116 based on the identity and position of various chess pieces.
In the present embodiment, processor 118, speaker 152, memory 120, and touch surface 116 of electronic device 106 can communicate using bus 124. More particularly, processor 118, speaker 152, memory 120, and touch surface 116 can use bus 124 to facilitate an interaction using peripheral device 102 and another device, such as peripheral device 104. It will be appreciated that processor 118, speaker 152, memory 120, and touch surface 116 can be connected to one another using other means, for example, a plurality of dedicated lines, or a combination of buses and dedicated lines.
Processor 118, which can comprise, for example, a central processing unit (CPU), is configured to operate in accordance with executable code 122 stored in memory 120. Memory 120 can comprise random access memory (RAM) and executable code 122 can include a software program, for example, a video game, educational software, or other software, such that processor 118 operates in accordance with instructions in the software program. By operating in accordance with executable code 122, procesor 118 can facilitate an interaction using peripheral devices 102 and 104.
In electronic device 106, processor 118 can detect a plurality of contemporaneous touches on touch surface 116. Thus, touch surface 116 is capable of registering the presence and position of multiple touch-based inputs thereon. In one particular embodiment, touch surface 116 is a capacitive touch screen, which uses charge variation to sense touch-based input. Processor 118 can detect, for example, contemporaneous touches 126 shown in
Processor 118 can also identify peripheral device 102 based on contemporaneous touches 126 as compared to one of touch points of peripheral device 102 and touch points of peripheral device 104. In the present embodiment, for example, processor 118 can identify peripheral device 102 using identifying data 130, which can characterize touch points of peripheral device 102 and touch points of peripheral device 104. According to one embodiment, identifying data 130 in memory 120 can comprise distances between touch points of peripheral device 102 and distances between touch points of peripheral device 104, which will be further described with respect to
Referring to
Identifying data 130 can characterize the arrangement of touch points 210b and 210c of peripheral device 202 and touch points 114b and 114c of peripheral device 104.
As shown in
Similarly, identifying data 130 in memory 120 can comprise a distance between touch points 114b and 114c, which characterizes the arrangement of touch points 114b and 114c of peripheral device 104. The distance between touch points 114b and 114c can be different than distance C, such that processor 118 can distinguish between touch points 114b and 114c of peripheral device 104 and touch points 210b and 210c of peripheral device 202, and thus between peripheral devices 202 and 104. In one specific embodiment, the distance can vary from distance C by 1-2 millimeters. Thus, peripheral devices 202 and 104 can have different identities, which can be determined by processor 118.
While
Referring to
As shown in
Similarly, identifying data 130 in memory 120 can comprise distances between touch points 114a, 114b, and 114c, which characterizes the arrangement of touch points 114a, 114b, and 114c of peripheral device 104. The distances can be provided such that processor 118 can distinguish between touch points 114a, 114b, and 114c of peripheral device 104 and touch points 210a, 210b, and 210c of peripheral device 202, and thus between peripheral devices 202 and 104. Thus, peripheral devices 202 and 104 can have different identities, which can be determined by processor 118.
In the embodiment described with respect to
Furthermore, processor 118 can determine the orientation of peripheral device 202 based on contemporaneous touches 126 as compared to touch points of peripheral device 102. For example, in the present embodiment, because identifying data 130 can characterize at least three touch points, for example, touch points 210a, 210b and 210c, processor 118 can determine the orientation of peripheral device 202 with respect to touch surface 116. As a specific example, processor 118 can associate touch point 210c with the back of peripheral device 202 and the midpoint of distance B with the front of peripheral device 202.
Referring now to
In peripheral device 302, each of contact regions 332a, 332b, and 332c are configured to transfer at least one touch to touch surface 116 through any combination of touch points 310a, 310b, and 310c, for example, by providing a grounding path when touch points 310a, 310b, and 310c are provided over touch surface 116. For example, when a user touches one of contact regions 332a, 332b, and 332c with a finger, the touch can be transferred through at least one of touch points 310a, 310b, and 310c over touch surface 116. In one embodiment, touch lead 108 (not shown in
It is preferred that touch lead 108 provides a grounding path between at least one of contact regions 332a, 332b, and 332c and at least two of touch points 310a, 310b, and 310c. Thus, peripheral device 302 can expand, for example, one touch on peripheral device 302 to a plurality of contemporaneous touches on touch surface 116. By way of example, in the embodiment described with respect to
Peripheral device 302 can further include touch switch 346 connected to a touch lead, such as touch lead 108, in addition to or instead of contact regions 332a, 332b, and 332c, which, when enabled, can provide touches on touch surface 116 through any combination of touch points 310a, 310b, and 310c. Touch switch 346 can provide touches on touch surface 116 without requiring user contact with peripheral device 302, for example, by providing charge variation on touch surface 116 when enabled.
Peripheral device 302 can include additional touch leads similar to touch lead 108. As a particular example, in the embodiment described with respect to
In some embodiments, the toggle point can remain isolated from touch lead 108, for example, to provide a signal to electronic device 106 using touch 111a separate from touches 111b and 111c. As a specific example, the additional touch lead can provide a grounding path between ,contact region 332b and touch point 310a. In this example, the toggle point can function as a button that a user can press by touching contact region 332b, for example, to fire a gun or select a menu option in a video game or other software, which is displayed on touch surface 116.
In other embodiments, the toggle point can become connected to touch lead 108, to signal electronic device 106 as to a condition of peripheral device 302. For example, putting a laser or other accessory in Buzz Lightyear's hand may signal electronic device 106 by changing the identity of peripheral device 302 or send a pulsing signal or pulse to electronic device 106 using touch 111a. The accessory may, for example, be attached to peripheral device 302 thereby shorting a conductive path to a touch lead, such as touch lead 108, or triggering a switch in peripheral device 302. In some embodiments the accessory can be another peripheral device, for example Buzz Lightyear could be put into his spaceship and electronic device 106 could run a space themed mini-game.
In further embodiments, peripheral device 302 may comprise a poseable or transformable figurine or action figure, where the toggle point signals to electronic device 106 that a different pose or transformation has occurred. As an example, electronic device 106 can be running a video game and peripheral device 302 can be a Buzz Lightyear Figurine, from Disney's Toy Story, holding a laser gun in his hand. When a user raises Buzz Lightyear's hand, touch point 310a can toggle, thereby providing a signal to electronic device 106 as touch 111a as described above. For example, a switch at the pivot point of Buzz Lightyear's arm could toggle touch point 310a. Subsequently, electronic device 106 can run a mini-game, such as a shooting game responsive to the signal. The toggle point can then be used as a button so that a user can fire the laser gun in the mini-game by touching, for example, contact region 332b. Alternatively, another toggle point could be used as the button.
Moving to
Referring to step 410 of flowchart 400 in
Referring to step 420 of flowchart 400 in
Referring to step 430 of flowchart 400 in
Referring to step 440 of flowchart 400 in
Referring to step 450 of flowchart 400 in
Woody and the audible indicator could be in the voice of Woody saying, “Well, if it's a sheriff you need, you sure came to the right toy.” Furthermore, according to some embodiments, the audible indicator can be context sensitive, for example, based on the position of peripheral device 102 on touch surface 116, a state of the video game being running or any of the identity, position, and orientation of peripheral devices 104 and 150 stored in memory 120. For example, the audible indicator could be in the voice of Woody saying, “Howdy Buzz, what brings you around these parts?” based on the identity of peripheral device 104 as Buzz Lightyear.
Furthermore, the interaction can be based on any of the position and orientation of peripheral device 102 relative to touch surface 116. For example, in one embodiment processor 118 can facilitate an interaction using peripheral device 102 and 104 based on the position and orientation of peripheral device 102 with respect to peripheral device 104. As a specific example, processor 118 can determine that the front of peripheral device 102 is facing the front of and is adjacent to peripheral device 104 as shown in
An exemplary sequence will be described below with respect to the video game shown in
A user could have recently provided peripheral device 102 over touch surface 116 in accordance with step 410. Subsequently, processor 118 can detect touch points from peripheral device 102 based on contemporaneous touches on touch surface 116 in accordance with step 420. Processor 118 can then identify peripheral device 102 as Woody in accordance with step 430. After peripheral device 102 is identified, Woody can say, “Well, if it's a sheriff you need, you sure came to the right toy” using speaker 152. Also, processor 118 can determine that peripheral device 102 is next to and facing the front of peripheral device 104 based on the orientation and position of each peripheral device, which is being tracked by electronic device 106. Based on this orientation and positioning, processor 118 can simulate a conversation between Buzz and Woody. For example, Buzz could say, “Hey Woody, warm day we are having isn't it?” using speaker 152. Then Woody could say, “Sure is Buzz, wish I had a way to cool down.”
Next, because processor 118 has determined that peripheral device 102 has the identity of Woody, that peripheral device 150 has the identity of Hamm, and that truck 164 is positioned between peripheral devices 150 and 102, peripheral device 150 (e.g.,
Hamm) could say using speaker 152, “Hey Woody! Buzz! Is that you? I didn't see you behind that truck. Would you like to cool down with a swim in my lake?” Based on the relative position of peripheral devices 104 and 150, electronic device 106 can determine that Hamm is far apart from Buzz and Woody and thus, using speaker 152, Buzz can say, “Hey Hamm! I didn't see you over there. Woody, I'm busy saving the planet, but you can go ahead . . . I just don't see how are you going to get all the way to Hamm's house?” Then, based on the orientation of peripheral device 102 and 104 being tracked by electronic device 106, processor 118 can determine that peripheral device 104 is facing away from and therefore cannot see horse 170 and furthermore that peripheral device 102 is facing and therefore can see horse 170. Based this information, Woody could say, “Buzz! I guess you didn't notice my horse Bullseye is right behind you. I'll hop on Bullseye and head over.” Based on the position of peripheral devices 102 and 104, tracked by electronic device 106, touch surface 116 can display an arrow from peripheral device 102 to horse 170, which goes around peripheral device 104, to indicate that the user should move peripheral device 102 over horse 170.
The user can then move peripheral device 102 toward horse 170 and because electronic device 106 is tracking that peripheral device 102 is positioned over grass 172 and has the identity of Woody, who is a cowboy, processor 118 can select the sound of cowboy boots walking on the grass to be played using speaker 152 as peripheral device 102 moves over touch surface 116. When processor 118 determines that peripheral device 102 is over horse 170, a mini-game can commence where horse 170 stays under peripheral device 102 as peripheral device 102 is moved by the user because peripheral device 102 is having its position tracked over touch surface 116 by electronic device 106. The mini-game can be for example, a traffic dodging game where horse 170 must cross road 168 while avoiding truck 164 and car 166. The sequence can incorporate various cinematics displayed on touch surface 116. For example, as the position of peripheral device 102 is moved by the user to near road 168, touch surface 116 can zoom in to display a more detailed traffic scene. Furthermore, based on processor 118 determining that peripheral device 106 is positioned over road 168, speaker 152 can play the sound of a horse walking on a road, as opposed to walking on a grassy surface when positioned over grass 172. The mini-game could end when processor 118 determines that peripheral device 102 and horse 170 are positioned near lake 160.
Thus, according to various embodiments, as set forth in
From the above description of the invention it is manifest that various techniques can be used for implementing the concepts of the present invention without departing from its scope. Moreover, while the invention has been described with specific reference to certain embodiments, a person of ordinary skills in the art would recognize that changes can be made in form and detail without departing from the spirit and the scope of the invention. As such, the described embodiments are to be considered in all respects as illustrative and not restrictive. It should also be understood that the invention is not limited to the particular embodiments described herein, but is capable of many rearrangements, modifications, and substitutions without departing from the scope of the invention.
Claims
1-20. (canceled)
21. A method for identifying one or more of a plurality of peripheral devices including a first peripheral device and a second peripheral device using a touch-sensitive system having a processor and a touch surface, each of the first and second peripheral devices having a plurality of touch points for touching the touch surface of the touch-sensitive system, the method comprising:
- detecting, using the processor, a plurality of contemporaneous touches on the touch surface of the touch-sensitive system; and
- identifying, using the processor, one of the first and second peripheral devices based on the plurality of contemporaneous touches as compared to the plurality of touch points of one of the first and second peripheral devices.
22. The method of claim 21, wherein identifying the one of the first and second peripheral devices is based on the plurality of contemporaneous touches as compared to one of two touch points of the first and second peripheral devices.
23. The method of claim 21, comprising determining the orientation of the one of the first and second peripheral devices based on the plurality of contemporaneous touches as compared to the plurality of touch points of the one of the first and second peripheral devices.
24. The method of claim 21, comprising performing an interaction using the first and second peripheral devices.
25. The method of claim 24, wherein the interaction is based on the relative orientation of the first and second peripheral devices.
26. The method of claim 24, wherein the interaction is based on the identity of one of the first and second peripheral devices.
27. The method of claim 24, wherein the interaction is based on the position of the first peripheral device with respect to the second peripheral device.
28. A touch-sensitive system for identifying one or more of a plurality of peripheral devices including a first peripheral device and a second peripheral device, each of the first peripheral device and the second peripheral device having a plurality of touch points, the touch-sensitive system comprising:
- a touch surface;
- a processor, the processor being configured to: detect a plurality of contemporaneous touches on the touch surface; and identify one of the first peripheral device and second peripheral device based on the plurality of contemporaneous touches as compared to the plurality of touch points of the one of the first peripheral device and second peripheral device.
29. The system of claim 28, wherein the processor is configured to identify the one of the first and second peripheral devices is based on the plurality of contemporaneous touches as compared to one of two touch points of the first and second peripheral devices.
30. The system of claim 28, wherein the processor is configured to determine the orientation of the one of the first and second peripheral devices based on the plurality of contemporaneous touches as compared to the plurality of touch points of the one of the first and second peripheral devices.
31. The system of claim 28, wherein the processor is configured to perform an interaction using the first and second peripheral devices.
32. The system of claim 31, wherein the interaction is based on the relative orientation of the first and second peripheral devices.
33. The system of claim 31, wherein the interaction is based on the identity of the identified one of the first and second peripheral devices.
34. The system of claim 31, wherein the interaction is based on the position of the first peripheral device with respect to the second peripheral device.
35. A peripheral device for use with a touch-sensitive system having a touch surface, the peripheral device comprising:
- a plurality of touch points configured to contact the touch surface of the touch-sensitive system;
- the plurality of touch points further configured to provide information to the touch-sensitive system for identifying the first peripheral device.
36. The peripheral device of claim 35, wherein the identifying distinguishes between the plurality of touch points of the peripheral device and a plurality of touch points of another peripheral device.
37. The peripheral device of claim 35, wherein the plurality of touch points are further configured to provide information to the touch-sensitive system for determining the orientation of the first peripheral device.
38. The peripheral device of claim 35, wherein the information provided to the touch-sensitive system comprises at least one distance between any of the plurality of touch points.
39. The peripheral device of claim 35, comprising a touch lead configured to provide a grounding path between at least one contact region and at least two of the plurality of touch points.
40. The peripheral device of claim 35, comprising a touch lead configured to provide a grounding path between a touch switch and at least one of the plurality of touch points.
Type: Application
Filed: Aug 24, 2010
Publication Date: Jan 12, 2012
Applicant: DISNEY ENTERPRISES, INC. (BURBANK, CA)
Inventors: Christopher W. Heatherly (Monrovia, CA), Kenlip Ong (Burbank, CA), Armen Mkrtchyan (Glendale, CA), Jonathan Backer (Burbank, CA), Brian White (Simi Valley, CA)
Application Number: 12/806,986
International Classification: G06F 3/041 (20060101);