METHOD AND APPARATUS FOR INTERACTIVE PLAY
An interactive toy object apparatus having a toy body that includes a plurality of object body portions, an object control circuit secured to the body that includes an object processor, an object memory device, and one or more object transceivers, a plurality of object inputs and object outputs secured to one or more of the object body portions and in communication with the object control circuit; and a first control program stored in the object memory device and operable by the object processor, wherein the interactive object is capable of communicating with a controller, via the one or more object transceivers, to receive or transmit at least one of commands, inputs, and outputs, therebetween.
The present Application is a non-provisional application and claims the benefit of U.S. provisional patent application No. 61/754,769 having the same title as the present Application and filed on Jan. 21, 2013, of which the present Application hereby incorporates by reference in its entirety.
FIELDThe method and apparatus relate to interactive objects, more particularly, interactive toys.
BACKGROUNDVarious types of toys are available for play with a user, such as a child. The interaction of these toys with the user has traditionally been limited to actions performed by the toy in response to an action of the user. For example, pushing a specific spot on a stuffed animal can initiate an action such as a sound or movement. Typically, the toy includes a basic circuit that receives a hard input from a user and responds in a pre-programmed manner with a hard output. Such limited interactive capabilities with a user, particularly a child, can lead to rapid boredom and subsequent non-use of the toy.
Embodiments of the method and apparatus are disclosed with reference to the accompanying drawings and are for illustrative purposes only. The method and apparatus are not limited in its application to the details of construction or the arrangement of the components illustrated in the drawings. The method and apparatus are capable of other embodiments or of being practiced or carried out in other various ways. In the drawings:
In at least some embodiments, the method and apparatus for interactive play relates to an apparatus that includes, a controller comprising, a controller processor, a controller memory device, a communication device, a controller display screen, one or more controller transceivers, and a first control program resident in the controller memory device and operable by the controller processor. In addition, the apparatus includes, an interactive object comprising, an object control circuit, an object processor, object memory device, one or more object transceivers, and a second control program resident in the object memory device and operable by the object processor, wherein, the controller is capable of communicating with the interactive object, via the transceivers, to receive and transmit at least one of commands, inputs, and outputs, therebetween.
In additional embodiments, the method and apparatus for interactive play relates to an interactive toy object apparatus having a toy body that includes a plurality of object body portions; an object control circuit secured to the body that includes an object processor, an object memory device, and one or more object transceivers; a plurality of object inputs and object outputs secured to one or more of the object body portions and in communication with the object control circuit; and a first control program stored in the object memory device and operable by the object processor, wherein the interactive object is capable of communicating with a controller, via the one or more object transceivers, to receive or transmit at least one of commands, inputs, and outputs, therebetween.
In other additional embodiments, the method and apparatus for interactive play relates to a method of interactive play with a toy that includes providing a toy object that includes a toy body and an object control circuit secured to the body that includes an object processor, an object memory device, and at least one communication device; activate one of a plurality of object inputs secured to one of a plurality of object body portions forming the toy body; and play an audio track stored in the object memory device via one or more object outputs secured to one or more of the body portions, wherein the audio track is assignable to be activated by one or more of the object inputs.
In further additional embodiments, the method and apparatus for interactive play relates to a method of interactive play with a toy that includes providing a controller having a display screen, a controller processor, a controller memory, and a wireless controller transceiver, wherein the controller is configured to communicate with a toy object via the controller transceiver, and displaying on the display screen a plurality of selections that include one or more of selecting an audio track from a library of audio tracks in the controller memory, downloading an audio track for storage in the controller memory, and recording an audio track for storage in the controller memory.
Other embodiments, aspects, features, objectives and advantages of the method and apparatus will be understood and appreciated upon a full reading of the detailed description and the claims that follow.
DETAILED DESCRIPTIONReferring to
The interactive object 102 illustrated in
Referring to
Referring again to
Referring again to
Referring to
In at least some embodiments, the controller transceivers 133 utilize a wireless technology for communication, such as, but not limited to, cellular-based communication technologies such as analog communications (using AMPS), digital communications (using CDMA, TDMA, GSM, iDEN, GPRS, EDGE, etc.), and next generation communications (using UMTS, WCDMA, LTE, IEEE 802.16, etc.), or variants thereof, or peer-to-peer, or ad hoc communication technologies such as HomeRF (radio frequency), Bluetooth, near field communications (NFC), and IEEE 802.11 (a, b, g or n), or other wireless communication technologies such as Infra-Red (IR). The controller 104 can utilize one or more of the aforementioned technologies, as well as other technologies not currently developed, to communicate with the interactive object 102. In at least some embodiments, it is preferable that the controller 104 and interactive object 102 communicate utilizing Bluetooth and/or NFC protocols. In particular, this is practical as it does not require an outside network interface (e.g., WiFi).
As discussed above, the controller 104 includes the first control program 190 and/or first control program interface 197, which reside on the controller 104 to initiate actions to be performed by the interactive object 102. In addition, the second control program 198 resides on the interactive object 102 to receive and execute instructions received from the controller 104, as well as to receive and transmit instructions from the interactive object 102 to the controller 104. When the first control program 190 is resident on the controller 104, the control program interface 197 is not required. If the first control program 190 is not resident on the controller 104, but merely accessed by the controller 104 for operation, then the first control program interface 197 is resident on the controller 104 to facilitate communication with the first control program 190 installed on another device/source.
The first control program 190 is, in at least some embodiments, a software application. The software application can be configured to run on various types of controllers 104. The type of controller 104 typically determines the operating system utilized. For example, the first control program 190 can be configured to operate on one or more of IOS, ANDROID, and WINDOWS 8 operating systems. In at least some embodiments, the first control program 190 is installed directly into the memory of the controller 104. Installation of the first control program 190 can be performed utilizing one of many installation methods. For example, the first control program 190 can be downloaded via wired or wireless communication, with Internet stores such as GOOGLE PLAY, APP STORE, WINDOWS STORE, etc. Once downloaded, the first control program 190 resides on the controller 104 and can be configured for communication with the interactive object 102. The interfacing and operation of the controller 104 and the interactive object 102 can be performed in numerous manners. At least one exemplary embodiment is illustrated in the flow charts found in
The first control program 190 includes various screen views that are configured to include arrangements of selection buttons 113 and other objects, such as an avatar 211, for display on the controller display screen 180 of the controller 104. Through selection or other manipulation of the controller 104 by a user, various different screen views can be displayed offering new, old, or modified selections (e.g., selection buttons 113). The first control program 190 includes a home screen 127 that displays an avatar 211 of the interactive object 102 and a first set of selection buttons 113, whose identifier or value can dynamically change based on user selections. It is to be further understood that the term selection button 113 can include any one of the identified buttons in the process steps listed below.
Referring to
If the interactive object 102 is not found by the controller 104 in step 206, then in step 208, a message can be displayed on the controller 104 indicating that no interactive objects 102 have been found. Alternatively, if the interactive object 102 is found in step 206, an indication is provided to the user in step 210, such as by displaying an avatar 211 (
Referring now to
Choosing from the selections, a user selects one of the selectable songs displayed in step 304. This selection prompts the song to be played from the object speaker 162 in the interactive object 102 and/or the controller speaker 179 in step 306. In addition, selection of the song can initiate illumination of one or more body lights 160 on the interactive object 102, as in step 308. Further, selection of the song in step 304 can initiate the playing of an animation on the controller 104 in step 310. Annunciation of the end of the selected song can be provided by the interactive object 102, such as by sounding a noise (e.g., a “giggle”), as in step 312, and/or, illuminating one or more LEDs on the interactive object 102, as in step 314. In step 316, the user is prompted with a choice to repeat the song. If the user chooses not to repeat, then at step 318, the sub-menu is displayed from step 302.
Beginning again from step 302 with a listing of available songs displayed, if the user selects “save songs” button at step 330, then at step 332, the avatar 211 of the interactive object 102 is displayed with avatar body portions 123 on the controller 104. The avatar body portions 123 are displayed on the controller 104 to assist a user with identifying the status of the various object inputs 120. More particularly, the avatar body portions 123 are displayed with body portion indicators 151 (e.g., illuminated or contrast colored display screen portions (e.g., pixels)). The body portion indicators 151 represent unassigned object inputs 120 and serve as user input points on the controller display screen 180 for a user to touch to make a selection. For example, an avatar body portion 123 (e.g., an avatar foot 125) can be shown with a body portion indicator 151 illuminated green if no song is assigned and red if a song is already assigned to a particular object body portion 119. Additionally, as in step 334, body lights 160 on the interactive object 102 can also be illuminated to correspond with the body portion indicator 151 on the avatar 211. For example, a body portion light 160 on the foot 105 of the interactive object 102 would be illuminated with a color that corresponds to the color displayed on the body portion indicator 151 of the avatar 211. This provides easy identification of the available choices for assigning a song. In step 336, the user selects one of the displayed songs and assigns it to the desired object body portion 119 of the interactive object 102. The assignment can be accomplished in many ways, such as by touching the song and then the desired avatar body portion 123 displayed on the controller 104. After the assignment, the newly assigned body portion light 160, and/or the body portion indicator 151 can be illuminated to acknowledge the assignment, as shown in steps 338 and 339. For example, before the selection of a song, a green illumination that is shown at a body portion light 160 of the interactive object 102, and/or body portion indicator 151 on the controller 104, can change to red upon a successful assignment. In step 340, the controller 104 can display an option to assign further songs to the interactive object 102, in which if the additional assignments are desired, the process then returns to step 336.
Referring to
Referring again to step 302 in
Returning again to step 210, and with reference to
Numerous sounds and functions can be preprogrammed into the object memory device 134 during manufacturing. In addition, sounds and functions can be downloaded to the interactive object 102 from the controller 104. Returning to step 210, once the interactive object 102 is paired with the controller 104, the object inputs 120 can be manipulated to activate various object outputs 122, including lights, sounds, etc. For example, the user can activate the sensor 150 positioned in the stomach 115 of the interactive object 102, and a sound such as a “laugh,” can be played from the object speaker 162. In addition, body lights 160 can be illuminated in and an animation can be displayed on the controller 104 for viewing by the user. Other actions can include turning the interactive object 102 upside-down, thereby activating an object input 120, such as the accelerometer 152 and/or gyroscope 154, resulting in a preselected or random audio track being played from the object speaker 162.
The above exemplary processes have been described with primary focus on the actions and responses from the controller 104 and interactive object 102 from the perspective of the user. Below and with reference to
The pairing can also occur via NFC by touching the interactive object 102 with the controller 104 or placing it within near proximity to the interactive object 102, wherein the interactive object 102 can have a NFC tag installed. The NFC tag, being in at least some embodiments, a programmable device that provides or triggers an action instruction in the controller 104 when the controller senses the tag to be in near proximity to the controller 104. In addition, as in step 608, the controller 104 receives a connect instruction, via the “connect” button, and initiates the pairing process. At step 610, the controller 104 verifies that pairing is successful and the process moves to step 612. If pairing is unsuccessful, annunciation is provided by the controller 104 and the process returns to step 604 for further instruction. At step 612, the controller 104 displays the avatar 211 of the interactive object 102. Further, the controller 104 can transmit a command to the interactive object 102, via communication initiated between the object processor 129 and the controller processor 131, utilizing the object transceiver(s) 132 and controller transceiver(s) 133. The command can include an annunciation of the pairing to be performed by the interactive object 102, such as illuminating body lights 160, etc. In addition to displaying the avatar 211, one or more additional or replacement selection buttons 113 are displayed on the controller 104.
In at least one embodiment, as shown in
During the playing of the audio track on the object speaker 162, the controller 104 can display animation of the avatar 211 associated with the interactive object 102, along with any additional avatars displayed as a result of pairing with additional objects, as in step 720. Further, the process moves to step 714, where a command is transmitted from the controller 104 to one or more interactive objects 102 that instructs them to illuminate one or more body portion indicators 151. At step 722, the controller 104 processes that the audio track is finished playing, and at step 724 the controller 104 transmits a command to the interactive object 102 to annunciate the end of the song, such as playing audio track that includes a “giggle”, and/or in step 726, the controller 104 transmits a command to the interactive object 102 to illuminate one or more body lights 160 with a distinctive color. At the end of the “giggle” audio track in step 728, the process returns to step 706 to display the song list again on the controller 104.
Still referring to
Referring now to
Referring now to
In addition to reading a story with a pre-recorded voice, a new story voice can be recorded and played with the story. Returning to step 906 and referring to
Manipulation of the interactive object 102, with or without pairing to the controller 104, can initiate an object output 122 to be activated by the second control program 198. Returning to step 610 (
In another example, as shown in
Although various references above include action described as being performed by the controller 104, it is to be understood that these various actions are more specifically performed via instruction from the first control program 190 as operated by the controller processor 131. Likewise, although various references above include action described as being performed by the interactive object 102, it is to be understood that these various actions are more specifically performed via instruction from the second control program 198 as operated by the object processor 129. It should be noted that although this process is described with respect to songs, stories, etc., numerous types of media (e.g., photos) and numerous functions (e.g., speaking phrases, vibrating body portions, etc.), can be assigned to operate with various object body portions 119. With regard to the first control program 190 and second control program 198, one or both of the programs can be configured (via real-time or pre-programmed) to play various games and activities to take advantage of the various object inputs 120, object outputs 122, controller inputs 193 and controller outputs 194.
The interactive object 102 can, in at least some embodiments, include numerous objects, such as medical devices, kitchen appliances, and other consumer products that can benefit from the interfacing and control afforded by the controller 104. The first control program 190 and second control program 198 can be tailored to address the desired features for these additional objects. It is specifically intended that the method and apparatus for interactive play is not to be limited to the embodiments and illustrations contained herein, but include modified forms of those embodiments including portions of the embodiments and combinations of elements of different embodiments as come within the scope of the following claims. Further, the steps described herein with reference to the method of operation (processes) are not to be considered limiting and can include variations, such as additional steps, removed steps, and re-ordered steps.
It should be appreciated that the present disclosure is intended to encompass numerous embodiments as disclosed herein and further described by the following:
(i). An interactive toy object apparatus comprising:
-
- a toy body that includes a plurality of object body portions;
- an object control circuit secured to the body that includes an object processor, an object memory device, and one or more object transceivers;
- a plurality of object inputs and object outputs secured to one or more of the object body portions and in communication with the object control circuit; and
- a first control program stored in the object memory device and operable by the object processor, wherein the interactive object is capable of communicating with a controller, via the one or more object transceivers, to receive or transmit at least one of commands, inputs, and outputs, therebetween.
(ii). The apparatus of (i), wherein the plurality of object inputs include a first tactile sensor secured to a first object body portion and a second tactile sensor secured to a second body portion, and wherein the plurality of object outputs include a speaker.
(iii). The apparatus of any one of (i)-(ii), wherein the toy body is a stuffed animal comprising at least in part of a cloth material and a stuffing material.
(iv). The apparatus of any one of (i)-(iii), wherein the plurality of object inputs include a microphone for receiving a voice prompt, and wherein the first control program is configured to respond to the voice prompt by activating one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light.
(v). The apparatus of any one of (i)-(iv), wherein the plurality of object outputs includes the speaker and one or more lights.
(vi). The apparatus of any one of (i)-(v), wherein the plurality of object inputs includes a gyroscope for sensing a position change of the toy body from a first position to an inverted second position, wherein the first control program is configured to activate one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light upon sensing the toy body moving from the first position to the inverted second position.
(vii). The apparatus of any one of (i)-(vi), wherein the plurality of object inputs includes an accelerometer for sensing a position change of the toy body from a first position to a second position, wherein the first control program is configured to activate one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light upon sensing the toy body moving from the first position to the second position.
(viii). The apparatus of any one of (i)-(vii), wherein the one or more transceivers are configured to receive wireless programming instructions, and wherein the programming instructions include the assignment of one or more of the plurality of object inputs to one or more of the object outputs.
(ix). The apparatus of any one of (i)-(viii), wherein the one or more transceivers are configured to receive wireless instructions, and wherein the instructions include the assignment of one or more of the plurality of object inputs to one or more of the object outputs, such that activation of one or more of the plurality of object inputs activates associated object outputs to provide at least one of sound, motion, or light.
(x). The apparatus of any one of (i)-(ix), wherein once the instructions are received, the object outputs respond to the associated object inputs without further instructions being received by the transceivers.
(xi). The apparatus of any one of (i)-(x), wherein the toy body includes a pair of arms, a pair of legs, and a chest, wherein the arms and chest each include an object input and an object output.
(xii). The apparatus of any one of (i)-(xi), wherein the one or more transceivers are paired with a controller using a wireless connection, and the controller includes a display screen that displays an avatar of the toy body.
(xiii). A method of interactive play with a toy comprising:
-
- providing a toy object that includes a toy body and an object control circuit secured to the body that includes an object processor, an object memory device, and at least one communication device;
- activate one of a plurality of object inputs secured to one of a plurality of object body portions forming the toy body; and
- play an audio track stored in the object memory device via one or more object outputs secured to one or more of the body portions, wherein the audio track is assignable to be activated by one or more of the object inputs.
(xiv). The method of (xiii), wherein the toy body is a stuffed animal comprising at least in part of a cloth material and a stuffing material.
(xv). The method of any one of (xiii)-(xiv), wherein the audio track is assignable via a controller capable of communicating with the communication device.
(xvi). The method of any one of (xiii)-(xv), wherein the audio track is chosen from a plurality of audio tracks that are stored in a controller memory of the controller.
(xvii). The method of any one of (xiii)-(xvi), further including displaying an avatar image of the toy body on the controller, wherein the body portions of the toy body displayed in the avatar image are selectable on the controller for assignment to one of the audio tracks.
(xviii). The method of any one of (xiii)-(xvii), further including activating one or more object outputs associated with the body portion during the assignment of a selected audio track.
(xix). A method of interactive play with a toy comprising:
-
- providing a controller having a display screen, a controller processor, a controller memory, and a wireless controller transceiver, wherein the controller is configured to communicate with a toy object via the controller transceiver; and
- displaying on the display screen a plurality of selections that include one or more of selecting an audio track from a library of audio tracks in the controller memory, downloading an audio track for storage in the controller memory, and recording an audio track for storage in the controller memory.
(xx). The method of (xix), further including:
-
- selecting one of the audio tracks stored in the library of the controller memory and communicating the audio track to the toy object via the controller receiver;
- receiving the communicated audio track at the toy object;
- storing the communicated audio track in an object memory of the toy object; and
- playing the audio track via a speaker secured to the toy object upon receipt of an activation of an object input situated on a body portion of the toy object.
(xxi). An apparatus comprising:
-
- a controller comprising:
- a controller processor;
- a controller memory device;
- a communication device;
- a controller display screen;
- one or more controller transceivers;
- a first control program resident in the controller memory device and operable by the controller processor; and
- an interactive object comprising:
- an object control circuit;
- an object processor;
- object memory device;
- one or more object transceivers; and
- a second control program resident in the object memory device and operable by the object processor,
- wherein the controller is capable of communicating with the interactive object, via the transceivers, to receive and transmit at least one of commands, inputs, and outputs, therebetween.
- a controller comprising:
(xxii). The apparatus of (xxi), wherein the controller is paired with the interactive object using a wireless connection.
(xxiii). The apparatus of any one of (xxi)-(xxii), wherein the interactive object is configured to be operated without a continuous connection to the controller.
While the principles of the method and apparatus for interactive play have been described above in connection with regard to a specific apparatus, it is to be clearly understood that this description is made only by way of example and not as a limitation on the scope of the method and apparatus for interactive play. It is specifically intended that the method and apparatus for interactive play not be limited to the embodiments and illustrations contained herein, but include modified forms of those embodiments, including portions of the embodiments and combinations of elements of different embodiments as come within the scope of the following claims. In addition, the various methods of use described herein can include additional steps not described herein or can omit steps described herein. Further, the various steps can be performed in a different order than described herein.
Claims
1. An interactive toy object apparatus comprising:
- a toy body that includes a plurality of object body portions;
- an object control circuit secured to the body that includes an object processor, an object memory device, and one or more object transceivers;
- a plurality of object inputs and object outputs secured to one or more of the object body portions and in communication with the object control circuit; and
- a first control program stored in the object memory device and operable by the object processor, wherein the interactive object is capable of communicating with a controller, via the one or more object transceivers, to receive or transmit at least one of commands, inputs, and outputs, therebetween.
2. The apparatus of claim 1, wherein the plurality of object inputs include a first tactile sensor secured to a first object body portion and a second tactile sensor secured to a second body portion, and wherein the plurality of object outputs include a speaker.
3. The apparatus of claim 2, wherein the toy body is a stuffed animal comprising at least in part of a cloth material and a stuffing material.
4. The apparatus of claim 3, wherein the plurality of object inputs include a microphone for receiving a voice prompt, and wherein the first control program is configured to respond to the voice prompt by activating one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light.
5. The apparatus of claim 4, wherein the plurality of object outputs includes the speaker and one or more lights.
6. The apparatus of claim 5, wherein the plurality of object inputs includes a gyroscope for sensing a position change of the toy body from a first position to an inverted second position, wherein the first control program is configured to activate one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light upon sensing the toy body moving from the first position to the inverted second position.
7. The apparatus of claim 6, wherein the plurality of object inputs includes an accelerometer for sensing a position change of the toy body from a first position to a second position, wherein the first control program is configured to activate one or more of the plurality of object outputs to provide an annunciation of at least one of sound, motion, or light upon sensing the toy body moving from the first position to the second position.
8. The apparatus of claim 7, wherein the one or more transceivers are configured to receive wireless programming instructions, and wherein the programming instructions include the assignment of one or more of the plurality of object inputs to one or more of the object outputs.
9. The apparatus of claim 8, wherein the one or more transceivers are configured to receive wireless instructions, and wherein the instructions include the assignment of one or more of the plurality of object inputs to one or more of the object outputs, such that activation of one or more of the plurality of object inputs activates associated object outputs to provide at least one of sound, motion, or light.
10. The apparatus of claim 9, wherein once the instructions are received, the object outputs respond to the associated object inputs without further instructions being received by the transceivers.
11. The apparatus of claim 10, wherein the toy body includes a pair of arms, a pair of legs, and a chest, wherein the arms and chest each include an object input and an object output.
12. The apparatus of claim 3, wherein the one or more transceivers are paired with a controller using a wireless connection, and the controller includes a display screen that displays an avatar of the toy body.
13. A method of interactive play with a toy comprising:
- providing a toy object that includes a toy body and an object control circuit secured to the body that includes an object processor, an object memory device, and at least one communication device;
- activate one of a plurality of object inputs secured to one of a plurality of object body portions forming the toy body; and
- play an audio track stored in the object memory device via one or more object outputs secured to one or more of the body portions, wherein the audio track is assignable to be activated by one or more of the object inputs.
14. The method of claim 13, wherein the toy body is a stuffed animal comprising at least in part of a cloth material and a stuffing material.
15. The method of claim 14, wherein the audio track is assignable via a controller capable of communicating with the communication device.
16. The method of claim 15, wherein the audio track is chosen from a plurality of audio tracks that are stored in a controller memory of the controller.
17. The method of claim 14, further including displaying an avatar image of the toy body on the controller, wherein the body portions of the toy body displayed in the avatar image are selectable on the controller for assignment to one of the audio tracks.
18. The method of claim 17, further including activating one or more object outputs associated with the body portion during the assignment of a selected audio track.
19. A method of interactive play with a toy comprising:
- providing a controller having a display screen, a controller processor, a controller memory, and a wireless controller transceiver, wherein the controller is configured to communicate with a toy object via the controller transceiver; and
- displaying on the display screen a plurality of selections that include one or more of selecting an audio track from a library of audio tracks in the controller memory, downloading an audio track for storage in the controller memory, and recording an audio track for storage in the controller memory.
20. The method of claim 19, further including
- selecting one of the audio tracks stored in the library of the controller memory and communicating the audio track to the toy object via the controller receiver;
- receiving the communicated audio track at the toy object;
- storing the communicated audio track in an object memory of the toy object; and
- playing the audio track via a speaker secured to the toy object upon receipt of an activation of an object input situated on a body portion of the toy object.
Type: Application
Filed: Jan 17, 2014
Publication Date: Jul 24, 2014
Inventors: Benjamin Huyck (Chicago, IL), Casparus Cate (Chicago, IL)
Application Number: 14/158,110
International Classification: A63H 30/04 (20060101);