INTERACTIVE PLUSH TOY
A system is presented for controlling an interactive electronic plush toy using a macro computer program created on a mobile electronic device using a mobile application. The plush toy is provided with an electronic wireless network interface for communicating wirelessly with the smartphone via a network. The macro provides computer programming for instructing the plush toy to perform an Internet search query, receive results of the query, and perform predetermined actions based on the query results. The actions may include producing sound or motion to make the toy seem life-like.
Latest BBY SOLUTIONS Patents:
The present application relates to the field of electronically-controlled toys. More particularly, the described embodiments relate to a plush toy that is controlled by a smartphone application or “app.” The plush toy may connect wirelessly to the Internet and may be programmed to perform actions based on information obtained from the Internet.
SUMMARYOne embodiment of the present invention provides a plush toy having built-in wireless communication features. A smartphone app may wirelessly control the plush toy's movement, sounds, or functions by assigning a predetermined macro computer program to the toy, or by creating a series of instructions for the toy to follow. In one aspect, a user may create a macro program that instructs the toy to query a resource, receive results, apply a conditional rule to the result, and perform one or more actions indicated by the rule. The toy may have moving parts, may play sounds, may play pre-recorded or real-time messages, may have text-to-speech features, and may play back custom voice recordings. The toy may have integrated sensors and respond to ambient factors such as movement, sounds, or light. The toy may be programmed to perform functions based on the time of day. The smartphone app may define tasks such as pulling weather information, financial information, or social media information from the Internet.
Smartphone 150 and plush toy 100 have wireless interfaces 151 and 155 for wireless data communication between plush toy 100 and smartphone 150. Plush toy 100 and smartphone 150 are also each able to connect wirelessly to a remote network 160 such as the Internet. Wireless interfaces 151, 155 may comprise one or more of Bluetooth, Wi-Fi, cellular GSM, cellular CDMA, or other wireless data communication interfaces. Wireless interfaces 151, 155 may also be capable of using more than one type of wireless data communication. For example, smartphone 150 may connect to plush toy 100 via Bluetooth communication and connect to network 160 via cellular communication.
Remote information resources 170, 180, and 190 are accessible to smartphone 150 and plush toy 100 through network 160. In the exemplary embodiment, weather information 170 may include temperature, precipitation, and pollen count information. Weather information 170 could also include weather forecasts, severe weather information, driving conditions, UV index, or other weather-related information. News information 180 may include national news headlines, local news, and stock news. News information 180 could also include category-specific news such as technology or political news. Social media 190 may include social networks and email, but may also include news sources, blogs, Internet forums, podcasts, video, etc. Information resources 170, 180, 190 may comprise databases that are accessible through data feeds, news feeds, RSS feeds, or other data dissemination methods. Data may also be extracted from information resources 170, 180, 190 via known data scraping or web scraping techniques. Other information resources accessible to smartphone 150 and plush toy 100 could include sports, horoscope, travel, shopping, and other types of information resources available via a network 160.
Smartphone 150 may include a plush toy control app 152 for controlling and programming plush toy 100. As explained further in relation to
Plush toy 100 has numerous interactive features to make plush toy 100 seem life-like. The features may include movement capabilities such as movement 115, which includes moving the head, arms, legs, eyes, and mouth of plush toy 100. Movement capabilities may be implemented electronically or mechanically, and may be software- or hardware-controlled. In one embodiment, plush toy 100 could contain movement actuators using gears, pulleys, etc. connected to one or more servo motors or stepper motors controlled by processor 145. Other implementations of movement in a plush toy 100 will be known to one skilled in the art.
Plush toy 100 has a digital processor 145 in data communication with a tangible, non-transitory memory 135. In one embodiment, memory 135 includes both non-volatile memory, such as magnetic-based memory or flash memory, and volatile but faster access memory, such as RAM Processor 145 executes computer instructions and may control movement, sensors, and sound production within plush toy 100. Processor 145 may be a general purpose processor such as those processors based on the ARM processing architecture by ARM Holdings PLC (Cambridge, UK). Memory 135 may contain macros 130 with instructions for accessing information over a network such as the Internet, querying databases or other information sources, monitoring sensors 105 within the plush toy 100, and instructing the plush toy 100 to produce movement and generate sounds. Memory 135 may also contain generally programming 132 that controls the basic features of the plush toy 100. The general programming may, for instance, instruct the plush toy 100 to make a particular noise when a pressure sensor in an arm of the plush toy 100 sensors pressure. The general programming 132 also instructs the plush toy 100 when and how to communicate over wireless interface 155 to the network 160, and how to download macros 130 from the network 160. On one embodiment, the macros 130 contain instructions that are interpreted and implemented on the processor 145 by the general instructions 132. The general instructions 132 and macros 130 may be loaded from the non-volatile portions of the memory 135 into the volatile but faster RAM portion of the memory 135 in order to speed up execution of the macro 130 and instructions 132.
Sensors 105 are integrated within plush toy 100 and may be software- or hardware-controlled and monitored. In the preferred embodiment, sensors 105 monitor various conditions internal and external to plush toy 100. Sensors 105 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, pressure sensor, or other types of known sensors. Sensors 105 may operate continuously, or may be activated on demand by computer instructions within plush toy 100 or instructions received from outside of plush toy 100. In some embodiments, a sensor 105 is provided that monitors the wireless interface 155 for a trigger signal. This allows a Bluetooth connection or a WiFi connection to receive signals that trigger actions in the plush toy 100.
Movement features 115 of plush toy 100 allow for physical movement of the body, head, arms, legs, eyes and other parts of plush toy 100, and may include light-up features of plush toy 100. Movement 115 may be controlled by macros 130 or as part of the general instructions 132. Sound generating features 125 includes pre-recorded sound, text-to-speech features, playback of live audio, or other types of sound. Sound generating features 125 preferably are coordinated with movement features 115, for example to synchronize the mouth of plush toy 100 with speech sounds.
In the first step 210 of
In step 220, the user chooses an execute trigger for the macro. The execute trigger is a conditional statement indicating when to execute the steps of the macro. The execute trigger condition may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition determined by a sensor 105 on the plush toy. Alternatively, the trigger could be the powering on or powering off of the plush toy 100. The trigger may be recurring or intermittent. The user may choose in step 220 to program the macro to execute an unlimited number of times, or may choose to program the macro to run only a specified number of times, after which the macro will not run.
In step 230, the user of the plush toy app 152 defines conditional rules indicating how to differentiate the query results returned from the information resource as a result of the query. The plush toy app 152 may have built-in or predetermined instructions for step 230, or the user may be given flexible choices for differentiating the query results. In the preferred embodiment, at step 230 the macro is programmed to evaluate the query results using conditional rules, then determine actions for the plush toy 100 based on an applied rule. The rules may include being above or below a certain threshold, may be Boolean yes/no conditions, may be based on predetermined keywords, or other programmable rules. It is possible to create macros that contain no conditionals in step 230.
In step 240 the macro is programmed to define actions to perform based on the differentiated query results. In one embodiment, an action is performed for every query result. In an alternate embodiment, an action may be performed for some query results, while no actions are performed for other query results. The actions to be performed may include movement, sound projection, speech-to-text, illuminating lights, or other actions. The actions to be performed may include executing a second macro and performing the steps of the second macro.
In step 250, the user of plush toy app 152 defines a return trigger indicating when to perform the actions defined in step 240. In step 250 the user chooses a return trigger for the macro. The return trigger may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition. In one embodiment the plush toy 100 waits until a return trigger condition is met before executing the action defined in step 240. In another embodiment, no return trigger is defined in step 250 and the plush toy 100 may perform the actions defined in step 240 immediately without waiting for a return trigger condition. The programming of the macro is finished when step 250 is complete. In step 260, the macro programming instructions are transferred wirelessly from the plush toy app 152 and saved in the memory 135 of plush toy 100 as a macro 130. The wireless transfer may be made locally through a connection such as a Bluetooth connection, or may be made over a remote network such as the Internet. If the wireless transfer of the macro 130 is made via the Internet, the smartphone 150 does not need to be near the plush toy 100 to send macro instructions.
Turning specifically to the steps in
In step 320, the macro 130 is programmed to receive the results of the query, after which conditional rules are applied. In the exemplary embodiment of
Once the differentiating conditional options of steps 332, 334, and 336 are programmed, actions are assigned for each result. If the conditional statement at step 332 is satisfied, the method proceeds to step 342. The macro programmer may choose one or more actions and sounds to perform for the particular query result. In the exemplary embodiment of
After the actions to be performed are defined at steps 342, 344, and 346, the user at the plush toy app 152 may define a conditional statement indicating when to perform the actions 342, 344, or 346. In one embodiment the actions may be performed immediately without waiting for a conditional return trigger to occur. In the exemplary embodiment of
A memory 511 is operably connected to processor 510, and stores macros 512 and general instructions 513 for the operation of the plush toy 500. In the preferred embodiment a macro is created on a mobile electronic device and sent via a wireless signal to be stored in the memory 511 of plush toy 100. The macros 512 may be macros programmed according to
Processor 510 is responsible for controlling motion and producing sound in plush toy 500 according to the general instructions 513 store in memory 511. Processor 510 also receives input from sensors 530 and uses the sensor data in connection with macros 512. Sensors 530 of the plush toy 500 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, or other types of known sensors. Plush toy 500 may have a speaker and amplifier 524 for generating sound. Text-to-speech functions 540 may be used to produce sound. As is true with the other capabilities of the plush toy 500, text-to-speech functionality 540 may be implemented in software as part of the general instructions 513, or may be implemented as hardware, such as in a special purpose processor designed to convert text data to audible speech. Pre-recorded sound may be stored in memory 511 and be projected through speaker 524.
Plush toy 500 has motion actuators including arm actuators 520, leg actuators 521, eyes actuator 522, and mouth actuator 523, which produce movement and make plush toy 500 appear life-like. Motion actuators of plush toy 500 may include small, rotating motors connected to gears, pulleys, cams, or levers. Servo or stepper motors connected to the processor 510 may drive motion in actuators 520-523. Processor 510 may control mouth actuator 523 and sound generator 540 to synchronize speech sounds with mouth motion of plush toy 500. The movement features of plush toy 500 may be implemented in a number of different ways that are known to one skilled in the art.
In one embodiment, macro templates are created to ease the creation of common macro types. For example,
Edit sounds button 614 and edit movement button 615 allow a user to program the actions that the plush toy 500 will perform after receiving results of a query. In the exemplary embodiment the data field 624 shows that the plush toy 500 will use text-to-speech features to play back the phrase “you won't need a coat today.” The data field 624 specifically shows the action to take when the weather query results are above 70 degrees. The plush toy app 601 would also allow the user to choose sounds when the weather query results indicate that the temperature is below 55 degrees. Movement instructions are given in data field 625. In the example shown, when the temperature is above 70 degrees the processor 510 of plush toy 500 will cause the arms actuators 521 to wave, and will cause the mouth actuator 532 to “lip synch” along with the text-to-speech sounds projected by speaker and amplifier 524. Plush toy app 601 would also allow the user to program motions for the plush toy 500 to perform when the results show that the temperature is below 55 degrees.
Edit return options button 616 allows the user to edit the conditions under which plush toy 500 will perform the sound 624 and movement 625 actions. In the example of
When the programming for macro 620 is complete, the plush toy app 601 will wirelessly send the macro program to plush toy 500 when the user selects the “send to my toy” button 680. The plush toy app 601 may send the macro directly to plush toy 500 via a wireless interface using a local wireless signal, for example through a Bluetooth connection. Plush toy app 601 may also route the macro programming 620 through a network such as the Internet. In this case, plush toy 500 would receive the wireless signals via a wireless Internet connection. When plush toy app 601 uses the Internet to communicate with plush toy 500, a smartphone does not need to be in close proximity to plush toy 500 to create and send the macro program 620.
It is possible to use the app 601 to program multiple plush toys 500. In this case, the send to my toy button 680 would require the user to identify which toy is being programmed by this macro. In a multi-toy environment, it is possible to program interactions between toys 500. These interactions usually require that one toy 500 perform an action that is sensed by the other toy 500. For instance, a first toy could be programmed as described above, to query the outdoor temperature at 7:00 am, wait for movement in the room, and then speak the words “You won't need a coat today.” The other toy could also be programmed so that at 7:00 am it will query the predicted temperature tomorrow, and may discover that the high tomorrow will be 92 degrees. The second toy would wait for the first toy to state its line (such as by waiting for a sound sensor to hear the words, or waiting until the second toy senses movement, then senses sound, and then waits 5 seconds). When this occurs, the second toy then states “and tomorrow looks like a hot one. It will be over ninety degrees tomorrow.” To improve interaction between toys 500, toys could include the ability to trigger one another, such as through unique sounds, or even wireless digital connections operating between the wireless interfaces 550 of each toy 500. Macros could be programmed to initiate upon receipt of a signal from a companion toy 500, cause an action to be performed, and then send a return signal to the companion toy 500 to trigger an additional macro at the companion toy. By combining this type of interaction with the ability to query external data sources, complicated and informative interactions between toys could be developed.
The many features and advantages of the invention are apparent from the above description. Numerous modifications and variations will readily occur to those skilled in the art. For example, the plush toy app could be implemented as a control panel attached to the plush toy. Since such modifications are possible, the invention is not to be limited to the exact construction and operation illustrated and described. Rather, the present invention should be limited only by the following claims.
Claims
1. A system for controlling an electronic toy, comprising:
- a) a mobile electronic device having a device processor, a device wireless communication interface in data communication with the electronic toy over a wireless data connection, a tangible, non-transitory device memory, and a mobile application residing on the device memory, the mobile application including macro instructions transmitted wirelessly to the toy over the wireless data connection; and
- b) the electronic toy having i) a toy processor, ii) a tangible, non-transitory toy memory, iii) a toy wireless communication interface in data communication with the mobile electronic device over the wireless data connection, iv) a processor-controlled apparatus selected from a set comprising a sound generator and a motion actuator, and v) the macro instructions received from the mobile electronic device via the wireless data connection and stored in the toy memory, the macro instructions configured to cause the toy to (1) perform a query of an external information resource over the toy wireless communication interface, (2) receive a query result, (3) apply a conditional rule to the query result, and (4) cause the processor-controlled apparatus to perform an action indicated by the conditional rule as applied to the query result.
2. The system of claim 1, wherein the information resource is a weather information resource and the query includes a request for at least one of temperature, precipitation, UV index, and pollen count information.
3. The system of claim 1, wherein the performed action is generating sound on the sound generator located within the toy, the sound being at least one of recorded sound and text-to-speech sound.
4. The system of claim 1, wherein the action performed is activating the motion actuator of the toy.
5. (canceled)
6. (canceled)
7. The system of claim 1, wherein the query result is a numeric value and the conditional rule differentiates the query result by comparing the numeric value to a threshold value.
8. The system of claim 1, wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor and send the query after the sensor makes a detection.
9. The system of claim 8, wherein the sensor is one of a motion sensor, a sound sensor, a light sensor, a voice sensor, a clock, and an accelerometer.
10. The system of claim 1, wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor after receiving the query result and perform the action after the sensor makes a detection.
11. A method for controlling a processor-controlled interactive toy via a mobile device, the mobile device having a processor, a wireless network interface, a user interface, and a tangible, non-transitory memory, the method comprising: wherein the conditional rule provides instructions for the toy to perform the first action if the query result satisfies a first condition.
- a) creating a macro, at the mobile device, by selecting, through the user interface, i) a search query for performing a query of an information source on a remote network, ii) a conditional rule to apply to a result of the query, iii) a first action for the toy to perform, the action including at least one of generating sound at a sound generator of the toy and activating a motion actuator of the toy; and
- b) transmitting the macro from the mobile device to the toy via the wireless network interface of the mobile device to be stored in a tangible, non-transient memory of the toy;
12. (canceled)
13. The method of claim 11, wherein the conditional rule provides instructions to perform a second action different from the first action if the query result does not satisfy the first condition.
14. The method of claim 11, wherein step a) further comprises selecting a trigger condition on which to initiate the query of the information resource.
15. The method of claim 14, wherein step a) further comprises selecting a trigger condition on which to initiate performing the action.
16. The method of claim 11, wherein the information source is one of a weather information source, a news information source, and a social media information source.
17. A method for executing a macro by an electronic toy having a processor, a non-transitory memory, and a wireless interface connected to a wireless network, the method comprising: wherein the action is one of generating sound at a sound generator of the toy and activating a motion actuator of the electronic toy.
- a) receiving macro instructions over the wireless network;
- b) storing the macro instructions in the memory of the electronic toy;
- c) executing the macro instructions after step b), the macro instructions including instructions to: i) query a remote information resource via the wireless interface, ii) receive a result of the query, iii) apply a conditional rule to the query result, the conditional rule providing instructions to perform an action if the query result meets a first condition; and
- d) performing the action, response to executing the macro instructions;
18. The method of claim 17, wherein the macro instructions further include instructions to query the remote information resource after a trigger event.
19. The method of claim 17, wherein the query result is a numeric value and the conditional rule includes comparing the numeric value to a threshold value.
20. The method of claim 17, wherein the motion actuator is one of a leg actuator, an arm actuator, an eye actuator, and a mouth actuator.
21. The method of claim 17, wherein the information resource is one of a weather resource, a news resource, and a social media resource.
22. The method of claim 1, wherein the conditional rule is a rule to differentiate the query result using a Boolean yes/no condition.
23. The method of claim 1, wherein the conditional rule is a rule to differentiate the query result based on predetermined keywords.
24. The method of claim 11, further comprising the steps of:
- c) storing the macro in the memory of the toy;
- d) performing the query of the information source on the remote network via a wireless interface within the toy;
- e) applying the conditional rule to the result of the query; and
- f) performing the first action, by the toy.
Type: Application
Filed: Aug 6, 2012
Publication Date: Feb 6, 2014
Applicant: BBY SOLUTIONS (Richfield, MN)
Inventors: Anshuman Sharma (St. Louis Park, MN), Patrick McGinnis (Chanhassen, MN), Todd Coate (Minneapolis, MN), Newton Guillen (Plymouth, MN)
Application Number: 13/567,490
International Classification: A63H 30/02 (20060101); A63H 3/28 (20060101);