INTERACTIVE PLUSH TOY

- BBY SOLUTIONS

A system is presented for controlling an interactive electronic plush toy using a macro computer program created on a mobile electronic device using a mobile application. The plush toy is provided with an electronic wireless network interface for communicating wirelessly with the smartphone via a network. The macro provides computer programming for instructing the plush toy to perform an Internet search query, receive results of the query, and perform predetermined actions based on the query results. The actions may include producing sound or motion to make the toy seem life-like.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
FIELD OF THE INVENTION

The present application relates to the field of electronically-controlled toys. More particularly, the described embodiments relate to a plush toy that is controlled by a smartphone application or “app.” The plush toy may connect wirelessly to the Internet and may be programmed to perform actions based on information obtained from the Internet.

SUMMARY

One embodiment of the present invention provides a plush toy having built-in wireless communication features. A smartphone app may wirelessly control the plush toy's movement, sounds, or functions by assigning a predetermined macro computer program to the toy, or by creating a series of instructions for the toy to follow. In one aspect, a user may create a macro program that instructs the toy to query a resource, receive results, apply a conditional rule to the result, and perform one or more actions indicated by the rule. The toy may have moving parts, may play sounds, may play pre-recorded or real-time messages, may have text-to-speech features, and may play back custom voice recordings. The toy may have integrated sensors and respond to ambient factors such as movement, sounds, or light. The toy may be programmed to perform functions based on the time of day. The smartphone app may define tasks such as pulling weather information, financial information, or social media information from the Internet.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a schematic diagram depicting a system for controlling an interactive plush toy.

FIG. 2 is a flow chart showing steps in a method for creating a macro.

FIG. 3 is a flow chart showing the steps for creating an exemplary macro.

FIG. 4 is a flow chart detailing steps in a method for executing a macro.

FIG. 5 is a schematic diagram depicting the primary elements of an interactive plush toy.

FIG. 6 is a diagram showing elements of a smart phone app used to program a macro for an interactive plush toy.

DETAILED DESCRIPTION

FIG. 1 shows a system for controlling an interactive plush toy. The system includes a plush toy 100, a handheld mobile communication device such as a smartphone 150, a network 160 such as the Internet, and information resources 170, 180, and 190. The plush toy is represented in FIG. 1 as a schematic of circuitry inside a plush toy 100. Although toy 100 is described as a “plush toy” in relation to the various embodiments, toy 100 may also be a doll, an action figure, a toy vehicle, or other plaything. Device 150 is described as a “smartphone,” but could also be a tablet computer, a music player, a digital watch, or any other mobile device with computing power able to execute a mobile application.

Smartphone 150 and plush toy 100 have wireless interfaces 151 and 155 for wireless data communication between plush toy 100 and smartphone 150. Plush toy 100 and smartphone 150 are also each able to connect wirelessly to a remote network 160 such as the Internet. Wireless interfaces 151, 155 may comprise one or more of Bluetooth, Wi-Fi, cellular GSM, cellular CDMA, or other wireless data communication interfaces. Wireless interfaces 151, 155 may also be capable of using more than one type of wireless data communication. For example, smartphone 150 may connect to plush toy 100 via Bluetooth communication and connect to network 160 via cellular communication.

Remote information resources 170, 180, and 190 are accessible to smartphone 150 and plush toy 100 through network 160. In the exemplary embodiment, weather information 170 may include temperature, precipitation, and pollen count information. Weather information 170 could also include weather forecasts, severe weather information, driving conditions, UV index, or other weather-related information. News information 180 may include national news headlines, local news, and stock news. News information 180 could also include category-specific news such as technology or political news. Social media 190 may include social networks and email, but may also include news sources, blogs, Internet forums, podcasts, video, etc. Information resources 170, 180, 190 may comprise databases that are accessible through data feeds, news feeds, RSS feeds, or other data dissemination methods. Data may also be extracted from information resources 170, 180, 190 via known data scraping or web scraping techniques. Other information resources accessible to smartphone 150 and plush toy 100 could include sports, horoscope, travel, shopping, and other types of information resources available via a network 160.

Smartphone 150 may include a plush toy control app 152 for controlling and programming plush toy 100. As explained further in relation to FIGS. 2-6, plush toy app 152 allows a user to create a macro for plush toy 100 by defining a search for a query, defining conditional instructions indicating when to perform the query, defining conditional rules differentiating the results of the query, and defining actions to perform based on the query results. After plush toy app 152 finishes programming a macro, the macro is sent to the plush toy 100 and stored in memory 135 as macro 130. Plush toy 100 may execute the macro 130 by performing the steps of executing a query, receiving the query results, comparing the query results to the conditional rules, and performing actions provided in the conditional rules, based on the query results. There are numerous ways to implement plush toy app 152 that would be known to one skilled in the art.

Plush toy 100 has numerous interactive features to make plush toy 100 seem life-like. The features may include movement capabilities such as movement 115, which includes moving the head, arms, legs, eyes, and mouth of plush toy 100. Movement capabilities may be implemented electronically or mechanically, and may be software- or hardware-controlled. In one embodiment, plush toy 100 could contain movement actuators using gears, pulleys, etc. connected to one or more servo motors or stepper motors controlled by processor 145. Other implementations of movement in a plush toy 100 will be known to one skilled in the art.

Plush toy 100 has a digital processor 145 in data communication with a tangible, non-transitory memory 135. In one embodiment, memory 135 includes both non-volatile memory, such as magnetic-based memory or flash memory, and volatile but faster access memory, such as RAM Processor 145 executes computer instructions and may control movement, sensors, and sound production within plush toy 100. Processor 145 may be a general purpose processor such as those processors based on the ARM processing architecture by ARM Holdings PLC (Cambridge, UK). Memory 135 may contain macros 130 with instructions for accessing information over a network such as the Internet, querying databases or other information sources, monitoring sensors 105 within the plush toy 100, and instructing the plush toy 100 to produce movement and generate sounds. Memory 135 may also contain generally programming 132 that controls the basic features of the plush toy 100. The general programming may, for instance, instruct the plush toy 100 to make a particular noise when a pressure sensor in an arm of the plush toy 100 sensors pressure. The general programming 132 also instructs the plush toy 100 when and how to communicate over wireless interface 155 to the network 160, and how to download macros 130 from the network 160. On one embodiment, the macros 130 contain instructions that are interpreted and implemented on the processor 145 by the general instructions 132. The general instructions 132 and macros 130 may be loaded from the non-volatile portions of the memory 135 into the volatile but faster RAM portion of the memory 135 in order to speed up execution of the macro 130 and instructions 132.

Sensors 105 are integrated within plush toy 100 and may be software- or hardware-controlled and monitored. In the preferred embodiment, sensors 105 monitor various conditions internal and external to plush toy 100. Sensors 105 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, pressure sensor, or other types of known sensors. Sensors 105 may operate continuously, or may be activated on demand by computer instructions within plush toy 100 or instructions received from outside of plush toy 100. In some embodiments, a sensor 105 is provided that monitors the wireless interface 155 for a trigger signal. This allows a Bluetooth connection or a WiFi connection to receive signals that trigger actions in the plush toy 100.

Movement features 115 of plush toy 100 allow for physical movement of the body, head, arms, legs, eyes and other parts of plush toy 100, and may include light-up features of plush toy 100. Movement 115 may be controlled by macros 130 or as part of the general instructions 132. Sound generating features 125 includes pre-recorded sound, text-to-speech features, playback of live audio, or other types of sound. Sound generating features 125 preferably are coordinated with movement features 115, for example to synchronize the mouth of plush toy 100 with speech sounds.

FIG. 2 presents a method for creating a macro for use with plush toy 100 of FIG. 1. FIG. 3 presents a specific implementation of the method of FIG. 2, in which a macro is programmed to access weather information via a network such as the Internet, and then perform actions based on the retrieved information. The macro is preferably created at the smartphone 150 using plush toy app 152.

In the first step 210 of FIG. 2, a user of a smartphone 150 uses plush toy app 152 to create a macro. Step 210 includes choosing an information resource and a query for that information resource. The information resource may be one or more of information resources 170, 180, 190 of FIG. 1, but could also include other types of resources accessible over a network 160. Step 210 may include choosing a database, webpage, URL, or other web resource to query, and preferably includes a search term or search parameter.

In step 220, the user chooses an execute trigger for the macro. The execute trigger is a conditional statement indicating when to execute the steps of the macro. The execute trigger condition may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition determined by a sensor 105 on the plush toy. Alternatively, the trigger could be the powering on or powering off of the plush toy 100. The trigger may be recurring or intermittent. The user may choose in step 220 to program the macro to execute an unlimited number of times, or may choose to program the macro to run only a specified number of times, after which the macro will not run.

In step 230, the user of the plush toy app 152 defines conditional rules indicating how to differentiate the query results returned from the information resource as a result of the query. The plush toy app 152 may have built-in or predetermined instructions for step 230, or the user may be given flexible choices for differentiating the query results. In the preferred embodiment, at step 230 the macro is programmed to evaluate the query results using conditional rules, then determine actions for the plush toy 100 based on an applied rule. The rules may include being above or below a certain threshold, may be Boolean yes/no conditions, may be based on predetermined keywords, or other programmable rules. It is possible to create macros that contain no conditionals in step 230.

In step 240 the macro is programmed to define actions to perform based on the differentiated query results. In one embodiment, an action is performed for every query result. In an alternate embodiment, an action may be performed for some query results, while no actions are performed for other query results. The actions to be performed may include movement, sound projection, speech-to-text, illuminating lights, or other actions. The actions to be performed may include executing a second macro and performing the steps of the second macro.

In step 250, the user of plush toy app 152 defines a return trigger indicating when to perform the actions defined in step 240. In step 250 the user chooses a return trigger for the macro. The return trigger may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition. In one embodiment the plush toy 100 waits until a return trigger condition is met before executing the action defined in step 240. In another embodiment, no return trigger is defined in step 250 and the plush toy 100 may perform the actions defined in step 240 immediately without waiting for a return trigger condition. The programming of the macro is finished when step 250 is complete. In step 260, the macro programming instructions are transferred wirelessly from the plush toy app 152 and saved in the memory 135 of plush toy 100 as a macro 130. The wireless transfer may be made locally through a connection such as a Bluetooth connection, or may be made over a remote network such as the Internet. If the wireless transfer of the macro 130 is made via the Internet, the smartphone 150 does not need to be near the plush toy 100 to send macro instructions.

FIG. 3 is an exemplary macro, such as macro 130 of FIG. 1, which may be programmed at a plush toy app 152. The function of the macro 130 is to perform an Internet query of a weather information resource to determine the outside temperature at 7 AM. The results of the query are numeric, and are differentiated by particular numeric thresholds. If the temperature query result returns a number below a certain threshold, the macro 130 instructs the plush toy 100 to perform a first set of actions; if the results are above a particular threshold the macro 130 instructs the plush toy 100 to perform a second set of actions. If the numeric result is between the two thresholds, the plush toy 100 makes no action. The macro 130 instructs the plush toy 100 to perform the chosen action after a motion sensor is triggered.

Turning specifically to the steps in FIG. 3, in step 310 the macro 130 is programmed to query an information resource 170 for weather information. At step 312, the macro defines the query for the weather information resource 170, asking the resource to return the current temperature (in degrees Fahrenheit) for a particular geographic location. At step 314, a trigger is defined that causes the macro to perform this query at 7:00 AM every day. Other triggers (not shown) could be defined, for example, to only execute the macro 130 on Mondays, or to only execute the macro 130 a particular number of times, then discontinue executing the macro. The macro 130 could also be programmed to execute more than one time per day, or only in response to a predetermined external stimulus detected by a sensor 105, such as a voice command.

In step 320, the macro 130 is programmed to receive the results of the query, after which conditional rules are applied. In the exemplary embodiment of FIG. 3, the results of the query are numeric and are represented in FIG. 3 by the variable “TEMP.” These results are differentiated in three steps. In step 332 the query result is tested against a first conditional statement. Step 332 determines whether the numeric value of “TEMP” is less than 55 degrees. If step 332 returns a negative, the result is tested at step 334, which determines whether “TEMP” is less than 70 degrees. If step 334 returns a negative, then it is known that “TEMP” is greater than or equal to 70 degrees, as indicated in FIG. 3 by box 336. In the example of FIG. 3, all numerical (e.g., integer) query results will follow one of the paths at steps 332, 334, and 336. In alternate embodiments, some query results may not fulfill one of the given conditions. In such a case a default action could be indicated for all query results that do not meet any of the conditional rules, or alternatively no action would result in these circumstances.

Once the differentiating conditional options of steps 332, 334, and 336 are programmed, actions are assigned for each result. If the conditional statement at step 332 is satisfied, the method proceeds to step 342. The macro programmer may choose one or more actions and sounds to perform for the particular query result. In the exemplary embodiment of FIG. 3, the user programs the macro 130 at step 342 to cause the plush toy 100 to move the mouth, eyes, and arms, and to produce the sound “Remember your jacket!” through either text-to-speech functions, pre-recorded sound, or by recording a custom voice sound. At step 344, the user programs the macro 130 to take no action when the condition at step 334 is met. At step 346, the user programs the plush toy 100 to move the mouth and eyes and produce the sound “You won't need a coat today!”

After the actions to be performed are defined at steps 342, 344, and 346, the user at the plush toy app 152 may define a conditional statement indicating when to perform the actions 342, 344, or 346. In one embodiment the actions may be performed immediately without waiting for a conditional return trigger to occur. In the exemplary embodiment of FIG. 3, the user chooses to make the plush toy 100 wait until motion is sensed by the plush toy before performing one of the actions 342, 344, and 346. By programming this step 348 the user of plush toy app 152 will decrease the likelihood that the plush toy 100 performs the actions when no person is around. The programming at step 348 will cause the plush toy 100 to wait for motion to be sensed, and if no motion is sensed, plush toy 100 will continue to monitor using its motion sensors. In one embodiment, the repeating step of waiting for motion to be sensed in step 348 can time out after a predetermined time period, such as two hours. At step 350, the user programs the plush toy 100 to perform the determined step 342, 344, 346 when the conditional statement at step 348 is satisfied.

FIG. 4 shows a method for executing a macro in a system such as the system shown in FIG. 1 having a plush toy 100 with a memory 135 and a processor 145. In the system the plush toy 100 is in wireless data communication with a network 160 and may access information resources such as information resources 170, 180, or 190 over the network 160. The first step 400 is for the plush toy 100 to receive the macro 130 that was programmed by the mobile device 150 and store the macro 130 in memory 135. In step 410 macro programming 130 then begins executing by instructing processor 145 to perform an Internet query of an information resource 170, 180, or 190 over a network 160 using the defined search query. In step 420 the results of the query are received at the processor 145. In step 430 the processor 145 compares the query results to predetermined results conditionals specified in the macro programming 130. Based on the results conditionals, the processor 145 determines actions to perform at the plush toy 100 at step 440. In step 450 the plush toy 100 waits for one or more events determined by macro 130 indicating when to perform the actions. At step 460 the processor 145 determines that the conditionals have been satisfied, and at step 470 the plush toy 100 performs the actions. As explained above, it is possible for no results conditionals to be defined in the macro programming 130, in which case the actions will be performed in step 470 immediately after determining the actions to be performed in step 440, thereby skipping steps 450 and 460.

FIG. 5 shows an exemplary plush toy to be used with the methods and system described herein. Plush toy 500 comprises a processor 510 that controls the operations of plush toy 500. A wireless interface 550 connects plush toy 500 to a network such as network 160 and a smartphone such as smartphone 150 of FIG. 1. Wireless interface 550 may be one or more of Bluetooth, Wi-Fi, cellular GSM, cellular CDMA, or other wireless data communication interfaces.

A memory 511 is operably connected to processor 510, and stores macros 512 and general instructions 513 for the operation of the plush toy 500. In the preferred embodiment a macro is created on a mobile electronic device and sent via a wireless signal to be stored in the memory 511 of plush toy 100. The macros 512 may be macros programmed according to FIG. 3. Multiple macros 512 may be programmed and transmitted to the plush toy 500 via wireless interface 550. It is contemplated that plush toy 500 would run more than one macro at a time. Allowing the toy 500 to perform multiple actions at various times in response to a number of different stimuli will make the toy 500 more interactive and lifelike.

Processor 510 is responsible for controlling motion and producing sound in plush toy 500 according to the general instructions 513 store in memory 511. Processor 510 also receives input from sensors 530 and uses the sensor data in connection with macros 512. Sensors 530 of the plush toy 500 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, or other types of known sensors. Plush toy 500 may have a speaker and amplifier 524 for generating sound. Text-to-speech functions 540 may be used to produce sound. As is true with the other capabilities of the plush toy 500, text-to-speech functionality 540 may be implemented in software as part of the general instructions 513, or may be implemented as hardware, such as in a special purpose processor designed to convert text data to audible speech. Pre-recorded sound may be stored in memory 511 and be projected through speaker 524.

Plush toy 500 has motion actuators including arm actuators 520, leg actuators 521, eyes actuator 522, and mouth actuator 523, which produce movement and make plush toy 500 appear life-like. Motion actuators of plush toy 500 may include small, rotating motors connected to gears, pulleys, cams, or levers. Servo or stepper motors connected to the processor 510 may drive motion in actuators 520-523. Processor 510 may control mouth actuator 523 and sound generator 540 to synchronize speech sounds with mouth motion of plush toy 500. The movement features of plush toy 500 may be implemented in a number of different ways that are known to one skilled in the art.

FIG. 6 shows an exemplary embodiment of a smartphone or mobile application for creating a macro program to be used with a toy such as plush toy 500. A user interface 600 for a plush toy app 601 has features for programming a macro and sending the macro 512 via a wireless connection to the plush toy 500. In the preferred embodiment, the smartphone has a touch-sensitive screen for interacting with user interface 600, making plush toy app 601 very easy to use. The user interface 600 shows an edit window 610 containing various modifiable fields. In a preferred embodiment, the plush toy app 601 has predefined macro templates to simplify macro creation. Temperature macro 620 is an example of such a template. A user is able to edit certain functional elements of the macro 620 without having to understand the underlying computer programming.

In one embodiment, macro templates are created to ease the creation of common macro types. For example, FIG. 6 could be created based upon a weather macro template that simplifies the creation of a macro for the plush toy that is based on the weather conditions in a location. In this example, the macro queries the outdoor temperature at a location. The macro 620 allows a user to edit the search query data field 621. The search query data field 621 defines the instructions that are used to search weather information such as weather information resource 170 of FIG. 1 over a wireless network 160. Search data field 621 is shown searching for the temperature in degrees Fahrenheit, but the data field 621 could also search other weather attributes such as precipitation forecasts and pollen count. This field could be edited by clicking on edit search button 611 in the edit macro 610 portion of user interface 600. The application would then allow a user to change one or more of the variables in the search field 621, such as the desired weather condition or the relevant geographic location. By choosing edit run settings button 612 the user may change the conditions under which the macro 620 will perform its functions. Run settings field 622 indicates that the macro will be executed when the clock within plush toy 500 registers 7:00 AM. The run settings field 622 could support other programming instructions. For example, the macro 620 could be programmed to run at different times on different days, or to run in response to a voice command, a motion detection, a light detection, or other type of stimulus internal or external to plush toy 500. In the preferred embodiment the macro programming 620 causes processor 510 to monitor one or more sensors 530 (including the clock) to determine when to run a macro program 620. Edit results options button 613 allows a user to change the results options field 623 for macro 620. In the embodiment of FIG. 6, the search query results are differentiated by numerical temperature. In a first case, the results option field 623 indicates that if the temperature is below 55 degrees the macro 620 will cause the processor 510 to perform a first set of actions. If the temperature is above 70 degrees the processor 510 will perform another set of actions. In an alternate scenario the macro 620 could be programmed to search for other information, for example, precipitation information. The results options 623 would be separated by different variables in that case. For example, the search query may query a weather information resource and return results indicating a percent chance of precipitation. In this case the results option 623 could instruct to perform a particular action if the chance of rain is below twenty percent, and perform another action if the chance of precipitation is fifty percent or higher.

Edit sounds button 614 and edit movement button 615 allow a user to program the actions that the plush toy 500 will perform after receiving results of a query. In the exemplary embodiment the data field 624 shows that the plush toy 500 will use text-to-speech features to play back the phrase “you won't need a coat today.” The data field 624 specifically shows the action to take when the weather query results are above 70 degrees. The plush toy app 601 would also allow the user to choose sounds when the weather query results indicate that the temperature is below 55 degrees. Movement instructions are given in data field 625. In the example shown, when the temperature is above 70 degrees the processor 510 of plush toy 500 will cause the arms actuators 521 to wave, and will cause the mouth actuator 532 to “lip synch” along with the text-to-speech sounds projected by speaker and amplifier 524. Plush toy app 601 would also allow the user to program motions for the plush toy 500 to perform when the results show that the temperature is below 55 degrees.

Edit return options button 616 allows the user to edit the conditions under which plush toy 500 will perform the sound 624 and movement 625 actions. In the example of FIG. 6, return options field 626 instructs the plush toy 500 to signal sensors 530 of FIG. 5 to activate a motion sensor and wait until the motion sensor senses a movement before returning the results and performing the specified actions 624, 625. Other sensors could be employed, such as a light sensor, a speech sensor, an accelerometer, a clock, etc. The user could also choose that the results be returned immediately, without waiting for a sensor 530.

When the programming for macro 620 is complete, the plush toy app 601 will wirelessly send the macro program to plush toy 500 when the user selects the “send to my toy” button 680. The plush toy app 601 may send the macro directly to plush toy 500 via a wireless interface using a local wireless signal, for example through a Bluetooth connection. Plush toy app 601 may also route the macro programming 620 through a network such as the Internet. In this case, plush toy 500 would receive the wireless signals via a wireless Internet connection. When plush toy app 601 uses the Internet to communicate with plush toy 500, a smartphone does not need to be in close proximity to plush toy 500 to create and send the macro program 620.

It is possible to use the app 601 to program multiple plush toys 500. In this case, the send to my toy button 680 would require the user to identify which toy is being programmed by this macro. In a multi-toy environment, it is possible to program interactions between toys 500. These interactions usually require that one toy 500 perform an action that is sensed by the other toy 500. For instance, a first toy could be programmed as described above, to query the outdoor temperature at 7:00 am, wait for movement in the room, and then speak the words “You won't need a coat today.” The other toy could also be programmed so that at 7:00 am it will query the predicted temperature tomorrow, and may discover that the high tomorrow will be 92 degrees. The second toy would wait for the first toy to state its line (such as by waiting for a sound sensor to hear the words, or waiting until the second toy senses movement, then senses sound, and then waits 5 seconds). When this occurs, the second toy then states “and tomorrow looks like a hot one. It will be over ninety degrees tomorrow.” To improve interaction between toys 500, toys could include the ability to trigger one another, such as through unique sounds, or even wireless digital connections operating between the wireless interfaces 550 of each toy 500. Macros could be programmed to initiate upon receipt of a signal from a companion toy 500, cause an action to be performed, and then send a return signal to the companion toy 500 to trigger an additional macro at the companion toy. By combining this type of interaction with the ability to query external data sources, complicated and informative interactions between toys could be developed.

The many features and advantages of the invention are apparent from the above description. Numerous modifications and variations will readily occur to those skilled in the art. For example, the plush toy app could be implemented as a control panel attached to the plush toy. Since such modifications are possible, the invention is not to be limited to the exact construction and operation illustrated and described. Rather, the present invention should be limited only by the following claims.

Claims

1. A system for controlling an electronic toy, comprising:

a) a mobile electronic device having a device processor, a device wireless communication interface in data communication with the electronic toy over a wireless data connection, a tangible, non-transitory device memory, and a mobile application residing on the device memory, the mobile application including macro instructions transmitted wirelessly to the toy over the wireless data connection; and
b) the electronic toy having i) a toy processor, ii) a tangible, non-transitory toy memory, iii) a toy wireless communication interface in data communication with the mobile electronic device over the wireless data connection, iv) a processor-controlled apparatus selected from a set comprising a sound generator and a motion actuator, and v) the macro instructions received from the mobile electronic device via the wireless data connection and stored in the toy memory, the macro instructions configured to cause the toy to (1) perform a query of an external information resource over the toy wireless communication interface, (2) receive a query result, (3) apply a conditional rule to the query result, and (4) cause the processor-controlled apparatus to perform an action indicated by the conditional rule as applied to the query result.

2. The system of claim 1, wherein the information resource is a weather information resource and the query includes a request for at least one of temperature, precipitation, UV index, and pollen count information.

3. The system of claim 1, wherein the performed action is generating sound on the sound generator located within the toy, the sound being at least one of recorded sound and text-to-speech sound.

4. The system of claim 1, wherein the action performed is activating the motion actuator of the toy.

5. (canceled)

6. (canceled)

7. The system of claim 1, wherein the query result is a numeric value and the conditional rule differentiates the query result by comparing the numeric value to a threshold value.

8. The system of claim 1, wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor and send the query after the sensor makes a detection.

9. The system of claim 8, wherein the sensor is one of a motion sensor, a sound sensor, a light sensor, a voice sensor, a clock, and an accelerometer.

10. The system of claim 1, wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor after receiving the query result and perform the action after the sensor makes a detection.

11. A method for controlling a processor-controlled interactive toy via a mobile device, the mobile device having a processor, a wireless network interface, a user interface, and a tangible, non-transitory memory, the method comprising: wherein the conditional rule provides instructions for the toy to perform the first action if the query result satisfies a first condition.

a) creating a macro, at the mobile device, by selecting, through the user interface, i) a search query for performing a query of an information source on a remote network, ii) a conditional rule to apply to a result of the query, iii) a first action for the toy to perform, the action including at least one of generating sound at a sound generator of the toy and activating a motion actuator of the toy; and
b) transmitting the macro from the mobile device to the toy via the wireless network interface of the mobile device to be stored in a tangible, non-transient memory of the toy;

12. (canceled)

13. The method of claim 11, wherein the conditional rule provides instructions to perform a second action different from the first action if the query result does not satisfy the first condition.

14. The method of claim 11, wherein step a) further comprises selecting a trigger condition on which to initiate the query of the information resource.

15. The method of claim 14, wherein step a) further comprises selecting a trigger condition on which to initiate performing the action.

16. The method of claim 11, wherein the information source is one of a weather information source, a news information source, and a social media information source.

17. A method for executing a macro by an electronic toy having a processor, a non-transitory memory, and a wireless interface connected to a wireless network, the method comprising: wherein the action is one of generating sound at a sound generator of the toy and activating a motion actuator of the electronic toy.

a) receiving macro instructions over the wireless network;
b) storing the macro instructions in the memory of the electronic toy;
c) executing the macro instructions after step b), the macro instructions including instructions to: i) query a remote information resource via the wireless interface, ii) receive a result of the query, iii) apply a conditional rule to the query result, the conditional rule providing instructions to perform an action if the query result meets a first condition; and
d) performing the action, response to executing the macro instructions;

18. The method of claim 17, wherein the macro instructions further include instructions to query the remote information resource after a trigger event.

19. The method of claim 17, wherein the query result is a numeric value and the conditional rule includes comparing the numeric value to a threshold value.

20. The method of claim 17, wherein the motion actuator is one of a leg actuator, an arm actuator, an eye actuator, and a mouth actuator.

21. The method of claim 17, wherein the information resource is one of a weather resource, a news resource, and a social media resource.

22. The method of claim 1, wherein the conditional rule is a rule to differentiate the query result using a Boolean yes/no condition.

23. The method of claim 1, wherein the conditional rule is a rule to differentiate the query result based on predetermined keywords.

24. The method of claim 11, further comprising the steps of:

c) storing the macro in the memory of the toy;
d) performing the query of the information source on the remote network via a wireless interface within the toy;
e) applying the conditional rule to the result of the query; and
f) performing the first action, by the toy.
Patent History
Publication number: 20140038489
Type: Application
Filed: Aug 6, 2012
Publication Date: Feb 6, 2014
Applicant: BBY SOLUTIONS (Richfield, MN)
Inventors: Anshuman Sharma (St. Louis Park, MN), Patrick McGinnis (Chanhassen, MN), Todd Coate (Minneapolis, MN), Newton Guillen (Plymouth, MN)
Application Number: 13/567,490
Classifications
Current U.S. Class: Having Light-or Sound-responsive Switch Or Control (446/175); Having Sounding Means (446/297)
International Classification: A63H 30/02 (20060101); A63H 3/28 (20060101);