Dynamic App Programming Environment with Physical Object Interaction

An app programming environment that enables a smartphone or tablet app to incorporate real toys or objects into the app, by use of a NFC chip located in the toy or object and a database comprising information regarding the toy or object used.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application takes priority from Provisional App. No. 62/044,310, filed Aug. 31, 2014, which is incorporated herein by reference.

BACKGROUND

Many children's games and toys, including educational games and toys, incorporate tablets and smartphones. However, a smartphone or tablet can only offer a virtual environment rather than any real objects. Since children are concrete, tactile thinkers, a virtual environment is often insufficient to hold their interest or to allow them to learn effectively.

While some games and toys exist that incorporate real objects into virtual environments, such games and toys typically are limited to particular types of physical objects (i.e. alphabet blocks), and do not provide for unbounded imaginative play.

A need therefore exists for a way that allows a child to incorporate real toys in a tablet or smartphone based game, and a programming environment that allows app developers to incorporate real toys in their apps in an easy and standardized way.

A need also exists for a tablet or smartphone based virtual environment that responds to the manipulation of real toys and that is open-ended and responsive to the child's actions, rather than scripted.

SUMMARY OF THE INVENTION

The system of the present invention comprises at least one physical manipulative, a computing device that comprises a display, a user interface, an identification module for identifying the physical manipulative, and a communication module for communicating with a server; a server that comprises a database listing all the types of physical manipulatives used in the game and at least one descriptive attribute for each physical manipulative; where the computing device is configured to identify the physical manipulative, look up at least one descriptive attribute associated with it, and display an animation related to the at least one descriptive attribute.

The physical manipulative can be an animal figure, a cartoon character figure, a doll, an action figure, a vehicle, or any other toy or shape.

The descriptive attribute of the physical manipulative can be any or all of the following: personality, habits, sounds, phrases, geographic origin, size, diet, spelling of the name, pronunciation of the name, at least one Internet link.

The identification module can be a camera, a NFC reader, a QR reader, a bar code reader, a RF receiver, a sound detection device, or any combination of the above.

In an embodiment, the computing device can also detect a motion pattern of the physical manipulative (i.e. a horse being moved in a “galloping” motion) and display an animation relating to the motion pattern.

In an embodiment, the computing device can also detect an orientation of the physical manipulative (i.e. upside-down or right-side-up) and display an animation relating to the orientation.

In an embodiment, the system comprises at least two physical manipulatives, and the computing device can also detect the relative or absolute position or orientation of each physical manipulative, as well as their relative position with respect to each other, and display an animation relating to the interaction between the physical manipulatives.

In an embodiment, the system further comprises a sound input device, and the computing device can detect a sound received by the sound input device, look up the sound in the database, determine the physical manipulative associated with the sound, and display an animation relating to the physical manipulative and the sound.

In an embodiment, the system can detect and respond to voice commands.

In an embodiment, the physical manipulative emits sounds and the computing device can detect sounds emitted by the physical manipulative and display an animation relating to the physical manipulative. In an embodiment, the physical manipulative only emits sounds when it is squeezed, pressed, or moved. The energy used to make the physical manipulative emit the sound may be generated by the action of squeezing, pressing, or moving it.

The method of the present invention comprises detecting the presence of at least one physical manipulative near a computing device, identifying the at least one physical manipulative, using the computing device to look up at least one descriptive attribute associated with the at least one physical manipulative, and using the computing device to display an animation relating to the at least one descriptive attribute.

In an embodiment, the method may also comprise detecting a motion pattern of the physical manipulative near the computing device, and using the computing device to display an animation relating to the at least one descriptive attribute and the motion pattern.

In an embodiment, the method may also comprise detecting the presence of a second physical manipulative near the computing device, using the computing device to look up at least one second descriptive attribute associated with the second physical manipulative, and using the computing device to display an animation relating to both descriptive attributes.

In an embodiment, the method may also comprise detecting the presence of a sound near the computing device, using the computing device to look up a physical manipulative associated with the sound and at least one descriptive attribute associated with the physical manipulative, and using the computing device to display an animation relating to the at least one descriptive attribute.

In an embodiment, the sound is emitted by the physical manipulative itself.

The descriptive attribute can be at least one of the following: personality, habits, sounds, phrases, geographic origin, size, diet, spelling of a name, pronunciation of a name, link to the Internet.

The at least one physical manipulative can be an animal figure, a cartoon character figure, a doll, an action figure, a vehicle.

The step of identifying the at least one physical manipulative may be performed by a QR reader, a bar code reader, a camera, a NFC reader, a RF receiver, or a sound detection device.

LIST OF FIGURES

FIG. 1 shows a diagram of an embodiment of the system of the present invention.

FIG. 2 shows a diagram of an alternate embodiment of the system of the present invention.

FIG. 3 shows a diagram of the operation of the software side of the present invention.

FIG. 4A shows a diagram of the operation of the present invention when a new toy is introduced.

FIG. 4B shows a diagram of the operation of the present invention when a toy is removed.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

In the preferred embodiment, the system of the present invention comprises a tablet or other mobile device such as a smartphone (hereafter referred to as a “tablet”), and at least one physical toy, said physical toy incorporating a NFC chip or a similar chip for storing information. The tablet comprises a means of reading the NFC chip (or similar chip; throughout this disclosure, it will be understood that any chip that can be used for storing information and can be read by a tablet can be used as an alternative to NFC). The tablet preferably also comprises a gaming or educational app that provides an interactive experience for a child.

FIG. 1 shows an example of the system of the present invention. Physical toy 300 comprises a NFC chip 330 that encodes various information about the toy. The NFC chip communicates with a tablet 310, which reads the information on the chip and uses that information to detect what toy is in front of the tablet. The tablet then communicates with a server 200 to access various information relating to the toy, the animal represented by the toy, the animal's habits and the noises it makes, links to various websites related to the animal, and so on. The server 200 preferably comprises a database that stores any such descriptive attributes. The descriptive attributes are then used to display animations within the app, to perform various actions within the app, to add characters to a game app, to display information in an educational app, and so on.

FIG. 2 shows an alternate embodiment of the system of the present invention, wherein the physical toy does not communicate directly with the computing device but rather with a short range to long range router/converter 420. Note that the server is not shown in this Figure, but nevertheless is present. In that embodiment, the router/converter 420 communicates with the toy 400 via NFC link 430, and then communicates with the tablet or other computing device via a longer-range connection 440 (such as Bluetooth, wi-fi, or any other wireless link). In that embodiment, the toy does not need to be as close to the tablet in order to be recognized.

It will be understood that while the preferred embodiment uses NFC chips, any other communication method is also acceptable. The toy itself may communicate with the computing device via Bluetooth or wi-fi. However, it reduces the cost and complexity of the system to use NFC communication.

As shown in FIGS. 1 and 2, the toy may also comprise a sound generator 340. In the preferred embodiment, the sound generator requires no external power or batteries—i.e. either a mechanical sound generator such as a squeaker or a rattle, or an electromechanical sound generator where the power is supplied by a user's handling of the toy (i.e. squeezing the toy or moving it around). The sound generator preferably serves two functions—providing an additional entertainment experience for the child, and serving as an additional way for the tablet 310 to identify the toy 330 (assuming that each sound is unique to each toy). In an embodiment, the sound may also trigger the tablet to display a special animation, play a special sound, or perform some other action within a game.

The app programming environment of the present invention is shown in FIG. 3. The command center comprises a relational database that comprises each physical object that is embedded with a NFC chip, and descriptive attributes of each such physical object. For example, the descriptive attributes can be the name of the toy, the dimensions of the physical toy, the spelling and pronunciation of its name, the type of toy it is (animal, cartoon character, vehicle, etc.), the geographic origin of the animal or character, various characteristics of the animal or character, and so on. This information is preferably stored in the cloud. The information may be customized to allow for different programming environments and different apps.

An app developer wishing to include physical devices into their app can thus include the attributes of the physical device into the app experience. When the tablet senses a NFC chip belonging to a particular toy, it can trigger the app to perform some action or to make some change.

For example, if an app is running and the tablet senses a plastic doll of a lion and a plastic doll of a giraffe, the giraffe and lion 3D graphic files are pulled from the cloud and dynamically introduced into the app's activities. The main characters of the app can then change into giraffes and lions that are identical to the representations of the dolls in color, size, and realism levels.

In another embodiment, the app can also pull characteristics of both animals or characters from the cloud and introduce them into the app's activities. The action of the app can then change to incorporate the interaction of a carnivore vs. a large vegetarian, in the case of the giraffe and the lion, for example. The characteristics can include behavior, food habits, reactions to external factors, sounds made by the animal, and so on.

In an embodiment, the tablet senses not just the presence of a toy, but also its movement patterns. For example, a child moving the toy in a “running” motion will make the corresponding animal “run” on the screen. A child moving the toy from right to left will make the corresponding animal move from right to left on the screen. In the preferred embodiment, the movement patterns of the toy are detected and identified by the tablet's built-in camera, or by another camera connected to the tablet.

The present invention may be used with a wide range of apps. For example, game apps may use physical manipulatives to control game play; educational apps may use physical manipulatives and their movement patterns to deliver educational material to the child (i.e. a child can put a giraffe toy in front of the tablet and learn about giraffes from the app); and so on. The present invention is not meant to be limited to any particular app or type of app.

FIGS. 4A and 4B illustrate the operation of the system of the present invention when a new toy is added or removed from the vicinity of the tablet. FIG. 4A shows what happens when a new toy is added. The tablet detects the change in physical manipulatives (in this case, the addition of the zebra), and looks up the characteristics associated with the zebra toy. The app running on the tablet will then incorporate an animation based on the characteristics of the zebra toy into the app.

Similarly, FIG. 4B shows what happens when a toy is removed. Once again, the tablet detects the change in physical manipulatives (in this case, the removal of the zebra), and removes the zebra animation from the app.

It must be noted that in the preferred embodiment, the app is written in such a way as to change the behavior of each individual character based on what other characters are there. However, the present invention is not limited to any particular app or type of app, and may encompass any type of change in what is displayed based on the addition, removal, or movement of a toy.

The relational database can be located in the cloud, or can be downloaded onto the tablet for ease of access. Different animals and characters can be added to the database by developers as new toys are added to the system.

While in the preferred embodiment, the system of the present invention uses NFC chips to communicate with the toys, the app may also use the tablet's camera to analyze the toys visually. The app may use the camera to identify the toy (by shape, color, or a special marking such as a QR code on the toy), or may use the camera to identify the way the toy is moving or the way the toy is oriented, while using the NFC chip to identify the toy itself.

For example, in this embodiment, the app can identify a toy as a giraffe by its NFC chip. Then, the tablet camera can determine that the child is moving the giraffe in a running motion. The app can then show a running giraffe on the screen, and use the descriptive information regarding the giraffe to give the representation the proper gait as it runs.

In an embodiment, the system can also use the tablet's microphone to identify sounds or speech. This can be used to allow the child to make the appropriate animal noise for the animal toy and to have the animal make the same noise on-screen, to allow the child to give instructions for what he or she wants to see happen on the tablet screen (i.e. “I want to see Spongebob riding a giraffe!”), and so on.

In a related embodiment, the toys can generate sound on their own. For example, a toy may be equipped with a squeaker or some other mechanical sound generating device, or an electrical sound generating device and some way to either store or generate power for the device. The toy may generate sound when squeezed, moved, or touched. In that embodiment, the tablet may identify the sound emitted by the toy and change the animation or make a sound when the toy emits that sound.

Exemplary embodiments have been described above. It will be understood by a person of reasonable skill in the art that the invention encompasses other embodiments that are equivalent to the ones described above.

Claims

1. A system for interactive play with physical and virtual elements, comprising:

at least one physical manipulative;
a computing device comprising: a display; a user interface; an identification module for identifying the at least one physical manipulative; a communication module for communicating with a server;
a server comprising a database, said database comprising: a listing of physical manipulatives; for each physical manipulative, at least one descriptive attribute;
wherein the computing device is configured to: identify the physical manipulative; look up at least one descriptive attribute associated with the physical manipulative in the database; display or modify an animation related to the at least one descriptive attribute on the display.

2. The system of claim 1, wherein the at least one physical manipulative is one of the following group: an animal figure, a cartoon character figure, a doll, an action figure, a vehicle.

3. The system of claim 1, wherein the at least one descriptive attribute is at least one of the following group: personality, habits, sounds, phrases, geographic origin, size, diet, spelling of a name, pronunciation of a name, at least one Internet link.

4. The system of claim 1, wherein the identification module comprises at least one of the following: a camera, a NFC module, a QR reader, a bar code reader, a RF receiver, a sound detection device.

5. The system of claim 1, wherein the computing device is further configured to:

detect a motion pattern of the physical manipulative;
display or modify an animation relating to the motion pattern on the display.

6. The system of claim 1 wherein the physical manipulative emits a sound, wherein the identification module is configured to detect the sound and to identify the physical manipulative based on the sound.

7. The system of claim 6 wherein the physical manipulative emits the sound when at least one of the following actions is performed: the physical manipulative is squeezed, the physical manipulative is pressed, the physical manipulative is moved.

8. The system of claim 7, wherein the action generates energy that is used to make the physical manipulative emit the sound.

9. The system of claim 1, wherein the computing device is further configured to:

detect an orientation of the physical manipulative;
display an animation relating to the orientation on the display.

10. The system of claim 1, comprising a first physical manipulative and a second physical manipulative, wherein the computing device is further configured to:

detect one of the following: the relative position of the two physical manipulatives, the absolute position of each physical manipulative, the orientation of each physical manipulative;
display an animation relating to the interaction between the first physical manipulative and the second physical manipulative.

11. The system of claim 1, further comprising:

a sound input device;
wherein the computing device is further configured to: detect a sound received by the sound input device; look up the sound in the database; determine the physical manipulative associated with the sound; display an animation relating to the physical manipulative and to the sound.

12. The system of claim 11, wherein the computing device is further configured to:

detect a voice command received by the sound input device;
display an animation relating to the physical manipulative and the voice command.

13. The system of claim 11, wherein the physical manipulative comprises a sound generating device, wherein the computing device is further configured to:

detect sounds emitted by the physical manipulative;
display an animation relating to the physical manipulative.

14. A method of entertaining a child, comprising:

detecting the presence of at least one physical manipulative near a computing device;
identifying the at least one physical manipulative;
using the computing device to look up at least one descriptive attribute associated with the at least one physical manipulative;
using the computing device to display an animation relating to the at least one descriptive attribute.

15. The method of claim 14, further comprising:

detecting a motion pattern of the physical manipulative near the computing device;
using the computing device to display an animation relating to the at least one descriptive attribute and the motion pattern.

16. The method of claim 14, further comprising:

detecting the presence of a second physical manipulative near the computing device;
using the computing device to look up at least one second descriptive attribute associated with the second physical manipulative;
using the computing device to display an animation relating to the at least one descriptive attribute and the at least one second descriptive attribute.

17. The method of claim 14, further comprising:

detecting a sound near the computing device;
using the computing device to look up a physical manipulative associated with the sound;
using the computing device to look up at least one descriptive attribute associated with the physical manipulative;
using the computing device to display an animation relating to the at least one descriptive attribute.

18. The method of claim 17, wherein the sound is emitted by the physical manipulative.

19. The method of claim 14, where the at least one descriptive attribute is at least one of the following group: personality, habits, sounds, phrases, geographic origin, size, diet, spelling of a name, pronunciation of a name, link to the Internet.

20. The method of claim 14, where the at least one physical manipulative is at least one of the following group: an animal figure, a cartoon character figure, a doll, an action figure, a vehicle.

21. The method of claim 14, wherein the step of identifying the at least one physical manipulative is performed by at least one of the following: a QR reader, a bar code reader, a camera, a NFC reader, a RF receiver, a sound detection device.

Patent History
Publication number: 20160184724
Type: Application
Filed: Aug 27, 2015
Publication Date: Jun 30, 2016
Inventors: Andrew Butler (Palo Alto, CA), Tom Boeckle (Las Vegas, NV), F Brian Iannce (San Jose, CA), Vivian Lee (Sunnyvale, CA)
Application Number: 14/838,307
Classifications
International Classification: A63H 29/22 (20060101);