Panoramic Coloring Kit

- Crayola LLC

A panoramic coloring kit for coloring a 360-degree coloring environment is provided. The panoramic coloring kit may include a coloring stylus, a stamper device, and a digital spyglass, each of which may be configured to interact with a touch-screen surface of a computing device. The kit may further include an application that, when executed by the computing device, generates a digital image of a panoramic view of the 360-degree coloring environment.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application Ser. No. 61/788,371, filed Mar. 15, 2013, entitled “Digital Coloring Tools Kit With Panoramic View And Create-To-Destroy Interactive Features,” having Attorney Docket No. HALC.178959, and is related by subject matter to concurrently filed U.S. application Ser. No. ______, entitled “Digital Coloring Tools Kit With Dynamic Paint Palette,” having Attorney Docket No. HALC.204556, the contents of both of which are hereby incorporated by reference in their entirety.

SUMMARY

Embodiments of the invention are defined by the claims below, not this summary. A high-level overview of various aspects of the invention are provided here for that reason, to provide an overview of the disclosure, and to introduce a selection of concepts that are further described in the Detailed Description section below. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in isolation to determine the scope of the claimed subject matter.

In brief and at a high level, this disclosure describes, among other things, a digital coloring tools kit for creating a realistic painting experience for use with a touch-screen device. In further embodiments, the invention includes a panoramic coloring kit for coloring in a 360-degree digital coloring environment. In additional embodiments, the invention is directed to a create-to-destroy interactive kit for use with a touch-screen device.

DESCRIPTION OF THE DRAWINGS

Illustrative embodiments of the invention are described in detail below with reference to the attached drawing figures, and wherein:

FIG. 1 is a perspective view of components of a create-to-destroy interactive kit for use with a touch-screen device, in accordance with an embodiment of the invention;

FIG. 2 is a top perspective view of a digital stamping tool, in accordance with an embodiment of the invention;

FIG. 3 is a bottom perspective view of the digital stamping tool of FIG. 2, in accordance with an embodiment of the invention;

FIG. 4 is a top perspective view of a digital paint palette 60, in accordance with an embodiment of the invention;

FIG. 5 is a bottom perspective view of the digital paint palette of FIG. 5, in accordance with an embodiment of the invention;

FIG. 6 is a perspective view of a single-touch-point brush, in accordance with an embodiment of the invention;

FIG. 7A is a top view of a digital paint palette, in accordance with an embodiment of the invention;

FIG. 7B is a top view of a digital paint palette, in accordance with an embodiment of the invention;

FIG. 7C is a top view of a digital paint palette, in accordance with an embodiment of the invention;

FIG. 7D is a top view of a digital paint palette, in accordance with an embodiment of the invention;

FIG. 8 is perspective view of a digital coloring tools kit, in accordance with an embodiment of the invention;

FIG. 9 is a top view of a touch-screen device for implementing embodiments of a panoramic coloring kit, in accordance with an embodiment of the invention;

FIG. 10 is a perspective view of a 360-degree digital coloring environment generated in association with the panoramic coloring kit, in accordance with an embodiment of the invention;

FIG. 11A is a top perspective view of a spyglass tool, in accordance with an embodiment of the invention;

FIG. 11B is a bottom perspective view of the spyglass tool of FIG. 11, in accordance with an embodiment of the invention;

FIG. 12 is a top view of a touch-screen device, including a spyglass tool, for implementing embodiments of a panoramic coloring kit, in accordance with an embodiment of the invention;

FIG. 13 is an exemplary computing system for executing an application in accordance with embodiments of the invention; and

FIG. 14 is an exemplary method including steps for providing an interactive coloring environment.

DETAILED DESCRIPTION

The subject matter of embodiments of the invention is described with specificity herein to meet statutory requirements. But the description itself is not intended to necessarily limit the scope of claims. Rather, the claimed subject matter might be embodied in other ways to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.

In one embodiment, the present invention is directed to a panoramic coloring kit. The coloring kit may include an activation feature for a panoramic coloring kit application. The coloring kit may further include at least one of the following digital input devices: a stylus, a stamper, and a spyglass. The stylus, stamper, and spyglass may be configured to interact with a touch-screen surface of a computing device executing the panoramic coloring kit application.

Another embodiment of the present invention is directed to a method of providing an interactive panoramic coloring environment. The method may include presenting, on a touch-screen display of a computing device, an interactive panoramic coloring environment. The interactive panoramic coloring environment may be presented on the touch-screen display of the computing device in response to executing an application on the computing device. Presenting the interactive panoramic coloring environment may include generating a digital image comprising a panoramic view of the interactive coloring environment, where such digital image may be configured to display at least a portion of the interactive coloring environment corresponding to a manipulation of the computing device by a user.

In yet another embodiment, the present invention provides a panoramic coloring kit that includes an activation feature for an application that when executed by a computing device generates a digital image comprising a 360-degree view of a coloring environment. The panoramic coloring kit may further include at least one of the following: a single touch-point coloring stylus for at least one of selecting an item in the coloring environment and coloring an item in the coloring environment; a multiple touch-point stamper for adding new items to the coloring environment; and a multiple touch-point spyglass, including a spyglass frame that encloses a transparent viewing area. The single touch-point coloring stylus, multiple touch-point stamper, and multiple touch-point spyglass may be configured to interact with a touch-screen surface of the computing device.

With reference now to the figures, apparatus, methods, and systems for providing a panoramic coloring kit are described in accordance with embodiments of the invention. Various embodiments are described with respect to the figures in which like elements are depicted with like reference numerals.

Referring initially to FIG. 13, an exemplary operating environment 10 in which embodiments of the present invention may be implemented is described below to provide a general context for various aspects of the present invention. Exemplary operating environment 10 includes a computing device 12, which is but one example of a computing environment for use with the present invention. The computing device 12 is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention, and should not be interpreted as having any dependency or requirement relating to any one component nor any combination of components illustrated. As one skilled in the art would recognize, one or more of the components of operating environment 10 may be used to execute an application associated with embodiments of the invention.

Computing device 12 may include hand-held devices, consumer electronics, general-purpose computers, more specialty computing devices, touch-pad computing devices, touch-screen computing devices, and the like. Embodiments of the invention may be described in the general context of computer code or machine-useable instructions, including computer-useable or computer-executable instructions such as program modules, being executed by the computing device 12. The computing device 12 typically includes a variety of computer-readable media, which may be any available media that is accessible by the computing device 12, such as computer storage media that stores computer-executable instructions for executing by the computing device 12. In one embodiment, computing device 12 is a touch-screen device having a camera, such as an iPod touch®, iPad®, and/or an iPhone® device provided by Apple® Inc.

As shown in the example of FIG. 13, the computing device 12 includes the following components: a memory 14, one or more processors 16, one or more presentation components 18, one or more input/output (I/O) ports 20, one or more I/O components 22, and an illustrative power supply 24. As will be understood, the components of exemplary computing device 12 may be used in connection with one or more embodiments of the invention, and may include few or additional components than those depicted in exemplary FIG. 13.

The memory 14 includes computer-storage media in the form of volatile and/or nonvolatile memory that may be removable, non-removable, or a combination thereof. The computing device 12 also includes one or more processors 16 that read data from various entities such as the memory 14 or the I/O components 22. The presentation component(s) 18 present data indications to a user or other device, such as a display device, speaker, printing component, vibrating component, and the like. The I/O ports 20 allow the computing device 12 to be logically coupled to other devices, while the I/O components 22 may include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, and a controller, such as a stylus, a keyboard and a mouse, a natural user interface (NUI), and the like.

Turning now to FIG. 1, a create-to-destroy interactive kit 26 for use with a touch-screen device is described in accordance with an embodiment of the invention. As shown in FIG. 1, the interactive kit 26 may be a kit for use with a touch-screen device 38, which may resemble the computing device 12 of FIG. 13. The kit may include a tool case 28, a digital catapult 30 moveable into a compressed position 32, a digital character stamp 34, and a digital stacker stamp 36. The kit 26 may further include physical glyphs that may be attached to the digital catapult 30 for projection. In various embodiments of the interactive kit 26, a portion of the components depicted in FIG. 1 may be part of the interactive kit 26, while other components may be excluded from the kit. Tool case 28 provides a storage mechanism for one or more of the components of the interactive kit 26, such as the digital catapult 30, the digital character stamp 34, and the digital stacker stamp 36. In embodiments, the tool case 28 is a hard-shell storage case configured to couple to a computing device 12, such as a tool case 28 that clips onto touch-screen device 38. Additionally, tool case 28 may further serve as a stand for a computing device 38 used in conjunction with the kit 26. For example, the exterior of the tool case may include a slot configured to receive a portion of the computing device 38. In further embodiments, the tool case 28 may be opened and spread flat on a playing surface, such as a table or a floor, such that when the slot configured to receive a portion of the computing device 38 actually receives such portion of the computing device 38, the tool case 28 provides a stable stand for the computing device 38. Based on an orientation of the computing device 38 when secured by the case/stand, a camera on the front surface of the computing device 38 may remain exposed to capture an image of the digital catapult 30 and/or glyph oriented in front of the computing device 38 and case/stand.

In embodiments, the digital catapult 30 is used to simulate the projection of objects towards a touch-screen device 38, such as computing device 12, which may include an iPad®. The touch-screen device 38 includes a display surface 40 where images generated by the interactive kit 26 may be displayed. The constructed image 42 may include elements generated using the components of the interactive kit 26. Accordingly, in some embodiments, a constructed image 42 includes a structure for a user to build and then disassemble. In embodiments, an application associated with the interactive kit 26 is executed by the touch-screen device 38 to display a constructed image 42 based on user interaction. Accordingly, the digital stacker stamp 36 may be used to add structure to the constructed image 42, such as adding bricks to a castle image based on contacting the digital stacker stamp 36 with the display surface 40.

Having generated a constructed image 42 by contacting the digital stacker stamp 36 to the display surface 40 and/or displaying a pre-defined constructed image 42 provided by the application executed by touch-screen device 38, a user may then “destroy” the constructed image 42. In embodiments, the user may aim the digital catapult 30 at the display surface 40, and upon retracting and releasing from the compressed position 32, altering at least a portion of the constructed image 42. For example, a digital catapult 30 may be aimed and released with respect to the display surface 40. The touch-screen device 38 may receive an indication that the digital catapult 30 was aimed and released, and generate a corresponding response for an image depicted on the display surface 40. For example, the digital catapult 30 may be used to simulate the throwing of bricks at a building, and the destruction of such building upon the detection of contact. In embodiments, a camera associated with the touch-screen device 38, such as a camera on a computing device 12, may detect the position and/or location of the digital catapult 30. In detecting the position and/or location of the digital catapult 30, the application may respond according to actions by the digital catapult 30, such as responding to the simulated “throwing” of items at the display surface 40.

During attack by the digital catapult 30, one or more physical glyphs attached to the digital catapult 30 may be used to indicate the catapult's location to the touch-screen device 38. As such, the identity of a physical glyph attached to the arm of the digital catapult 30 may be identified during an attack, and a user may attach different physical glyphs to the digital catapult 30 for projection. In this way, the glyphs may be used to indicate the aiming and distance of the digital catapult 30 with respect to the touch screen device 38. In some embodiments, in determining the angle and distance of the digital catapult 30 from the touch-screen device 38, two small tag labels may be coupled to the front of the digital catapult 30 at a determined distance from each other. In embodiments, when the digital catapult 30 is placed in front of the camera of a touch-screen device 38, the application/software associated with the interactive kit 26 assigns a size for each of the tags. The interactive kit 26 then determines the angle of each of these tags with respect to the display surface 40, based at least in part on the proportion of each of the tags to each other. In further embodiments, the size proportion of the largest tag can also be used to determine the distance of the catapult from the touch-screen device 38, such as a distance from an iPad®. In one embodiment, LED lights on the digital catapult 30 may be used to determine user interactions using a camera of the computing device 12.

In further embodiments of the invention, the impact of an attack with the digital catapult 30 may depend upon the power applied by the digital catapult 30. In one embodiment, a mechanical clicker mechanism may be built into the digital catapult 30. For example, when the digital catapult 30 is pulled back to simulate the launch of a projectile, the amount of clicks that it takes to reach a final position (i.e. the amount of clicks to reach a compressed position 32) may indicate how far the digital catapult 30 will launch its “payload.” In some embodiments, the application/software will compare the amount of clicks to the distance from the computing device 12 based on the use of the tags discussed above to see if the user under-shot the target, hit the target, or over-shot the target.

In another embodiment of the invention, a tag may be placed at the bottom of the digital catapult 30 arm, and a tag may be placed at an area where the “payload” would be placed (i.e., the simulated object being launched by the digital catapult 30). The application/software could then determine, for example, the amount of time that it takes from the time the tag at the bottom of the arm is not seen to the time the tag located near the payload is seen, to determine the amount of power generated by the throw. As such, the capture rate of the touch-screen camera may also be determined.

In some embodiments, the interactive kit 26 includes a device stand 44 for supporting the touch-screen device 38. As mentioned, the tool case 28 may be configured for use as a stand. In further embodiments, the digital character stamp 34 may be used to add animated images to the scene depicted on display surface 40. Accordingly, in some embodiments, the digital character stamp 34 may be recognized by the application executed on touch-screen device 38 based on contact of at least a portion of the digital character stamp 34 with the display surface 40. For example, based on contacting the touch-screen device 38 with the digital character stamp 34, an animated image may be added to the image including the constructed image 42, such as a fire-breathing dragon character being displayed based on contact of the digital character stamp 34 with a portion of the display surface 40.

In further embodiments, additional enhancements may be provided by the application executed by the touch-screen device 38 as part of the interactive kit 26. In one example, sound enhancements may be generated in response to a user's interaction with the application, such as an explosion sound being generated in response to a user projecting a virtual item with the digital catapult 30 towards the display surface 40 (i.e., “throwing” a simulated brick at the constructed image 42). In another embodiment, motion sensors of the touch-screen device 38 may detect motion of the device such as shaking or raising/lowering of the device with respect to a surface, and display a corresponding response. For example, a user may shake the touch-screen device 38 to simulate an earthquake, which may cause one or more images on the display surface 40 to be altered (e.g., a building collapsing, or one or more bricks stamped with the digital stacker stamp 36 shifting position).

In embodiments of the invention, the digital stacker stamp 36 may be used to create buildings on the touch-screen device 38, such as a castle, fortress, or other desired structure. The digital stacker stamp 36 may also be used to assemble blocks and/or add doors and windows to a simulated structure presented on the display surface 40. In embodiments, the application may be used to simulate a gaming environment that tests the strength of a structure assembled using the digital stacker stamp 36. For example, a user may “build” a virtual castle with individual bricks imprinted using the digital stacker stamp 36, and then test the strength of their creation by launching virtual boulders at the touch-screen device 38 using digital catapult 30. In some embodiments, with the digital character stamp 34, a user may add additional enhancements to a scene displayed on display surface 40, such as adding a stationary or animated figure that interacts with the constructed image 42.

In some embodiments, the create-to-destroy interactive kit 26 may be used in conjunction with multiple touch-screen devices. As such, a user may interact with another user executing the same application and/or a different instance of the same application to compete in the simulated environment of the interactive kit 26. Thus, a first user may create a constructed image 42 that is destroyed by a second user, and vice versa. In one embodiment, a user may save, email, print, or otherwise store a constructed image 42 that the user assembles, based at least in part on the addition of one or more “bricks” using the digital stacker stamp 36.

Embodiments of the interactive kit 26 include an application that provides a first tier of options available to a user upon purchase of the interactive kit 26, and a second tier of options available to the user upon “unlocking” a full mode of the interactive kit 26. In one example, a user may purchase the interactive kit 26 with an application that enables the user to access a “try me” mode of the product, and utilize a limited number of building materials, backgrounds, attack modes, etc. In another example, the user may “unlock” a full mode of the purchased interactive kit 26 to activate additional options with the digital stacker stamp 36, the digital character stamp 34, the digital catapult 30, and other features of the interactive kit 26 that may be limited and/or restricted based on which mode a user is executing.

In one embodiment of the invention, the interactive kit 26 generates a menu screen for presentation on the display surface 40. The menu screen may include multiple components for selection by a user, including a “start creating” indicator, a “start destroying” indicator, a “gallery” indicator, an “options” indicator, and a “more from Crayola indicator.” The “start creating” indicator may be selected to begin constructing a building, such as the constructed image 42. In embodiments, the selection of the “start creating” indicator presents a user with a selection screen to initiate a variety of beginning options, such as selecting a premade castle/building that the user may build upon and/or customize using a building tool. In another example, the selection screen provides a number of backgrounds or landscapes, such as three-dimensional backgrounds, for a user to view behind their constructed image 42.

In further embodiments of the “start creating” mode, the user may begin building a three-dimensional figure using bricks, gates, towers, walls, and the like. In some embodiments, a user selects an object to build with, and selects what material the object will be made out of. For example, a user may select from a menu of building materials, such as wood, stone, glass, brick, and/or mud. In embodiments of the invention, different building materials may react differently (i.e., produce a different result displayed to a user) to different “attacks” from a user (i.e., to destruction by a catapult, or other user intervention).

In some embodiments, the interactive kit 26 includes multiple create modes for a user to create a constructed image 42. In one embodiment, an open create mode includes features that allow a user to select a type of object, with the number of objects available to the user only being limited to the size of the work area. In another embodiment, a challenge mode includes features that allow the application to determine a type and number of objects for use. For example, in the challenge mode, individual objects may be presented one at a time for the user to place in the work area (i.e., for the user to manipulate on display surface 40). In one example, the challenge mode requires a user to create a different structure each time, with a limited number of objects.

The “start destroying” indicator may be selected to direct the user to a pre-constructed image 42 for destruction by the user. In one example, the “start destroying” indicator may be selected to navigate the user to a gallery where the user can select a pre-made castle (provided by the application) or a saved design previously created and stored by the user. In embodiments of the invention, a user may “destroy” the constructed image 42 using manual manipulation of the touch-screen device 38, such as touching the display surface 40 with a finger, or virtual manipulation, such as projecting a catapult towards the screen. In embodiments, the user may attack in a “destroy” mode of the invention using different types of destruction techniques, such as a fireball, cannonball, ground shaker, lightning bolt, dinosaur, cow, etc. In one embodiment, tapping a surface of the touch-screen device 38 launches an attack as if it was coming from the digital catapult 30. Each attack by a user may affect each material differently, based on the type of attack and the type of material being affected. For example, a fireball that contacts wood may light the wood on fire but may have little effect on stone. In another example, a canon ball may put a hole in a wooden structure, but may crumble stone. In one embodiment, objects such as walls and towers are reduced to bricks when hit with an attack.

In embodiments of the invention, a dinosaur attack mode may be executed by the user to cause one or more changes to the constructed image 42. For example, a user can select to attack a castle with a dinosaur character, or may simulate launching a dinosaur character onto a scene by “virtually” propelling it using the digital catapult 30. In embodiments, a dinosaur character may damage a constructed image 42 through movement of the dinosaur character's body, such as slashing of the dinosaur character's tail that knocks down bricks. In one embodiment, an “attack” by a dinosaur character may be initiated by a user, while in another embodiment, the application may initiate a dinosaur attack on the constructed image 42. In further embodiments, a dinosaur attack may last for a predetermined amount of time, such as a particular number of seconds. After the predetermined amount of time has passed, the digital image of the dinosaur character may be removed from view on the display surface 40, such as by running away.

In embodiments of the invention utilizing a game-playing mode, a variety of features and/or simulations may be generated by the application to enhance the user experience of interacting with the interactive kit 26. In embodiments, an amount of destruction of a constructed image 42 may be measured in points, with a number of points being assigned, for example, to the number of bricks knocked down by a user per attack. For example, in one attack, one brick knocked down by a user may generate one point, while in another attack that knocks down ten bricks, ten points may be generated. In one embodiment, a goal of the game may be to accumulate as many points as possible with the fewest number of attacks. Accordingly, in another embodiment, multiple players accessing the application may be allowed to take turns building and destroying each others' castles.

Selection of the “gallery” indicator may provide a view of multiple scenes previously interacted with and saved by the user, including images of buildings constructed and/or destroyed using the digital stacker stamp 36, the digital character stamp 34, and the digital catapult 30. The “options” indicator may be selected to generate a menu of additional options for execution for a user, while the “More from Crayola” indicator may be selected to navigate a user to additional enhancements that may be accessed and/or purchased for use with the interactive kit 26.

In some embodiments, as part of the digital application provided with the interactive kit 26, a user may be able to purchase additional digital enhancements for the application (i.e., “in-app purchases”) that further enhance the user's experience, such as updating the interactive kit 26 with additional pre-built castles/buildings, adding additional building materials, adding additional attack modes, etc. In further embodiments of the invention, the application is adapted to build structures (buildings, castles, etc.) in both a two-dimensional and a three-dimensional mode for presentation to a user.

With reference now to FIGS. 2-3, embodiments of a digital coloring tools kit may include a digital stamper 46 that is used to add enhancements to an image displayed on a computing device, such as the computing device 12 of FIG. 13, which may include a touch-screen device, based on contacting a touch-screen surface of the computing device 12. The digital stamper 46 has a top surface 48 and a bottom surface 50. When the bottom surface 50 contacts the touch-screen of the computing device 12, one or more touch-points contact the surface, such as stationary touch-points 52 and 54, and translating touch-point 56 that travels inside a defined space 58. To determine which enhancements to generate based on contact with the digital stamper 46, an application associated with the digital coloring tools kit is configured to identify a particular orientation of the translating touch-point 56 relative to the two other stationary touch-points 52 and 54. In further embodiments of the invention, digital stamper 46 includes two touch-points detectable by a touch-screen device.

As shown in FIGS. 4-5, embodiments of the digital coloring tools kit include a digital paint palette 60 having a top surface 62, a plurality of transparent openings 64, a bottom surface 66, and a plurality of digital touch-points 70. The plurality of transparent openings 64 may be viewed from the bottom surface 66 as a plurality of transparent openings 68. The plurality of digital touch-points 70 may be oriented in a particular configuration on the bottom surface 66 such that the identity of the digital paint palette 60 may be recognized by a touch-screen device, such as an iPad® computing device.

Embodiments of the digital coloring tools kit provide “realistic” painting effects like color mixing, using intuitive painting tools designed specifically for a touch-screen device, such as the single touch-point brush 72 in FIG. 6. The single touch-point brush 72 has a body 74, with a single touch-point 76 at a proximal first end of the single touch-point brush 72, a distal second end 78 of the single touch-point brush 72, and a plurality of paintbrush bristles 80 surrounding the single touch-point 76. The single touch-point 76 and/or paintbrush bristles 80 may be detected by a touch-screen surface on a computing device 12.

In embodiments, an application associated with the digital coloring tools kit, for execution by a computing device such as a touch-screen device, may include thematic backgrounds such as an easel, canvas, watercolor paper, etc. In further embodiments, the application may include line art coloring-page backgrounds, fully-completed assets that can be stamped onto a display (such as a painted bee), pre-mixed colors, different brushes and/or brush tip effects for selection by a user, and different lay-down effects.

Embodiments of the digital coloring tools kit include an application that provides a first tier of options available to a user upon purchase of the digital coloring tools kit, and a second tier of options available to the user upon “unlocking” a full mode of the digital coloring tools kit. In one example, a user may purchase the digital coloring tools kit with an application that enables the user to access a “try me” mode of the product, and utilize a limited number of features, such as backgrounds, effects, color mixing techniques, etc. In another example, the user may “unlock” a full mode of the purchased digital coloring tools kit to activate additional features, such as options for painting with a mixed color, blending of particular colors, and activating additional features of the digital coloring tools kit that may be limited and/or restricted based on which mode a user is executing. In one embodiment, a digital stamper and/or digital paint palette may be used to unlock features of the digital coloring tools kit.

An application associated with the digital coloring tools kit may include a “start creating” indicator, a “my gallery” indicator, an “options” indicator, a “more Crayola” indicator, and an “unlock more” indicator. Further, embodiments of the invention may provide for background selection, where a user can select realistic/simulated backgrounds like canvas and watercolor surfaces for painting. In some embodiments, coloring page backgrounds are provided, as well as solid-colored backgrounds for painting on.

Embodiments of the invention provide realistic painting effects for coloring on a touch-screen device. For example, a kit may provide realistic paint effects such as color mixing and swirling, color bleeding, color slowly decreasing in intensity (lightening) as a brush stroke “runs out” of paint, and additional subtle painting effects for individual types of paint. Accordingly, a painting effect for a corresponding type of paint may include providing a slightly embossed appearance for a child's paint. In another example, a painting effect for a particular type of paint may include added transparency and/or darker-colored edge appearance for a watercolor paint. In embodiments of the invention, painted designs created by a user may be altered using erasable features and undo features. In further embodiments, color-mixing features are provided where, in the image displayed on the computing device screen, a color stays bright rather than muddy during color mixing. Additionally, embodiments of the invention provide for various lay-down effects, such as brush width, patterns, and glitter/metallic features of the ink being digitally painted with on the touch-screen device. In one embodiment, the digital coloring tools kit provides an interactive user interface, upon executing an application of the kit, with realistic-looking paint mixing, blending, and swirling features, as well as robust but simple to use color mixing features, when interacted with a variety of digital coloring tools. In some embodiments, digital creations generated by a user with the kit may be saved to a memory for sharing or future enhancing.

In embodiments, placing a paint palette on the touch-screen of a computing device, such as an iPad® device, brings up a palette of colors underneath the digital paint palette device. In one embodiment, primary mixing colors appear through the transparent openings on the paint palette, while the larger opening on the paint palette remains empty for color mixing. In embodiments, a user can “pick up” a color and add it to the mixing area. Once two or more colors have been added, the user can mix the selected colors with the digital paintbrush to swirl colors and thoroughly mix the selected colors. In embodiments, an amount of color added to the mixing area is determined based on an amount of times a user taps a particular paint palette opening. In one example, colors are mixed in a one-to-one ratio unless the user adds more taps of a color than another.

For example, as shown in FIGS. 7A and 7B, an activated paint palette 82 displays multiple coloring options inside a plurality of transparent openings 86 on a digital paint palette 84. As such, the paintbrush bristles 80 (and/or single touch-point 76) on the body 74 may interact with the transparent openings 86 to indicate a selection by a user of a particular color being displayed in a particular opening. Accordingly, FIG. 7A depicts an embodiment of the invention including an activated paint palette 82 that displays colors on a surface of a touch-screen device through the individual, transparent openings, such as transparent opening 86, on the digital paint palette 84. Having selected the color of paint displayed through transparent opening 86 with the single touch-point brush 72, the user may then add that color to the mixing area 88, as shown in FIG. 7B. The user may then paint an image on the touch-screen surface using the selected color, or may continue to mix an additional color. As such, as in FIG. 7C, the user may select a second color from a transparent opening 90 to mix with the color from transparent opening 86. As shown in FIG. 7D, the user may thereby create a blended paint in the mixing area 88 that includes both colors.

Accordingly, in embodiments of the invention, the colors being presented through the openings in the activated paint palette 82 are those that are presented on the touch-screen based on the touch-screen recognizing the location/identity of the digital paint palette 84 using digital touch-points, such as the digital touch-points 70 of FIG. 5. Further, upon providing one or more colors for selection in the transparent openings, such as transparent opening 86, the touch-screen may then receive an indication of a color selection of at least one of the colors populated in each opening by the touch-screen. The selected color(s) may then be used to draw in a virtual environment of the digital coloring tools kit, such as on a watercolor background. In one embodiment, two selected colors are mixed in the mixing area 88 and then used to color in a virtual environment on the touch-screen display.

As shown in FIG. 8, an exemplary digital coloring tools kit 92 includes a digital stamper 100 and a digital paint palette 84 that are used to add enhancements to the painting created by the paintbrush bristles 80 of the digital paintbrush. In one embodiment, a mixture of paint selected by the user creates a mixed paint stroke 98 on the display surface 96 of the touch-screen device 94.

Mixing of colors using the digital paintbrush tool may, in some embodiments, be a gradual mixing based on the techniques executed as part of the application. For example, mixed colors may remain bright rather than becoming muddy gray/brown upon mixing. In some embodiments, color mixing will have two distinct stages for creation—a first stage where colors are initially swirled together, and a second stage where a user is able to paint with a fully-mixed color. Accordingly, in one embodiment, the touch-screen device may display the swirled paint upon selection of at least two colors for mixing, and may provide the fully mixed color in response to a threshold amount of time and/or an amount of swirling motion created by the user with the digital paintbrush.

Embodiments of the digital coloring tools kit include options for generating various brush techniques and paint patterns on the painted display. For example, the kit may include a variety of traditional brushes (thick bristles, thin bristles, fanned bristles) for use with the touch-screen surface. Additionally, the application may generate a variety of patterns in response to user interaction with a paintbrush and the touch-screen, such as dotted, wavy, splattered, and/or dripped patterns. In further embodiments, as discussed above, various digital painting effects are provided that correspond with a particular type of virtual painting medium selected for painting, such as a kids paint, a watercolor paint, a glitter paint, a metallic paint, a crackle paint, etc. In one embodiment, a user may select from a variety of surprising paint types (such as animal fur paint, a growing vines paint, etc.) for simulation in the digital coloring tools environment.

In some embodiments of the digital coloring tools kit, stamped, pre-made artwork may be displayed on the touch-screen surface for a user to paint in, on, or around, or otherwise interact with during painting. In one embodiment, a stamped image is generated based on contacting the surface of the touch-screen device with the digital stamper 46. For example, a fully-painted asset may be stamped onto the screen using the digital stamper 46, while in other embodiments, a newly-added stamped item (from touching the digital stamper 46 to the touch-screen surface) may be enhanced with additional amounts of paint once stamped. For example, a user may paint over a stamped item, or may color in or change colors of the stamped item. In one embodiment, the single touch-point brush 72 and/or digital stamper 46 may be used to manipulate items stamped on the screen, such as re-sizing, rotating, moving, blending, coloring, etc. In another embodiment of the invention, a variety of stamping effects are provided to create a visual impression of a particular painting technique, such as sponge painting on the surface of the touch-screen device.

In some embodiments, as part of the digital application provided with the digital coloring tools kit, a user may be able to purchase additional, digital enhancements for the application (i.e., “in-app purchases”) that further enhance the user's experience, such as updating the digital coloring tools kit with additional backgrounds, stamps, lay-down painting effects, etc.

With reference to FIG. 9, a panoramic coloring kit 102 for digitally coloring a 360-degree coloring environment is depicted according to an embodiment of the invention. In one embodiment, the panoramic coloring kit 102 includes a simulated 360-degree coloring environment 104 in which a computing device 106, such as the computing device 12 of FIG. 13, which may include an iPad® or other touch-screen device, captures a selected scene 108 for coloring by a user on the touch-screen surface of the computing device 106.

As indicated by the arrows in FIG. 9, a user may maneuver the computing device 106 up, down, left, or right to manipulate a location of the selected scene 108 with respect to the coloring environment 104. The user may also rotate the computing device 106 about an axis, such as when the user holds the computing device 106 such that a display screen of the computing device is oriented parallel to the user's body and then turns around in a circle, in order to manipulate a location of the selected scene 108 with respect to the coloring environment 104. Additionally or alternatively, the user may interact with the touch-screen via physical touch, such as, for example, by swiping a finger across the touch-screen, in order to manipulate a location of the selected scene 108 with respect to the coloring environment 104. So for example, in terms of manipulating a location of the selected scene 108, moving the computing device 106 to the right may have the same effect as a user turning in a clockwise direction while holding the computing device 106, which may further have the same effect as a user swiping a finger to the left across the screen of the computing device 106. As such, a user may view a continuous coloring environment 104 and select a particular portion of the virtual coloring environment 104 for coloring. In embodiments, when the user reaches a selected scene 108 desired to be colored, the user may tap the touch-screen and freeze the motion of the virtual coloring environment 104.

The 360-degree coloring environment is further described with respect to FIG. 10, which provides perspective view 110 of a virtual tour 112 inside a 360-degree coloring environment 114 in accordance with an embodiment of the invention. In embodiments, the panoramic coloring scene in the 360-degree virtual coloring environment 114 is populated with a variety of objects and figures with which the user may interact, such as by coloring. When viewed with the computing device 106, embodiments of the coloring environment 114 may include looping animations in a background of a scene, more robust animations in foreground objects, and other animated enhancements to the coloring environment 114. In some embodiments, the virtual tour 112 of the digital coloring environment 114 allows the user to physically maneuver the computing device 106 to change the portion of the coloring environment 114 that is available for selection by the user (i.e., change which portion of the coloring environment 114 populates the display of the computing device 106 as the selected scene 108).

As mentioned, the user may manipulate a location of the selected scene 108 with respect to the coloring environment 114 in a number of ways, including: moving the computing device 106 up, down, left, or right; rotating the computing device 106 about an axis, such as when the user turns around in circles while holding the computing device 106; physically touching the touch-screen of the computing device 106, such as when the user swipes a finger across the touch-screen; as well as other means of interacting with the computing device 106.

In embodiments, rotating the computing device 106 about an axis includes holding the computing device 106 such that the display screen of the computing device 106 is parallel to the axis of rotation, and then rotating the computing device 106 about that axis. For example, the computing device 106 may be rotated about an axis that runs vertically through the user's body. This may be accomplished when a user holds the computing device 106 such that the display screen of the computing device 106 is parallel to the user's body and the user turns around in a circle. Thus, the computing device 106 is rotated about an axis that runs vertically through the user's body. In other embodiments, the computing device 106 may be rotated about any other axis for the purpose of manipulating a location of a selected scene 108. For example, the computing device 106 may be rotated about an axis that runs perpendicular to the user's body. This may be accomplished when a user holds the computing device 106 such that the display screen of the computing device 106 is parallel to the floor (which is also parallel to the axis of rotation), and then moves the computing device 106 in a counterclockwise direction until the computing device 106 is parallel to a wall to the user's right.

In other embodiments, a user may manipulate a location of the selected scene 108 by panning the computing device 106 in all directions. This may include panning the computing device 106 across the floor, ceiling, and walls of a room. A guided tour feature may also be provided, in which the user selects the guided tour option and is automatically guided through the coloring environment 114 without any further user intervention.

As described with respect to FIGS. 9-10, embodiments of the invention allow a user to interact with a panoramic coloring environment. In some embodiments, a 360-degree virtual tour effect allows a user to navigate through a large coloring page by physically moving a computing device, such as the computing device 106 of FIGS. 9-10, which may include an iPad®, up and down, such as up toward and across a ceiling or down toward and across a floor, as well as moving the computing device side to side or rotating the computing device about an axis, such as when the user turns around while standing in place. Also, as described, in embodiments, the user may navigate through the large coloring page by touching the touch-screen of the computing device, such as by swiping a finger across the screen.

The 360-degree coloring environment of the present invention, as represented by the elements corresponding to reference numerals 104 in FIGS. 9 and 114 in FIG. 10, may include a variety of interactive features. In one example, as an object enters a portion of a display screen on the computing device 106, such as the center portion of the display screen, an animation and/or sound associated with that object may be played. For example, as an airplane becomes visible on the touch-screen, propellers on the airplane may spin and an airplane engine sound may accompany the animation. In another example, some objects in the panoramic coloring environment remain in constant motion around the scene. Thus, as the user explores the coloring environment, it may be filled with animation and sound. In other embodiments, the panoramic coloring environment lacks animation and/or sound as the user explores the panoramic scene, but after the user selects a particular portion of the screen, as will be discussed below, the animation and/or audio features may be triggered.

Continuing on with respect to FIG. 10, in one embodiment, a user may tap the screen of a computing device having a touch-screen, or may otherwise interact with any kind of computing device, to freeze the motion of the 360-degree virtual tour 112, and thereby select a particular selected scene 108 for coloring. In this way, an “explore mode” and a “coloring mode” may be provided. The explore mode could provide for the exploration of the 360-degree coloring environment 114 by manipulating the computing device 106 as described above, while the coloring mode could enable the user to color and otherwise interact with a selected scene 108 from the 360-degree coloring environment 114, as will be further described below. In the coloring mode, any animation associated with the selected scene 108 may continue while the user colors the selected scene 108, or the animation may discontinue while the user colors. In one embodiment, the 360-degree virtual tour may present the panoramic coloring environment to a user with distorted and/or multi-dimensional appearance. As such, upon selection of a particular scene for coloring, the screen may quickly fade into a non-distorted coloring page.

When a user selects a particular scene for coloring, various coloring tools may be made available to the user. Alternatively, such tools may be continuously available to the user, not only upon selection of a selected scene 108. For example, when the user selects a selected scene 108, a coloring control may be provided. The coloring control may include a plurality of coloring tools, such as the coloring control described in the concurrently filed U.S. application Ser. No. (not yet assigned), entitled “Digital Collage Creation Kit,” Attorney Docket No. HALC.204554. The coloring tools may include crayons, paintbrushes, and other coloring tools having various colors. In some embodiments, the panoramic coloring kit 102 includes a digital coloring stylus or other digital input device for selection and/or coloring of items within a selected scene 108. For example, in one embodiment, the single touch-point brush 72 depicted in FIG. 6 may be used to digitally color in portions of the selected scene 108. In this example, the single touch-point brush 72 may or may not include the paintbrush bristles 80 surrounding the single touch-point 76. In embodiments, the single touch-point brush includes a plurality of conductive touch-points concentrated in a single, localized area, such as conductive paintbrush bristles. In other words, the single touch-point is not limited to one conductive point, but can include a number of conductive points that are concentrated in one area. A paint palette with various colors for selection and mixing, such as the digital paint palette 60 of FIGS. 4-5, may further be provided.

In some embodiments of the invention, the panoramic coloring kit 102 may include other accessories for enhancing and interacting with a panoramic coloring environment. For example, the kit 102 might include a touch-point stamper, such as the digital stamper 46 discussed with reference to FIGS. 2-3. In embodiments, the touch-point stamper is a multiple touch-point stamper. Regarding embodiments of the panoramic coloring kit 102, the digital stamper may be used to add new objects to a 360-degree coloring environment, such as coloring environments 104 and 114 depicted in FIGS. 9-10. For example, a user may add additional characters to a selected scene 108, which may then be animated based on interactive features of an application. The digital stamper may further be used to create additional figures or objects for coloring, thereby enabling a user to customize the coloring environment. In embodiments, a digital sticker may be added to a selected scene 108 based on contacting a plurality of touch-points, such as the touch-points 52,54, and 56 on the bottom surface 50 of the digital stamper 46 in FIGS. 2-3, with the touch-screen surface of a computing device 106.

Embodiments of the panoramic coloring kit 102 may include other interactive features in a 360-degree coloring mode. For example, a coloring inside the lines feature may be provided. A feature for stamping new objects into a scene and other added features while coloring a selected scene from the panoramic view may also be provided. These user-provided creations, including coloring, stamps, and stickers, may then be added to the 360-degree coloring environment such that they are visible when the user leaves the selected scene 108 and explores the coloring environment.

In some embodiments, the panoramic coloring kit 102 may also include a spyglass tool 116 depicted in FIGS. 11A and 11B. FIG. 11A provides a top perspective view of the spyglass tool 116 and FIG. 11B provides a bottom perspective view of the spyglass tool 116. In embodiments, the spyglass tool 116 includes a spyglass frame 118, which encloses a transparent viewing area 120, such that when the spyglass tool 116 is placed on the touch-screen of the computing device 106, a user can look through the transparent viewing area 120 and see the selected scene 108 on the computing device 106. In embodiments, the transparent viewing area 120 includes a transparent lens, such as a glass or plastic lens. In other embodiments, the transparent viewing area 120 may include a tinted or colored lens. In other embodiments, the transparent viewing area 120 does not include a lens at all. A bottom surface of the spyglass tool 116 may include one or more digital touch-points 122, in which case the spyglass tool 116 may be described as a multiple touch-point spyglass. When the digital touch-points 122 contact the surface of the computing device 106, an application associated with the panoramic coloring kit that is running on computing device 106 recognizes the spyglass tool 116 and features associated with the spyglass tool 116 are activated, as discussed in greater detail below.

As shown in the interactive coloring environment 124 of FIG. 12, placement of the spyglass tool 116 on the display screen of the computing device 106, such that the spyglass tool 116 covers at least a portion of the selected scene 108, may cause one or more enhancements to be activated in the coloring environment 104. The spyglass tool 116 may enable the discovery of hidden objects or the selective animation of objects that were previously motionless. In embodiments, the spyglass tool 116 can detect hidden animations on a coloring page, such as the selected scene 108. For example, as shown in FIG. 12, when the spyglass tool 116 is placed on the touch-screen of the computing device 106, a hidden animation is revealed. The jellyfish, which was previously motionless, as illustrated in FIG. 9, is now associated with a jiggling animation, as indicated by the animation lines 126 in FIG. 12. In embodiments, when such hidden animations are found and revealed using the spyglass tool 116, the animations are populated in the explore mode, such that when the user returns to the virtual tour, the animation remains visible.

In addition to discovering hidden animations, the spyglass tool 116 may also be used to discover hidden sounds or hidden objects. For example, when the spyglass tool 116 touches the display screen of the computing device 106 and passes over an object on the selected scene 108, a sound associated with that object may be revealed. In other embodiments, when the spyglass tool 116 passes over an apparently empty space within the selected scene 108, a hidden object may be revealed. For example, when the spyglass tool 116 is placed over an apparently empty section of an ocean scene, a school of fish may be revealed. In some embodiments, an achievement message may be presented upon detecting a particular hidden object, animation, or sound, such as a “You found X!” statement that flashes across the screen.

Having colored, activated, manipulated, stamped, and otherwise altered at least a portion of the coloring environment, a user may then save a scene, such as by using an auto-saving feature of an application. In one example, the user may return to a saved scene to finish coloring the scene. In further embodiments, a user may select a “start over” option to save a current state of a scene as a separate scene.

When the user finishes coloring, animating, or otherwise interacting with a screen, the user may return to exploring the coloring environment, where a user input provided in the coloring mode is visible in the explore mode. In other words, the items that the user colored when in coloring mode may now appear in color in the explore mode. Other user creations that were created in the coloring mode may also be populated in the explore mode, such as stamps, stickers, animations, sound effects, and unhidden objects. Thus, when the user pans around the coloring environment, these user creations are included in the panoramic coloring environment. In some embodiments, a user may toggle between a coloring mode to an explore mode based on pressing a button on a computing device, such as pressing a button on a touch-screen device like the iPad®, to move from a static coloring scene to a panning view of the 360-degree coloring environment.

Embodiments of a user interface displayed on a computing device 106 in association with the panoramic coloring kit 102 include a launch screen that prompts the user to select one or more features, including, for example, a “start creating” indicator, a “my gallery” indicator, an “options” indicator, a “more Crayola” indicator, and an “unlock more,” indicator. In one embodiment, a scene selection feature for initiating coloring with the panoramic coloring kit 102 may include multiple options for a user. For example, a scene selection feature may provide an explore mode that lets a user choose a scene and pick a coloring page from that scene, such as picking a selected scene from an animated, panning image displayed on the computing device screen. As another example, the scene selection feature may provide a pre-selected image mode where a user selects a scene to start coloring from a set of pre-selected coloring pages. In that example, the user may begin coloring the pre-selected scene, and subsequently explore other portions of the panoramic view of an associated coloring environment. In that embodiment, a pre-selected coloring page may include a particular call to action, such as a coloring page that instructs the user to “Find the Hidden Octopus.”

In embodiments, the panoramic coloring kit 102 includes a panoramic coloring kit application that, upon execution by the computing device 106, is configured to generate a digital image comprising a panoramic view of a coloring environment, such as the simulated 360-degree coloring environment 104. In further embodiments, the coloring kit 102 includes an activation feature for such a panoramic coloring kit application. The activation feature may include an activation code and/or an activation indicator, such as a web page, website URL, or other indicator of a resource from which a user may access one or more features of the application embodying the 360-degree coloring environment 104. In some embodiments, user interaction with the activation feature enables and/or activates a panoramic coloring kit application retrieved by using the activation feature, and/or a panoramic coloring kit application associated with the activation feature. In further embodiments, a non-user-specific application, such as Crayola ColorStudio HD™, is downloaded from an external source, and the specific features for the panoramic coloring kit application may then be activated and/or “unlocked,” by an activation feature. In this instance, the activation feature may involve touching a component included in the user-specific kit to the touch-screen of the computing device 106 running the non-user-specific application. For example, a user may download ColorStudio HD™ to a computing device 106, and then touch a digital stamper included in a user-specific panoramic coloring kit to the touch-screen of the computing device in order to activate the panoramic coloring kit application.

Upon enabling/activation, the panoramic coloring kit application may be accessed, retrieved from, downloaded, and/or otherwise interacted with via a source separate from the coloring kit 102. For example, the panoramic coloring kit application may be accessed and/or downloaded from a website, a database, a data store, or any other external source that may provide applications. An example of an external source is the online iTunes® store.

Embodiments of the panoramic coloring kit 102 include an application that provides a first tier of options available to a user upon purchase of the panoramic coloring kit 102, and a second tier of options available to the user upon “unlocking” a full mode of the panoramic coloring kit 102. In one example, a user may purchase the panoramic coloring kit 102 with a carrying case and/or computing device holder (e.g., a touch-screen device holder, such as an iPad® holder), a digital stylus, a digital stamper, and a spyglass tool. The user may also purchase, as part of the panoramic coloring kit 102 or as an associated or accessible feature available to the purchaser of the panoramic coloring kit 102, an application that enables the user to access a “try me” mode of the product. The “try me” mode of the panoramic coloring kit 102 may allow access to a limited number of features, such as a single scene, use of a single stamp (i.e., a stamp/sticker image that appears on a screen of the computing device 106 based upon stamping with a digital stamper tool), and/or a spyglass. In another example, the user may “unlock” a full mode of the purchased panoramic coloring kit 102 to activate additional options such as hidden animations, additional coloring scenes, and other features of the panoramic coloring kit 102 that may be limited and/or restricted based on which mode a user is executing. In one embodiment, a digital stamper, such as the digital stamper 46 of FIG. 2, may be used to activate and/or unlock a particular mode of the application, such as a limited mode or a full mode.

In some embodiments, as part of the digital application provided with the panoramic coloring kit 102, a user may be able to purchase additional, digital enhancements for the application (i.e., “in-app purchases”) that further enhance the user's experience, such as updating the panoramic coloring kit 102 with new scenes, and providing new stamp images for stamping into a new scene, etc.

Turning now to FIG. 14, a flow diagram 128 including steps for providing an interactive coloring environment is illustrated. At step 130, an interactive coloring environment is presented on a touch-screen display of a computing device. In embodiments, presenting the interactive coloring environment includes generating a digital image of a panoramic view of the interactive coloring environment. At step 132, in embodiments, a selection of a scene in the interactive coloring environment is received upon interaction with the digital image. The method may further include, at step 134, receiving at the selected scene an input from at least one of a single touch-point coloring stylus, a multiple touch-point stamper, and a multiple touch-point spyglass.

The following U.S. patent applications are hereby incorporated by reference in their entirety: U.S. Provisional Application No. 61/788,371 entitled “Digital Coloring Tools Kit with Panoramic View and Create-to-Destroy Interactive Features,” filed Mar. 15, 2013, having Attorney Docket No. HALC.178959; “Digital Coloring Tools Kit with Dynamic Digital Paint Palette,” filed Mar. 14, 2014, having Attorney Docket No. HALC.204556; U.S. Provisional Application No. 61/788,349, entitled “Personalized Digital Animation and Digital Collage Creation Kit,” filed Mar. 15, 2013, having Attorney Docket No. HALC.178958; U.S. Nonprovisional Application No. ______, entitled “Digital Collage Creation Kit,” filed Mar. 14, 2014, having Attorney Docket No. HALC.204554; U.S. Nonprovisional Application No. ______, entitled “Personalized Digital Animation Kit,” filed Mar. 14, 2014, having Attorney Docket No. HALC.204555; U.S. Provisional Application No. 61/788,381, entitled “Digital Fashion Portfolio and Green Screen Animation Kit,” having Attorney Docket No. HALC.178960; and U.S. Nonprovisional Application No. ______, entitled “Coloring Kit for Capturing and Animating Two-dimensional Colored Creation,” filed Mar. 14, 2014, having Attorney Docket No. HALC.204558.

Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the scope of the claims below. Embodiments of the technology have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to readers of this disclosure after and because of reading it. Alternative means of implementing the aforementioned can be completed without departing from the scope of the claims below. Certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations and are contemplated within the scope of the claims.

Claims

1. A panoramic coloring kit comprising:

an activation feature for a panoramic coloring kit application;
at least one of the following digital input devices: (1) a stylus; (2) a stamper; and (3) a spyglass,
wherein the stylus, stamper, and spyglass are configured to interact with a touch-screen surface of a computing device executing the panoramic coloring kit application.

2. The panoramic coloring kit of claim 1, wherein upon executing the panoramic coloring kit application by a computing device, the panoramic coloring kit application is configured to generate a digital image comprising a panoramic view of a coloring environment.

3. The panoramic coloring kit of claim 2, wherein the stylus comprises a single touch-point coloring stylus for at least one of selecting an item in the coloring environment and coloring the item in the coloring environment.

4. The panoramic coloring kit of claim 2, wherein the stamper comprises a multiple touch-point stamper for adding new items to the coloring environment.

5. The panoramic coloring kit of claim 2, wherein the spyglass comprises a multiple touch-point spyglass, including a spyglass frame that encloses a transparent viewing area.

6. The panoramic coloring kit of claim 2, wherein the spyglass is configured to activate a feature in the coloring environment.

7. The panoramic coloring kit of claim 6, wherein activating the feature comprises at least one of activating an animation feature, activating an audio feature, and activating a hidden object feature.

8. The panoramic coloring kit of claim 2, wherein the application provides an explore mode for exploring the coloring environment by at least one of moving the computing device and touching the touch-screen surface of the computing device.

9. The panoramic coloring kit of claim 8, wherein the explore mode further provides for manipulating a location of a selected scene.

10. The panoramic coloring kit of claim 9, wherein the application provides a coloring mode for coloring the selected scene from the coloring environment.

11. The panoramic coloring kit of claim 10, wherein in the coloring mode, at least one animated feature is included in the coloring environment.

12. The panoramic coloring kit of claim 10, wherein a user input provided in the coloring mode is visible in the explore mode.

13. A method of providing an interactive panoramic coloring environment, the method comprising:

presenting, on a touch-screen display of a computing device, an interactive panoramic coloring environment,
wherein the interactive panoramic coloring environment is presented on the touch-screen display of the computing device in response to executing an application on the computing device,
wherein presenting the interactive panoramic coloring environment comprises generating a digital image comprising a panoramic view of the interactive coloring environment,
wherein the digital image is configured to display at least a portion of the interactive coloring environment corresponding to a manipulation of the computing device by a user.

14. The method of claim 13, wherein upon interaction with the digital image, a selection of a scene in the interactive coloring environment is received.

15. The method of claim 14, wherein the selected scene of the interactive coloring environment is configured to receive an input from at least one of the following:

a single touch-point coloring stylus;
a multiple touch-point stamper; and
a multiple touch-point spyglass,
wherein the single touch-point coloring stylus, multiple touch-point stamper, and multiple touch-point spyglass are configured to interact with a touch-screen surface of the computing device.

16. The method of claim 15, wherein, based on the received input, at least one item in the selected scene is modified.

17. The method of claim 15, wherein receiving an input from the multiple touch-point spyglass comprises receiving an indication that the multiple touch-point spyglass is in contact with the touch-screen surface of the computing device, and wherein the modifying the at least one item in the selected comprises, based on the received indication, activating at least one of an animation feature, an audio feature, and a hidden object feature.

18. A panoramic coloring kit comprising:

an activation feature for an application that when executed by a computing device generates a digital image comprising a 360-degree view of a coloring environment;
at least one of the following: (1) a single touch-point coloring stylus for at least one of selecting an item in the coloring environment and coloring an item in the coloring environment; (2) a multiple touch-point stamper for adding new items to the coloring environment; and (3) a multiple touch-point spyglass, including a spyglass frame that encloses a transparent viewing area,
wherein the single touch-point coloring stylus, multiple touch-point stamper, and multiple touch-point spyglass are configured to interact with a touch-screen surface of the computing device.

19. The panoramic coloring kit of claim 18, wherein the digital image comprising the 360-degree view of the coloring environment includes animation features and audio features.

20. The panoramic coloring kit of claim 18, wherein the application provides an explore mode for exploring the 360-degree view of the coloring environment, wherein a location of a selected scene is manipulated based on at least one of moving the computing device and touching the touch-screen surface of the computing device.

Patent History
Publication number: 20140273715
Type: Application
Filed: Mar 14, 2014
Publication Date: Sep 18, 2014
Applicant: Crayola LLC (Easton, PA)
Inventors: Joseph Thomas Moll (Bethlehem, PA), Brian Nemeckay (Belvidere, NJ), Stephen Weiss (Easton, PA)
Application Number: 14/211,815
Classifications
Current U.S. Class: Having Means To Draw (446/146)
International Classification: A63H 33/00 (20060101);