INTERACTIVE ANIMATE LUGGAGE

A suitcase that, in certain embodiments, allow for animate and interactive zoolocomotion and zoomimicry of luggage. Importantly, such output may occur without the need for a user to press buttons; instead, such output may be triggered by natural interactions with embodiments described herein, such as when a child strokes a cat causing the cat to purr in enjoyment.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
PRIORITY

This application claims the benefit of co-pending U.S. provisional application 62/587,211 filed Nov. 16, 2017 by the same inventors which is included by reference as if fully set forth herein. This application is also a continuation-in-part of co-pending application Ser. No. 29/617,217 filed Sep. 9, 2017, which in turn is a continuation of U.S. Pat. D824,676 issued Aug. 7, 2018.

BACKGROUND Field of Invention

Embodiments of the present disclosure relate generally to luggage, and more specifically, to an animal-like, rolling suitcase with sensory input and audiovisual, physical and tactile output.

Description of Related Art

Currently, luggage is inanimate and lacks interactivity with owners. Thus owners are unlikely to be emotionally attached to their luggage. This is especially true with children. Because of this dearth, luggage is merely borne by tired, disinterested users, dragged and bumped along through hotels and airports all over the world. Clearly there is a need for interactive, animate and entertaining luggage.

SUMMARY

Embodiments described herein allow for animate and interactive zoolocomotion and zoomimicry of luggage. Importantly, such output may occur without the need for a user to press buttons; instead, such output may be triggered by natural interactions with embodiments described herein, such as when a child strokes a cat causing the cat to purr in enjoyment.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure.

FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure.

FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure.

DETAILED DESCRIPTION Generality of Invention

This application should be read in the most general possible form. This includes, without limitation, the following:

References to specific techniques include alternative and more general techniques, especially when discussing aspects of embodiments described herein, or how the embodiment might be made or used.

References to “preferred” techniques generally mean that the inventor contemplates using those techniques, and thinks those techniques are best for the intended application. This does not exclude other techniques for embodiments described herein, and does not mean that those techniques are necessarily essential or would be preferred in all circumstances.

References to contemplated causes and effects for some implementations do not preclude other causes or effects that might occur in other implementations.

References to reasons for using particular techniques do not preclude other reasons or techniques, even if completely contrary, where circumstances would indicate that the stated reasons or techniques are not as applicable.

Furthermore, embodiments described herein are in no way limited to the specifics of any particular embodiments and examples disclosed herein. Many other variations are possible which remain within the content, scope and spirit of embodiments described herein, and these variations would become cl to those skilled in the art after perusal of this application.

More detail may be found in the attached appendix, which is incorporated by reference as if fully set forth herein.

Glossary

As used herein, “coapproach” (noun) may refer to the process of movement of one or more objects towards the other objects and/or towards a common center before collision. As used herein, “coapproaching” (verb) refers to locomotion during coapproach.

As used herein, “zoolocomotion” (noun), “zoolocomote” (verb), and zoolocomotory (adjective) may refer to any animal-like movement of a member or part of a member of the animal kingdom.

As used herein, “zoomimicry” (noun), “zoomimical” (adjective), and “zoomimic/zoomimicking” (verb) may refer to non-living objects taking the appearance and/or behavior of a member or part of a member of the animal kingdom.

FIG. 1

FIG. 1 illustrates a circuit block diagram, according to one embodiment of the present disclosure. Circuit 100 employs bus 102 to electrically connect (1) data processing 105 with (1) sensory input 120 with (3) audiovisual, physical and tactile output 155 with (4) power system 180. Data processing 105 includes processor 110, memory 115. Sensory input 120 includes accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotation sensor 145, handle sensor 150. Audiovisual, physical and tactile output 155 includes vibratory driver 160, speaker 165, display 170, actuator 175. Power system 180 includes motor/dynamo 185, power/data cable 190 and battery 195. The inventors contemplate the connection of elements of circuit 100 in any and all conceivable fashion.

Data Processing 105

Data processing 105 includes processor 110 and memory 115. In one embodiment, processor 110 may execute commands related to sensory input data from sensory input 120 (described herein). In another embodiment, processor 110 may execute instructions that may trigger actions by output 155 (described herein).

Memory 115 may store, by way of example and not limitation, inputs, commands, outputs or other data. By way of example and not limitation, memory 115 may be short-term memory (e.g., random access memory) or long-term data storage (e.g., EEPROM or solid state memory). In one embodiment, memory 115 may store sensory input data from sensory input 120 (described herein). In another embodiment, memory 115 may store command instructions (e.g., software or firmware) that, when executed by processor 110, may trigger actions by output 155 (described herein). In an additional embodiment, memory 115 may be used as backup/auxillary data storage for smart devices, laptops and the like.

Certain embodiments may include wireless communications circuitry (not shown) to effectuate programmability. This may include Bluetooth, near field communications (NFC), and Wi-Fi circuitry. Moreover, certain embodiment may include location sensing such as GPS coupled to the processor.

Sensory Input 120

Sensory input 120 includes accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotational sensor 145, handle sensor 150. Accelerometer 125 may be a 3-axis (i.e., X-, Y- and Z-axis) accelerometer capable of detecting motion of circuit 100 and/or embodiments described herein connected to circuit 100. In this manner, accelerometer 125 may detect when embodiments described herein undergo motion, and may send telemetry to processor 110.

Tactile sensor 130 may send touch data to processor 110, which, in turn, causes processor 110 to execute commands described herein. In some embodiments, tactile sensor 130 may be located on a suitcase handle (not pictured) or on a suitcase shell (not pictured) and may detect when a user touches tactile sensor 130.

Microphone 135 may record audio and transmit audio data to processor 110, causing processor 110 to execute commands described herein. In one embodiment, processor 110 may cause microphone 135 may monitor audio for ‘key phrases’ spoken by a user. In this example, when a ‘key phrase’ is spoken and detected by microphone 135, microphone 135 may trigger processor 110 to execute output commands described herein.

Camera 140 may record video and transmit video data to processor 110, causing processor 110 to execute commands described herein. In one embodiment, processor 110 may cause camera 140 may monitor video for ‘key gestures’ performed by a user. In this example, when a ‘key gesture’ is performed and detected by camera 140, camera 140 may trigger processor 110 to execute output commands described herein.

Rotation sensor 145 may detect rotation of embodiments described herein and send rotational data (i.e., telemetry) to processor 110. In one embodiment, a commercially available rotation sensor 145 may mechanically linked to a wheel (not pictured) and may detect when a user causes the wheel to roll. In a further embodiment, telemetry may used to trigger processor 110 to execute output commands described herein.

Handle sensor 150 may detect telescopic movement of coaxial shafts. Such coaxial shafts may be, by way of example and not limitation, a telescoping, extendable handle shaft such as that found on a suitcase. In one embodiment, handle sensor 150 may be located proximate to a handle shaft (not pictured) and may detect when a user telescopically extends a handle shaft. This detection may be effectuated using a commercially available proximity sensor or limit switch. In a further embodiment, movement detected by handle sensor 150 may cause processor 110 to execute output commands described herein.

Audiovisual, Physical and Tactile Output 155 (Output 155)

Accelerometer 125, tactile sensor 130, microphone 135, camera 140, rotational sensor 145, handle sensor 150 (together, sensory input 120), may record sensory input and send sensory input data to data processing 105. Data processing 105 may send commands to cause audiovisual, physical and tactile output from output 155 as described herein. More specifically, a user's interactions with embodiments described herein may cause movements or reactions that mimic that of animals (i.e., zoolocomotion and zoomimicry, respectively).

Vibratory devices such as haptic motion devices and vibratory drivers 160 may cause vibrations of embodiments described herein that are detectable by a user. Vibratory driver 160 may be a solid-state vibratory driver or any known vibratory driver. In one embodiment, vibratory driver 160 may mimic the vibrations of an animal's body when “purring” occurs (e.g., a cat's purring can be felt as vibrations by a user upon being petted). However, the inventors contemplate embodiments described herein conducting any and all types of zoolocomotion and zoomimicry.

Embodiments described herein provide for inputs from sensory input 120 to trigger outputs from output 155. By way of example and not limitation tactile sensor 130 may detect a “petting” motion (e.g., such as when one strokes a pet affectionately) and may communicate touch data to data processing 105. Said petting motion may be detected by detecting and tracking contact movement across a surface. In turn, data processing 105 may trigger vibratory driver 160. In this manner, tactile sensor 130 may trigger vibratory driver 160 using zoomimicry of embodiments described herein. By way of example and not limitation, zoomimicry in this example may take the appearance of “purring” (such as that conducted by a cat in response to being petted). Thus, embodiments described herein provide for zoomimicking reactions triggered by user stimuli.

In an optional embodiment, speaker 165, connected to circuit 100, may be employed to play sounds based on commands generated by processor 110. Such sounds may include “purring,” any zoomimicry or known audio file. Moreover, these commands may occur in reaction to one or more of: data from handle sensor 150, tactile sensor 130, rotation sensor 145 or telemetry from accelerometer 125.

Continuing with examples provided herein, a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause speaker 165 to play a “purring sound” in either isolation or conducted with other output (i.e., simultaneous with vibrations caused by vibratory driver 160). In this manner, embodiments described herein may more realistically zoomimic animal reactions to user stimuli.

Display 170 may consist of any known visual display such as, by way of example and not limitation, one or more LEDs or a LCD display. In one embodiment, display 170 may take the form of a collection of LEDs. By way of example and not limitation, display 170 may be a “heart-shaped” formation of multi-colored LEDs capable of varying illumination levels and color displays.

While zoomimical and zoolocomotory examples have been given, the inventors contemplate that embodiments described herein are not limited to realistic forms and movements. Indeed, fantastical shapes and actions are contemplated by the inventors. A fantastical example follows: display 170 may “pulse” in response to commands output by processor 110 in reaction to movement detected by accelerometer 125. By way of example and not limitation, such pulsing may also take the form of changing shapes and/or colors, occurring at stable or varying frequencies. Moreover, the pulsing may be fashioned after an animal heartbeat.

Accordingly, a user may interact with embodiments described herein in a manner detectable by sensory input 120, causing display 170 to change in response. By way of example and not limitation, accelerometer 125 or other sensory input 120 may detect movement of embodiments described herein, causing display 170 to appear as a “beating heart” in a fantastical representation of life. For example, a user may “pet,” “shake,” or roll embodiments described herein, causing embodiments described herein to appear “awake” or “alive” to a user.

In a further example, the “heartbeat” described herein may change frequency depending on a rolling speed detected by rotation sensor 145 or other sensory input 120. In an even further example, rolling embodiments described herein faster may cause a faster “heartbeat.” In this manner, display 170 may cause embodiments described herein to take on a fantastical appearance of “excitement.”

Actuator 175 may rotate, oscillate or otherwise actuate in response to commands output by data processing 105. This actuation may occur in reaction to data recorded by sensory input 120. In a further embodiment, appendages as described herein (not pictured) may be attached to actuator 175 such that the appendages undergo movement driven by actuator 175 in reaction to telemetry from accelerometer 125. These appendages may mimic animal shapes; however, the inventors contemplate that the appendages may take on any form or shape.

By way of example and not limitation, appendages may be attached to one or more actuators 175 to give the appearance of “cat ears.” Further in this example, user stimuli detected by sensory input 120 may cause actuators 175 and attached appendages to execute zoolocomotion.

Continuing with examples provided herein, a user's petting motion detected by tactile sensor 130 or other sensory input 120 may cause one or more actuators 175 to “wiggle,” by way of example and not limitation, appendages in the form of “cat ears.” Thus, “petting” embodiments described herein may cause a “lifelike” reaction in the form of “twitching” or “twisting” cat ears in a “perkily attentive” manner. Said wiggling may be effectuate by having the “ears” magnetically coupled to a support plate, which in turn is rotated of re-positioned under programmatic control. More information on such zoolocomotion is provided herein. Magnetically coupling appendages allows for changing to different ears or shapes to provide different effects. Moreover, magnetically coupling appendages allows the appendages to be removed for storing or shipping the suitcase.

The inventors contemplate that such actuation by actuator 175 may occur in either isolation or conducted with other output. Furthering examples used herein, wiggling appendages caused by actuation of one or more actuators 175 may occur simultaneously with vibrations caused by vibratory driver 160 and/or “purring” sounds played by speaker 165. In this manner, embodiments described herein may more realistically zoomimic animal reactions to user stimuli. Again, output possibilities from output 155 are not limited to lifelike representations. Furthering examples still, display 170 may play visual output simultaneously with one or more of the output examples given herein, thus fantastically enhancing output and/or animations described herein. By way of example and not limitation, display 170 may display a pulsing heart in response to a user petting embodiments described herein.

Power Systems 180

Power systems 180 include motor/dynamo 185, power/data cable 190 and battery 195. Motor/dynamo 185 may be located in wheels (not pictured) to cause movement of embodiments described herein, or to generate electricity by acting as a dynamo when embodiments described herein are moved. In one embodiment, motor/dynamo 185 may be mechanically linked to rotation sensor 140. Motor/dynamo 185 may also cause locomotion of embodiments described herein in response to sensory input. In one embodiment, motor/dynamo 185 may cause embodiments described herein to zoolocomote. By way of example and not limitation, such zoolocomotion may mimic the way a pet follows a person (e.g., a cat follows an owner).

Power/data cable 190 may be connected to circuit 100 and may be employed to power or recharge one or more elements of circuit 100. Power/data cable 190 may be retractable or concealable with a cover plate as known. Battery 195 may be connected to circuit 100 and may be employed to power one or more elements of circuit 100. Battery 195 may be any known battery including rechargeable-type batteries. In one embodiment, battery 195 may be recharged by embodiments described herein, including but not limited to: motor/dynamo 185 and power/data cable 190.

Power/data cable 190 may be of any cable type, including USB. The inventors contemplate that power/data cable 190 may feed power into other devices to recharge them or transmit/receive data. In one embodiment, power/data cable 190 may be compatible with smart devices for multiple purposes including but limited to: recharging smart devices and backing up/storing data for smart devices. In one embodiment, the inventors contemplate the usage of embodiments described herein (e.g., rolling a suitcase) to charge battery 195, and in turn, battery 195 may be used to recharge a user's smart device through power/data cable 190. Any power/data cables 190, such as a USB cable, has myriad uses in the art and the inventors contemplate all such uses.

Data Communications System

While not pictured, the inventors also contemplate the addition of a communications system (not pictured), including but not limited to: wireless or wired communications (e.g., Wi-Fi, 3G, 4G or LTE data communications) as known. Wired or wireless communications may provide Internet connectivity, data transfer capability and access to cloud data storage. Data storage capabilities as described herein (e.g., memory 115) may allow for wireless backup or wired backup (e.g., through power/data cable 190) for smart devices, laptops and the like.

Conventional radio modules such as GPS, Bluetooth and the like may be included in some embodiments to allow for enhanced operations such as the ability to transmit, to a server, location information, or to effectuate certain motions in response to specific location information.

The inventors also contemplate that any data from embodiments described herein (e.g., sensory input data, telemetry and the like, or any data as described herein) may be recorded and uploaded to cloud-based Internet storage. Furthermore, the inventors contemplate storing and sharing of this data on the Internet (e.g., social media websites) with Internet users, e.g., social media users and/or other owners of embodiments described herein, such that these Internet users may be aware of each other's shared data. By way of example and not limitation, users of embodiments described herein may share data related to embodiments described herein with each other, e.g., compare telemetry. Moreover, users may download certain program instructions to alter or enhance the operations described herein.

The above illustration provides many different embodiments for implementing different features of embodiments described herein. Specific embodiments of components and processes are described to help clarify embodiments described herein. These are, of course, merely embodiments and are not intended to limit embodiments described herein from that described in the claims.

FIG. 2

FIG. 2 illustrates an interactive and animate suitcase, according to one embodiment of the present disclosure. View 200 shows a front three-quarters profile of suitcase 205. Suitcase 205 includes wheels 210, tactile sensor 215, one or more appendages 220, display 225, casters 230, handle 255, and actuators 275.

Suitcase 205 may be of impact-resistant or ballistic material as known. Suitcase 205 may be water resistant or waterproof (e.g., IP68g or IP69k) as known. Suitcase 205 may contain multiple, isolated compartments as known. Suitcase 205 may be a clamshell or foldable design as known. Suitcase 205 may a hardshell, soft fabric, or hybrid design as known. Suitcase 205 may have zippers, clasps, buttons, magnetic enclosures and may seal closed by any known means.

Some embodiments may include handles 255 made from clear plastic providing for certain illumination affects. This may be effectuated by placing a light source, such as one or more LEDs (not shown), at the base of the handle 255. With the LEDs under programmatic control, the color, and light intensity may be varied for certain movements or in response to certain stimuli.

Optionally, suitcase 205 may include speakers, handle sensors and/or vibratory drivers (not shown) as described herein. In some embodiment, casters 230 may be fixed- or spinner-style, as known. In other embodiments, wheels 210 and/or casters 230 may be of varying opacity (e.g., transparent). In one embodiment, casters 230 and/or wheels 210 may have be transparent with an opaque portion to give the appearance of “paws” or “feet.”

Sensory Input

Tactile sensor 215 may detect physical contact by a user as described herein. Tactile sensor 215 may transmit sensory data to a processor. In one embodiment, tactile sensor 215 may detect, by way of example and not limitation, “petting” of suitcase 205 by a user in a similar manner that a user may pet an animal. While “petting” of tactile sensor 215 is provided as an example, the inventors contemplate any physical interaction with tactile sensor 215 to cause tactile sensor 215 to transmit sensory data to the processor.

As illustrated, tactile sensor 215 is positioned on suitcase 205 near appendages 220, however, the inventors contemplate positioning one or more tactile sensors 230 on any location on suitcase 205. Furthermore, tactile sensor 215 may take any shape and occupy any surface of suitcase 205.

A rotation sensor (not shown) may be mechanically linked with one or more wheels 210 in order to detect rotation of wheels 210, and thus locomotion of suitcase 205 by a user. In turn, rotation sensor may send telemetry to a processor. The rotation sensor may be mounted coaxially to, or otherwise mechanically engaged with, one or more wheels 210 such that wheels 210 supply rotational drive to the rotation sensors by known means.

An accelerometer (not shown) may be employed in suitcase 205 to detect movement of embodiments described herein, or physical contact of embodiments described herein by a user. Thus, the accelerometer may transmit telemetry to a processor. By way of example and not limitation, the accelerometer may detect rolling of suitcase 205 by a user, “petting” or shaking of suitcase 205 by a user.

Finally, a microphone and/or camera (not shown) may be located proximate to suitcase 205 to record audiovisual data. Audiovisual data may be sent to a processor and may in turn cause audiovisual, physical or tactile output as described herein.

Handle 255, as shown in rear three-quarters view 250, may allow a user to drag or push suitcase 205. Handle 255 may be a fixed- or telescoping-style handle as known. Handle 255 may also contain a handle sensor (not pictured) that sends information related to handle movement to a processor. In one embodiment, handle 255, when deployed or retracted by a user, may trigger activation/deactivation of embodiments described herein. In further embodiments, such triggering may reduce charge depletion and improve battery performance.

Audiovisual, Physical & Tactile Output

As illustrated, appendages 220 appear as “cat ears,” but appendages 220 may, by way of example and not limitation, take any zoomorphic form and the inventors contemplate appendages 220 taking the form of any known three-dimensional shape. Also as illustrated, appendages 220 are located, by way of example and not limitation, on top of suitcase 205. However, the inventors contemplate placing appendages 200 any location proximate to suitcase 205. In one embodiment, appendages 220 may be removable and stored in suitcase 205 to allow suitcase 205 to occupy an overall smaller volume profile. In this manner, such a reduced footprint may allow for easier storage in airplane overhead compartments and better compatibility with passenger airline regulations.

Appendages 220 may be driven by actuators 275, as shown in top front (cutaway) view 270. Under the power of actuators 275, appendages 220 may “move,” or more specifically, appendages 220 may coapproach or oscillate. By way of example and not limitation, appendages 220 may “twitch,” “wiggle,” or “twist,” in a manner similar to the motion of the ears of a cat. Note, however, that the inventors contemplate any and all movements physically possible of appendages 220 by actuators 220. By way of example and not limitation, such movements may take the form of zoolocomotion. Movement of appendages 220 may occur in reaction to accelerometer/rotation sensor telemetry, handle or tactile sensor 215 data being received and processed by a processor.

Display 225 may be lit by a power source (not shown) in suitcase 205. As illustrated, Display 225 takes the form of LEDs deployed in a “heart” formation on suitcase 205, but the inventors contemplate that display 225 may take any shape or be of any number, and be located on any position on suitcase 205. LEDs are illustrated only by way of example and not limitation. Thus, embodiments described herein provide for any known video displays, such as LCD screens and the like. Furthermore, the inventors contemplate the display of any and all visual media on display 225. Furthermore, the inventors contemplate a processing updating or otherwise changing this visual media based on sensory input data as provided herein. In one embodiment, display 225 may illuminate in reaction to sensory input as described herein.

In one embodiment, a speaker (not shown) may cause sound in reaction to accelerometer/rotation sensor telemetry, microphone, camera, handle sensor or tactile sensor 215 data being received and processed by a processor as described herein. By way of example and not limitation, the speaker may play music, animal sounds, however, the inventors contemplate any and all audio output known.

A vibratory driver (not shown) may be employed to cause vibration of suitcase 205 or parts of the suitcase according to some embodiments described herein. In one embodiment, the vibratory driver may vibrate suitcase 205 in reaction to accelerometer/rotation sensor telemetry, handle sensor or tactile sensor 215 data being received and processed by a processor.

By way of example and not limitation, output by appendages 220, display 225, a vibratory driver or speaker may occur in reaction to data from a user “petting” suitcase 205, rolling or shaking suitcase 205, using (e.g., deploying/retracting) handle 255, or any other stimuli. In other words, a user interfacing with suitcase 205 may cause output by appendages 220, display 225, a vibratory driver or speaker to imitate a “reaction” by suitcase 205. While zoolocomotory/zoomimical examples have been provided, the inventors contemplate any and all possible movements as reactions by suitcase 205 to user stimuli. Furthermore, the speed and/or frequency of output by appendages 220, display 225, a vibratory driver or speaker may vary in reaction to user stimuli.

In one embodiment, pulling up on handle 255 (i.e., telescoping or deploying) may cause embodiments described herein to appear to “wake up,” e.g., triggering a “heartbeat” on display 225, movement of appendages 220, or causing audiovisual, physical or tactical output as described herein.

Power System

Suitcase 205 may include one or more of: a power cable (not pictured) or a battery (not pictured) as described herein. One or more wheels 210 may include one or more motor/dynamos (not pictured). Motor/dynamos may recharge a battery (not shown) or provide locomotion to suitcase 205, as described herein. Motor/dynamos may also cause suitcase 205 to move in response to sensory input as described herein. In one embodiment, motory/dynamos may cause suitcase 205 to zoolocomote (by way of example and not limitation: “follow” a user in manner similar to a pet following an owner) based on commands from a processor and/or sensory input as described herein.

FIG. 3

FIG. 3 illustrates a method for driving audiovisual, physical and/or tactile output based on sensory input, according to one embodiment of the present disclosure. Although the method steps are described in conjunction with FIGS. 1-3, persons skilled in the art will understand that any system configured to perform the method steps, in any order, falls within the scope of the present disclosure. The steps in this method are illustrative only and do not necessarily need to be performed in the given order they are presented herein. Some steps may be omitted completely.

The method begins at a step 305, in which sensory input is received by a sensor. In some embodiments, sensors may be tactile sensors, handle sensors, rotational sensors or accelerometers as described herein. By way of example and not limitation, sensory input for a tactile sensor may take the form of touch data, such as if a user were to “pet” the tactile sensor. A handle sensor may send a handle trigger output when, by way of example and not limitation, deployment of a telescoping suitcase handle is detected by a handle sensor. By way of example and not limitation, an accelerometer or a rotational sensor mechanically linked to a wheel may send telemetry related to movement of embodiments described herein.

At a step 310, sensory input data is transmitted to a processor. At a step 315, the processor may execute instructions from software stored in memory in response to and/or dependent on sensory input data. By way of example and not limitation, software may take the form of firmware, software loaded in short- or long-term data storage as described herein, or Internet/cloud-stored information.

At a step 320, the processor may send commands based on the instructions to an audio/visual physical or tactile output system (output system). In some embodiments, an output system may include one or more of the following: motor/dynamos, vibratory drivers, displays and speakers as described herein. By way of example and not limitation, a command sent by the processor to a motor/dynamo may cause voltage applied to a motor/dynamo, causing a suitcase to zoolocomote, as described herein. By way of example and not limitation, commands to a vibratory driver may cause voltage applied to a vibratory driver, causing a suitcase to zoomimick a cat (e.g., “purring”), as described herein. By way of example and not limitation, commands sent to a speaker may be formed as audio data signals causing a suitcase to zoommick “purring” sounds from a speaker, as described herein. By way of example and not limitation, commands sent to a display may take the form of voltage applied to an LED or video data sent to a display, causing pulsation/illumination of a heart formation of LEDs or fantastical representations of lifelike appearances and animations as described herein.

Although embodiments described herein are illustrated and described herein as embodied in one or more specific examples, it is nevertheless not intended to be limited to the details shown, since various modifications and structural changes may be made therein without departing from the spirit of embodiments described herein and within the scope and range of equivalents of the claims. Moreover, this application includes additional images in the attached appendix to the specification. Accordingly, it is appropriate that the appended claims be construed broadly and, in a manner, consistent with the scope of embodiments described herein, as set forth in the following claims.

Claims

1. A container including:

a plurality of wheels,
an extendable handle, coupled to said container;
a processor coupled to the container said processor coupled to a memory device, a motion detector and an actuator;
said memory device operable to hold program instructions directing the processor to perform a method including:
detecting motion and moving an appendage in response to the motion.

2. The container of claim 1 wherein the moving the appendage includes zoolocomotion.

3. The container of claim 1 wherein the appendage is substantially in the form of cat ears.

4. The container of claim 1 wherein the container is a suitcase.

5. The container of claim 4 further including:

a wireless communication system, said wireless communications system coupled to the processor;
wherein the processor and wireless communications system are operable to send and receive information to alter the program instructions.

6. The container of claim 5 wherein the wireless communications system is either Bluetooth or Wi-Fi.

7. The container of claim 1 further including a tactile sensor, said tactile sensor coupled to the processor.

8. The container of claim 1 wherein the motion detector is an accelerometer.

9. A suitcase including:

at least one wheel, said wheel including a rotation sensor;
a processor coupled to the suitcase said processor coupled to a memory device, a tactile sensor, a motion detector, and actuator and the rotation sensor;
said memory device operable to hold program instructions directing the processor to perform a method including one or more of the following:
detecting motion and moving an appendage in response to the motion,
detecting rotation and moving an appendage in response to the rotation,
detecting touch and moving an appendage in response to the touch;
wherein said appendage substantially represents an animal ear;
wherein the moving the appendage includes movement of the animal ear in a substantially zoomical motion.

10. The suitcase of claim 9 further including:

a second appendage,
wherein said moving the appendage includes moving both appendages.

11. A suitcase including:

at least one wheel, said wheel including a rotation sensor;
a substantially transparent handle, said handle dispose on a first surface of the suitcase;
a light source, said light source disposed to illuminate at least a portion of the handle;
a processor coupled to the suitcase said processor coupled to a memory device, a tactile sensor, a motion detector, the light source and the rotation sensor;
said memory device operable to hold program instructions directing the processor to perform a method including one or more of the following:
detecting motion and illuminating the light source in response to the motion,
detecting rotation and illuminating the light source in response to the rotation, or
detecting touch and illuminating the light source in response to the touch.

12. The suitcase of claim 11 wherein the light source includes multi-colored LEDs.

Patent History
Publication number: 20190231045
Type: Application
Filed: Nov 14, 2018
Publication Date: Aug 1, 2019
Inventor: Aili Jian (Alameda, CA)
Application Number: 16/190,526
Classifications
International Classification: A45C 7/00 (20060101); A45C 13/00 (20060101);