USB desktop toy
An amusing desktop toy, and methods of its use. The toy, in the shape of a realistic or imaginary creature, is connected to a local computer via a communication port, and is actuated by a local or remote user to move so as to directly express gestures visually in three dimensions. The remote user sends gesture instructions to actuate the toy via a telecommunication mechanism from a remote computer. The remote user can program the gesture instructions via a graphical user interface. Optionally, the toy includes a speaker and/or a UFD.
Latest Patents:
- FOOD BAR, AND METHOD OF MAKING A FOOD BAR
- Methods and Apparatus for Improved Measurement of Compound Action Potentials
- DISPLAY DEVICE AND MANUFACTURING METHOD OF THE SAME
- PREDICTIVE USER PLANE FUNCTION (UPF) LOAD BALANCING BASED ON NETWORK DATA ANALYTICS
- DISPLAY SUBSTRATE, DISPLAY DEVICE, AND METHOD FOR DRIVING DISPLAY DEVICE
This patent application claims the benefit of U.S. Provisional Patent Application No. 60/720,056 filed Sep. 26, 2005.
FIELD AND BACKGROUND OF THE INVENTIONThe present invention relates to a system and method for connecting an amusing mechanical toy to a computer that can be controlled by a computer user (either locally or remotely to the computer) in order to convey various forms of expression.
Software programs used for textual communication between computer users often provide users with graphical symbols known as icons, emoticons, and winks. Users often embed these symbols in their messages for additional modes of expression. In the prior art, these symbols have been limited to two-dimensional drawings on the screen.
In order to further enhance textual communication between computer users, a device that could convey an intended emotion or gesture by movement in physical space (as opposed to just a screen depiction or animation) would be desirable. This type of three-dimensional figurine (or toy) would extend the forms of expression available to these types of users. This would expand the variety of multimedia available to textual message software, and its users, beyond just screen images and sounds. In addition, it would provide an amusing experience like most other toys generally do.
For the purpose of this disclosure and claims, the term “amusing” is used in this application to describe the property of stimulating at least one of an interest, an excitement, or an arousal by a viewer. An amusing object or event is, under this definition, distinguished from a functional object or event. A robot is an automated mechanical device that is functional, in the sense that it can work unobtrusively and may not create any sentiment or interest by the viewer. A toy drummer is an automated mechanical device that may not be functional or useful at all, but its motion is funny, threatening, intriguing or arousing to a viewer. Clearly, a device can be both amusing and functional, or be neither amusing nor functional. This application is about amusing devices, in the sense of the definition provided above.
The term “gesture” is used in this application as a name of a predetermined collection of movements of the model (specifically, a minimum of three movements), carried out through a set of software instructions designed to represent a known motion, such as nodding, kneeling, or waving hands.
SUMMARY OF THE INVENTIONIt is the purpose of the present invention to provide an amusing mechanical toy that is connected to the computer via a communication port, such as a universal serial bus (USB) port. This toy responds with motion to instructions provided to it by either its local user or a remote user who is in communication with the local user. The toy is a mechanical model of a real or imaginary creature, such as a person or an animal, preferably with an amusing appearance, which is is connected to a computer via a serial port or a USB port and is placed on or near a desk. The toy is equipped with mechanical actuators, such as motors or electromagnets, that cause perceptible motion in the model in response to commands sent from the computer in order to convey a gesture.
The term perceptible motion is used in this application to refer to motion that is appreciable enough to be observed by a user, as opposed to slight variations in shape created e.g. by tightening, loosening, vibrating, or heating an object.
A gesture is expressed directly by motion itself and not by a consequence of the motion. Therefore, we are excluding a three-dimensional motion that indirectly expresses a two-dimensional gesture, such as the motion of a printer printing a smiley face on paper.
This model can serve as a toy or as mechanism to support communication between people by adding mechanized body language to the verbal or textual discourse. In these latter applications, the remote user has a vocabulary of gestures from which to choose. If the local user has such a model connected to the local computer, and if the remote user is made aware of this fact, then the remote user can include, in his text, commands that will cause the local toy to perform some of the motions to physically emphasize the text.
Preferably, a variety of models of the toy are made available that have a different set of gestures. Each toy is supported by a file listing its set of gestures, and the host computer is able to read the toy capabilities and configure the toy software accordingly. This enables a remote computer to activate a local toy by reading its possible gestures and presenting these to the remote user, who can then send proper instructions to the local computer to activate the local toy.
The system includes a telecommunication mechanism for triggering a command from a computer, and a communication mechanism for sending a command from a second computer to the device. The telecommunication mechanism includes an “internet” which is a set of interconnected computer networks. The most well-known internet is the Internet.
Therefore, according to the present invention, there is provided for the first time a system for expressing a gesture in three dimensions, the system including: (a) a mechanical device operative to perceptively move at least three parts thereof in response to gesture instructions; (b) a gesture instruction, for controlling said mechanical device, embedded in a correspondence, the gesture instruction originating from a remote computer; (c) a local computer for supplying the gesture instruction to the mechanical device during the course of the correspondence between the remote computer and the local computer; and (d) a gesture instruction interpreter operative to extract the gesture instruction from the correspondence, and convey the gesture instruction to the mechanical device, thereby directly expressing the gesture visually.
Preferably, the system also includes: (e) a telecommunication mechanism for sending the gesture instruction from a remote computer.
Most preferably, the telecommunication mechanism also includes an internet.
Most preferably, the remote computer also includes a graphical user interface (GUI) for programming the gesture instruction via the telecommunication mechanism.
Preferably, the system also includes: (e) a communication mechanism, such as a generic, wired serial port or a wireless port, for sending at least one gesture instruction from a local computer to the mechanical device.
Preferably, the system also includes: (e) a UFD housed in the mechanical device for storing and retrieving data.
Most Preferably, the UFD is detachable.
According to the present invention, there is provided for the first time an amusing physical model controlled by a computer, the model including: (a) at least three movable parts; (b) at least one mechanical actuator for moving at least three movable parts, the number of mechanical actuators is less than the number of movable parts; and (c) a communication port for receiving a gesture instruction from the computer for controlling the mechanical actuator to directly express a gesture visually.
Preferably, the remote computer also includes the model is shaped as a realistic creature or as a fictitious creature.
Preferably, the movable parts represent limbs.
Preferably, the model also includes: (d) a UFD housed in the model for storing and retrieving data to and from the local computer.
Most preferably, the UFD is detachable.
Preferably, the model also includes: (d) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
Preferably, the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
Preferably, the mechanical actuator is pneumatically-controlled.
Preferably, the mechanical actuator is hydraulically-controlled.
Preferably, the mechanical actuator comprises at least one gear for moving the movable parts.
Preferably, the communication port is a generic, wired serial port or a wireless port.
Preferably, the model also includes: (d) a computer-readable storage medium that includes a data file of the gesture instruction.
According to the present invention, there is provided for the first time an amusing physical model controlled by a computer, the model including: (a) at least three movable parts; (b) a mechanical actuator for moving at least three movable parts; (c) a communication port for receiving gesture instructions from the computer for controlling the mechanical actuator to directly express a gesture visually; and (d) a UFD housed in the model for storing and retrieving data to and from the computer via the communication port.
Preferably, the model is shaped as a realistic creature or as a fictitious creature.
Preferably, the movable parts represent limbs.
Preferably, the UFD is detachable.
Preferably, the model also includes: (e) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
Preferably, the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
Preferably, the mechanical actuator is pneumatically-controlled.
Preferably, the mechanical actuator is hydraulically-controlled.
Preferably, the mechanical actuator comprises at least one gear for moving the movable parts.
Preferably, the communication port is a generic, wired serial port or a wireless port.
Preferably, the model also includes: (e) a computer-readable storage medium that includes a data file of the gesture instructions.
According to the present invention, there is provided for the first time a method of directly expressing a gesture of a physical model with at least three movable parts visually in three dimensions by a remote user of a remote computer to a local user of a local computer, the method includes the steps of: (a) providing the local user with a computer-controlled mechanical device operationally connected to the local computer; and (b) remotely activating the mechanical device to move in a manner that expresses the gesture visually by the remote user to the local user.
Preferably, the step of activating the mechanical device is effected by an application running on the remote computer.
According to the present invention, there is provided for the first time a method of self-expression through a three-dimensional visual gesture of a physical model with at least three movable parts, the method includes the steps of: (a) programming a computer with at least one software module that activates the model to perform the gesture by moving parts of the model; (b) operationally connecting the model to the computer; and (c) activating at least one software module from the computer, thereby directly expressing the gesture.
These and further embodiments will be apparent from the detailed description and examples that follow.
Devices that resemble the present invention are known in the art. One such device is the Doc Johnson High Joy Enabled® iVibe Rabbit from High Joy Products, LLC which allows a user to be stimulated physically by the device through a computer control which can be operated by the user or a remote operator. In contrast to the present invention, the user is required to be in physical contact with the device in order to be stimulated. In addition, the act of wearing the device creates a sense of anticipication of the forthcoming stimulation in the user. Whereas, in the case of the present invention, the user can be spontaneously surprised by the device since its operation does not require active involvement by the user (other than being in viewing range). Furthermore, the stimulation, in and of itself, does not constitute a gesture as defined above.
Another device that resembles the present invention is the Nabaztag (“Wi-Fi Rabbit”) from Violet, The Smart Object Company. This device allows a user to be notified of various information through a wireless link to the Internet. The information is conveyed by the device's speaker via simulated voices. The information is obtained from the services that the Nabaztag provides (such as weather forecasts). In addition to talking, the Nabaztag can flash lights, play music, and move its ears. In contrast to the present invention, the expression of gestures as defined in this application is not possible by the Nabaztag. The limited rotational movement of the Nabaztag is inadequate to express a gesture. Whereas, in the case of the present invention, the device makes at least three movements to express a gesture. This is considered, in this application, to be the minimum number of motions necessary to express a gesture realistically, where the moving parts represent limbs of a figure or creature (e.g. head, arms, legs, tail, etc.).
Another example of a prior art device, developed by a research team at Nanyang Technological University, Singapore, is a USB jacket that can simulate the act of hugging to the wearer transmitted by a remote operator. In contrast to the present invention, the expression of the gesture is created by physical contact and creates no visual effect. Furthermore, the prior art device does not move or change its shape perceptibly, as defined above, in its operation. In the case of the present invention, the device changes its shape perceptibly to convey the visual expression of a gesture.
BRIEF DESCRIPTION OF THE DRAWINGSThe invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
The present invention is of a system for a USB desktop toy and a method for expressing gestures via such a toy. Specifically, the present invention can be used for amusement and expressing emotion. The principles and operation of a USB desktop toy according to the present invention may be better understood with reference to the drawings and the accompanying description.
Referring now to the drawings,
The instructions to mechanical actuator 36 in
Clearly, the model can represent either real creatures, such as humans or animals, or also imaginary creatures, such as Donald Duck or Smurfs. As toy 20 is located near local computer 34, to which toy 20 is connected via a communication port (such as a USB port), toy 20 can also easily serve as a portable storage device and contain a USB flash memory drive (UFD). As seen in
As toy 20 is computer-controlled, a batch of gesture instructions contained in a data file can either be created on local computer 34, or can be received from remote computer 42. This batch of instructions can make the toy move according to predefined choreography in order to perform predefined gestures. Remote computer 42 contains a memory 46 and a database 48. Database 48 contains the data files representing the feasible gestures that toy 20 can make. Database 48 is either loaded into memory 46 or transmitted to remote computer 42 via internet 40 at the start of a correspondence session.
In one embodiment, remote computer 42 embeds the gesture instruction in the correspondence with local computer 34. A gesture instruction interpreter 49, residing on local computer 34, extracts the gesture instruction from the correspondence, and conveys the gesture instruction to toy 20.
As there may be many models, with a variety of possible movements, the set of possible movements can be saved in a file which can be made available to remote computers. This will enable a remote user to program instructions for the local toy, and use these instructions to cause the local toy to move in response to remote instructions. When this is done in the course of real-time correspondence, the toy adds gestures and body language to emphasize important ideas and feelings/moods within the conversation.
As many of the potential users of this toy may not be professional programmers, the toy preferably is schematically represented on the screen of the computer so that the user can program movements by using a mouse to click and drag control points on the screen which represent real movements of the corresponding points of the toy in space. Preferably, the system has a “record” mode and a “play” mode; wherein, the record mode stores the gestures as marked by the mouse on the screen, and the play mode sends these gestures to the toy for execution.
In a preferred embodiment of the present invention, the model also includes a speaker, and the instructions to the model can include sound files to be played through that speaker.
It is noted that the toy does not need to be connected to the computer through a USB port. It can equally be connected via a serial port, a parallel port, or a wireless port such as Bluetooth.
In a preferred embodiment of this invention, the toy can have a USB socket, and serve as a UFD cradle on the desk on which the toy rests.
In another preferred embodiment of this invention, the instructions to the toy can be triggered by software instructions in running applications.
While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications, and other applications of the invention may be made.
Claims
1. A system for expressing a gesture in three dimensions, the system comprising:
- (a) a mechanical device operative to perceptively move at least three parts thereof in response to gesture instructions;
- (b) a gesture instruction, for controlling said mechanical device, embedded in a correspondence, said gesture instruction originating from a remote computer;
- (c) a local computer for supplying said gesture instruction to said mechanical device during the course of said correspondence between said remote computer and said local computer; and
- (d) a gesture instruction interpreter operative to extract said gesture instruction from said correspondence, and convey said gesture instruction to said mechanical device, thereby directly expressing the gesture visually.
2. The system of claim 1, further comprising:
- (e) a telecommunication mechanism for sending said gesture instruction from said remote computer.
3. The system of claim 2, wherein said telecommunication mechanism includes an internet.
4. The system of claim 2, wherein said remote computer includes a graphical user interface (GUI) for programming said gesture instruction via said telecommunication mechanism.
5. The system of claim 1, the system further comprising:
- (e) a communication mechanism for sending at least one said gesture instruction from said local computer to said mechanical device.
6. The system of claim 5, wherein said communication mechanism is a generic, wired serial port or a wireless port.
7. The system of claim 1, the system further comprising:
- (e) a UFD housed in said mechanical device for storing and retrieving data to and from said local computer.
8. The system of claim 7, wherein said UFD is detachable.
9. An amusing physical model controlled by a computer, the model comprising:
- (a) at least three movable parts;
- (b) at least one mechanical actuator for moving said at least three movable parts, number of said at least one mechanical actuator less than number of said at least three movable parts; and (c) a communication port for receiving a gesture instruction from the computer for controlling said mechanical actuator to directly express a gesture visually.
10. The model of claim 9, wherein the model is shaped as a realistic creature.
11. The model of claim 9, wherein the model is shaped as a fictitious creature.
12. The model of claim 9, wherein said movable parts represent limbs.
13. The model of claim 9, the model further comprising:
- (d) a UFD housed in the model for storing and retrieving data to and from said local computer.
14. The model of claim 13, wherein said UFD is detachable.
15. The model of claim 9, the model further comprising:
- (d) a speaker housed in the model for receiving signals from the computer via said communication port for conveying sound.
16. The model of claim 9, wherein a single said mechanical actuator is operative to move said at least three movable parts using at least one flexible mechanical linkage for controlling the model to express said gesture.
17. The model of claim 9, wherein said mechanical actuator is pneumatically-controlled.
18. The model of claim 9, wherein said mechanical actuator is hydraulically-controlled.
19. The model of claim 9, wherein said mechanical actuator comprises at least one gear for moving said at least three movable parts.
20. The model of claim 9, wherein said communication port is a generic, wired serial port or a wireless port.
21. The model of claim 9, the model further comprising:
- (d) a computer-readable storage medium that includes a data file of said gesture instruction.
22. An amusing physical model controlled by a computer, the model comprising:
- (a) at least three movable parts;
- (b) a mechanical actuator for moving said at least three movable parts;
- (c) a communication port for receiving a gesture instruction from the computer for controlling said mechanical actuator to directly express a gesture visually; and
- (d) a UFD housed in the model for storing and retrieving data to and from the computer via said communication port.
23. The model of claim 22, wherein the model is shaped as a realistic creature.
24. The model of claim 22, wherein the model is shaped as a fictitious creature.
25. The model of claim 22, wherein said movable parts represent limbs.
26. The model of claim 22, wherein said UFD is detachable.
27. The model of claim 22, the model further comprising:
- (e) a speaker housed in the model for receiving signals from the computer via said communication port for conveying sound.
28. The model of claim 22, wherein a single said mechanical actuator is operative to move said at least three movable parts using at least one flexible mechanical linkage for controlling the model to express said gesture.
29. The model of claim 22, wherein said mechanical actuator is pneumatically-controlled.
30. The model of claim 22, wherein said mechanical actuator is hydraulically-controlled.
31. The model of claim 22, wherein said mechanical actuator comprises at least one gear for moving said at least three movable parts.
32. The model of claim 22, wherein said communication port is a generic, wired serial port or a wireless port.
33. The model of claim 22, wherein the model further comprising:
- (e) a computer-readable storage medium that includes a data file of said gesture instruction.
34. A method of directly expressing a gesture of a physical model with at least three movable parts visually in three dimensions by a remote user of a remote computer to a local user of a local computer, the method comprising the steps of:
- (a) providing the local user with a computer-controlled mechanical device operationally connected to the local computer; and
- (b) remotely activating said mechanical device to move in a manner that expresses the gesture visually by the remote user to the local user.
35. The method of claim 34, wherein said step of activating said mechanical device is effected by an application running on the remote computer.
36. A method of self-expression through a three-dimensional visual gesture of a physical model with at least three movable parts, the method comprising the steps of:
- (a) programming a computer with at least one software module that activates the model to perform the gesture by moving parts of the model;
- (b) operationally connecting the model to said computer; and
- (c) activating at least one said software module from said computer, thereby directly expressing the gesture.
Type: Application
Filed: Feb 9, 2006
Publication Date: Mar 29, 2007
Applicant:
Inventors: Shuka Zernovizky (Tel Aviv), Itzhak Pomerantz (Kefar Saba)
Application Number: 11/349,991
International Classification: A63H 3/20 (20060101);