Interactive animated characters
A very compact interactive animated character is provided that provides highly life-like and intelligent seeming interaction with the user thereof. The animated character can take the form of a small animal-like creature having a variety of moving body parts including a smile/emotion assembly which are coordinated to exhibit life-like emotional states by precisely controlling and synchronizing their movements in response to external sensed conditions. The animated character also includes sound generating circuitry to generate speech sounds as well as sounds associated with various emotional states, which are coordinated with a lip sync assembly simulating speech mouth movement. The drive system utilizes first and second reversible motors which are able to power and precisely coordinate the lip sync assembly producing speech mouth movement, with the movable parts and the smile/emotion assembly to produce life-like interactions and emotional expressions.
The present invention relates to interactive toys and, more particularly, to animated characters that can perform movements with body parts thereof in a precisely controlled and coordinated manner in response to external sensed conditions.
BACKGROUND OF THE INVENTIONOne major challenge with toys in general is keeping a child interested in playing with the toy for more than a short period of time. To this end, toy dolls and animals have been developed that can talk and/or have moving body parts. The goal with these devices is to provide a plaything that appears to interact with the child when they play with the toy.
Whereas prior art interactive toys have several moving parts, the life-like action attributed to these moving parts is due to the random nature of their movements with respect to each other as the individual parts tend to move in a predictable cyclic action; in other words, there is no control over the motion of a specific part individually on command in prior toys, and highly controlled coordination of one part with the movement of other parts is generally not done. Emotion has proven difficult to capture with conventional mechanical actuators, and thus it would be desirable to provide better coordinated constituent assemblies to exhibit life-like emotional states by precisely controlling and synchronizing their movements in response to external sensed conditions. Moreover, coordination with sound generating circuitry would be desirable to generate speech sounds as well as sounds associated with various emotional states that are coordinated with a lip synchronization simulating speech and mouth movements. Thus, there is a need for an animated character that provides for more precisely controlled and coordinated movements between its various moving parts and allows for individual parts to be moved in a more realistic manner.
SUMMARY OF THE INVENTIONIn accordance with the present invention, a very compact animated character is provided that provides highly life-like and intelligent seeming interaction with the user thereof. The animated character can take the form of a small animal-like creature having a variety of moving body parts including a smile/emotion assembly which are coordinated to exhibit life-like emotional states by precisely controlling and synchronizing their movements in response to external sensed conditions. The animated character also includes sound generating circuitry to generate speech sounds as well as sounds associated with various emotional states, which are coordinated with a lip sync assembly simulating speech mouth movement. The drive system utilizes first and second reversible motors which are able to power and precisely coordinate the lip sync assembly producing speech mouth movement, with the movable parts and the smile/emotion assembly to produce life-like interactions and emotional expressions.
More particularly, the drive system that powers the movement of the character body parts, e.g. eye, brow, mouth, ear, plume, chest, and foot assemblies, includes a mouth assembly on the front facial area including a flexible molded material having upper and lower mouth portions and having first and second opposing corners thereof. The animated character has a lip sync assembly to simulate speech mouth movement that is independently controlled and coordinated with the movable parts resulting in coordinated speech mouth movement with the desired life-like emotional states. A first mouth mechanism is operable with the mouth assembly for controlling the first and second corners of the mouth assembly to define smile/emotion states of the mouth assembly responsive to external input from sensed conditions of the plurality of sensors. A second mouth mechanism is further operable with the mouth assembly for controlling the upper and lower mouth portions to provide lip synchronism response to the multisyllabic words generated with the sound generating circuitry.
The drive system that powers movement of the first and second mouth mechanisms thus uses each of these mechanisms independently to simulate life-like responses to sensed conditions, at least one of which may also be used for causing movement of another of movable body parts in addition to the first or second mechanisms. The cams have surfaces that are programmed for very precise and controlled movements of the body parts in particular ranges of shaft movements such that generally every point on a particular cam surface has meaning to the controller in terms of what type of movement the body part is undergoing and where it needs to be for its subsequent movement, or for when the body part is to remain stationary. In this manner, the controller can coordinate movements of the body parts to provide the animated character with different states such as sleeping, waking or excited states. Further, the controller is provided with sound generating circuitry for generating words that complement the different states such as snoring in the sleeping state or various exclamations in the excited state. In addition, the programmed surfaces of the cam mechanisms are preferably provided on the walls of slots with the cam mechanisms including followers that ride in the slots.
The animated character herein is also capable of playing games with the user in a highly interactive and intelligent seeming manner. These and other advantages are realized with the described interactive plaything. The invention advantages may be best understood from the following detailed description taken in conjunction with the accompanying flow charts of Appendix A and the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
An animated character 100 as shown in
The sensors 104 signal a controller or processor circuitry 400, described hereinafter, which controls a drive systems 106 for coordinating speech mouth movement with the desired life-like emotional states. The drive system 106, utilizes two, low power reversible electric motors, 108 & 110, which are able to power and precisely control the lip sync assembly 140 and the movable parts 102 to produce life-like interactions and emotional expressions. Further, the control processor circuitry 400 includes sound generating circuitry to generate speech sounds as well as sounds associated with various emotional states, such as a laugh, gasp, sigh, growl & snore, etc. coordinated with speech mouth movement driven by the lip sync assembly 140 and movement of the various body parts 102 of the animated character 100. A prior form of the device was available from the Assignee herein under the name “Furby” ™, for which prior issued U.S. Pat. Nos. 6,544,098, 6,537,128, 6,514,117, 6,497,607, 6,149,490 for “Interactive toy” to Hampton et al. of Applicant's Assignee are hereby incorporated by reference in their entirety.
The animated character 100, as seen in
Many of the movable body parts 102 of the animated character 100 herein are provided in a front facial area 117 toward the upper end 116 of the animated character body 112. The facial area contains eye and eye lid assemblies 123 and 124 respectively and mouth assembly 126, with a brow assembly 122 adjacent thereto, as seen in
A face frame 119 is mounted to the body 112 in an upper opening and includes a pair of upper eye openings and a lower mouth opening centered therebelow (openings not shown). An eye assembly 123 is provided including a pair of semi-spherical eyeballs 248 and 250 sized to fit in the eye openings of the frame 119 and pivotally attached thereto via pivot eye shafts 252 and 254 respectively. Thus, the pivot shafts 252 and 254 are spaced forwardly and vertically higher than a first control shaft 136 (discussed further below) and extends perpendicular thereto.
Movement of each body part 102 is driven either by motor 108, as seen in
More specifically, cam mechanisms 138 are associated with the ear assembly 118, the plume assembly 120, the brow assembly 122, the eye lid assembly 124, the eye assembly 125, the smile/emotion assembly 126, and the chest assembly 128, coordinating the movement of the aforementioned assemblies for the expression of various life-like emotional states. Simultaneously, motor 110, rotates the shuttle gear 142 associated with a second control shaft 139 in one direction driving movement of the lip sync assembly 140 simulating speech mouth movement, or rotates the shuttle gear 142 and associated second control shaft 139 in an opposite direction driving movement of the foot assembly 130. The dual motor system of the present embodiment, separates the lip sync assembly 140 operations from operations of the movable body parts 102 associated with the first control shaft 136 and cam operating mechanisms 138. Thus any speech generated by animated character 100 can be combined and coordinated with any of the exhibited life-like emotional states, such that the animated character 100 is able to coordinate the same speech with any exhibited emotional state, and likewise, coordinate different speech with the same emotional state.
Cam mechanisms 138 each include a disc-shaped cam member and a follower or actuator linkage thereof. More specifically, and referencing
Left and right movable ear devices 132 and 133 respectively, are able to bend forward and back in addition to their pivoting action upon rotation of the control shaft 136. As shown in
Right movable ear device 133 also bends forward and back upon rotation of control shaft 136 in the same manner through similar mechanisms as described herein for bending left movable ear device 132. A prior form of a movable appendage was available from the Assignee herein for which prior issued U.S. Pat. No. 6,773,327 for “Apparatus for actuating a toy” to Felice et al. issued Aug. 10, 2004 to Applicant's Assignee is hereby incorporated by reference in its entirety with reference to apparatus for an appendage including a moveable device within a toy appendage that is attached to a body of a toy and an actuator connected to the moveable device. The actuator is configured to rotate the moveable device about a drive axis that is fixed relative to the body of the toy. The actuator is configured to rotate at least a first portion of the moveable device relative to at least a second portion of the moveable device about a device axis that is fixed relative to the moveable device.
The mouth assembly 126 includes a first mouth mechanism herein the smile/emotion assembly 127 and a second mouth mechanism herein the lip sync assembly 140 which are operate with the mouth assembly 126 to independently drive two different types of mouth movement. The mouth assembly 126 has a mouth member 196 comprised of substantially identical upper and lower mouth portions 204 and 206, covered with a flexible molded material, in the form of upper and lower halves of a beak in the present embodiment, as seen in
A smile/emotion cam mechanism 178 drives the vertical up and down movement of the smile/emotion assembly 127 and includes a left cam member 146 (shared with left ear cam member as set forth herein) and a right cam member 179 as shown in
The second mouth mechanism herein the lip sync assembly 140 attaches to upper and lower mouth portions 204 and 206 respectively, of mouth member 196 to achieve a second type of mouth movement. In particular,
The upper and lower mouth portions 204 and 206 are pivotally mounted on shaft 208 by rear semi-circular boss portions thereof spaced on either side of the mouth portions 204 and 206 so as to provide space for a tongue member 210 therebetween, as seen in
The eye lid assembly 124, as shown in
Accordingly, rotation of shaft 136 rotates cam member 228 with pin 236 riding in slot 230 thereof to cause the follower 234 to translate in a fore and aft direction while engaged with upper and lower lid portions 220 and 222. The follower shifting forwardly causes upper and lower lid portions 220 and 222 to move away from one another and seemingly close the eyes of animated character 100, and the follower 234 shifting rearwardly causes the lid portions 220 and 222 to move toward each other seemingly opening the eyes of animated character 100.
The eye assembly 123, as also seen in
Further expressive features of the animated character 100 which are driven for movement by rotation of the first control shaft 136 include the plume assembly 120 and the brow assembly 122. The plume assembly 120 and the brow assembly 122 are pivotally attached to a brow bracket 278 fixed to the upper end 116 of the body 112. The plume assembly 120 as seen in
The brow assembly 122 as seen in
A chest assembly 128, as seen in
The animated character 100 also includes a foot assembly 130 including a pair of feet 300, as seen in
The control processor circuitry 400 is able to precisely control and determine the position of the first control shaft 136 when the motor 108 is activated; however, it is also desirable to avoid the expense and moving parts of utilizing a closed loop servo mechanism for providing the necessary feedback. The drive system 106 of an embodiment herein instead includes an optical counting assembly 302 which counts intervals of the rotation of a slotted gear wheel 304 in gear train transmission of the drive system 106, as seen in
For programming of the cam surfaces, modeling of the animated character's different states is based on puppeteering actions to achieve positions of body parts for generating animated character movements. From the neutral position as a starting point, the cam is designed to actuate the leaf spring switch to zero out the count for the motor on a regular basis. In this manner, the position of the shaft will not become out of synchronization, the count of the processor thus being zeroed to provide for recurrent and regular calibration of the position of the shaft. From the neutral position, rotation/direction is determined to cause certain coordinated movements of various body parts with precise movements thereof. In this regard, the cams are provided with cam surfaces that have active regions and inactive regions so that in the active regions, the part associated with the particular cam is undergoing movement, and in the inactive region the part is stationary.
As shown in
Contacts of a leaf spring switch are mounted between the disc 320 and the speaker grill 324 and affixed thereto. Thus, depressing the disc 320 as by pushing or rubbing on the hide of the character thereover causes engagement of the contact strips which signals the processor circuitry 400. Actuating a front sensor assembly can simulate tickling of the animated character 100 in its belly region.
As previously stated, cam surfaces of the cam mechanisms 138 herein are provided with precise predetermined shapes which are coordinated with the programming of the processor circuitry 400 so that at every point of the cam surfaces, the processor circuitry 400 can be used to determine the position of the moving body parts 102 associated therewith. In this manner, the animated character 100 can be provided with a number of different expressions to simulate different predetermined physical and emotional states. For instance, changes in emotional expressions of animated character 100 upon rotation of first control shaft 136 are provided as shown in
A neutral position is provided at a zero degree position of the control shaft 136 wherein the eyes lids 220 and 222 are open, the ear devices 132 and 133 are up at a forty-five degree angle, the chest is in, the plume 256 is down, and the mouth corners 198 and 200 and brow are in neutral positions neither up nor down. A happy expression is provided at a thirty-six slot count clockwise rotation of the control shaft 136 wherein the eye lids 220 and 222 are open the ear devices 132 and 133 are pivoted up to a twenty-five degree angle, the chest is in, the plume 256 is up, the mouth corners 198 and 200 are up in a smile, and the brow is up. A surprised expression is provided at a seventy-two slot count clockwise rotation of the control shaft 136 wherein the eyes lids 220 and 222 are wide open, the ear devices 132 and 133 remain up at a twenty-five degree angle, the chest is in, the plume 256 is up, the mouth corners 198 and 200 are in a neutral position neither up nor down, and the brow remains up. A sad expression is provided at a one hundred eight slot count rotation of the control shaft 136 wherein the eyes lids 220 and 222 lower to open, the ear devices 132 and 133 are down at a ninety degree angle, the chest is in, the plume 256 is down, the mouth corners 198 and 200 are down in a frown, and the brow remains up. An angry expression is provided at a one hundred forty-four slot count rotation of the control shaft 136 wherein the eyelids 220 and 222 are narrow, the ear devices 132 and 133 are down at a ninety degree angle, the chest is about thirty percent out, the plume 256 is up, the mouth corners 198 and 200 are down in a frown, and the brow is down. A sleep expression is provided at a one hundred eighty slot count rotation of the control shaft 136 wherein the eye lids 220 and 222 are wide open, the ear devices 132 and 133 are up at a forty-five degree angle, the chest is about fifty percent out (i.e., the chest is fully out at a one hundred sixty-eight rotation of control shaft 136) the plume 256 is down, the mouth corners 198 and 200 and brow are in neutral positions neither up nor down. Total slot count for 1 revolution of cam system is 206 counts (0 to 205).
The embedded microprocessor circuit for the animated character 100 is identified in
The circuitry employs wireless transmission 404. The input/output (I/O) port of the information processor 402 is capacitively coupled to the data lines from the port of the information processor 402. Capacitive coupling methods are employed to initiate simple wireless communication between two bodies by placing them within a few inches apart. For example, communication is facilitated through the use of two small plates (406, 408) about 0.75 square inches in size and mounted side by side about ½ inch apart horizontally. A receiver amplifier 410 is provided as a receiver module preamplifier, herein Waitrony Module No. WPI-T2100 used for amplification of the capacitively coupled electrical carrier signals. Accordingly, an emitter plate 406 is used as a transmitter, with the other plate 408 used as a receiver. When located near a matching pair of plates, communication can be established by initiating a capacitive coupling between the aligned plates. Transmit and receive protocol is assigned on the fly (i.e. who talks first). Such capacitive coupling techniques know in the art include, e.g., expired U.S. Pat. No. 4,242,666 to Reschovsky et al. for Range selectable contactless data acquisition system for rotating machinery, issued Dec. 30, 1980 which discloses a multichannel data acquisition system uses radio telemetry for data transfer by providing a capacitive coupling link between rotating and stationary members with a pulse-code modulated signal containing the measured information for transmission through the capacitive coupling link.
As described, the wireless transmission 404 provides circuitry under control of the speech processing incorporated with the information processor 402 which serves to receive, transmit and process speech and other information. The wireless receive circuit block 408 is coupled to the information processor 402 for receiving wireless signals from the transmit circuitry 404 of another animated character device as described herein.
The information processor 402 is provided for speech and wireless communications capabilities. The RSC-4× speech recognition and synthesis are supported with its Sensory Speech™ 7, providing advanced algorithms having substantial on-chip speech recognition algorithms accuracy for speaker-dependent recognition and as well as for speaker-independent recognition. Additionally audible speech synthesis is also provided.
The described information processor 402 of
The sound detection and voice recognition are provided via microphone (Mic In 1 and Mic In 2) inputs to allow the information processor 402 to receive audible information as sensory inputs from the child which is interacting with the animated character 100. Optical control circuitry 412 is used with the motor control circuitry 414 as discussed herein to provide an electronic motor control interface for controlling the position and direction of the electric motors. An H-bridge circuit for operating the motor in either forward or reverse directions. A power control block 416 is used to voltage regulate the battery power to the processor CPU, nonvolatile memory (EEPROM) and other functional components of the processor circuitry 400.
Various other sensory inputs 418 provide a plurality of sensory inputs coupled to the information processor 402 allowing the animated character 100 to be responsive to its environment and sensory signals from the child. A tilt/invert sensor 420 is provided to facilitate single pull double throw switching with a captured conductive metal ball allowing the unswitched CPU voltage to be provided at either of two input ports indicating tilt and inversion of the plaything respectively. The sensory inputs 418 of the described embodiment are provided as push button switches, although pressure transducers and the like may also be provided for sensory input. The sensory inputs 418 are provided as a momentary push button controlled, e.g., a mouth sensor of the tongue of the plaything is acquired with the audio ADC provided as a switch-select allowing the processor 402 to receive, e.g. the feed input with other I/O inputs. Additional momentary switches are provided for the front and back sensors of the plaything respectively as push button sensory inputs 418.
The motor interface provided between the information processor 402 and the motor control block 414 controls the actuator linkages with the information processor 402. As described, the plurality of sensory inputs, e.g., switch sensory inputs 418, and the audio (Mic In 1 and Mic In 2), and wireless wireless transmission 404, are coupled to the information processor 402 for receiving corresponding sensory signals. Computer programs referenced below in connection with the program flow diagram for operating the embedded processor design embodiment of Appendix A facilitates processing of the sensory signals for a plurality of operational modes provided by the computer program with respect to the actuator linkage operation and corresponding sensory signal processing for controlling the at least one actuator linkage to generate voice interaction with the child with the plurality of movable members corresponding to each of the operational modes of the plaything which provides interactive artificial intelligence for the animated character 100. As discussed, the animated character includes a doll-plush toy or the like having movable body parts with one or more of the body parts of the doll being controlled by the plurality of movable members for interacting with the child in a life-like manner.
The of the software program flow diagrams of Appendix_A, see programs P.1-P.48 below; provide for the operating of the embedded processor circuitry described above. The program flow diagram uses the embedded processor circuitry 400 for initialization, diagnostics, and calibration routines are executed prior to the normal run mode of the processor circuitry 400. As provided in connection with the general random table (P.48), pseudo random values are introduced. For example, when push button sensory inputs 418 are activated, inputs e.g., 0, 1, 2 are provided instead of always adding 1 when an input is triggered once. This adds a random increment when sensory inputs are being triggered. The mean value of the pseudo output may be set to unity (1) to have randomness factor of trigger for fluency calculation. There are three conditions for fluency counting to increase: (a) Each time that the animated character HEARS and UNDERSTANDS a word, that word's Fluency increases (+5 each time for “each” word). The fluency parameter will be updated in each VR response; (b) each time that the animated character SPEAKS a word, increase that word's Fluency by +1—the fluency parameter will be updated in each Phrase in ENGLISH response; or (c) every hour that Furby has some interaction with the player, ALL words increase by +1, e.g, a hourly check of any key/VR within the previous hour. The flow charts also address the now fluency mechanism references to the fluency increasing (OOV response and ‘I don't understand’ response). Various artificial intelligence (AI) and sensor training functions are provided in which training between the random and sequential behavior modification of the animated character, allowing the child to provide reinforcement of desirable activities and responses. In connection with the AI functions, appropriate responses are provided in response to particular activities or conditions, e.g., bored, hungry, sick, sleep. Such predefined conditions have programmed responses which are undertaken by the animated character at appropriate times in its operative states. Accordingly, summarizing the wide range of life-like functions and activities the compact and cost-effective toy 100 herein can perform to entertain and provide intelligent seeming interaction with a child, the following is a description of the various abilities the preferred animated character 100 has and some of the specifics in terms of how these functions can be implemented, in subroutines or programs P.1-P.48 (Appendix A) as follows.
- P.1 Game Play Flowchart
- P.2 Power Up Sequence
- P.3 Game Play Loop
- P.4 Idle Mode
- P.5 OOV Response
- P.6 Acknowledgement Response
- P.7 Time Out Response
- P.8 Mischief Mode
- P.9 Initiate Response
- P.10 Main Input Mode
- P.11 Main Input Mode (Furbish)
- P.12 Story Mode
- P.13 Song Mode
- P.14 Joke Mode Entry
- P.15 Joke Player Response
- P.16 Joke Correct Respond Sequence
- P.17 End Joke Sequence
- P.18 Hungry Mode
- P.19 Play Mode Entry
- P.20 Furby Select Action (Play Mode)
- P.21 Game Over Handle (Play Mode)
- P.22 Player Tilted Furby (Play Mode)
- P.23 Player Too Slow (Play Mode)
- P.24 Dance Mode
- P.25 Love Mode
- P.26 “How are you” mode
- P.27 Sleep Response
- P.28 Sleep Mode
- P.29 Deep Sleep Mode
- P.30 Sensor Check
- P.31 Tilt Mode
- P.32 Petting Mode
- P.33 Tickle Mode
- P.34 Feeding Mode
- P.35 Question Mode
- P.36 Yes No Response
- P.37 Don't Understand Handle
- P.38 Furby to Furby Mode Entry
- P.39 Furby to Furby VR Check
- P.40 Furby to Furby VR Check pt ½
- P.41 Furby to Furby VR Check pt ¾
- P.42 Furby to Furby VR Check pt 5
- P.43 Play Phrase Handling
- P.44 Fluency Handle During VR Check
- P.45 Hourly Fluency Handle
- P.46 Try Me Mode (Part 1)
- P.47 Try Me Mode (Part 2)
- P.48 General Random Table
While there have been illustrated and described particular embodiments of the present invention, it will be appreciated that numerous changes and modifications will occur to those skilled in the art, and it is intended in the appended claims to cover all those changes and modifications which fall within the true spirit and scope of the present invention.
Claims
1. An electrically controlled animating apparatus for simulating life-like movements, the apparatus comprising:
- a front facial area, a body and a plurality of movable body parts thereof;
- sound generating circuitry for generating speech including multisyllabic words;
- a plurality of sensors for detecting external inputs;
- a controller responsive to the plurality of sensors;
- said controller being operable to control the sound generating circuitry;
- a mouth assembly on the front facial area comprising flexible molded material having upper and lower mouth portions and having first and second opposing corners thereof;
- a first mouth mechanism operable with the mouth assembly for controlling the first and second-corners of the mouth assembly to define smile/emotion states of the mouth assembly responsive to external input from sensed conditions of the plurality of sensors;
- a second mouth mechanism operable with the mouth assembly for controlling the upper and lower mouth portions to provide lip synchronism response to the multisyllabic words generated with the sound generating circuitry;
- a drive system that powers movement of the first and second mouth mechanisms independently to simulate life-like responses to sensed conditions;
- a first control shaft for the first mouth mechanism driven for rotation by the drive system;
- a second control shaft for the second mouth mechanism driven by the drive system;
- at least one of the first or second control shafts having a predetermined range of rotation for causing movement of at least one of the plurality of movable body parts in addition to the first or second mechanisms.
2. The animating apparatus as recited in claim 1 wherein said first control shaft causes movement of at least one of the plurality of movable body parts in addition to the first and second corners of the mouth assembly.
3. The animating apparatus as recited in claim 2 wherein said drive system further comprises a first reversible motor which powers the rotation of said first control shaft for movement of the corners of the mouth assembly, and a second reversible motor which powers the rotation of said second control shaft for movement of the upper and lower mouth portions of the mouth assembly.
4. The animating apparatus as recited in claim 3 further comprising a mechanical coupling between said second motor and said second control shaft for transmitting rotary output power from the second motor to the second shaft for rotation thereof.
5. The animating apparatus as recited in claim 4 wherein said at least one of the plurality of movable body parts includes a foot portion, and rotation of said second control shaft in a direction causes movement of said foot portion and rotation of said second control shaft in an opposite direction causes movement of said second mouth mechanism for controlling upper and lower mouth portions of said mouth assembly.
6. The animating apparatus as recited in claim 5 wherein said mechanical coupling comprises one or more of a shuttle gear or a clutch mechanism between said second motor and said second control shaft for transmitting rotary output power from the second motor to the second shaft for rotation of said second control shaft in the direction causing movement of said foot portion and rotation of said second control shaft in the opposite direction causing movement of said second mouth mechanism.
7. The animating apparatus as recited in claim 1 wherein said controller provides the animating apparatus with a plurality of states comprising animating apparatus modes that include excited, sleeping, and waking modes.
8. The animating apparatus as recited in claim 2 wherein said at least one of a plurality of body parts includes one or more of the following: an ear assembly; an eye assembly; an eye lid assembly; a plume assembly; a brow assembly; and a chest assembly.
9. The animating apparatus as recited in claim 2 wherein said first and second corners of the mouth assembly pivot to a neutral state an up/smile state and a down/frown state.
10. The animating apparatus as recited in claim 9 wherein said first control shaft has a neutral position in the predetermined range of shaft rotation with the corners of the mouth assembly in a neutral state.
11. An animating system comprising:
- means for generating speech including multisyllabic words;
- means for detecting external sensor inputs;
- means for controlling the means for generating speech responsive to the means for detecting external sensor inputs;
- said means for controlling being operable to control a mouth assembly comprising flexible molded material having upper and lower mouth portions and having first and second opposing corners thereof;
- said means for controlling being operable to control a first mouth mechanism operable with the mouth assembly for controlling the first and second corners of the mouth assembly to define smile/emotion states of the mouth assembly responsive to external input from sensed conditions;
- said means for controlling being operable to control a second mouth mechanism operable with the mouth assembly for controlling the upper and lower mouth portions to provide lip synchronism response to the multisyllabic words generated with the sound generating; and
- means for driving movement of the first and second mouth mechanisms independently to simulate life-like responses to sensed conditions including a first control shaft for the first mouth mechanism driven for rotation by the drive system, and a second control shaft for the second mouth mechanism driven by the drive system with at least one of the first or second control shafts having a predetermined range of rotation for causing movement of at least one of the plurality of movable body parts in addition to the first or second mechanisms.
12. The animating system as recited in claim 11 wherein said first control shaft causes movement of at least one of the plurality of movable body parts in addition to the first and second corners of the mouth assembly.
13. The animating system as recited in claim 12 wherein said drive system further comprises a first reversible motor which powers the rotation of said first control shaft for movement of the corners of the mouth assembly, and a second reversible motor which powers the rotation of said second control shaft for movement of the upper and lower mouth portions of the mouth assembly.
14. The animating system as recited in claim 13 further comprising means for coupling between said second motor and said second control shaft for transmitting rotary output power from the second motor to the second shaft for rotation thereof.
15. The animating system as recited in claim 14 wherein said at least one of the plurality of movable body parts includes a foot portion, and rotation of said second control shaft in a direction causes movement of said foot portion and rotation of said second control shaft in an opposite direction causes movement of said second mouth mechanism for controlling upper and lower mouth portions of said mouth assembly.
16. An animating method comprising:
- generating speech including multisyllabic words;
- detecting external sensor inputs;
- controlling a mouth assembly comprising flexible molded material having upper and lower mouth portions and having first and second opposing corners thereof for generating speech responsive thereto and detecting external sensor inputs;
- controlling a first mouth mechanism operable with the mouth assembly for controlling the first and second corners of the mouth assembly to define smile/emotion states of the mouth assembly responsive to external input from sensed conditions;
- controlling a second mouth mechanism operable with the mouth assembly for controlling the upper and lower mouth portions to provide lip synchronism response to the multisyllabic words generated with the sound generating; and
- driving movement of the first and second mouth mechanisms independently to simulate life-like responses to sensed conditions including a first control shaft for the first mouth mechanism driven for rotation by the drive system, and a second control shaft for the second mouth mechanism driven by the drive system with at least one of the first or second control shafts having a predetermined range of rotation for causing movement of at least one of the plurality of movable body parts in addition to the first or second mechanisms.
17. The animating method as recited in claim 16 wherein said first control shaft causes movement of at least one of the plurality of movable body parts in addition to the first and second corners of the mouth assembly.
18. The animating method as recited in claim 16 wherein said controlling provides a plurality of states comprising modes that include excited, sleeping, and waking modes.
19. The animating method as recited in claim 18 wherein at least one of a plurality of body parts includes one or more of the following: an ear assembly; an eye assembly; an eye lid assembly; a plume assembly; a brow assembly; and a chest assembly.
20. The animating method as recited in claim 17 wherein said first and second corners of the mouth assembly pivot to a neutral state an up/smile state and a down/frown state, and said first control shaft has a neutral position in the predetermined range of shaft rotation with the corners of the mouth assembly in a neutral state.
21. An electrically controlled animating apparatus for simulating life-like movements, the apparatus comprising:
- a front facial area of a character head and a plurality of movable parts thereof;
- a mouth assembly on the front facial area comprising flexible molded material having upper and lower lips and having first and second opposing corners where the lips meet;
- a first mouth mechanism operable with the mouth assembly for animating the first and second corners of the mouth assembly;
- a second mouth mechanism operable with the mouth assembly for animating the upper and lower mouth portions for animating the lips between opened and closed positions independent of the first mouth mechanism;
- a drive system that powers movement of the first and second mouth mechanisms independently to simulate life-like responses and for causing movement of at least one of the plurality of movable parts in addition to the first or second mouth mechanisms.
22. The animating apparatus as recited in claim 21 wherein said first and second corners of the first mouth mechanism are moveable between neutral, up/smile state and down/frown states.
23. The animating apparatus as recited in claim 21 wherein said drive system further comprises a first reversible motor which powers the movement of the corners of the mouth assembly, and a second reversible motor which powers the movement of the upper and lower mouth portions of the mouth assembly independently.
24. The animating apparatus as recited in claim 21 wherein the drive system facilitates the animating apparatus with a plurality of states comprising animating apparatus modes that include excited, sleeping, and waking modes.
25. The animating apparatus as recited in claim 24 wherein the plurality of modes includes movement of one or more of the following moveable parts: an ear assembly; an eye assembly; an eye lid assembly; a plume assembly; a brow assembly; and a chest assembly.
26. An electrically controlled animating apparatus for simulating life-like movements in a front facial area of a character head, the apparatus comprising:
- a molded flexible elastomeric mouth including upper and lower lips having center portions and opposite side corners where the lips meet, the mouth including attachment points at the center portions and corners;
- a first animation mechanism for animating the mouth for simulated talking movement by moving the attachment points at the center portions; and
- a second animating mechanism independent from the first animating mechanism for moving the attachment points at the corners to simulate mouth movement corresponding to facial expression;
- wherein the first and second animating mechanisms act to distort the elastomeric mouth to achieve simulated talking movement and facial expression as the animating mechanisms are operated.
27. The animating apparatus as recited in claim 26 comprising a drive system that powers movement of the first animation mechanism independent of the movement of the second animation mechanism for providing simulated talking movement or facial expression, or a combination thereof as the animating mechanisms are operated.
28. The animating apparatus as recited in claim 26, said elastomeric mouth comprising a neutral undeformed position, wherein the second animating mechanism moves the attachment points at the corners to deform the elastomeric mouth from the neutral undeformed position.
29. The animating apparatus as recited in claim 28 wherein said opposite side corners where the lips meet are moveable between neutral, up/smile state and down/frown states with the second animating mechanism.
Type: Application
Filed: May 27, 2005
Publication Date: Nov 30, 2006
Inventors: Richard Maddocks (Barrington, RI), Eduardo Rodriguez (Tiverton, RI), Jeffrey Ford (Warwick, RI), Peter Hall (Norfolk)
Application Number: 11/140,483
International Classification: A63H 3/36 (20060101);