Toy system cooperating with Computer

Disclosed is a toy system for downloading digital contents via the internet and processing the downloaded digital contents through real time cooperation between a computer and a toy. The computer comprises a recording medium which stores digital contents and a digital contents driving program. The toy comprises a microprocessor, a communication port, a digital sound decoder, a speaker, a motor driving unit, a motor, an LED/lighting device and an LED/lighting device control unit. The digital contents comprise image data, first and second sound data and motion control data. The digital contents driving program processes the following steps: reading the digital contents; displaying the image data of the digital contents on a screen of the computer; outputting the second sound data of the digital contents to a speaker of the computer; and transmitting the first sound data and the motion control data of the digital contents as a packet form to the toy connected with the computer.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND OF THE INVENTION

[0001] 1. Field of the Invention

[0002] The present invention relates generally to a toy for executing digital contents, and more particularly to a toy system for downloading the digital contents via the internet to a computer and processing the downloaded digital contents in the computer and a toy connected to the computer interactively.

[0003] 2. Description of the Prior Art

[0004] As the internet has developed lately, a number of digital contents providers have come to provide various digital contents. The digital contents include, for example, MP3 music files. For executing these digital contents, a digital contents driving program can be processed in a computer or a digital contents player.

[0005] Therefore, when a user has the digital contents driving program or the digital contents player, he/she can be provided with new digital contents through access to a digital contents provider's server and execute the contents in the computer or the digital contents player.

[0006] Meanwhile, motor driven toys such as robots, vehicles and the like are currently on the market, which can move by using motors installed therein or output sound by using memories and speakers installed therein. These kinds of toys of the related art have small memory capacity and cannot be upgraded. So, they execute or output very simple motions or sounds, thereby boring children after a short time. Also, a long-term educational benefit cannot be expected from these toys.

[0007] In order to solve the foregoing problems, it has been proposed to apply internet technologies to the toys. An example of such proposals is disclosed in Korean Patent Application laid-open No. 1999-0027814, which relates to an apparatus for controlling motion of a toy. More particularly, the apparatus can control a motor driving unit to perform various motions according to a program which is transmitted from a computer after being composed therein.

[0008] However, according to the above-identified application, the digital contents downloaded from the computer are stored in a storage unit in the toy, and the toy is operated independently from the computer. Therefore, the toy is required to further comprise at least a certain capacity of memory, and the size of the downloaded digital contents are restricted according to memory capacity installed in the toy.

SUMMARY OF THE INVENTION

[0009] Accordingly, the present invention solves the foregoing problems of the related art. Therefore, it is an object of the invention to provide a toy system which comprises a computer and a toy connected to the computer. The computer of the toy system accesses the server of a digital contents provider and downloads digital contents to a memory of the computer. When the digital contents are executed in the computer, the screen of the computer cooperates with the motion and sound of the toy on a real time base.

[0010] It is another objection of the invention to provide a toy system which enables the cooperation between a toy and a computer while transmitting motion control data or sound data from the computer to the toy without downloading digital contents to the toy, so that the toy with a small memory capacity can drive digital contents of which the size is larger than the toy's memory.

[0011] It is further another object of the invention to provide a toy which can automatically upgrade a motion code table therein without replacement of parts of the toy or the toy itself, thereby performing various motions in response to the digital contents.

[0012] It is other object of the invention to provide a controlling apparatus and method for motorization which controls the motion of the mouth of a toy or a robot according to amplitude of a sound signal that is outputted to the mouth, thereby enabling the mouth to be moved very naturally.

[0013] According to an embodiment of the invention to achieve the foregoing objects, a toy system is provided a computer with a record medium in which digital contents and a digital contents driving program are stored; and a toy interfaced with the computer.

[0014] The digital contents are stored in the memory of the computer and comprise image data, sound data and motion control data.

[0015] The digital contents driving program processes the steps of reading the digital contents, displaying the image data of the digital contents on a screen of the computer and transmitting the sound data and the motion control data of the digital contents as a packet form to the toy.

[0016] The toy is comprised of a communication port for providing an interface with the computer; a microprocessor for decoding a data packet received via the communication port, outputting sound data and a motion control signal, and controlling operations of the toy; memory for storing a program and a motion code table required for the operation of the microprocessor; a decoder for decoding the sound data from the microprocessor to output a sound signal; a motor driving unit for driving a motor in response to the motion control signal from the microprocessor; a motor installed in each part of the toy; and a speaker for outputting the sound signal.

[0017] According to the invention, the toy receives the control data and the sound data necessary for the operation thereof from the computer, decodes and outputs the received data in real time so that the motion and sound output of the toy is synchronized with the display image on the computer screen.

[0018] Preferably, the toy has a motion code table in the memory, in which the motion code table comprises character ID, motion code table version, motion codes and motor control data corresponding to each motion code. The digital contents driving program further processes the steps of: comparing the version of the motion code table of the toy with that of a motion code table stored in the computer; and if the version of the motion code table of the toy is lower than that of the computer, downloading the motion code table stored in the computer to the memory of the toy.

[0019] The toy can also comprise a robot and a stage unit; the robot has motor and an EEPROM where the motion control table is written, and the robot is detachable from the stage unit.

[0020] The digital contents can also comprise a sound data pool, the sound data pool being composed of sound data IDs, and sound data and motion control data corresponding to each sound data ID. Also, the digital contents driving program obtains the sound data and the motion control data corresponding to the sound data ID and transmits the sound data and the motion control data to the toy.

[0021] Also, the toy transmits an event signal to the digital contents driving program when the input unit of the toy receives an external input. The digital contents driving program processes an operation corresponding to the event signal which informs of start, stop and operation from the user control buttons installed in the toy, or informs if the robot is detached.

[0022] According to another embodiment of the invention to achieve the foregoing objects, the present invention is provided with a method of controlling a motor by using a sound signal, the method comprising the steps of: (a) amplifying the sound signal; (b) clearing high frequency signal of the amplified sound signal to detect an envelope of the sound signal; and (c) using the envelope as the reference signal for controlling the motor.

[0023] Preferably, the method further comprises the steps of converting an analog signal corresponding to the envelope of the sound signal into a digital signal, and using the converted digital signal as the reference input signal for controlling the motor.

[0024] The motor controlling method of the invention is applied to control the motor driving the mouth of a speaking robot or toy, and the sound signal is preferably a sound signal outputted via a speaker installed in the robot or toy.

[0025] The control data for operating the toy and the sound data for reciting a fairy tale are transmitted in real time instead of being downloaded from a computer, so that the toy does not require a large capacity of memory for storing the whole digital contents. Also, the motion, sound output and lighting control of the toy can be synchronized in real time with image and sound output of the computer.

[0026] It is to be understood that the foregoing general description of the present invention is exemplary and explanatory.

BRIEF DESCRIPTION OF THE DRAWINGS

[0027] The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the principle of the invention.

[0028] FIG. 1 is a block diagram generally showing a computer and a toy system which cooperates with the computer according to the present invention;

[0029] FIG. 2A shows a configuration of digital contents according to the first embodiment of the present invention;

[0030] FIG. 2B shows a configuration of digital contents according to the second embodiment of the present invention;

[0031] FIG. 2C shows a configuration of a sound data pool of the digital contents according to the second embodiment of the present invention;

[0032] FIG. 3 is a block diagram showing a digital contents driving program of the invention;

[0033] FIG. 4 is a flow chart showing the operation of the digital contents driving program according to the present invention;

[0034] FIG. 5A through 5D show configurations of packets including sound data, motion control data, LED control data and lighting control data;

[0035] FIG. 6 shows a download control program in a digital contents driving program according to the invention;

[0036] FIG. 7 is a block diagram showing a toy system according to the invention;

[0037] FIG. 8 is a block diagram showing the operation of a microprocessor;

[0038] FIG. 9A is a flow chart showing a sequential process of a microprocessor;

[0039] FIG. 9B is a flow chart showing an interrupt handling routine of the microprocessor shown in FIG. 9A;

[0040] FIG. 10 is a circuit diagram of a motor control unit adopted in a toy according to the invention; and

[0041] FIG. 11 is a flow chart showing the operation of the motor control unit shown in FIG. 10.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0042] Hereinafter, a detailed description will be provided about the configuration and operation of digital contents, digital contents execution program and a toy system in reference with the accompanying drawings. 5 First, FIG. 1 is a system block diagram generally showing a toy system according to the invention. A description will be provided about the configuration and operation of the first embodiment about the toy system according to the invention with reference to FIG. 1.

[0043] The toy system of the invention is comprised of a computer having digital contents and a digital contents driving program, and a toy connected with the computer using by either 1 wire or wireless interface. The toy is a movable device having motors.

[0044] As shown in FIG. 2A, the digital contents of the first embodiment comprise image data 210, first sound data 216, second sound data 212, motion control data 214, LED/lighting control data 218. Here, each data need not be written in uniform order.

[0045] A user accesses a server of a digital contents provider via the internet and downloads such digital contents. Here, an updated new motion code table corresponding to a character ID of a robot is downloaded together.

[0046] FIG. 2B shows digital contents according to the second embodiment of the invention. The digital contents include a sound data pool and a sound data ID as well as image data, second sound data, motion control data and LED/lighting control data.

[0047] Meanwhile, as shown in FIG. 2C, the sound data pool includes sound data IDs, and first sound data, motion control data and LED/lighting control data corresponding to each sound data ID. Here, the first sound data, the motion control data and the LED/lighting control data corresponding to one sound data ID are signals necessary for operating simultaneously in the toy. In other words, the sound data pool is provided in a certain area of the digital contents, and the sound data ID functioning as a key word of the sound data pool is used in the digital contents to allow size reduction of the digital contents.

[0048] Again, the user can access the server of the digital contents provider via the internet and download such digital contents together with the motion code table corresponding to the character ID of the robot. The downloaded digital contents and the motion code tables are stored in a memory of the computer. The motion code table is downloaded to the memory in the toy by the digital contents driving program.

[0049] The image data and the second sound data are respectively outputted to a monitor and a speaker of the computer when the digital contents are executed. The first sound data, the motion control data and the LED/lighting control data are transmitted to the toy from the computer after being packeted therein. A detailed description about each of these data will be provided later in the specification.

[0050] If the data of the read digital contents is sound data ID, the first sound data, the motion control data and the LED/lighting control data corresponding to the sound data ID are read from the sound data pool and transmitted to the toy from the computer after being packeted therein. Further comprising the sound data pool, the digital contents can be reduced in size.

[0051] Meanwhile, the digital contents can further include data corresponding to encryption codes for preventing any illegal copy thereof.

[0052] FIG. 3 is a block diagram showing the function of a driving program 300 for reading and executing the digital contents stored in the memory of the computer, and FIG. 4 is a flow chart showing the operation of the digital contents driving program.

[0053] First, a description about a configuration of the driving program in reference to FIG. 3 will be provided.

[0054] The driving program comprises a digital contents reader 310 for reading the digital contents from the memory, a PC screen display unit 320 and a PC speaker output unit 322 for outputting data of the digital contents to a screen and a speaker according to data types, respectively, a data packet transmitting unit 324, a download control unit 330 and an input signal processing unit 340.

[0055] Upon receiving an event or a request signal from the toy to the computer, the input signal processing unit 340 processes the event corresponding to the input signal by using an interrupt handling routine. The type and substance of the event will be described later.

[0056] The operation of the download control unit 330 will also be described later.

[0057] Hereinafter, a description will be provided about the operation of the digital contents driving program in reference to FIG. 4.

[0058] First, the digital contents are read and loaded in step 410, and the digital contents are analyzed in step 420.

[0059] Then, if the data of the digital contents are image data in step 430, the image data are displayed on the screen of the computer in step 432. If the data of the digital contents are the second sound data in step 440, the second sound data are outputted to the speaker installed in the computer in step 442.

[0060] In sequence, if the data of the digital contents are one of the first sound data, the motion control data and the LED/lighting control data in step 450, the data are transmitted to the toy that is interfaced with the computer after packeting the data in step 452. Then, if the digital contents are terminated, the operation of the digital contents driving program is ended. If not, it returns to the step 420.

[0061] FIG. 5A to FIG. 5D show configurations of the data packet formed by the digital contents driving program.

[0062] FIG. 5A is a packet of motion control data, and is composed of a communication header and the motion control data. The motion control data represent data codes in the motion code table previously set about desired operations and written in the memory of the toy.

[0063] FIG. 5B is a packet of lighting control data which is composed of a communication header, a start RGB value, a termination RGB value, a fixed time and the number of repetition. The lighting device varies from the start RGB value to the termination RGB value as often as the number of repetition in the fixed time.

[0064] FIG. 5C is a packet of LED control signal which is composed of a communication header, the number of repetition, LED ON/OFF, time, . . . , LED ON/OFF, time. Each LED ON/OFF, which is composed of a length of bits, is turned on when its data is 1, and off when its data is 0. Each LED is turned on/off in the fixed time as often as the number of repetition, matched to a relative time from the point of receiving a command.

[0065] FIG. 5D is a packet of the first sound data and is composed of the first sound data, time, a control command, . . . , time and another control command. While the first sound data are decoded and processed, the packet is synchronized and the control commands are executed so as to cause the robot to speak and move. Here, the control commands are data required for controlling the motion of the robot and the lighting as soon as the first sound data are outputted.

[0066] FIG. 6 is a flow chart showing the operation of the download control unit.

[0067] First, the digital contents driving program requests the character ID of the toy and a version information of the motion code table written in the memory of the toy in step 600, and receives the character ID and the version information from the toy in step 610.

[0068] Then, the driving program compares the character ID and the motion code table version received from the toy with the character ID and the motion code table version which are downloaded from the internet and stored in the computer in step 620.

[0069] If the character IDs are the same and the motion code table of the toy has lower version than that of the computer in step 630, the driving program transmits a download start code to the toy in step 640. Then, upon receiving a ready-for-receiving code from the toy in step 650, the driving program downloads the motion code table corresponding to the character ID to the memory installed in the toy in step 660 and in step 670, and then terminates.

[0070] Hereinafter, the configuration and the operation of the toy in reference to FIG. 7 to FIG. 10 are described.

[0071] As shown in FIG. 7, the toy comprises a communication port 710, a microprocessor 720, a digital sound decoder 750, a speaker 754, an envelope detector 756, an A/D converter 758, a motor driving unit 760, motor 762, an LED/lighting device driving unit 770, an LED or a lighting device 772, an input unit 740 and memories 730 and 732.

[0072] Here, the toy is comprised of a stage unit and a robot, in which the robot is preferably detachable from the stage unit. In this configuration, the robot can be replaced with a new robot character at any time. Also, the motion code table is varied according to the character type of the robot, and can be replaced together with the robot.

[0073] The stage unit includes the communication port, the microprocessor for controlling the entire operations of the toy, the memories, the digital sound decoder, the motor driving unit, the LED/lighting device driving unit, the LED/lighting device and the input unit.

[0074] The communication port 710 enables the wire or wireless interface of the toy with the computer.

[0075] The microprocessor 720 controls the entire operations of the toy, and as shown in FIG. 8, comprises a communication processor 820, a packet decoder unit 830, an input processing unit 810, a motor control unit 840, a sound data processing unit 850 and an LED/lighting device control unit 860. Hereinafter, a detailed description is provided about the operation of each component of the microprocessor.

[0076] After a data stream received from the communication port is successfully stored, upon completion of receiving the packet, the communication processor 820 outputs the received packet to a packet decoder unit 830 which will be described later, or transmits the data received from the input processing unit 810 to the computer.

[0077] The packet decoder unit 830 decodes the packet from the communication processor, and transmits the packet to the sound data processing unit if the packet is the first sound data, to the motor control unit if it is motion control data, and to the LED/lighting control unit if it is LED/lighting control data.

[0078] When an internal event or an external event from the input unit takes place, the input processing unit 810 transmits a signal corresponding to each event via the communication processor to the computer. Here, the internal event includes a signal for informing if the sound to the speaker of the toy and the motion of the toy are synchronized, and another signal for informing if the robot is detached/attached from/to the stage unit. The event from the input unit is a signal inputted from user control buttons installed in the toy or a signal for informing of the start, stop, forward and backward of the contents, start of the contents reader, and the like.

[0079] The motor control unit 840, after reading position data and time duration data of the motors corresponding to the motion control data from the packet decoder in the motion code table, obtains a motion control signal from the position and time duration data, and outputs the motion control signal to the motor driving unit.

[0080] The voice data processing unit 850 outputs the sound data from the packet decoder to the digital sound decoder.

[0081] The LED/lighting control unit 860 obtains an LED/lighting control signal from the LED/light control data outputted from the packet decoder, and then outputs the control signal to the LED/lighting device driving unit.

[0082] The operation and the speaking motion of the toy are controlled in response to input signals from the microprocessor to the toy.

[0083] Then, the memories store programs and data required for the operation of the microprocessor. Preferably, the first memory stores the program and data required for the operation of the microprocessor, and the second memory stores the motion code table. More preferably, the second memory is an EEPROM which is an electrically erasable programmable ROM.

[0084] The digital sound decoder 750 decodes and amplifies the sound signal from the microprocessor 720 by using the sound amplifier 752 and outputs the amplified sound signal to the speaker 754.

[0085] Meanwhile, the envelope detector 756 detects an envelope of a sound signal from the digital sound decoder 750, and the A/D converter 758 converts an analog signal of the envelope into a digital signal. The converted sound signal is used as the reference input signal to control the motor 762 for moving the mouth of the robot.

[0086] The motor driving unit 760 drives the motor 762 in response to the motion control signal from the microprocessor 720.

[0087] The LED/lighting device driving unit 770 controls the operation of the LED or the lighting device 772 in response to the LED/lighting control signal produced from the microprocessor 720. The LED/lighting control signal expresses brightness, color and the like of the LED or the lighting device 772.

[0088] The input unit 740 is the user control buttons, and when an external input takes place, interrupts the microprocessor to inform the external input.

[0089] The robot comprises a plurality of motors, speakers and memories.

[0090] The motion code table is composed of the character ID of the toy, version of the motion code table, the motion codes and control data of the motor respectively corresponding to the motion codes. The control data include at least position data and time duration data.

[0091] Upon receiving the motion control data packet from the computer, the toy obtains the motion code from the motion control data packet, and calls the motor control data corresponding to the motion code from the motion code table with the motion codes as a key word. Motors in the robot are controlled by using the motor control data.

[0092] If the motion code table has a version lower than that of the motion code table stored in the cooperating computer, a new version of motion code table is downloaded to the second memory of the robot from the computer. Therefore, the motion code table can be newly upgraded without replacing the existing robot or the toy.

[0093] The speaker 754 outputs the sound signal transmitted from the computer.

[0094] The motor 762 is mounted to portions of the robot allowing limbs of the robot to move. Also, the motor is operated in response to the motion control signal which the microprocessor produced by using the motor control data from the motion code table.

[0095] One of the motors connected with the mouth of the robot uses an additional motor control unit as shown in FIG. 10 to be controlled by the sound signal. The motor control unit comprises an amplifier 1000 for amplifying an input signal, a diode 1010 and a low-pass filter 1020. More preferably, the motor control unit further comprises an A/D converter as shown in FIG. 7.

[0096] FIG. 9A is a flow chart showing a sequential process of the microprocessor, and FIG. 9B is a flow chart showing a process which is interrupted at a certain time interval in the microprocessor.

[0097] Referring to FIG. 9A, when a new packet is received in step 910, the received packet is decoded in step 960 to inspect data of the packet. Here, if the received packet is the sound data packet in step 962, the packet is stored in step 964. If the received packet is the motion control data packet in step 970, a motion mode is set in step 972. If the received packet is the LED/lighting control data packet in step 980, a LED/lighting mode is set in step 982.

[0098] If a new data packet is not received in step 910, it is checked whether the buffer of the sound decoder is full in step 920. If the buffer of the sound decoder is not full, it is checked whether data stream receipt is ended in step 930. If the data stream is not ended, it is checked whether restoration of the received sound data packet is terminated in step 940. If restoration is not terminated, the sound data are outputted in step 950. If terminated, transmission of the next packet is requested in step 940. Then, the process returns to step 910.

[0099] FIG. 9B is an interrupt handling routine carried out at a certain time interval separately from the sequential process routine of the microprocessor. For control of the LED and the lighting device, this routine should be executed with constant time intervals and is processed as an interrupt routine. For example, after setting the routine so that a timer interrupt takes place at an interval of every 25 ms, A/D conversion is started in the timer interrupt. An A/D converted signal may include an output of the envelope detector or a position signal of each motor. When A/D conversion is ended, an A/D conversion-complete interrupt takes place and the routines of motor control, LED control and lighting control are sequentially processed as shown in FIG. 9B. Therefore, motor control, LED flashing and lighting device control can be executed at exact time interval of 25 ms.

[0100] Further, the microprocessor of the toy uses an external interrupt routine also when an input takes place from a switch or an external input unit. Therefore, when any button of the switch is pushed, the microprocessor of the toy is interrupted so that the microprocessor reads a switch state and transmits a state value of the switch to the computer through the communication processor.

[0101] The microprocessor executes the interrupt routine whenever data are received from the computer. To be more specific, the data transmitted from the computer are received in the microprocessor via a communication port, in which a receipt interrupt takes place whenever one bite is received. When the receipt interrupt takes place, the microprocessor maintains storing the received data in the previously reserved buffer, and when the last byte of the packet is received, sets a flag informing of the receipt of the last byte. Then, in the flow chart of FIG. 9A, the microprocessor can confirm if a new data packet is received by inspecting the flag. When packet decoding is terminated, the microprocessor should reset the packet flag as “0” again.

[0102] Hereinafter, a detailed description will be provided about the motor control unit for moving the mouth of the robot in reference to FIG. 11.

[0103] First, in step 1100, the motor control unit amplifies the sound signal decoded with the first sound data, which are transmitted from the computer to the toy in a packet form, in which the amplifier preferably employs a differential amplifier. When the sound signal is lower than 0.7 volt, the diode in the rear of the amplifier outputs “0” voltage making envelope detection impossible.

[0104] Then, the sound signal from the amplifier passes the diode to clean a negative component from the sound signal in step 1110. The output signal passes the low-pass filter comprised of a capacitor and a resistance to clear a high frequency signal of the sound signal and detect an envelope of the sound signal in step 1120.

[0105] Then in step 1130, an analog signal corresponding to the detected envelope is converted into a digital signal by using the A/D converter, and the converted signal is used as the reference input signal for controlling the motor in step 1140.

[0106] The foregoing motor control unit can be used to control the motor connected to the mouth of the robot or the toy. The motor control unit employs the sound signal outputted to the speaker of the robot or the toy so that the degree of moving the mouth can be varied according to the amplitude of the sound signal. As a result, the mouth shape of the robot or the toy is varied according to sound output so that the speaking motion of the robot or the toy can appear more natural.

[0107] According to the toy system as described above, the digital contents driving program is executed in the PC to activate multimedia output such as sound, image and animation of the digital contents, and the toy cooperates with the multimedia in real time to move the limbs of the toy while, for example, reciting a fairy tale, thereby stimulating the interest of a person watching the toy.

[0108] Therefore, the toy system of the invention can be applied to a robot system which recites a fairy tale in an interesting manner, in which the computer screen displays characters and backgrounds of the fairy tale and robot recites corresponding to the display image of the computer screen with various motions and stage lighting effect, thereby allowing the fairy tale to be much more amusing.

[0109] Also, the invention can be applied to a language educational system. The computer screen displays images or captions of foreign language and the toy speaks corresponding to the caption data displayed on the computer screen so that children or beginners can have an interesting language study while watching the toy.

[0110] The toy system of the invention allows the toy to receive the data from the computer in real time and output the same so that the motion and the sound output of the toy can cooperate with the computer screen.

[0111] The robot of the toy is detachable from the stage unit of the toy and the motion code table is written in the memory installed in the robot thereby enabling substitution to a new robot at any time. Also, the memory for the motion code table employs EEPROM which is electrically rewritable so that a new version of the motion code table can be downloaded from the computer to update the motion code table of the robot new version. Therefore, the motion code table written in the memory of the robot can be upgraded without any additional hardware replacement.

[0112] Also, the sound output signal is used to control the motor for driving the mouth of the toy so that more natural motion of the mouth can be achieved when reciting a fairy tale.

Claims

1. A toy system comprising:

a computer with a recording medium in which digital contents and a digital contents driving program are stored; and
a movable device interfaced with said computer;
wherein said digital contents are stored in said recording medium of said computer, and comprise image data, sound data and motion control data;
wherein said digital contents driving program processes the following steps of:
(a) reading said digital contents;
(b) displaying the image data of said digital contents on a screen of said computer; and
(c) transmitting the sound data and the motion control data of said digital contents as a packet form to said movable device connected with said computer;
wherein said movable device comprises:
a communication port for providing an interface with said computer;
a microprocessor for decoding a data packet received via said communication port, outputting sound data and a motion control signal, and controlling operations of said movable device;
a memory for storing a program and a motion code table required for the operation of said microprocessor;
a decoder for decoding the sound data from said microprocessor to output a sound signal;
a motor driving unit for driving a motor in response to the motion control signal from said microprocessor; and
a motor installed in each part of said movable device; and
wherein the image data is displayed on said computer screen and said movable device decodes the data packet transmitted from said computer to output the sound data to said digital sound decoder and the motion control signal to the motor driving unit when said digital contents driving program is processed in said computer, whereby the motion and sound output of said movable device are synchronized with the screen output of said computer.

2. A toy system according to claim 1, wherein said motion code table comprises a character ID and a motion code table version, motion codes and motor control data corresponding to each motion codes, and

wherein said digital contents driving program further processes the steps of:
comparing version of said motion code table of the movable device with a motion code table stored in said computer; and
if said motion code table of said movable device has a lower version than said motion code table stored in said computer, downloading said motion code table stored in said computer to said memory of said movable device.

3. A toy system according to claim 2, wherein said motor control data is composed of position data and time duration data.

4. A toy system according to claim 1, wherein said movable device comprises a robot and a stage unit, said robot having motor and being detachable from said stage unit.

5. A toy system according to claim 4, wherein said motion code table is stored in an electrically rewritable memory installed in said robot.

6. A toy system according to claim 4, wherein said memory includes first and second memories, said first memory storing a program necessary for the operation of said microprocessor of said movable device and being installed in said stage unit, said second memory storing said motion code table and being installed in said robot.

7. A toy system according to claim 6, wherein said second memory is electrically rewritable.

8. A toy system according to claim 1, wherein said digital contents further comprise a sound data pool, said sound data pool being composed of sound data IDs, and sound data and motion control data corresponding to each sound data ID; and

wherein said digital contents driving program obtains the sound data and the motion control data corresponding to the sound data ID from said sound data pool and transmits the sound data and the motion control data to said movable device.

9. A toy system according to claim 1, wherein said movable device further comprises an input unit for inputting an external signal, said microprocessor of said movable device transmitting an event signal to said digital contents driving program when said input unit receives an external input, said digital contents driving program executing an operation corresponding to the event signal.

10. A toy system according to claim 9, wherein said input unit has user control buttons for instructing start, stop, forward and backward, said event signal informing of the start, stop forward and backward from said user control buttons installed in said movable device.

11. A toy system according to claim 3, wherein said movable device further comprises an input unit for inputting an external signal, said microprocessor of said movable device transmitting an event signal to said digital contents driving program when said input unit receives an external input, said digital contents driving program executing an operation corresponding to the event signal, said input unit having user control buttons for instructing start, stop, forward and backward, said event signal informing of the start, stop, forward and backward from said user control buttons installed in said movable device, or informing if said robot is detached.

12. A toy system according to claim 1, wherein said digital contents comprise additional sound data, said digital contents driving program outputting said additional sound data of said digital contents to a speaker installed in said computer.

13. A toy system according to claim 1, wherein said microprocessor of said movable device detects an envelope from a waveform of the sound data transmitted via said communication port, and uses the envelope as an electric reference signal to control a motor connected to the mouth of said movable device.

14. A toy system according to claim 1, wherein said microprocessor comprises:

a communication processor for successively storing a data stream received from said communication port, and then upon completion of receiving the packet, outputting the received packet to a packet decoder unit or transmitting data received from an input processing unit to said computer;
a packet decoder unit for decoding the packet from said communication processor, and then transmitting the decoded packet to a motor control unit or a sound data processing unit;
an input processing unit for transmitting a signal corresponding to each event of an input unit to said communication processor when the input unit is inputted;
a motor control unit for using motion control data from said packet decoder unit to read position and time data of motors corresponding to the motion code from the motion code table, obtain the motion control signal from said position and time data, and output the motion control signal to said motor driving unit; and
a sound data processing unit for outputting the sound data from said packet decoder unit to said digital sound decoder.

15. A toy system according to claim 1, wherein said digital contents are downloaded together with the motion code table corresponding to a character ID of said digital contents from a server of a digital contents provider via the internet.

16. A toy system according to claim 15, wherein said digital contents driving program further performs the steps of:

(a) receiving the character ID and the motion code table version of said movable device from said movable device;
(b) comparing the character ID and the motion code table version stored in said computer with the character ID and the motion code table version received from said movable device;
(c) if the character IDs are not the same, stopping the downloading; and
(d) if the character IDs are the same and the motion code table version of said movable device is lower than the motion code table version stored in said computer, downloading the motion code table in said computer to said memory in said movable device.

17. A toy system according to claim 1, wherein said digital contents further comprise LED/lighting control data, said digital contents driving program forming the LED/lighting control data into a packet to transmit the packeted data to said movable device; and

wherein said movable device further comprises an LED or a lighting device, and controls said LED or said lighting device according to the LED/lighting control data from said digital contents driving program.

18. A toy system according to claim 17, wherein said microprocessor comprises:

a communication processor for successively storing a data stream received from said communication port, and then upon completion of receiving the packet, outputting the received packet to a packet decoder unit or transmitting data received from an input processing unit toward said computer;
a packet decoder unit for decoding the packet from said communication processor, and then transmitting the decoded packet to a motor control unit, a sound data processing unit or an LED/lighting control unit;
an input processing unit for transmitting a signal corresponding to each event of an input unit to said communication processor when the input unit is inputted;
a motor control unit for using motion control data from said packet decoder unit to read position and time data of motors corresponding to motion code from the motion code table, obtain the motion control signal from said position and time data, and output the motion control signal to said motor driving unit;
a sound data processing unit for outputting the sound data from said packet decoder unit to said digital sound decoder; and
an LED/lighting control unit for obtaining an LED/lighting control signal from LED/lighting control data outputted from said packet decoder, and outputting said LED/lighting control signal to an LED/lighting driving unit.
Patent History
Publication number: 20020077021
Type: Application
Filed: Aug 6, 2001
Publication Date: Jun 20, 2002
Inventors: Soon Young Cho (Seoul), Jung Ho Moon (Kyunggi-Do), In Seob Lee (Seoul), Sang Kyu Park (Seoul)
Application Number: 09922883
Classifications
Current U.S. Class: Sounding (446/265)
International Classification: A63H001/28; A63H001/28;