AUTOMATIC COLORING SYSTEM AND METHOD

- ZONG JING INVESTMENT,INC.

An automatic coloring system and method are used for coloring a three-dimensional object. An electronic device executes a coloring design process to obtain a coloring procedure corresponding to the three-dimensional object. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. A connecting interface electrically connects the electronic device to an automatic coloring machine in a separable manner, so as to output the coloring procedure from the electronic device to the automatic coloring machine. Finally, the automatic coloring machine directly executes the coloring instructions in the coloring procedure sequentially.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCES TO RELATED APPLICATIONS

This non-provisional application claims priority under 35 U.S.C. §119(a) on Patent Application No. 101146207 filed in Taiwan, R.O.C. on 2012 Dec. 7, the entire contents of which are hereby incorporated by reference.

BACKGROUND OF THE INVENTION

1. Technical Field

The present invention relates to a coloring technology, and more particularly to an automatic coloring system and method for coloring a three-dimensional object.

2. Related Art

Wanting to be beautiful is a natural human desire, so various major manufacturers provide the market with a wide variety of care products and cosmetics for consumers to purchase. However, in order to compose makeup a person likes and is suitable to the person, makeup techniques must be practiced repeatedly, and various cosmetics and makeup tools purchased, so as to draw various eyebrow shapes, various eye lines, eyelashes, eye contours, face makeup, labial makeup, appearance modifications, and various color changes. However the difference in proficiency in the makeup techniques and the wide range of cosmetics usually results in a difference between the effect of the makeup and the effect expected by the consumer.

As the information technology continues to evolve, a simulation device for trying color makeup or a care product is provided by some research. Through the simulation device for tying color makeup or a care product, a user may simulate an effect of makeup on a screen before purchase instead of trying a color makeup product in person; for example, US Patent Publication No. 2005/0135675A1. However, simulating the effect of the color makeup on the screen still depends on manual makeup skills that apply the color makeup on the human face. However, the real effect of manual makeup performed by the user is not necessarily equal to the effect presented by the simulation on the screen.

SUMMARY

In an embodiment, an automatic coloring system is used for coloring a three-dimensional object. The automatic coloring system includes an automatic coloring machine, and the automatic coloring machine includes a first connecting interface, a material supply module, a moving module, at least one coloring tool, and a control unit. The material supply module has at least one pigment. The coloring tool is disposed on the moving module. The control unit is connected electrically to the first connecting interface, the material supply module, and the moving module.

The first connecting interface is used for receiving a coloring procedure in a wireless manner or in a wired manner. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. The control unit sequentially executes the coloring instructions in the coloring procedure, and according to the executed coloring instruction controls the material supply module to select at least one pigment and controls the moving module to move one coloring tool to apply the selected pigment to the three-dimensional object.

In some embodiments, the automatic coloring system may further include an electronic device, and the electronic device includes a processing unit, a user interface, and a second connecting interface. The processing unit is connected electrically to the user interface and the second connecting interface. The processing unit is used for receiving an appearance image of the three-dimensional object, and generating an outline image through feature analysis of the appearance image. The user interface is used for displaying the outline image, and sequentially outputting at least one edit instruction corresponding to the outline image, so that the processing unit obtains the coloring procedure in response to the edit instruction. The second connecting interface then outputs the coloring procedure to the first connecting interface in a wireless manner or in a wired manner.

In some embodiments, the automatic coloring system may further include an image capturing module, and the image capturing module is used for capturing the appearance image of the three-dimensional object. The electronic device, the automatic coloring machine, and the image capturing module may be devices capable of being separated from each other. Alternatively, the image capturing module is built in the electronic device or in the automatic coloring machine.

In an embodiment, an automatic coloring method includes receiving an appearance image of a three-dimensional object; generating an outline image through feature analysis on the appearance image; displaying the outline image on a user interface; using the user interface to sequentially output at least one edit instruction corresponding to the outline image; in response to the at least one edit instruction, obtaining a coloring procedure; and outputting the obtained coloring procedure to an automatic coloring machine in a wireless manner or in a wired manner. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.

In some embodiments, the automatic coloring method may further include sequentially executing the coloring instructions in the coloring procedure. An execution step of each coloring instruction includes: according to the executed coloring instruction, controlling the material supply module of the automatic coloring machine to select at least one pigment; and according to the executed coloring instruction, controlling the moving module of the automatic coloring machine to move a coloring tool, so as to apply the selected pigment to the three-dimensional object.

In some embodiments, each coloring instruction includes track information represented by two-dimensional coordinates or represented by three-dimensional coordinates.

In some embodiments, the outline image may be a three-dimensional simulated image.

In view of the above, the automatic coloring system and method according to the present invention are used for coloring a three-dimensional object. Herein, the electronic device executes a coloring design process to obtain a coloring procedure corresponding to the three-dimensional object. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. Through separable electrical connection of the connecting interface, the coloring procedure is output from the electronic device to the automatic coloring machine. Then, the automatic coloring machine directly executes the coloring instructions in the coloring procedure sequentially. In other words, the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere. Further, the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine. In some embodiments, by directly providing track information represented by three-dimensional coordinates, the automatic coloring machine can execute the actual coloring process more precisely. In some embodiments, by directly presenting a three-dimensional simulated image, the coloring action in the coloring design process is closer to that in the actual coloring process.

BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will become more fully understood from the detailed description given herein below for illustration only, and thus not limitative of the present invention, wherein:

FIG. 1 is a schematic block diagram of an automatic coloring system according to a first embodiment of the present invention;

FIG. 2 is a schematic block diagram of an electronic device according to a first embodiment of the present invention;

FIG. 3 is a schematic block diagram of an electronic device according to a first embodiment of the present invention;

FIG. 4 is a schematic block diagram of an automatic coloring machine according to a first embodiment of the present invention;

FIG. 5 is a schematic view of an automatic coloring system according to a second embodiment of the present invention;

FIG. 6 is a schematic view of an automatic coloring system according to a third embodiment of the present invention;

FIG. 7 is a flow chart of an automatic coloring method according to a first embodiment of the present invention;

FIG. 8 is a schematic view of a user interface according to an embodiment;

FIG. 9 is a schematic view of a tool option according to an embodiment;

FIG. 10A is a schematic view of a color palette option according to a first embodiment;

FIG. 10B is a schematic view of a color palette option according to a second embodiment;

FIG. 11 is a schematic view of an automatic coloring system according to a fourth embodiment of the present invention; and

FIG. 12 is a schematic view of a template option according to an embodiment; and

FIG. 13 is a flow chart of an automatic coloring method according to a second embodiment of the present invention.

DETAILED DESCRIPTION

Terms such as “first”, “second”, and “third” in the following description are used for distinguishing elements, not used for sequencing or limiting differences between the elements, and not used for limiting the scope of the present invention.

Please refer to FIG. 1 to FIG. 4, an automatic coloring system 10 includes an electronic device 11 and an automatic coloring machine 12. The electronic device 11 may output a coloring procedure corresponding to a colored pattern of a three-dimensional object 14 to the automatic coloring machine 12, and the automatic coloring machine 12 colors the three-dimensional object 14 by executing the coloring procedure. Herein, the electronic device 11 may be a device capable of executing an application or an equivalent device thereof, such as a portable electronic device or a personal computer. The portable electronic device may be a smart phone, a notebook computer, a tablet computer, or other equivalent device. The three-dimensional object 14 may be a human body, a specific part of a human body (such as the face, an eye, and a nail), or an article (such as a mask and a cup).

The automatic coloring system 10 further includes an image capturing module 13: The image capturing module 13 is used for capturing an appearance image Pf of the three-dimensional object 14. In some embodiments, the electronic device 11, the automatic coloring machine 12, and the image capturing module 13 (such as a. digital camera or a webcam) may be devices capable of being separated from each other. The separable image capturing module 13 is, for example, a digital camera or a webcam. Preferably, the image capturing module 13 is an image pickup device capable of color photographing. In some embodiments, the image capturing module 13 may be built in the electronic device 11 (as shown in FIG. 2) or in the automatic coloring machine 12 (as shown in FIG. 4).

Please refer to FIG. 2 and FIG. 3, in which the electronic device 11 includes a processing unit 110, a user interface 120, a connecting interface 130, and a storage unit 140.

Please refer to FIG. 4, in which the automatic coloring machine 12 includes a control unit 210, a connecting interface 230, a material supply module 240, a moving module 250, and at least one coloring tool 260 and 262.

To make the description clear, in the following, the connecting interface 230 of the automatic coloring machine 12 is referred to as the first connecting interface 230, and the connecting interface 130 of the electronic device 11 is referred to as the second connecting interface 130.

Please refer to FIG. 2 and FIG. 3, in which the processing unit 110 is connected electrically to the user interface 120, the second connecting interface 130, and the storage unit 140. The second connecting interface 130 is used for being connected electrically to the first connecting interface 230 of the automatic coloring machine 12 in a wireless manner or in a wired manner. The electrical connection in the wired manner may be direct connection (for example, the first connecting interface 230 and the second connecting interface 130 are a male connector and a female connector, which are physical connectors, respectively), or indirect connection (for example, through a connecting cable 15 or an equivalent device thereof).

In some embodiments, please refer to FIG. 2, in which the electronic device 11 may have the built-in image capturing module 13, and the image capturing module 13 is connected electrically to the processing unit 110, as shown in FIG. 2. The appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110, or may be stored in the storage unit 140 in advance.

In some embodiments, please refer to FIG. 3, in which the electronic device 11 may further include another connecting interface 132. To make the description clear, in the following, the connecting interface 132 is referred to as the third connecting interface 132.

The third connecting interface 132 is connected electrically to the processing unit 110. The image capturing module 13 outside the electronic device 11 is connected to the third connecting interface 132 in a wireless manner, in a directly connected manner, or through a connecting cable, so that the image capturing module 13 is connected electrically to the processing unit 110 through the third connecting interface 132, as shown in FIG. 3. In this case, the appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110 through the third connecting interface 132. Herein, the image capturing module 13 may be a Charge Coupled Device (CCD) element, a Complementary Metal Oxide Semiconductor (CMOS) element, or other equivalent element. Preferably, the image capturing module 13 is an image pickup device capable of color photographing.

In some embodiments, the user interface 120 may be a touch screen, a combination of a touch screen and at least one physical button, a combination of a screen and an input assembly (for example, a keyboard, a mouse, a handwriting pad, or a combination thereof), or an equivalent device.

In the automatic coloring machine 12, please refer to FIG. 4, in which the control unit 210 is connected electrically to the first connecting interface 230, the material supply module 240, and the moving module 250. The coloring tool 260 and 262 are disposed on the moving module 250. The material supply module 240 has at least one pigment.

An example of makeup of a human face is taken in the following to exemplarily illustrate the structure of the automatic coloring machine 12 in detail. In other words, in this example, the three-dimensional object 14 is the face of a user.

Please refer to FIG. 5, in which the automatic coloring machine 12 may further include a table 202 and a face positioning module 220. The control unit 210, the face positioning module 220, and the moving module 250 are disposed on the table 202.

The face positioning module 220 is disposed to be corresponding to the moving module 250. The face positioning module 220 is provided, so that the head of the user is disposed on the face positioning module 220, so as to ensure the position of the face.

The face positioning module 220 includes a lower-jaw support 221 and an overhead positioning member 222. The lower-jaw support 221 is used by the user to place the lower jaw thereof, so as to support the head (face), of the user. The overhead positioning member 222 is disposed above the lower-jaw support 221. Herein, the overhead positioning member 222 is slightly inverted U-shaped, and an arc-shaped holding portion 223 is formed in an upper middle position corresponding to the forehead. During use, the user may urge the forehead thereof against the holding portion 223 of the overhead positioning member 222, and urge the chin against the lower-jaw support 221, so as to ensure the face of the user to be opposite to the position of the moving module 250.

The moving module 250 includes a moving block 251, a lifter 252, a horizontal rail 253, and a telescopic platform 254. The horizontal rail 253 spans and is above the lifter 252, and by adjusting the lifter 252, the horizontal rail 253 is enabled to move vertically along a first direction (for example, the Y-axis direction). The telescopic platform 254 is slidably disposed on the horizontal rail 253, and the telescopic platform 254 can move left and right on the horizontal rail 253 along a second direction (for example, the X-axis direction in the drawing). The moving block 251 is disposed on the telescopic platform 254, and the moving block 251 can move back and forth on the telescopic platform 254 along a third direction (for example, the Z-axis direction in the drawing). Further, a motor controlled by the control unit 210 drives the moving block 251, the lifter 252, and the telescopic platform 254, so that the moving block 251 can move in a three-dimensional manner accordingly to be precisely positioned.

In this embodiment, the material supply module 240 controls output and makeup operations thereof through the control unit 210. The material supply module 240 is disposed on the moving block 251 of the moving module 250. The material supply module 240 stores various coloring materials. An output port of the material supply module 240 is appropriately connected to each coloring tool 260 and 262, and supplies a corresponding pigment to the coloring tool 260 and 262. The coloring tool 260 and 262 may be a spray head, a nozzle, or a coating pen.

When the coloring tool 260 is a nozzle, the material supply module 240 may have a supply cup and an air pressure pipe. The supply cup stores a pigment. The air pressure pipe is connected to an air compressor, provides air flowing to the output port, and can absorb the pigment in the supply cup and spray the pigment out through the output port.

When the coloring tool 262 is a coating pen, the material supply module 240 may be designed to have a rotary wheel, various output ports are disposed in the rotary wheel, so as to output the pigment to the outside. The output ports are disposed on the circumference of the rotary wheel. Rotating of the rotary wheel results in different pigments.

The diversified material supply module 240 facilitates automatic coating using different coloring tools 260 and 262 or pigments.

A control module 204 may be disposed on the table 202. The control module 204 has the control unit 210 and the first connecting interface 230.

The first connecting interface 230 receives a coloring procedure from the electronic device 11 in a wireless manner or in a wired manner, and transmits the received coloring procedure to the control unit 210 to sequentially execute each coloring instruction in the coloring procedure. In other words, the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. Each coloring instruction may include track information represented by two-dimensional coordinates or track information represented by three-dimensional coordinates.

In some embodiments, when each coloring instruction includes track information represented by three-dimensional coordinates, the control unit 210 controls, based on track information in a currently executed coloring instruction, movement of the moving module 250, to make the moving block 251 move to be positioned.

In some embodiments, when each coloring instruction includes track information represented by two-dimensional coordinates, the automatic coloring machine 12 may further include a range finding device 270. The range finding device 270 is mounted on the moving block 251 of the moving module 250. The range finding device 270 can measure a position in the third direction to provide a position signal and a calibration signal, so as to convert a two-dimensional image into a three-dimensional image for operation, thereby ensuring that the coloring tool 260 and 262 contacts the face of the user safely or keeps a safe distance from the face of the user.

The control unit 210 controls movement of the moving module 250 based on track information in a currently executed coloring instruction, so as to make the moving block 251 drive the coloring tool to apply a selected pigment to the face of the user. Further, according to the type of the selected coloring tool and the position signal obtained by the range finding device 270, the control unit 210 controls a distance of movement of the moving module 250 relative to the face, so that the moving block 251 moves the coloring tool to the position for contacting the face of the user safely or a position for keeping a safe distance from the face of the user.

In some embodiments, the range finding device 270 may be a laser range finder, a tellurometer, an infrared range finder, an image capturing module, or other equivalent range finding devices.

In some embodiments, the three-dimensional object 14 may be an eye of the user.

Please refer to FIG. 6, in which for the automatic coloring machine 12 dedicated to the eye, the aforementioned face positioning module 220 may be an eye mask to enable the eye of the user to correspond to the moving module 250 of the automatic coloring machine 12.

In some embodiments, the second connecting interface 130 may be a wireless transceiver module, a Universal Serial Bus (USB), or an External Serial Advanced Technology Attachment (e-SATA) connector. The third connecting interface may be a wireless transceiver module, a USB, or an e-SATA connector.

The wireless transceiver module may adopt various wireless communications technologies in the prior art, such as the Bluetooth technology, the Wireless Fidelity (WiFi) technology, and the Near Field Communication (NFC) technology.

In some embodiments, the pigment may be powdery, foamy, gelatinous, in a liquid state, and in any one of the three phases or a combination thereof, for example, shining pieces, mist or other special state. The pigment is, for example, a makeup base material, a concealing material, an eyebrow color material, a cheek color material, a labial makeup material, a decorative color makeup material, basic care material, various colors of inks, or various colors of dyeing materials, which may be mixed arbitrarily.

The operation of the automatic coloring system 10 is illustrated below in detail for demonstration. Please refer to FIG. 1 to FIG. 8, in which the storage unit 140 stores a coloring application.

The processing unit 110 executes the coloring application, so as to display a coloring editing window 121 on the user interface 120 (Step S21). The coloring editing window 121 includes an image preview box 122 and a design function bar 124. The design function bar 124 has an edit option 125, a return option 126, a clear option 127, a complete option 128, and a file option 129. The edit option 125 has a tool option 1251 and a color palette option 1252, as shown in FIG. 9.

In some embodiments, the tool option 1251 and the color palette option 1252 may be located on the same level of menu, as shown in FIG. 9 and FIG. 10A. In some embodiments, the tool option 1251 and the color palette option 1252 may be located on different levels of menu, as shown in FIG. 10B. For example, please refer to FIG. 10B, in which the tool option 1251 has multiple tool pictures A1 and A2, and each tool picture A1 and A2 is connected to a color palette option 1252. When a tool picture A1 is selected, the coloring application provides the color palette option 1252 connected to the tool picture A1, for selection by the user. The color palette option 1252 has multiple color pictures C1 and C2 to be selected by the user.

The processing unit 110 may receive an appearance image Pf of a three-dimensional object 14 from the image capturing module 13, read a stored appearance image Pf from the storage unit 140, or receive an appearance image Pf from an external electronic device or storage device (Step S23). In some embodiments, the appearance image Pf may be a plane simulated image, i.e. there is 2D image of the three-dimensional object 14 in the appearance image. In some embodiments, the appearance image Pf may be a three-dimensional simulated image, i.e. there is 3D model of the three-dimensional object 14 in the appearance image.

Then, the processing unit 110 performs feature analysis on the received appearance image Pf, so as to generate an outline image Pp (Step S25). In some embodiments, the processing unit 110 may directly read a stored outline image Pp from the storage unit 140, or receive an outline image Pp from an external electronic device or storage device. For example, the user may use the file option 129 to select an outline image Pp to be displayed in the image preview box 122.

The processing unit 110 then displays the outline image Pp in the image preview box 122 on the user interface 120 (Step S27).

At the moment, the user may use the edit option 125 to perform coloring design of the outline image Pp.

In the process of coloring design, the user may use the tool option 1251 to select a coloring tool (that is, click a tool picture A1/A2 in the tool option). to be used, use the color palette option 1252 to select a color to be used (that is, click a color picture C1/C2 in the color palette option), and use the selected coloring tool and color to perform a coloring action on the outline image Pp in the image preview box 122 (that is, move the mouse to perform simulated coloring on the outline image Pp), to apply the selected color to the outline image Pp.

For each coloring action performed by the user, the user interface 120 outputs an edit instruction in response to the coloring action of the user (Step S29), so that the coloring application (that is, the processing unit 110) generates a coloring instruction in response to the edit instruction. The coloring instruction includes tool information indicating the coloring tool selected by the user, color information indicating the color selected by the user, and track information indicating a movement track of the coloring action.

In some embodiments, the track information is formed of multiple consecutive positioning points. Herein, the start of the coloring action corresponds to a first positioning point, the end of the coloring action corresponds to a last positioning point, and a movement process of the coloring action corresponds to a second positioning point to a penultimate positioning point sequentially. Each positioning point may be coordinate data.

After the user performs multiple coloring actions, the user may click the complete option 128 to make the user interface 120 output a confirm instruction. At the moment, the coloring application (that is, the processing unit 110), in response to the confirm instruction, sequences the multiple coloring instructions corresponding to the multiple coloring actions according to a generation order, so as to generate a coloring procedure (Step S31), and output the generated coloring procedure to the outside or store the generated coloring procedure in the storage unit 140 (Step S33). In other words, the coloring procedure has multiple coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof (that is, an order in which the user performs the multiple coloring actions).

Further, the coloring application (that is, the processing unit 110), may obtain a colored pattern Pc through the outline image Pp in the image preview box 122 in response to the confirm instruction. In some embodiments, the processing unit 110 may store the colored pattern Pc and the coloring procedure corresponding to the colored pattern Pc in the storage unit 140, so as to form a pattern database. In other words, the coloring application may have a pattern database. The pattern database is stored in the storage unit 140. The pattern database has one or more colored patterns Pc that are edited and stored in advance, and each colored pattern Pc has a corresponding coloring procedure Sp. Therefore, during next time of use, the user may directly use the file option 129 to select a colored pattern Pc to be used from the pattern database, and display the colored pattern Pc in the image preview box 122, which is for confirmation by the user.

Herein, the example that the user edits and designs the colored pattern Pc is provided, by the present invention is not limited thereto. That is to say, please refer to FIG. 11, in which after the user captures the appearance image Pf of the three-dimensional object 14 (such as the face, an eye, or other objects), to be colored, the appearance image Pf or the outline image Pp may be transmitted to another electronic device 11′ through the second connecting interface 130 in a wireless manner, in a wired manner or in other far-end transmission manner. Herein, the second connecting interface 130 may be a telecommunication module, so as to transmit the appearance image Pf acting as a multimedia messaging (MMS) (short message).

A designer may perform coloring design on the outline image Pp through the electronic device 11′, that is, Step S25 to Step S31 or Step S27 to Step S31. Herein, after the design is completed, the colored pattern Pc and the corresponding coloring procedure Sp are transmitted back. to the electronic device 11 of the user through the second connecting interface 130 in a wireless manner or in a wired manner (Step S33).

In some embodiments, please refer to FIG. 12, in which the edit option 125 may further have a template option 1253. The template option 1253 has multiple template patterns E1 and E2. Herein, each of the template patterns E1 and E2 is a colored pattern Pc that is edited and stored in the pattern database in advance. That is to say, each of the template patterns E1 and E2 has a respective corresponding coloring procedure Sp, and the coloring procedure Sp is already stored in the pattern database correspondingly in advance.

In other words, each edited colored pattern Pc and the corresponding coloring procedure thereof may be optionally stored as a template, so as to become an option in the template option 1253. When the edited colored pattern Pc and the corresponding coloring procedure thereof are stored as a template, the colored pattern. Pc may act as a template pattern. In some embodiments, each of the template patterns E1 and E2 is a colored pattern Pc, but is not limited to be represented (shown), as the outline image Pp of the three-dimensional object 14. In other words, each of the template patterns E1 and E2 presents a result of coloring design.

When the user clicks a template pattern E1 in the template option 1253, the a result of coloring design represented by the template pattern E1 is applied to the outline image Pp of the three-dimensional object 14 (that is, the image displayed in the image preview box 122), so as to obtain a final colored pattern Pc. At the moment, the user interface 120 may output an edit instruction corresponding to the template pattern E1 in response to a select operation of the user, so that the coloring application reads the coloring procedure Sp corresponding to the template pattern E1 from the pattern database in response to the edit instruction (Step S31), and outputs the read coloring procedure Sp when the user clicks the complete option 128 (Step S33).

In some embodiments, the coloring procedure is generated in a script form. The coloring procedure in the script form is, for example, as follows:

 <coloring interface = skin> (that is, the type of the three-dimensional object 14 to be colored)  <coloring pigment = No. 1 spray material> (that is, the type of the coloring pigment)  <color spray fineness = A> (that is, the type of the coloring tool)  <color spray color = red> (that is, the color of the coloring pigment)  <system positioning points X, Y scale X>  <draw point X.Y>  <draw line X0.Y0 X1,Y1>  <draw plane X0.Y0 X1,Y1>  <draw picture picture name>  <draw character character name>

In some embodiments, the outline image Pp displayed in the image preview box 122 may be a plane simulated image. In other words, there is 2D image of the three-dimensional object 14 in the appearance image.

In some embodiments, the outline image Pp displayed in the image preview box 122 may be a three-dimensional simulated imaged. In other words, there is 3D model of the three-dimensional object 14 in the outline image Pp. Implementation of the three-dimensional simulated image is well known by persons skilled in the art, and is not repeated herein.

Therefore, when the user performs a coloring action on the three-dimensional simulated image, each coloring action correspondingly generates track information represented by three-dimensional coordinates. In other words, each positioning point in the track information is three-dimensional coordinate data.

Please refer to FIG. 13, in which when the user actually performs makeup on the face, the user may connect the electronic device 11 to the automatic coloring machine 12, that is, electrically connect the first connecting interface 230 of the automatic coloring machine 12 to the second connecting interface 130 of the electronic device 11 (Step S41).

After the connection, the user may operate the electronic device 11, so that the processing unit 110 outputs a coloring procedure to the automatic coloring machine 12. In other words, the automatic coloring machine 12 receives through the first connecting interface 230 the coloring procedure transmitted in a wireless manner or in a wired manner (for example, through the connecting cable 15 or a physical connector) (Step S43).

Then, the automatic coloring machine 12 performs makeup on the face of the user based on the coloring procedure. Herein, the control unit 210 of the automatic coloring machine 12 sequentially executes each coloring instruction in the coloring procedure.

The control unit 210 controls, according to the executed coloring instruction, the material supply module 240 to select a pigment corresponding to color information in the coloring instruction (Step S45), and controls, according to the executed coloring instruction, the moving module 250 to select a coloring tool corresponding to tool information in the coloring instruction (Step S47). The execution order of Step S45 and Step S47 is not limited by the present invention. That is to say, besides sequential execution of Step S45 and Step S47, Step S45 and Step S47 may be executed at the same time, or Step S47 is executed before Step S45.

In Step S45, the color indicated by the color information may be one of multiple pigments included by the material supply module 240. Further, the color indicated by the color information may not be among the multiple pigments included by the material supply module 240. In this case, the material supply module 240 may select, according to the color indicated by the color information, two or more pigments from the multiple pigments to obtain the needed pigment (that is, the color indicated by the color information), by mixing. In other words, the automatic coloring machine 12 may have a storage unit, and a color database is established in the storage unit. The color database has multiple colors and corresponding mixing methods (for example, pigments and proportions thereof required for mixing).

Then, the control unit 210 moves the moving module 250 according to track information in the executed coloring instruction, and applies the selected pigment to the face of the user (the three-dimensional object 14), through the selected coloring tool (Step S49).

In some embodiments, when the track information is represented by two-dimensional coordinates, the control unit 210 moves the moving module 250 in the first direction and in the second direction according to each positioning point in the track information, so as to move the moving module 250 to a corresponding designated position. Further, during movement to each positioning point or upon arriving at the designated position, the control unit 210 receives a position signal from the range finding device 270 to control movement of the moving module 250 relative to the face (that is, the movement in the third direction), so that the coloring tool is positioned in a position capable of applying the pigment to the face of the user safely. Herein, the first direction, the second direction, and the third direction are the Y-axis, the X-axis, and the Z-axis of a movement coordinate system of the moving module 250 respectively.

After a coloring instruction is executed (that is, after a coloring action is completed), the control unit 210 continues to execute a next coloring instruction, until all coloring instructions are executed (Step S51).

In some embodiments, the coloring application has a coordinate system conversion step, so that an image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250.

In some embodiments, the coordinate system conversion step may use features or edges obtained in the feature analysis step (Step S25) as corresponding points, so that the image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250. That is to say, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.

In some embodiments, the coordinate system conversion step may be implemented by using a scaling object with known actual size.

When the user uses the image capturing module 13 to capture the appearance image Pf of the three-dimensional object 14, the scaling object is in the coverage of the capture at the same time. In other words, the image capturing module 13 is used to capture the appearance image Pf including the image of the three-dimensional object 14 and including the image of the scaling object. According to the known actual size and the image size of the scaling object in the appearance image Pf, the scale between the image coordinate system of the outline image Pp and the movement coordinate system of the moving module 250 is calculated. Then, by using the features or edges obtained in the feature analysis step (Step S25) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.

In some embodiments, the coordinate system conversion step may be implemented by using camera parameters (such as a focal length of the lens and an image format). of the image capturing module 13 and specifications of the screen in the user interface 120. The coloring application may calculate a scale between image size of the outline image Pp and the actual size of the three-dimensional object 14 according to the camera. parameters of the image capturing module 13 and the specifications of the screen in the user interface 120. Then, by using the features or edges obtained in the feature analysis step (Step S25) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.

After the coordinate system conversion step is completed, the coloring action applied to the outline image Pp on the user interface 120 can enable the coloring application to generate corresponding track information based on the movement coordinate system of the moving module 250.

In some embodiments, the coloring application may be implemented by a computer program product, so that after a computer (that is, the electronic device), is loaded with the coloring application and executes the coloring application, the automatic coloring method according to any embodiment of the present invention may be performed. In some embodiments, the computer program product may be a readable recording medium, and the coloring application is stored in the readable recording medium to be loaded into a computer. In some embodiments, the coloring application may be a computer program product, and transmitted to the computer in a wired manner or wireless manner.

In view of the above, the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere. Further, the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine. For example, the control unit of the automatic coloring machine does not need to have a powerful processing function, and may be implemented by, for example, a. microcontroller, or the automatic coloring machine does not need to be provided with the image capturing module. In some embodiments, by directly providing track information represented by three-dimensional coordinates, the automatic coloring machine can execute the actual coloring process more precisely. In some embodiments, by directly presenting a three-dimensional simulated image, the coloring action in the coloring design process is closer to that in the actual coloring process.

While the present invention has been described by the way of example and in terms of the preferred embodiments, it is to be understood that the invention need not be limited to the disclosed embodiments. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims, the scope of which should be accorded the broadest interpretation so as to encompass all such modifications and similar structures.

Claims

1. An automatic coloring system, used for coloring a three-dimensional object, the automatic coloring system comprising:

an automatic coloring machine, comprising: a first connecting interface, used for receiving a coloring procedure in a wireless manner or in a wired manner, wherein the coloring procedure has a plurality of coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof; a material supply module, having at least one pigment; a moving module; at least one coloring tool, disposed on the moving module; and a control unit, connected electrically to the first connecting interface, the material supply module, and the moving module, so as to sequentially execute the coloring instructions in the coloring procedure, and according to the executed coloring instruction control the material supply module to select at least one of the at least one pigment and control the moving module to move one of the at least one coloring tool to apply the selected at least one pigment to the three-dimensional object.

2. The automatic coloring system according to claim 1, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates, and the control unit controls, according to the track information, the moving module to move.

3. The automatic coloring system according to claim 1, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates, and the control unit controls, according to the track information, the moving module to move.

4. The automatic coloring system according to claim 1, further comprising:

an electronic device, comprising: a processing unit, used for receiving an appearance image of the three-dimensional object, and generating an outline image through feature analysis of the appearance image; a user interface, connected electrically to the processing unit, so as to display the outline image and sequentially output at least one edit instruction corresponding to the outline image, so that the processing unit obtains the coloring procedure in response to the at least one edit instruction; and a second connecting interface, connected electrically to the processing unit, so as to output the coloring procedure to the first connecting interface in a wireless manner or in a wired manner.

5. The automatic coloring system according to claim 4, further comprising:

an image capturing module, used for capturing the appearance image of the three-dimensional object;
wherein the electronic device further comprises:
a third connecting interface, connected electrically to the second connecting interface, wherein the image capturing module is connected to the second connecting interface in a wireless manner or in a wired manner, so that the processing unit receives the appearance image from the image capturing module through the second connecting interface and the third connecting interface.

6. The automatic coloring system according to claim 4, wherein the automatic coloring machine further comprises:

an image capturing module, connected electrically to the first connecting interface, so as to capture the appearance image of the three-dimensional object, and transmit the appearance image to the processing unit through the first connecting interface and the second connecting interface.

7. The automatic coloring system according to claim 4, wherein the electronic device further comprises:

an image capturing module, connected electrically to the processing unit, so as to capture the appearance image of the three-dimensional object.

8. The automatic coloring system according to claim 4, wherein the processing unit further executes coordinate system conversion by using camera parameters of a image capturing module and display specifications of the outline image in the user interface, so as to correspondingly convert coordinates of the outline image into coordinates for moving the moving module, and therefore obtain track information in each of the coloring instructions.

9. The automatic coloring system according to claim 4, wherein the outline image is a three-dimensional simulated image.

10. The automatic coloring system according to claim 4, wherein the number of the at least one edit instruction is multiple, and the edit instructions correspond to the coloring instructions respectively.

11. An automatic coloring method, comprising:

receiving an appearance image of a three-dimensional object;
generating an outline image through feature analysis on the appearance image;
displaying the outline image on a user interface;
using the user interface to sequentially output at least one edit instruction corresponding to the outline image;
in response to the at least one edit instruction, obtaining a coloring procedure, wherein the coloring procedure has a plurality of coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof; and
outputting the obtained coloring procedure in a wireless manner or in a wired manner.

12. The automatic coloring method according to claim 11, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates.

13. The automatic coloring method according to claim 11, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates.

14. The automatic coloring method according to claim 11, further comprising:

executing coordinate system conversion by using camera parameters used when capturing the outline image and display specifications of the outline image in the user interface, so as to correspondingly convert coordinates of the outline image into coordinates for moving a moving module of an automatic coloring machine, and therefore obtain a track information in each of the coloring instructions.

15. The automatic coloring method according to claim 11, further comprising:

an automatic coloring machine receiving the coloring procedure and sequentially executing the coloring instructions in the coloring procedure, which comprises:
according to the executed coloring instruction, controlling a material supply module of the automatic coloring machine to select at least one pigment; and
according to the executed coloring instruction, controlling a moving module of the automatic coloring machine to move a coloring tool, so as to apply the selected pigment to the three-dimensional object.

16. The automatic coloring method according to claim 15, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates, and a control step of the moving module comprises: controlling, according to the track information, the moving module to move.

17. The automatic coloring method according to claim 15, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates, and a control step of the moving module comprises: controlling, according to the track information, the moving module to move.

18. The automatic coloring method according to claim 11, further comprising:

capturing the appearance image of the three-dimensional object.

19. The automatic coloring method according to claim 11, wherein the outline image is a three-dimensional simulated image.

20. The automatic coloring method according to claim 11, wherein the number of the at least one edit instruction is multiple, and the edit instructions correspond to the coloring instructions respectively.

21. A computer program product, capable of implementing the automatic coloring method according to claim 11 after a computer is loaded with and executes the program.

Patent History
Publication number: 20140161507
Type: Application
Filed: Mar 14, 2013
Publication Date: Jun 12, 2014
Applicant: ZONG JING INVESTMENT,INC. (Taipei)
Inventor: Charlene Hsueh-Ling WONG (Taipei)
Application Number: 13/829,526
Classifications
Current U.S. Class: Combined (401/195)
International Classification: A45D 40/00 (20060101);