Image processing method and apparatus for processing an image by using a face detection result
An image processing apparatus determines, when white balance correction is executed on an image signal of a face area detected by face detection unit, based on a first white balance correction value obtained by detecting white pixels, whether the corrected image signal of the face area is present in a second color signal area around a first color signal area indicating a flesh color. When the corrected image signal of the face area is determined to be present in the second color signal area, the image processing apparatus calculates a second white balance correction value for correcting the image signal based on a relationship between the corrected image signal of the face area and the first color signal area.
Latest Canon Patents:
- Image capturing apparatus, control method of image capturing apparatus, and storage medium
- Emission of a signal in unused resource units to increase energy detection of an 802.11 channel
- Apparatus comprising emission areas with different relative positioning of corresponding lenses
- Image capturing apparatus
- Image capturing apparatus, system, and method
This application is a broadening reissue application of prior U.S. application Ser. No. 12/207,335 filed Sep. 9, 2008, now U.S. Pat. No. 8,160,310 issued Apr. 17, 2012, which claims priority from Japanese Patent Application No. 2007-235948 filed Sep. 11, 2007, which are hereby incorporated by reference herein in their entirety.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates to an image processing apparatus and a method, and an imaging apparatus, and more particularly to an image processing apparatus and a method for processing an image by using a face detection result, and an imaging apparatus.
2. Description of the Related Art
An operation of a conventional white balance gain calculation circuit used for a digital camera will be described. First, as illustrated in
Cx[i]=(R[i]−B[i])/Y[i]×1024 Cy[i]=(R[i]+B[i])−2G[i]/Y[i]×1024 (1)
Y[i]=R[i]+2G[i]+B[i], where [i] is an index number of each of the blocks.
A white object is captured under various light sources beforehand to calculate color evaluation values. Accordingly, if a white detection range 301 illustrated in
Then, based on the integrated pixel values (sumR, sumG, sumB), white balance coefficients (WBCo_R, WBCo_G, WBCo_B) are calculated by using the following equation (2):
WBCo_R=sumY×1024/sumR
WBCo_G=sumY×1024/sumG (2)
WBCo_B=sumY×1024/sumB
where sumY=(sumR+2×sumG+sumB)/4.
However, the conventional white balance coefficient calculation method has the following problems. Under a light source such as sunlight, white color evaluation values are distributed around an area A in
The above-described problems have conventionally been dealt with by determining a light source to be external light when an object illuminance is high, and the white detection range is narrowed to prevent the flesh color from being erroneously determined to be white.
Generally, fluorescent lamps include a variety of light sources, such as a source where whites are distributed around an area C, and a light source where whites are distributed below the area C. The white detection range has to be expanded to deal with such light sources. However, under high color temperature fluorescent lamps of low illuminance or middle color temperature fluorescent lamps, flesh colors may be distributed around a portion (area C) below a blackbody locus. Thus, expansion of the white detection range causes erroneous determination of the flesh color, resulting in a whitened flesh color.
U.S. Pat. No. 6,975,759 discusses a method of detecting a face, extracting a flesh color of the face when the face is detected, comparing the extracted flesh color with a reference flesh color, and correcting white balance based on the comparison result.
However, this conventional white balance correction value calculation method makes no consideration about a case where a face detection circuit erroneously detects a face. Consequently, even when the face detection circuit erroneously detects a false face area, the false face area may be corrected to be an appropriate flesh color. Thus, a desired white balance correction value cannot be acquired.
Depending on performance of an imaging apparatus regarding a processing speed, face detection may be unavailable during shooting. In such an imaging apparatus. even if a face is detected at a shooting preparation stage, the position of an object may be shifted during shooting. Such a case has not been taken into consideration. In other words, even if a face is correctly detected during a shooting preparation period, white balance correction accuracy may be lowered in the end.
The present invention is directed to an image processing apparatus and a method capable of performing more stable white balance correction based on a face detection result, and an imaging apparatus.
According to an aspect of the present invention, an image processing apparatus for processing an image signal of an image obtained by capturing includes a first calculation unit configured to calculate a first white balance correction value by detecting white pixels from the image, a face detection unit configured to detect a face area from the image, a determination unit configured to determine, when white balance correction based on the first white balance correction value is executed on an image signal of the face area detected by the face detection unit, whether the corrected image signal of the face area is present in a second color signal area around a first color signal area indicating a flesh color, and a second calculation unit configured to calculate, when the determination unit determines that the corrected image signal of the face area is present in the second color signal area, a second white balance correction value for correcting the image signal based on a relationship between the corrected image signal of the face area and the first color signal area.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
First Exemplary Embodiment
Referring to
A face detection unit 114 detects a face area based on the image signal stored in the memory 102. Various methods have been proposed as techniques of detecting face areas. Any method can be used as long as information on the position and size of a face can be obtained. Thus, these methods are in no way limitative of the present invention. For example, a method using learning represented by a neutral network, and a method of extracting parts such as eyes or a nose having features in physical shape by template matching, are known. Another available technique is a method that detects an image feature amount, such as a flesh (skin) color or an eye shape, and that analyzes it by using a statistical method (see Japanese Patent Application Laid-Open Nos. 10-232934 and 2000-48184). Other currently available products include a method of detecting a face by using wavelet transform and an image feature amount, and a method that combines template matching.
As a face detection method, a template matching method, which is one of pattern recognition methods, will be described. The pattern recognition is a process of correlating an observed pattern with one of predefined concepts (classes).
First, in step S1, the face detection unit 114 reads out image data from the memory 102 and pre-processes the image data. Instep S2, the face detection unit 114 extracts a pattern of a feature part from the pre-processed image data. The face detection unit 114 correlates the extracted pattern with a template (standard pattern) (template matching). For example, as illustrated in
Thus, the face detection unit 114 obtains the recognition pattern in step S3, and outputs the obtained recognition pattern in step S4 to finish the pattern recognition process.
Referring back to
A white balance (WB) control unit 103 calculates a WB correction value based on the image signal stored in the memory 102 and face information obtained from the face detection unit 114. By using the calculated WB correction value, the WB control unit 103 performs white balance correction (WB correction) on the image signal stored in the memory 102. A method of calculating the WB correction value used by the WB control unit 103 will be described below in detail.
A color conversion matrix (MTX) circuit 104 amplifies the image signal WB-corrected by the WB control unit 103 by color gains to convert it into color-difference signals R-Y and B-Y so that the image signal can be reproduced by an optimal color. A low-pass filter (LPF) circuit 105 limits a frequency band of the color-difference signals R-Y and B-Y. A chroma-suppress (CSUP) circuit 106 suppresses a false color signal of a saturated part in the image signal frequency-band-limited by the LPF circuit 105.
The image signal WB-corrected by the WB control unit 103 is also output to a luminance signal (Y) generation circuit 111 to generate a luminance signal Y. An edge enhancement circuit 112 carries out edge enhancement for the generated luminance signal Y.
An RGB conversion circuit 107 converts the color-difference signals R-Y and B-Y output from the CSUP circuit 106, and the luminance signal (Y) output from the edge enhancement circuit 112 into RGB signals, and then a gamma correction circuit 108 corrects gradation of the signals. Subsequently, a color luminance conversion circuit 109 converts the signals into YUV signals, and then the YUV signals are compressed, for example, by using JPEG compression at a compression circuit 110 to be recorded as image signals on an external recording medium or an internal recording medium.
Referring to
First, in step S11, for example, the WB control unit 103 detects white pixels from the image signal stored in the memory 102 to calculate a first white balance correction value (first WB correction value). A method for calculating the first WB correction value will be described next in detail referring to
First, in step S101, the WB control unit 103 reads out an image signal stored in the memory 102, and divides the image area into m blocks illustrated in
Cx[i]=(R[i]B[i])/Y[i]×1024 Cy[i]=(R[i]+B[i])−2G[i]/Y[i]×1024 (1)
Y[i]=R[i]+2G[i]+B[i], where [i] is an index number of each block.
In step S103, the WB control unit 103 determines whether color evaluation values (Cx[i], Cy[i]) of the i-th block calculated in step S102 are included within a predetermined white detection range 301 illustrated in
If the calculated color evaluation values (Cx[i], Cy[i]) are included within the white detection range 301 (YES in step S103), the WB control unit 103 determines that the block is white. Then, in step S104, the WB control unit 103 integrates color average values (R[i], G[i], B[i]) of the block. If the calculated color evaluation values (Cx[i], Cy[i]) are not included (NO in step S103), the WB control unit 103 proceeds to step S105 without any addition. Steps S103 and S104 can be represented by the following equation (3):
In equation (3), Sw[i] is set to 1 if color evaluation values (Cx[i], Cy[i]) are included within the white detection range 301, and set to 0 if not included. In this way, whether to add color average values (R[i], G[i], B[i]) is practically determined based on the determination in step S103. In step S105, the WB control unit 103 determines whether the aforementioned process has been executed for all the blocks. If there is a block yet to be processed (NO in step S105), the WB control unit 103 returns to step S102 to repeat the process. If all of the blocks have been processed (YES in step S105), the process proceeds to step S106.
In step S106, the WB control unit 103 calculates first WB correction values (WBCo1_R, WBCo1_G, WBCo1_B) based on the obtained integrated values (sumR, sumG, sumB) of the color evaluation values by using the following equation (4):
WBCo1_R=sumY×1024/sumR
WBCo1_G=sumY×1024/sumG (4)
WBCo1_B=sumY×1024/sumB
where sumY=(sumR+2×sumG+sumB)4.
After the calculation of the first WB correction values, in step S12, the WB control unit 103 determines whether a face is detected. If no face is detected (NO in step S12), then in step S20, the WB control unit 103 determines to use the first WB correction values, calculated in step S11, for the WB process thereof. Then, the process ends.
If a face is detected (YES in step S12), then in step S13, the WB control unit 103 obtains blocks of a face area. In step S14, the WB control unit 103 obtains the color average values (R[i], G[i], B[i]) calculated in step S102 to calculate the first WB correction values of one of the blocks.
Then, the WB control unit 103 multiplies each of the color average values obtained in step S14 with each of the corresponding first WB correction values obtained in step S11 to calculate a flesh color average value (value obtained by WB-correcting the color average value of the face area based on the first WB correction value, i.e., a corrected image signal). In step S15, the WB control unit 103 determines which of a flesh color area (area (A) in
For example, when a face is erroneously detected (imaging area (b) in
After the aforementioned process is executed a number of times equal to the number of processed blocks (i.e., until the process becomes YES in step S17), the WE control unit 103 proceeds to step S18. In step S18, the WB control unit 103 determines whether a total (total of image signals) of flesh color average values obtained by the addition in step S16 is within the flesh color correction target area (B).
If the total of the flesh color average values is within the flesh color area (A) in
According to the above-described exemplary embodiment, erroneous corrections or excessive corrections, which may be caused by an erroneous face recognition or a shift of an object during shooting, can be reduced. Thus, more stable white balance correction can be carried out according to a face detection result.
The present invention can be applied to a system that includes a plurality of devices (e.g., a host computer, an interface device, and a camera head), or to an apparatus (e.g., a digital still camera or a digital video camera).
The present invention can also be achieved as follows. First, a storage medium or a recording medium, which stores software program code to realize the functions of the exemplary embodiment, is supplied to a system or an apparatus. Then, a computer (CPU or MPU) of the system or the computer in an apparatus reads out the program code stored in the storage medium and executes the same. In this case, the program code read out from the storage medium realizes the functions of the exemplary embodiment.
The present invention is not limited to the case where the computer executes the read program code to realize the functions of the exemplary embodiment. The invention can also be achieved as follows. Based on instructions of the readout program code, an operating system (OS) operating in the computer executes a part or the whole of the actual process to realize the functions of the exemplary embodiment. The storage medium which stores the program code, for example, can be a floppy disk, a hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic tape, a non-volatile memory card, a compact disc-ROM (CD-ROM), a CD-recordable (CD-R), a digital versatile disc (DVD), an optical disk, or a magneto-optical disk (MO). A computer network, such as a local area network (LAN) or a wide area network (WAN), can be used for supplying the program code.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
This application claims priority from Japanese Patent Application No. 2007-235948 filed Sep. 11, 2007, which is hereby incorporated by reference herein in its entirety.
Claims
1. An image processing apparatus for processing an image signal of an image obtained by capturing, the image processing apparatus comprising:
- a first calculation unit configured to calculate a first white balance correction value by detecting white pixels from the image;
- a face detection unit configured to detect a face area from the image;
- a determination unit configured to determine, after white balance correction based on the first white balance correction value is executed on an image signal of the face area detected by the face detection unit, whether the corrected image signal of the face area is present in a second color signal area around a first color signal area indicating a flesh color; and
- a second calculation unit configured to calculate, when the determination unit determines that the corrected image signal of the face area is present in the second color signal area, a second white balance correction value for correcting the image signal based on a relationship between the corrected image signal of the face area and the first color signal area; and
- a white balance correction unit configured to correct white balance of the image,
- wherein the white balance correction unit executes white balance correction selectively by using the second white balance correction value when the determination unit determines that the corrected image signal of the face area is present in the second color signal area, and by using the first white balance correction value when the determination unit determines that the corrected image signal of the face area is not present in the second color signal area.
2. The image processing apparatus according to claim 1, further comprising a white balance correction unit configured to correct white balance of the image,
- wherein the white balance correction unit executes white balance correction selectively by using the second white balance correction value when the determination unit determines that the corrected image signal of the face area is present in the second color signal area, and by using the first white balance correction value when the determination unit determines that the corrected image signal of the face area is not present in the second color signal area.
3. The image processing apparatus according to claim 1, wherein the second calculation unit corrects the image signal so that the image signal of the face area is located in the first color signal correction area.
4. The image processing apparatus according to claim 1, wherein the first calculation unit detects the white pixels of the image for each predetermined block wherein the image is divided into blocks, and the determination unit executes the determination on the image signal of the face area for each predetermined block.
5. An image processing apparatus for processing an image signal of an image obtained by capturing, the image processing apparatus comprising:
- a white balance correction unit configured to correct white balance of the image signal;
- a face detection unit configured to detect a face area from the image;
- a first calculation unit configured to calculate a first white balance correction value used for the white balance correction based on an image signal of a block which is present in a color signal area indicating white from among blocks into which the image is divided and each of which includes a plurality of pixels; and
- a second calculation unit configured to add corrected image signals present in a first color signal area indicating a flesh color and a second color signal area that is a surrounding area of the first color signal area from among corrected image signals obtained after correcting white balance of image signals of blocks included in the face area based on the first white balance correction value when the face detection unit detects the face area, and to calculate a second white balance correction value for correcting a total of image signals to enter the first color signal area when the total of image signals obtained by the addition is present in the second color signal area,
- wherein the white balance correction unit executes white balance correction by using the second white balance correction value when the second white balance correction value is calculated.
6. The image processing apparatus according to claim 5, wherein the second calculation unit adds the corrected image signals included in neither of the first and second color signal areas, and adds the corrected image signals included in the first color signal area, the corrected image signals included in the second color signal area, and the corrected image signals included in neither of the first and second color signal areas after applying different weights to the respective corrected image signals, and
- wherein the weights applied to the corrected image signals included in neither of the first and second color signal areas are lower than the weights applied to the corrected image signals included in the first and second color signal areas.
7. The image processing apparatus according to claim 5, wherein, when no second white balance correction value is calculated, the white balance correction unit executes white balance correction by using the first white balance correction value.
8. The image processing apparatus according to claim 5, wherein the first and second calculation units calculate the first and second white balance correction values by using image signals of images obtained by capturing after the image processed by the face detection unit.
9. An imaging apparatus comprising:
- an imaging unit configured to output an image signal of an image obtained by capturing; and
- the image processing apparatus according to claim 5.
10. A method for processing an image signal of an image obtained by capturing, the method comprising:
- calculating, using a processor, a first white balance correction value by detecting a white pixel from the image;
- detecting a face area from the image;
- determining whether the corrected image signal of the face area is present in a second color signal area around a first color signal area indicating a flesh color after white balance correction is executed on an image signal of the detected face area based on the first white balance correction value; and
- calculating, using a processor, a second white balance correction value for correcting the image signal when the corrected image signal of the face area is determined to be present in the second color signal area based on a relationship between the corrected image signal of the face area and the first color signal area; and
- executing white balance correction selectively by using the second white balance correction value when it is determined that the corrected image signal of the face area is present in the second color signal area, and by using the first white balance correction value when it is determined that the corrected image signal of the face area is not present in the second color signal area.
11. A method for processing an image signal of an image obtained by capturing, the method comprising:
- detecting a face area from the image;
- calculating, using a processor, a first white balance value used for white balance correction based on an image signal of a block present in a color signal area indicating white from among blocks into which the image is divided for a plurality of pixels;
- adding, when the face area is detected, corrected image signals present in a first color signal area indicating a flesh color and a second color signal area that is a surrounding area of the first color signal area from among corrected image signals obtained after correcting white balance of image signals of blocks included in the face area based on the first white balance correction value, and calculating, using a processor, a second white balance correction value when a total of image signals obtained by the addition is present in the second color signal area for correcting the total of image signals to enter the first color signal area;
- executing white balance correction for the image signal by using the first and second white balance correction values; and
- executing, when the second white balance correction value is calculated, white balance correction by using the second white balance correction value.
12. An image processing apparatus for processing an image signal of an image obtained by capturing, the image processing apparatus comprising:
- a first calculation unit configured to calculate a first white balance correction value based on the image;
- a face detection unit configured to detect a face area from the image;
- a determination unit configured to determine, in a case where white balance correction based on the first white balance correction value is executed on an image signal of a face area, whether the corrected image signal of the face area is present in a second color signal area different from a first color signal area indicating a flesh color;
- a second calculation unit configured to calculate, in a case where the determination unit determines that the corrected image signal of the face area is present in the second color signal area, a second white balance correction value for correcting the image signal based on a relationship between the corrected image signal of the face area and the first color signal area; and
- a white balance correction unit configured to correct white balance of the image using the first white balance correction value and the second white balance correction value.
13. The image processing apparatus according to claim 12, wherein the first calculation unit calculates the first white balance correction value by detecting white pixels from the image.
14. The image processing apparatus according to claim 12, wherein the second color signal area is around a first color signal area.
15. The image processing apparatus according to claim 12, wherein the white balance correction unit executes white balance correction selectively by using the second white balance correction value in a case where the determination unit determines that the corrected image signal of the face area is present in the second color signal area, and by using the first white balance correction value in a case where the determination unit determines that the corrected image signal of the face area is not present in the second color signal area.
16. The image processing apparatus according to claim 12, wherein the second calculation unit corrects the image signal so that the image signal of the face area is located in the first color signal area.
17. The image processing apparatus according to claim 12, wherein the image is divided into blocks, and the determination unit executes the determination on the image signal of the face area for each block.
18. A method for processing an image signal of an image obtained by capturing, the method comprising:
- calculating, using a processor, a first white balance correction value;
- detecting a face area from the image;
- determining, in a case where white balance correction is executed on an image signal of a face area based on the first white balance correction value, whether the corrected image signal of the face area is present in a second color signal area different from a first color signal area indicating a flesh color;
- calculating, using a processor, a second white balance correction value for correcting the image signal in a case where the corrected image signal of the face area is determined to be present in the second color signal area based on a relationship between the corrected image signal of the face area and the first color signal area; and
- correcting white balance of the image using the first white balance correction value and the second white balance correction value.
19. A non-transitory computer-readable storage medium storing a control program for implementing a control method of controlling an image processing apparatus for processing an image signal of an image obtained by capturing, the control method comprising:
- calculating, using a processor, a first white balance correction value by detecting a white pixel from the image;
- detecting a face area from the image based on the image;
- determining, in a case where white balance correction is executed on an image signal of the detected face area based on the first white balance correction value, whether the corrected image signal of the face area is present in a second color signal area around a first color signal area indicating a flesh color;
- calculating, using a processor, a second white balance correction value for correcting the image signal in a case where the corrected image signal of the face area is determined to be present in the second color signal area based on a relationship between the corrected image signal of the face area and the first color signal area; and
- correcting white balance of the image using the first white balance correction value and the second white balance correction value.
20. A non-transitory computer-readable storage medium storing a control program for implementing a control method of controlling an image processing apparatus for processing an image signal of an image obtained by capturing, the control method comprising:
- detecting a face area from the image;
- calculating, using a processor, a first white balance value used for white balance correction based on an image signal of a block present in a color signal area indicating white among blocks into which the image is divided for a plurality of pixels;
- adding, in a case where the face area is detected, corrected image signals present in a first color signal area indicating a flesh color and a second color signal area that is a surrounding area of the first color signal area among corrected image signals obtained after correcting white balance of image signals of blocks included in the face area based on the first white balance correction value, and calculating, using a processor, a second white balance correction value in a case where a total of image signals obtained by the addition is present in the second color signal area for correcting the total of image signals to enter the first color signal area;
- executing white balance correction for the image signal by using the first and second white balance correction values; and
- executing, in a case where the second white balance correction value is calculated, white balance correction by using the second white balance correction value.
21. A non-transitory computer-readable storage medium storing a control program for implementing a control method of controlling an image processing apparatus for processing an image signal of an image obtained by capturing, the control method comprising:
- calculating, using a processor, a first white balance correction value;
- detecting a face area from the image;
- determining, in a case where white balance correction is executed on an image signal of a face area based on the first white balance correction value, whether the corrected image signal of the face area is present in a second color signal area different from a first color signal area indicating a flesh color;
- calculating, using a processor, a second white balance correction value for correcting the image signal in a case where the corrected image signal of the face area is determined to be present in the second color signal area based on a relationship between the corrected image signal of the face area and the first color signal area; and
- correcting white balance of the image using the first white balance correction value and the second white balance correction value.
4739393 | April 19, 1988 | Seki et al. |
6975759 | December 13, 2005 | Lin |
7139425 | November 21, 2006 | Takahashi |
7599093 | October 6, 2009 | Kagaya |
7636108 | December 22, 2009 | Suzuki et al. |
7652717 | January 26, 2010 | Enge et al. |
7864222 | January 4, 2011 | Yoshino et al. |
7868929 | January 11, 2011 | Fujiwara |
8045014 | October 25, 2011 | Fujiwara et al. |
20030001958 | January 2, 2003 | Hoshuyama |
20030058350 | March 27, 2003 | Ishimaru et al. |
20030156206 | August 21, 2003 | Ikeda et al. |
20040156544 | August 12, 2004 | Kajihara |
20040184671 | September 23, 2004 | Fukuda et al. |
20040208363 | October 21, 2004 | Berge et al. |
20060284991 | December 21, 2006 | Ikeda |
20080002865 | January 3, 2008 | Toyoda |
20080211925 | September 4, 2008 | Misawa et al. |
20090002519 | January 1, 2009 | Nakamura |
20090067683 | March 12, 2009 | Takayama |
20090167892 | July 2, 2009 | Takayama |
20090225226 | September 10, 2009 | Kakuta |
10-232934 | September 1998 | JP |
2000-048184 | February 2000 | JP |
2001-148863 | May 2001 | JP |
2004-180114 | June 2004 | JP |
2005-122612 | May 2005 | JP |
Type: Grant
Filed: Apr 17, 2014
Date of Patent: Dec 6, 2016
Assignee: Canon Kabushiki Kaisha (Tokyo)
Inventor: Masahiro Takayama (Tokyo)
Primary Examiner: Adam L Basehoar
Application Number: 14/255,748
International Classification: G06K 9/00 (20060101); G06K 9/38 (20060101); H04N 1/60 (20060101); G06T 5/00 (20060101); H04N 5/232 (20060101); H04N 9/73 (20060101);