Method and system for luminance noise filtering
In a method and system for luminance noise filtering, a region of pixel data directly from the image sensor is used for determining a virtually filtered luminance for a pixel location within the region. Luminance noise reduction is performed using the region of pixel data directly from the image sensor such that frame memory is eliminated. In addition, the present invention provides adaptive noise filtering by selecting the virtually filtered luminance as a final luminance for a darker image and by selecting a reference luminance without virtual noise filtering for a brighter image.
The present application is a continuation application of an earlier filed copending patent application with Ser. No. 10/776,447 filed on Feb. 10, 2004, for which priority is claimed. This earlier filed copending patent application with Ser. No. 10/776,447 is in its entirety incorporated herewith by reference.
The present application also claims priority under 35 USC § 119 to Korean Patent Application No. 2003-0037268, filed on Jun. 10, 2003, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference. A certified copy of Korean Patent Application No. 2003-0037268 is contained in the parent copending patent application with Ser. No. 10/776,447.
TECHNICAL FIELDThe present invention relates generally to image pick-up devices, and more particularly, to luminance noise filtering for a pixel location using a region of pixel data directly from an image sensor such that line memory capacity is minimized.
BACKGROUND OF THE INVENTIONA signal processor 110 manipulates the pixel data from the image sensor 104 for showing the image of the object 106 on a display 112, or for further processing by an image recognition system 114, or for sending the image via a transmission system 116 such that the image is shown on a remote display 118. Referring to
With the Bayer filter array, the image sensor 104 generates an intensity signal of a respective color at each pixel location. A square labeled with an “R” is for a pixel location on the image sensor 104 that generates an intensity signal of the red color component. Similarly, a square labeled with a “G” is for a pixel location on the image sensor 104 that generates an intensity signal of the green color component. Further, a square labeled with a “B” is for a pixel location on the image sensor 104 that generates an intensity signal of the blue color component.
An interpolation algorithm is then used by the signal processor 110 to determine the full set of intensity signals of the respective interpolated RGB color components for each of the pixel locations. The interpolation algorithm uses the pixel data of the Bayer color filter array 120 for such a determination.
Such an interpolation algorithm is known to one of ordinary skill in the art as disclosed in U.S. Pat. No. 5,382,976, U.S. Pat. No. 5,506,619, or U.S. Pat. No. 6,091,862. For determining the interpolated color components R′, G′, and B′ at a particular pixel location 124 with such an interpolation algorithm, a region of pixel data 126 surrounding that pixel location 124 is used as illustrated in
Temporal noise affects the quality of the image of the object as detected and generated by the image pick-up device 102. Temporal noise is the variation in the output from the image sensor 104 even under uniform illumination onto the image sensor 104. Such temporal noise may arise from shot noise and 1/f noise at the photo-diodes of the image sensor 104, from thermal noise at the transistors and other circuit components used within the image sensor 104, or from quantization error of an A/D (analog to digital) converter used within the image sensor 104.
Such temporal noise increases with brightness of the image. However, the detrimental effect of the temporal noise on the image is greater at lower illumination because the SNR (signal to noise ratio) decreases with lower illumination. In fact, temporal noise sets a limit on the dynamic range of the image sensor 104 under dark conditions.
In the prior art, after the interpolated RGB color components 122A, 122B, and 122C for an n×n array of pixel locations are generated and stored in the frame memory device 122, a noise reducing block 132 uses such interpolated RGB color components for reducing the deleterious effects of the temporal noise.
In any case for the prior art noise reducing process, the capacity of the frame memory device 122 is sufficient to store the n×n arrays of interpolated RGB color components used by the noise reducing block 132. However, such a relatively large capacity of the frame memory device 122 is disadvantageous when the camera system 102 is incorporated as part of a hand-held device such as a cell phone or a PDA for example. Thus, elimination of the frame memory device 122 is desired for a smaller device size, lower power dissipation, and lower cost especially when the camera system 102 is incorporated into a hand-held device.
SUMMARY OF THE INVENTIONAccordingly, in a general aspect of the present invention, luminance noise filtering is performed for a pixel location using a relatively small region of pixel data directly from the image sensor such that the frame memory may be eliminated.
In a general embodiment of the present invention, in a method and system for luminance noise filtering, a region of pixel data from the image sensor is used for determining a virtually filtered luminance for a pixel location within the region. In an example embodiment, the virtually filtered luminance is determined by averaging the respective pixel data multiplied with a respective weighting coefficient for each pixel location of the region.
In another embodiment of the present invention, the color components for the pixel location are determined from the region of pixel data.
In a further embodiment of the present invention, a reference luminance is determined for the pixel location from the color components. A final luminance of the pixel location is selected between the virtually filtered luminance and the reference luminance depending on an adaptive luminance that indicates the brightness of the image. The present invention provides adaptive noise filtering by selecting the virtually filtered luminance as the final luminance for a darker image.
In this manner, noise filtering is performed for a pixel location within a region of pixel data using virtual luminance that is determined using such a region of pixel data directly from the image sensor. Thus, frame memory for storing interpolated pixel data is eliminated with the present invention. The elimination of frame memory is especially advantageous for smaller device size, lower power dissipation, and lower cost of the camera system incorporated into a hand-held device.
These and other features and advantages of the present invention will be better understood by considering the following detailed description of the invention which is presented with the attached drawings.
The figures referred to herein are drawn for clarity of illustration and are not necessarily drawn to scale. Elements having the same reference number in
Referring to
The system 200 is typically implemented within the signal processor 110 for manipulating pixel data from the image sensor 104. Referring to
The line memory device 204 may be implemented with any type of data storage devices. For example, the line memory device 204 may store pixel data for 4×H pixel locations with H being the number of columns of pixels at the image sensor 104 (such as 648 columns for example) when the image sensor 104 outputs pixel data row by row.
The RGB matrix 202 also includes an interpolation processor 208 for determining interpolated color components R′, B′, and G′ of a pixel location 210 within the region 206 (step 304 of
In addition, the RGB matrix 202 determines a reference luminance, Y1H, from the interpolated color components R′, B′, and G′ as follows (step 304 of
Y1H=(19*R′+38*G′+7*B′)/64
Such a reference luminance Y1H is calculated according to a conventional standard in the industry for calculating luminance as known to one of ordinary skill in the art.
Furthermore, the RGB matrix 202 includes a virtual luminance processor 212 for determining virtual luminance arrays Y0V, Y1V, and Y2V (step 304 of
Similarly, referring to
Referring back to
The first multiplier 222 multiplies the first luminance array Y0V with a first weighted coefficient array GAD0 [39 63 39] as follows:
39*A+63*B+39*C.
Similarly, the second multiplier 224 multiplies the second luminance array Y1V with a second weighted coefficient array GAD1 [63 104 63] as follows:
63*D+104*E+63*F.
Also, the third multiplier 226 multiplies the third luminance array Y2V with a third weighted coefficient array GAD2 [39 63 39] as follows:
39*G+63*H+39*I.
The adder 228 sums together the resulting values from the multipliers 222, 224, and 226. A shifter 232 within a brightness control block 234 divides the result from the adder 228 with a sum of all the coefficients of GAD0, GAD1, and GAD2 (i.e., 512). A fourth multiplier 236 within the brightness control block 234 multiplies the result from the shifter 232 by a luminance compensation factor α, to generate the virtually filtered luminance GOUT.
In one embodiment of the present invention, the luminance compensation factor α and the weighted coefficient arrays GAD0, GAD1, and GAD2 are determined using a Gaussian distribution equation for optimum image quality and are stored within a data register 220 in
In addition, referring to
For example, in
In another embodiment of the present invention, the reference luminance Y1H is used as indicating the adaptive luminance AY in
Referring back to
In
If the pixel location 210 is a last pixel location for an image (step 316 of
Generally, a larger array of pixel data is generated from the image sensor 104 for producing a smaller array of the processed image, as known to one of ordinary skill in the art. Pixel data from locations toward the outer perimeter of the image sensor 104 are used for image signal processing of adjacent pixel locations toward the center of the processed image. However, such pixel locations toward the outer perimeter become cut off from the processed image because a sufficient region of pixel data surrounding such a pixel location is not available, as known to one of ordinary skill in the art. Steps 304, 306, 308, 310, 312, 314, and 316 of
In this manner, noise filtering is performed for a single pixel location using a region of pixel data 206 directly from the image sensor 104 while determining the final luminance Y1H/GOUT for the pixel location 210. The region of pixel data 206 is also used for determining the interpolated color components R′, G′, and B′ of the pixel location 210. Because noise filtering is performed by using the region of pixel data 206 directly from the image sensor 104, frame memory for storing interpolated pixel data may be eliminated with the present invention.
Thus, the capacity of the memory device included in the image pick-up device may be minimized since the luminance noise filter uses the image data directly from the image sensor for determining the final luminance Y1H/GOUT and the interpolated color components R′, G′, and B′ of the pixel location 210. Such smaller memory capacity is advantageous for smaller device size, lower power dissipation, and lower cost especially when the camera system is incorporated into a hand-held device.
In addition, the present invention provides adaptive noise filtering by varying the threshold value THV depending on the brightness of the image. For a brighter image, the reference luminance Y1H is selected as the final luminance instead of the virtually filtered luminance GOUT. Noise filtering introduces distortion to the image, and the effect of temporal noise is less for a brighter image. Thus, the reference luminance Y1H without distortion from virtual noise filtering is selected as the final luminance for a brighter image. On the other hand, the deleterious effect of temporal noise is greater for a darker image. Thus, the virtually filtered luminance GOUT with noise filtering is selected as the final luminance for a darker image.
Referring back to
The Y/C formatter 219 generates luminance and chrominance data Y, Cb, Cr/R, and color data R′″, G′″, and B′″ according to a standard as required by the display 112, the image recognition system 116, or the transmission system 116. Such components of the Y/C processor 216 and the Y/C formatter 219 are known to one of ordinary skill in the art. The data processor 218 also determines and outputs to the image pick-up device auto exposure control data AED from Y1H, R′, G′, and B′, as known to one of ordinary skill in the art. The present invention lies in the darkly outlined components of the RGB matrix 202, the noise filter 214, the data processor 218, and the data register 220 in
The foregoing is by way of example only and is not intended to be limiting. For example, the present invention is described for the camera system 102 that may be part of a hand-held device. However, the present invention may be used for any type of imaging device performing image signal processing. In addition, the components illustrated and described herein for an example embodiment of the present invention may be implemented with any combination of hardware and/or software and in discrete and/or integrated circuits. In addition, any number as illustrated and described herein is by way of example only. For example, any number of pixels as illustrated and described herein is by way of example only.
The present invention is limited only as defined in the following claims and equivalents thereof.
Claims
1. A method for luminance noise filtering, comprising:
- inputting a region of pixel data from an image sensor;
- determining a virtually filtered luminance from a first processing of said region of pixel data and without using other pixel data for a pixel location within the region; and
- determining a reference luminance for the pixel location from a second processing of said same region of pixel data and without using other pixel data,
- wherein the reference luminance is determined after respective interpolated color components for the pixel location are determined such that the reference luminance is determined using said respective interpolated color components.
2. The method of claim 1, wherein the second processing includes the steps of:
- determining said interpolated color components for the pixel location from said region of pixel data; and
- determining the reference luminance for the pixel location from the interpolated color components.
3. The method of claim 1, further comprising:
- selecting between the virtually filtered luminance and the reference luminance as a final luminance of the pixel location depending on an adaptive luminance.
4. The method of claim 3, further comprising:
- determining a threshold value from the adaptive luminance;
- selecting the virtually filtered luminance if an absolute of a difference between the virtually filtered luminance and the reference luminance is less than or equal to the threshold value; and
- selecting the reference luminance if the absolute of the difference between the virtually filtered luminance and the reference luminance is greater than the threshold value.
5. The method of claim 4, wherein the adaptive luminance is determined from an average reference luminance for a predetermined region of pixel data.
6. The method of claim 3, wherein the adaptive luminance is indicated by an auto exposure gain for the image sensor.
7. The method of claim 3, wherein the adaptive luminance is indicated by the reference luminance.
8. The method of claim 1, wherein the virtually filtered luminance is determined by averaging a respective pixel data multiplied with a respective weighting coefficient for each pixel location of the region.
9. The method of claim 1, wherein the image sensor is part of a hand-held image pick-up device having minimized line memory capacity.
10. A system for luminance noise filtering, comprising:
- a memory device for storing a region of pixel data from an image sensor;
- a noise filter for determining a virtually filtered luminance from a first processing of said region of pixel data and without using other pixel data for a pixel location within the region; and
- a matrix for determining a reference luminance for the pixel location from a second processing of said same region of pixel data and without using other pixel data,
- wherein the matrix determines the reference luminance after respective interpolated color components for the pixel location are determined such that the reference luminance is determined using said respective interpolated color components.
11. The system of claim 10, wherein the matrix determines said interpolated color components for the pixel location from said region of pixel data such that the reference luminance is determined from the interpolated color components.
12. The system of claim 10, wherein the noise filter selects between the virtually filtered luminance and the reference luminance as a final luminance of the pixel location depending on an adaptive luminance.
13. The system of claim 12, further comprising:
- a data processor that determines a threshold value from the adaptive luminance;
- wherein the noise filter selects the virtually filtered luminance if an absolute of a difference between the virtually filtered luminance and the reference luminance is less than or equal to the threshold value; and
- wherein the noise filter selects the reference luminance if the absolute of the difference between the virtually filtered luminance and the reference luminance is greater than the threshold value.
14. The system of claim 13, wherein the adaptive luminance is determined from an average reference luminance for a predetermined region of pixel data.
15. The system of claim 12, wherein the adaptive luminance is indicated by an auto exposure gain for the image sensor.
16. The system of claim 12, wherein the adaptive luminance is indicated by the reference luminance.
17. The system of claim 10, wherein the virtual luminance is determined by averaging a respective pixel data multiplied with a respective weighting coefficient for each pixel location of the region.
18. The system of claim 10, wherein the image sensor is part of a hand-held image pick-up device having minimized line memory capacity.
19. A system for luminance noise filtering, comprising:
- means for inputting a region of pixel data from an image sensor;
- means for determining a virtually filtered luminance from said region of pixel data and without using other pixel data for a pixel location within the region; and
- means for determining a reference luminance for the pixel location from a second processing of said same region of pixel data and without using other pixel data,
- wherein the reference luminance is determined after respective interpolated color components for the pixel location are determined such that the reference luminance is determined using said respective interpolated color components.
20. The system of claim 19, further comprising:
- means for determining said interpolated color components for the pixel location from the said region of pixel data; and
- means for determining the reference luminance for the pixel location from the interpolated color components.
21. The system of claim 20, further comprising:
- means for selecting between the virtually filtered luminance and the reference luminance as a final luminance of the pixel location depending on an adaptive luminance.
Type: Application
Filed: Nov 28, 2008
Publication Date: Apr 2, 2009
Inventor: Hyung-Guen Lee (Suwon-Si)
Application Number: 12/315,147
International Classification: H04N 5/225 (20060101);