Fingerprint region segmenting apparatus, directional filter unit and methods thereof
A fingerprint region segmenting apparatus and methods thereof The fingerprint region segmenting apparatus may include at least one directional filter receiving an input fingerprint image and filtering the input fingerprint image to generate at least one directional image, a normalization unit normalizing the at least one directional image and a region classification unit dividing the normalized at least one directional image into a plurality of blocks and classifying each of the plurality of blocks. In an example, the classification for each of the plurality of blocks may be one of a foreground of the input fingerprint image and a background of the input fingerprint image. In an example method, a fingerprint may be segmented by segmenting a fingerprint image into a plurality of regions based on a plurality of directional images, each of the plurality of directional images associated with a different angular direction.
This application claims the benefit of Korean Patent Application No. 10-2005-0000807, filed on Jan. 5, 2005, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates generally to a fingerprint apparatus, directional filter and methods thereof, and more particularly to a fingerprint region segmenting apparatus, directional filter and methods thereof.
2. Description of the Related Art
Fingerprints may vary from person to person. Further, a fingerprint may not change throughout a person's life. Accordingly, fingerprints may be a useful tool for identification. Conventional fingerprint recognition systems may verify a person's identity, and may be included in, for example, an automated security system, a financial transaction system, etc.
In conventional fingerprint recognition systems, an input fingerprint image may include a foreground and a background. The foreground may refer to an area of the input fingerprint image including ridges. The ridges may indicate where a finger may contact a fingerprint input apparatus when the fingerprint may be made. The background may refer to an area that may not include ridge information, which may be a portion of the fingerprint image where a finger may not contact the fingerprint input apparatus when the fingerprint may be made.
Conventional fingerprint recognition systems may distinguish between the foreground and the background with fingerprint segmentation. The fingerprint segmentation may divide a given fingerprint image into a foreground and a background. The fingerprint segmentation may be performed at an initial stage of a fingerprint recognition process.
The fingerprint segmentation may enable other stages of the fingerprint recognition process, such as, for example, an extraction of ridge directions in the foreground, enhancement of foreground image quality and/or thinning of the foreground. Accordingly, the fingerprint segmentation may reduce a duration of the fingerprint recognition process and/or increase a reliability of the fingerprint recognize process.
However, errors may occur with respect to the information extracted from the background and/or the foreground. A fingerprint region segmenting process may reduce errors with respect to the background and/or the foreground. In the conventional region segmenting process, a brightness value in a given direction for each pixel of a fingerprint image (e.g., the background and/or the foreground) may be calculated. The fingerprint image may be divided into a plurality of blocks having a given pixel size (e.g., 16×16). The conventional region segmenting process may use a histogram distribution of the brightness values associated with the given directions in corresponding blocks to divide the fingerprint image into a plurality of regions.
However, if a given region in the plurality of regions has a uniform brightness, the direction for the given region may not be determined and the given region may not be divided correctly. Other conventional methods for determining a given fingerprint region may be based on a maximum response of a Gabor filter bank, reconstructing a fingerprint region, a consistency of ridge directions, a mean and variance of brightness of a fingerprint image, an absolute value of a ridge gradient calculated in given units and/or establishing a reliability metric based on information from neighboring blocks/regions.
However, each of the above-described conventional methodologies may be based on fixed threshold values which may filter a fingerprint image received from a given fingerprint input apparatus. Thus, if the given fingerprint apparatus is changed, the fixed threshold values may be less accurate, which may reduce an accuracy of a fingerprint region segmentation. In addition, other fingerprint characteristics (e.g., a humidity level or whether a fingerprint may be wet or dry) may vary between fingerprint images, which may further reduce the accuracy of the fingerprint region segmentation.
SUMMARY OF THE INVENTIONAn example embodiment of the present invention is directed to a fingerprint region segmenting apparatus, including a directional filter unit receiving an input fingerprint image and filtering the input fingerprint image to generate at least one directional image, a normalization unit normalizing the at least one directional image and a region classification unit dividing the normalized at least one directional image into a plurality of blocks and classifying each of the plurality of blocks.
Another example embodiment of the present invention is directed to a method of segmenting a fingerprint image, including filtering an input fingerprint image to generate at least one directional image, normalizing the at least one directional image, dividing the at least one normalized directional image into a plurality of blocks and classifying each of the plurality of blocks.
Another example embodiment of the present invention is directed to a method of segmenting a fingerprint image, including segmenting the fingerprint image into a plurality of blocks based on a plurality of directional images, each of the plurality of directional images associated with a different angular direction.
Another example embodiment of the present invention is directed to a directional filter unit, including a plurality of directional filters generating a plurality of directional images based on a fingerprint image, each of the plurality of directional images associated with a different angular direction.
BRIEF DESCRIPTION OF THE DRAWINGSThe accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate example embodiments of the present invention and, together with the description, serve to explain principles of the present invention.
Hereinafter, example embodiments of the present invention will be described in detail with reference to the accompanying drawings.
In the Figures, the same reference numerals are used to denote the same elements throughout the drawings.
In the example embodiment of
In the example embodiment of
In the example embodiment of
An example embodiment of the directional gradient filter unit 120 of
FIGS. 2(A), 2(B), 2(C) and 2(D) illustrate example directional gradient filters 220/240/260/280 corresponding to angular directions of 0°, 45°, 90°, and 135°, respectively, according to another example embodiment of the present invention. The following Equations 1-4 may correspond to the example embodiments illustrated in
where a coordinate (x) may denote a horizontal position of a given pixel of the FIMG, a coordinate (y) may denote a vertical position of the given pixel of the FIMG, I(x, y) may denote a level of brightness of the given pixel at coordinate (x, y), DGF0(x,y), DGF45(x,y), DGF90(x,y), and DGF135(x,y) may denote a level of brightness of the given pixel at angular directions of 0°, 45°, 90°, and 135°, respectively, and a distance d may denote a distance between a center pixel C and a width (2m+1) of a filter (e.g. directional gradient filter 122, 124, 126, 128, etc.).
In the example embodiment of FIGS. 2(A), 2(B), 2(C) and 2(D), a variable m may equal 1 and the distance d may equal 2. Further, the directional gradient filters 220/240/260/280 may be represented as two sets of three pixels (e.g., −1, 1, etc.) and the center pixel C in a 5×5 pixel grid.
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In another example embodiment of the present invention, the directional gradient filters 220/240/260/280 of FIGS. 2(A)-2(D) may correspond to the first/second/third/fourth directional gradient filters 122/124/126/128, respectively, of
In the example embodiment of
In another example embodiment of the present invention, directional gradient filters 220/240/260/280, which may use equations 1-4, respectively, may output filtered values DGF1, DGF2, DGF3 and DGF4, respectively. In an example, if the difference of brightness values at a given angular direction is higher, the absolute value of the filtered values DGF1/DGF2/DGF3/DGF4 may be higher. Likewise, if the difference of brightness values at a given angular direction is lower, the filtered value DGF1/DGF2/DGF3/DGF4 may be lower (e.g., approximately zero).
In another example, there may be a lower brightness value difference among neighboring pixels in a background of a given fingerprint image. In another example, there may be an increased brightness value difference among neighboring pixels in a foreground of the given fingerprint image. If the absolute value of the filtered value DGF1/DGF2/DGF3/DGF4 is lower (e.g., approximately zero), there may be a higher probability that a corresponding center pixel is located in the background of the given fingerprint image. Likewise, if the absolute value of the filtered value DGF1/DGF2/DGF3/DGF4 is higher, there may be a higher probability that a corresponding center pixel is located in the foreground of the given fingerprint image.
In another example embodiment of the present invention, if noise (e.g., point noise) occurs in a fingerprint image, a brightness difference among neighboring pixels may be higher. Accordingly, if an absolute value of the filtered value DGF1/DGF2/DGF3/DGF4 is equal to or greater than a maximum threshold MAX or equal to or less than a minimum threshold MIN, there may be a higher probability that a corresponding center pixel may be located in a noise region. In an example, the maximum threshold MAX and the minimum threshold MIN may be values corresponding to the upper 1% and the lower 1%, respectively, of the filtered values DGF1/DGF2/DGF3/DGF4 obtained by filtering a number of pixels (e.g., all pixels) in a plurality of angular directions (e.g., 0°, 45°, 90°, 135°, etc). However, it is understood that values for the maximum threshold MAX and the minimum threshold MIN may be established in any well-known manner in other example embodiments of the present invention. For example, a user may set values for the thresholds MIN/MAX in another example embodiment of the present invention.
In the example embodiment of
In the example embodiment of
In the example embodiment of
In another example embodiment of the present invention, brightness ranges may vary based on a type of fingerprint input apparatus receiving a given fingerprint. Thus, the directional gradient images associated with fingerprint images of the same finger may vary based at least in part on the type of fingerprint input apparatus.
In another example embodiment of the present invention, fingerprint images associated with the same finger may have different brightness ranges with respect to a humidity level of a fingerprint input apparatus. Thus, the directional gradient images of fingerprint images may vary based at least in part on a humidity level associated with a received fingerprint image.
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In another example embodiment of the present invention, a normalization of the directional gradient images DGIMG1/DGIMG2/DGIMG3/DGIMG4 may be given as
where NDGI(x,y) may denote a value obtained by normalizing the values DGF1/DGF2/DGF3/DGF4 filtered for a given pixel at a coordinate (x, y), angle θ may denote a given angular direction associated with one of the directional gradient filters 122/124/126/128, and a value A may denote an upper bound in a range for normalization. In the example embodiment of
An example embodiment of the normalization represented in Equation 5 will now be described in greater detail.
In the example embodiment of Equation 5, filtered values DGF1/DGF2/DGF3/DGF4, which may be distributed between the maximum threshold MAX and the minimum threshold MIN (e.g., as illustrated in
In the example embodiment of FIGS. 5(A), 5(B), 5(C) and 5(D), the normalized directional gradient image 510 may be clear (e.g., having portions with a higher probability of correctly characterizing as one of a foreground or a background) in the 0° degree direction, the normalized directional gradient image 520 may be clear in the 45° degree direction, the normalized directional gradient image 530 may be clear in the 90° degree direction and the normalized directional gradient image 540 may be clear in the 135° degree direction.
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
where coordinate (p,q) may denote a position for one of the plurality of blocks in a normalized gradient image, and direction i may denote a given angular direction (e.g., 0°, 45°, 90°, 135°) of the directional gradient filter.
In the example embodiment of
In the example embodiment of
where the coordinate (p,q) may denote a position for one of the plurality of blocks in a normalized gradient image, a first number CHL may denote the number of normalized values less than the central value and a second number CHH may denote the number of normalized values greater than the central value. The normalization coefficient HS may have a value between 0 and 1. In an example, the symmetry of the normalization coefficient HS may increase as the normalization coefficient HS approaches 0 and the symmetry may decrease as the normalization coefficient HS approaches 1.
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In another example embodiment of the present invention, a fingerprint region may be segmented by normalizing a plurality of directional gradient images. Thus, threshold values (e.g., variance threshold TV, symmetrical coefficient threshold, etc.) need not be adjusted for different environments (e.g., different fingerprint input apparatuses, different humidity levels, etc.).
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
In the example embodiment of
Although described primarily in terms of hardware above, the example methodology implemented by one or more components of the example system described above may also be embodied in software as a computer program. For example, a program in accordance with the example embodiments of the present invention may be a computer program product causing a computer to execute a method of segmenting a fingerprint image into a plurality of regions, as described above.
The computer program product may include a computer-readable medium having computer program logic or code portions embodied thereon for enabling a processor of the system to perform one or more functions in accordance with the example methodology described above. The computer program logic may thus cause the processor to perform the example method, or one or more functions of the example method described herein.
The computer-readable storage medium may be a built-in medium installed inside a computer main body or removable medium arranged so that it can be separated from the computer main body. Examples of the built-in medium include, but are not limited to, rewriteable non-volatile memories, such as RAM, ROM, flash memories and hard disks. Examples of a removable medium may include, but are not limited to, optical storage media such as CD-ROMs and DVDs; magneto-optical storage media such as MOs; magnetism storage media such as floppy disks (trademark), cassette tapes, and removable hard disks; media with a built-in rewriteable non-volatile memory such as memory cards; and media with a built-in ROM, such as ROM cassettes.
These programs may also be provided in the form of an externally supplied propagated signal and/or a computer data signal embodied in a carrier wave. The computer data signal embodying one or more instructions or functions of the example methodology may be carried on a carrier wave for transmission and/or reception by an entity that executes the instructions or functions of the example methodology. For example, the functions or instructions of the example method may be implemented by processing one or more code segments of the carrier wave in a computer controlling one or more of the components of the example apparatus 100 of
Further, such programs, when recorded on computer-readable storage media, may be readily stored and distributed. The storage medium, as it is read by a computer, may enable the processing of multimedia data signals prevention of copying these signals, allocation of multimedia data signals within an apparatus configured to process the signals, and/or the reduction of communication overhead in an apparatus configured to process multiple multimedia data signals, in accordance with the example method described herein.
Example embodiments of the present invention being thus described, it will be obvious that the same may be varied in many ways. For example, while above-described example embodiments include four directional gradient filters corresponding to four angular directions, it is understood that other example embodiments of the present invention may include any number of directional gradient filters and/or angular directions. Further, while above-described example embodiments are illustrated with a symmetrical distribution (e.g., in
Further, the example embodiment illustrated in
Further, while above-described as directional gradient filters 122/124/126/128/220/240/260/280, it is understood that in other example embodiments of the present any directional filter may be employed. Likewise, while above-described as directional gradient images, it is understood that in other example embodiments any directional image may be generated by other example directional filters.
Such variations are not to be regarded as departure from the spirit and scope of example embodiments of the present invention, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.
Claims
1. A fingerprint region segmenting apparatus, comprising:
- a directional filter unit receiving an input fingerprint image and filtering the input fingerprint image to generate at least one directional image;
- a normalization unit normalizing the at least one directional image; and
- a region classification unit dividing the normalized at least one directional image into a plurality of blocks and classifying each of the plurality of blocks.
2. The fingerprint region segmenting apparatus of claim 1, further comprising:
- a pre-processing unit for reducing noise in the input fingerprint image.
3. The fingerprint region segmenting apparatus of claim 1, wherein the one directional filter unit includes a plurality of directional filters and the at least one directional image includes a plurality of directional images.
4. The fingerprint region segmenting apparatus of claim 3, wherein the plurality of directional filters filters the input fingerprint image at a plurality of angular directions.
5. The fingerprint region segmenting apparatus of claim 4, wherein each of the plurality of directional filters filters the input fingerprint image at a different one of the plurality of angular directions.
6. The fingerprint region segmenting apparatus of claim 1, wherein the region classification unit classifies based at least in part on variances and symmetrical coefficients associated with the plurality of blocks.
7. The fingerprint region segmenting apparatus of claim 1, wherein the region classification unit classifies each of the plurality of blocks as being associated with one of a foreground of the input fingerprint image and a background of the input fingerprint image.
8. The fingerprint region segmenting apparatus of claim 4, wherein the plurality of angular directions includes at least one of 0°, 45°, 90°, and 135°.
9. The fingerprint region segmenting apparatus of claim 4, wherein the plurality of angular directions include a first angular direction, a second angular direction, a third angular direction and a fourth angular direction, wherein a brightness difference between pixels in the input fingerprint image for the first, second, third and fourth angular directions may be represented respectively as DGF 0 ( x, y ) = ∑ k = - m m { I ( x + d, y - k ) - I ( x - d, y - k ) } DGF 45 ( x, y ) = ∑ k = - m m { I ( x + d 2 + k, y + d 2 - k ) - I ( x + d 2 + k, y - d 2 - k ) } DGF 90 ( x, y ) = ∑ k = - m m { I ( x - k, y + d ) - I ( x - k, y - d ) } DGF 135 ( x, y ) = ∑ k = - m m { I ( x - d 2 + k, y + d 2 - k ) - I ( x + d 2 + k, y - d 2 - k ) } where DGF0, DGF45, DGF90, and DGF135 denote the brightness differences in angular directions of 0°, 45°, 90°, and 135°, respectively, coordinate (x,y) denotes coordinates indicating the position of the pixel in the directional image, and d denotes a distance from the pixel and (2m+1) denotes the width of a corresponding directional filter.
10. The fingerprint region segmenting apparatus of claim 9, wherein m equals 1 and d equals 2.
11. The fingerprint region segmenting apparatus of claim 1, wherein the normalization unit generates the normalized at least one directional image by normalizing brightness differences of each pixel of the at least one directional image into values in a given range.
12. The fingerprint region segmenting apparatus of claim 11, wherein the given range ranges from 0 to A, and the normalized brightness difference is expressed as NDGI θ ( x, y ) = { min - DGF θ ( x, y ) min × ( A + 1 ) 2, if DGF θ ( x, y ) < 0 max + DGF θ ( x, y ) max × ( A + 1 ) 2, otherwise where NDGI denotes the normalized brightness difference, min denotes a brightness difference corresponding to a lowest 1% from among a brightness distribution, θ denotes one of a plurality of angular directions associated with the at least one directional filter, and max denotes the brightness difference corresponding to a highest 1% among the brightness distribution.
13. The fingerprint region segmenting apparatus of claim 12, wherein A equals 255.
14. The fingerprint region segmenting apparatus of claim 1, wherein the region classification unit includes:
- a block segmenting unit dividing the normalized directional image into the plurality of blocks, each of the plurality of blocks having a given size;
- a variance calculation unit calculating a first variance of normalized brightness differences in each of the plurality of blocks;
- a symmetrical coefficient calculation unit calculating a symmetrical coefficient of the normalized brightness difference in each of the plurality of blocks; and
- a region determination unit determining a classification associated with each of the plurality of blocks based at least in part on the calculated first variance and the calculated symmetrical coefficient.
15. The fingerprint region segmenting apparatus of claim 14, wherein the variance calculation unit calculates a mean of the normalized brightness differences at a plurality of angular directions for each of the plurality of blocks, calculates a second variance of the normalized brightness differences at the plurality of angular directions for each of the plurality of blocks and selects a maximum value among the calculated second variances at the plurality of angular directions as the first variance for one of the plurality of blocks.
16. The fingerprint region segmenting apparatus of claim 15, wherein the mean is expressed as E i ( p, q ) = 1 mm ∑ x = pm + 1 pm + m ∑ y = qm + 1 qm + m NDGI i ( x, y ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image and i denotes one of the plurality of angular directions.
17. The fingerprint region segmenting apparatus of claim 15, wherein the second variance is expressed as V i ( p, q ) = 1 mm ∑ x = pm + 1 pm + m ∑ y = qm + 1 qm + m { E i ( p, q ) - NDGI i ( x, y ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image and i denotes one of the plurality of angular directions.
18. The fingerprint region segmenting apparatus of claim 14, wherein the symmetrical coefficient calculation unit calculates the symmetrical coefficient for each of the plurality of blocks based on a ratio of a number of the normalized brightness differences greater than a central value in a brightness distribution to a number of the normalized brightness differences less than the central value in the brightness distribution.
19. The fingerprint region segmenting apparatus of claim 18, wherein the symmetrical coefficient is expressed as HS ( p, q ) = CHH ( p, q ) - CHL ( p, q ) CHH ( p, q ) + CHL ( p, q ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image, CHL denotes the number of normalized brightness differences less than the central value in the brightness distribution, and CHH denotes the number of normalized brightness differences greater than the central value in the brightness distribution.
20. The method of claim 14, wherein the region determination unit classifies a given block as associated with a foreground of the input fingerprint image if the variance is greater than a variance threshold and the symmetrical coefficient is less than a symmetrical coefficient threshold and classifies the given block as associated with a background of the input fingerprint image if the variance is not greater than a variance threshold and the symmetrical coefficient is not less than a symmetrical coefficient threshold.
21. The fingerprint region segmenting apparatus of claim 14, further comprising:
- a preprocessing unit reducing noise in the input fingerprint image.
22. The fingerprint region segmenting apparatus of claim 21, wherein the preprocessing unit reduces the noise with a Gaussian-filtering process.
23. The fingerprint region segmenting apparatus of claim 1, further comprising:
- a post-processing unit correcting a classification for at least one incorrectly classified block from among the plurality of blocks.
24. The fingerprint region segmenting apparatus of claim 23, wherein the at least one corrected block is initially classified incorrectly by the region classification unit.
25. The fingerprint region segmenting apparatus of claim 24, wherein the post-processing unit corrects the at least one incorrectly classified block by repeatedly median-filtering the fingerprint image in which the incorrectly classified block is classified.
26. A method of segmenting a fingerprint image, comprising:
- filtering an input fingerprint image to generate at least one directional image;
- normalizing the at least one directional image;
- dividing the at least one normalized directional image into a plurality of blocks; and
- classifying each of the plurality of blocks.
27. The method of claim 26, further comprising:
- preprocessing the input fingerprint image to reduce noise before the filtering.
28. The method of claim 27, wherein the filtering filters the input fingerprint image at a plurality of angular directions.
29. The method of claim 27, wherein the dividing is based at least in part on a variance and a symmetrical coefficient of each of the plurality of blocks.
30. The method of claim 27, wherein the classifying classifies each of the plurality of blocks as being associated with one of a foreground of the input fingerprint image and a background of the input fingerprint image.
31. The method of claim 28, wherein the plurality of angular directions include at least one of 0°, 45°, 90°, and 135°.
32. The method of claim 28, wherein the plurality of angular directions include a first angular direction, a second angular direction, a third angular direction and a fourth angular direction, wherein a brightness difference between pixels in the input fingerprint image for the first, second, third and fourth angular directions may be represented respectively as DGF 0 ( x, y ) = ∑ k = - m m { I ( x + d, y - k ) - I ( x - d, y - k ) } DGF 45 ( x, y ) = ∑ k = - m m { I ( x + d 2 + k, y + d 2 - k ) - I ( x + d 2 + k, y - d 2 - k ) } DGF 90 ( x, y ) = ∑ k = - m m { I ( x - k, y + d ) - I ( x - k, y - d ) } DGF 135 ( x, y ) = ∑ k = - m m { I ( x - d 2 + k, y + d 2 - k ) - I ( x + d 2 + k, y - d 2 - k ) } where DGF0, DGF45, DGF90, and DGF135 denote the brightness differences in angular directions 0°, 45°, 90°, and 135°, respectively, coordinate (x,y) denotes coordinates indicating the position of the pixel in the directional image, and d denotes a distance from the pixel and (2m+1) denotes the width of a corresponding directional filter.
33. The method of claim 32, wherein m equals 1 and d equals 2.
34. The method of claim 26, wherein the normalizing includes normalizing brightness differences of each pixel of the at least one directional image into values in a given range.
35. The fingerprint region segmenting apparatus of claim 34, wherein the given range ranges from 0 to A, and the normalized brightness difference is expressed as NDGI θ ( x, y ) = { min - DGF θ ( x, y ) min × ( A + 1 ) 2, ifDGF θ ( x, y ) < 0 max + DGF θ ( x, y ) max × ( A + 1 ) 2, otherwise where NDGI denotes the normalized brightness difference, min denotes a brightness difference corresponding to a lowest 1% from among a brightness distribution, θ denotes one of a plurality of angular directions associated with the at least one directional filter, and max denotes the brightness difference corresponding to a highest 1% from among the brightness distribution.
36. The method of claim 35, wherein A equals 255.
37. The method of claim 26, wherein the classifying includes:
- dividing the at least one normalized directional image into the plurality of blocks, each of the plurality of blocks having a given size;
- calculating a first variance of a normalized brightness differences for each of the plurality of blocks;
- calculating a symmetrical coefficient of the brightness difference for each of the plurality of blocks; and
- determining whether a classification associated with each of the plurality of blocks based on the calculated first variance and the calculated symmetrical coefficient.
38. The method of claim 37, wherein the calculating of the first variance includes:
- calculating a mean of the normalized brightness differences at a plurality of angular directions for each of the plurality of blocks;
- calculating a second variance of the normalized brightness differences at the plurality of angular directions for each of the plurality of blocks; and
- selecting a maximum value among the calculated second variances at the plurality of angular directions as the first variance for one of the plurality of blocks.
39. The method of claim 37, wherein the mean is expressed as E i ( p, q ) = 1 mm ∑ x = pm + 1 pm + m ∑ y = qm + 1 qm + m NDGI i ( x, y ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image and i denotes one of the plurality of angular directions.
40. The method of claim 37, wherein the second variance is expressed as V i ( p, q ) = 1 m m ∑ x = pm + 1 pm + m ∑ y = qm + 1 qm + m { E i ( p, q ) - NDGI i ( x, y ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image and i denotes one of the plurality of angular directions.
41. The method of claim 37, wherein the calculating of the symmetrical coefficient includes is based on a ratio of a number of the normalized brightness differences greater than a central value to a number of the normalized brightness differences less than the central value.
42. The method of claim 37, wherein the symmetrical coefficient is expressed as HS ( p, q ) = CHH ( p, q ) - CHL ( p, q ) CHH ( p, q ) + CHL ( p, q ) where coordinate (p,q) denotes a position of one of the plurality of blocks in the normalized at least one image, CHL denotes the number of normalized brightness differences less than the central value and CHH denotes the number of normalized brightness differences greater than the central value.
43. The method of claim 37, wherein the classifying classified a given block as associated with a foreground of the input fingerprint image if the variance is greater than a variance threshold and the symmetrical coefficient is less than a symmetrical coefficient threshold.
44. The method of claim 37, wherein the classifying classified a given block as associated with a background of the input fingerprint image if the variance is not greater than a variance threshold and the symmetrical coefficient is not less than a symmetrical coefficient threshold.
45. The method of claim 27, wherein the preprocessing is performed before the at least one directional image is generated.
46. The method of claim 27, wherein the preprocessing includes a Gaussian-filtering process.
47. The method of claim 26, wherein the classifying classifies at least one of the plurality of blocks incorrectly.
48. The method of claim 47, further comprising:
- correcting the classification of the at least one incorrectly classified block.
49. The method of claim 48, wherein the correcting includes applying a median-filtering process repeatedly.
Type: Application
Filed: Jan 4, 2006
Publication Date: Jul 6, 2006
Inventors: Dong-jae Lee (Seoul), Deok-soo Park (Seoul)
Application Number: 11/324,380
International Classification: G06K 9/00 (20060101);