Fast Iris Detection via Shape based Circularity

0 downloads 0 Views 886KB Size Report
Keywords: Iris Detection, Circularity, Image Processing. I. INTRODUCTION ..... Taking into account that C# code might run 100 times faster than matlab code, we ...
Fast Iris Detection via Shape based Circularity Milos Stojmenovic, Aleksandar Jevremovic, Amiya Nayak Singidunum University, Belgrade, Serbia {mstojmenovic, ajevremovic}@singidunum.ac.rs SEECS, University of Ottawa, Canada [email protected] Abstract— We define an iris detection algorithm that performs an order of magnitude faster than the state of the art, while preserving accuracy. The algorithm isolates the pupil boundary by extracting image edges, then finding the largest contiguous set of points that satisfy the circularity criterion and contain mostly dark pixels. The iris/sclera boundary is found by horizontally and simultaneously searching along both directions of the pupil center for the highest cumulative difference in intensities. Current detection systems mainly rely on the methods proposed by [D] in order to isolate the iris pattern reliably, yet they are computationally expensive and expectedly slow. We apply a measure of circularity to isolate both the sclera and pupil boundaries, avoiding the exhaustive search required by [D]. Our method correctly identifies the iris region in 95% of test cases in the CASIA 3 dataset [CA]. While the detection rate is slightly lower compared to competitors, our method performs in O(n2) time compared to Ω(n3) that [D] offers. The iris detection procedure of [D], implemented in Matlab runs an average of roughly 15 seconds per image, while our own implementation in C++ on a single core 2.0 Ghz processor takes about 5 milliseconds on the same system, which is also faster than [HAK]. Keywords: Iris Detection, Circularity, Image Processing.

I.

INTRODUCTION Like other measures for primitive geometric shapes, the measure of circularity is motivated by real world image processing problems, in this case, biometrics. Circularity is common in nature and industry, and finding a way of identifying it can be important for potential applications to both. The main part of our solution involves measuring how circular a finite set of points is. In analyzing various algorithms, we restrict ourselves to the following criteria. Circularity values are assigned to sets of points and these values shall be numbers in the range [0, 1]. The circularity measure equals 1 if and only if the shape is a circle, and equals 0 when the shape is highly non-circular such as a line. A shape’s circularity value should be invariant under similarity transformations of the shape, such as scaling, rotation and translation. The algorithms should also be resistant to protrusions in the data set. Circularity values should also be computed by a simple and fast algorithm for any finite set of points. Using such a framework, we are able to quickly detect circular regions as candidates to pupil and sclera boundary detection. Circularity measures were discussed in [LS, DD, CKT, KA, P]. All of them are area based and linked to closed curves except for one: [P]. This one is shape based and can be applied to open curves.

Daugman’s method [D] extracts the iris area by using an integrodifferential circular edge detector which tests a given point and radius (x, y, r) for circular characteristics. An exhaustive search is performed over the entire image for circular regions. Once the optimal locations of the circles have been determined, the area between them is used for iris comparison. [ZTW] detects the pupil by assuming it is the largest dark spot in the iris image, and the outer boundary of the iris by "maximizing changes of the perimeter-normalized sum of gray level values along the circle." [RB, FH] manually localized the iris, while [YZW] find the center of the pupil by first applying a binary threshold, and then finding the longest consecutive sequence of black pixels while searching through each horizontal line of the binary image. [HAK] introduces a real time solution to iris detection which is based on a combination of morphology, thresholding and an incremental, horizontal search for the pupil. We propose shape based algorithms that assign circularity values to both open and closed curves. These measures are adaptations of the linearity measures proposed in [SNZ]. The choice of center of each shape influences its overall circularity value. The center of each shape is traditionally seen as its center of gravity. We also consider another definition of shape center here. The ‘true center’ (Xtc, Ytc) of a shape is defined as the center of a circle C that best fits the shape to C. It is determined by sampling k triplets of points from the point set, and finding their true centers. The median center value of the k samples is taken as the shape’s true center. We use both cases of centers of shapes in order to eliminate instances where a boundary’s center is outside the image. We successfully apply our algorithm to the problem of iris detection in the CASIA V3 near infrared iris dataset which contains 2655 iris images from 249 subjects. Our algorithm was able to detect 95% of all irises (sclera and pupil boundaries), at an order of magnitude faster than [D]. The literature review is given in section 2. Our circularity measure and iris detection procedure are seen in section 3. Experimental data are presented in section 4, along with a general discussion of our results. II.

LITERATURE REVIEW

A. Iris Detection Daugman's iris recognition algorithm [D] first became commercialized in the 1990s. The algorithm automatically recognizes persons in real-time by encoding the random patterns visible in the iris of the eye from some distance, and applying a powerful test of statistical independence. It is currently used in many identification applications such as

several border crossing controls. Daugman's algorithm uses a Gabor wavelet transform to extract the phase structure of the iris. This is encoded into a very compact bit stream, the IrisCode, that is stored in a database for identification purposes. The iris detection algorithm [D] extracts the iris from a near infrared image. After Gabor wavelet transform, it applies an integrodifferential circular edge detector which is an exhaustive search over the entire image where an iris is present, taking into account 3 parameters (Xi, Yi, R), where Xi and Yi are the proposed coordinates of the center of the eye, and R is the radius from the center of the iris or pupil to the boundary. Each iris has 2 radii (one to the edge of the pupil, and the other to the edge of the iris). Once the optimal location of the center has been determined via this exhaustive search, the area between the two radii is used for iris comparison. The number of tests is of the order of the product of width, height and maximum radius of the iris, and is cubic in these dimensions. Since each test (for the locations of the pupil and sclera boundaries) has possibly non-constant time complexity, we conclude that the time complexity of [D] is Ω(n3). [YZW] finds the center of the pupil by first applying a binary threshold to the eye image, and then by finding the longest consecutive sequence of black pixels while searching through each horizontal line of the binary image. The longest such 'chord' is nominally pronounced as the diameter of the pupil. Due to illumination issues within the pupil, this technique is not always stable. It remains unclear how the outer ring of the iris is determined since the authors only mention that this is achieved by "using edge detection in a certain region determined by the center of the pupil". [ZTW] detect the pupil by assuming it is the largest dark spot in the iris image, and the outer boundary of the iris by "maximizing changes of the perimeter-normalized sum of gray level values along the circle." No further details are given, however this is reminiscent of the approach taken by [D]. The iris images that are used for testing are cropped such that nothing except the iris is present, which reduces the general applicability of the system. A real time iris detection system was proposed by [T] which claims to trivially find pupils via a global threshold followed by morphological opening to remove light sources reflected within. The iris boundary is found by an algorithm that remotely resembles active contour models, which optimizes an energy function at 4 points which roughly form the iris outer boundary. Their algorithm requires tuning several parameters to achieve optimal results which they claim to be real time, yet not specific execution times are mentioned. Earlier versions of the [CA] database contained simpler inclusions, which would make the pupil detection operation simpler, but recent iterations make morphological operations far less successful in isolating pupil regions. [J] Discusses the performance of 5 different iris segmentation algorithms on challenging images. The goal is to convey some of the difficulties in localizing the iris structure in images of the eye characterized by variations in illumination, eye-lid and eye-lash occlusion, de-focus blur,

motion blur and low resolution. Although some of the algorithms achieve high detection rates, no discussion is offered regarding the speed of any of the surveyed algorithms. A real time iris detection system is proposed by [HAK]. They employ a Hough circle search for pupillary boundary detection and gradient search method for limbic boundary detection. Their method also uses Otsu thresholds, morphological operations, median filtering, and performing a gradient search for the iris/sclera boundary, the last part of which is reminiscent of the approach by [D]. Their algorithm, implemented as a multithreaded C++ program in Unix on a quad core 2.4Ghz processor runs at an average of 0.19s. Masek [M] uses a Hough based search approach for localization of the iris and pupil region, which runs slightly faster than [D], with similar detection accuracy. [SM] propose a fuzzy circle detector, which is a set of circles with different radii. The set is tested against any region, by comparing the intensity values on the perimeters of consecutive circles. Circular regions that pass the test will have a high relative difference of intensities, which is a similar idea to Daugman’s integrodifferential circular detector, although the authors of [SM] maintain that their algorithm runs an order of magnitude faster than [D] while suffering only slightly in terms of detection rate. B. Existing circularity measures The most used circularity measure is the well-known C=4πA/P2, where A is the area of the shape while P is its perimeter. It is scale independent. However, the border of an object may be highly irregular and the calculated value of C will then be very small and will not reflect the correct circularity measure, as observed in [DD]. Di Ruberto and Dempster [DD] defined new circularity measures which are translation and scale invariant and are based on mathematical morphology. Morphological operators simplify images, and quantify and preserve the main shape of objects. Erosion g=f®s of a binary image f by a structuring element s is an image g such that g(x,y)=1 if s, when centered at (x,y), fits f, and g(x,y)=0 otherwise. Thus erosion creates a new image that marks all of the locations of a structuring element’s origin at which it fits the input image [E]. Di Ruberto and Dempster [DD] make use of a distance function dist(p) associated with each pixel p, as the integer valued radius of the smallest circle which erodes p. Let R(p,r) be a circle with radius r centered at p. Then dist(p)= min {r ∈ N, p ∉ f®R(p,r)}. The regional maxima of the distance function represents inner points of f located at the longest distance from the border of f. In this sense, they can be considered as ‘centres of gravity’ of an object. A perfect circle has one regional maximum in the centre of the circle. Let h=max{dist(p)} be the maximum distance function, then V measure [DD] is defined as V=sum dist(p)/h3, which is the ratio of the volume of the generated shape of the distance function and the cube of the height of the same function. If scaled by π/3, for perfect circle this value is 1. T measure is defined as T=A/h2 [DD], where A is the area of the object. The E measure in [DD] is E=h/sqrt(A), where h is the radius of the largest disk contained inside the given object.

In the M measure in [DD], the distances from the center of gravity from the border in several directions (e.g. 8 or 16 cardinal directions) are calculated and the object deformity is calculated by the variance of these distances. If there are several centers of gravity, the sum of variances is taken. A perfect circle has zero variance. Let L be the number of centres of gravity. If d(I,j) is the distance from a centre of gravity i in direction j then, M= var(d(1,j)/max(d(1,j)+…+var(d(L,j)/max(d(L,j)). Experiments in [DD] seem to favor the M and E measures. All of the authors measures are restricted to closed curves only. The measures presented in [DD] will be analyzed in section III. Proffitt [P] introduced measures for circularity and ellipticity on a digital grid. His circularity measure was based on taking the mean radius μr presumably from the center of gravity to each border pixel. The standard deviation σr of all such radii was also calculated. The proposed circularity measure was 2 P = 1 − (σ r μ r ) .

It is a variation of measure σr/μr proposed by Haralick [H]. This is the only shape based measure that was found in literature. Lee and Sallee [LS] introduce a set of area based measures of circularity, triangularity, and rectangularity. Their circularity measure first separately calculates the intersection and union of the shape area S with the area of the circle C that best fits the shape. Their final circularity measure is the ratio of the areas of the intersection and union of S and C, (S∩C)/(S∪C). Determining the best fit circle C was not explained in their work. Novaski and Barczak [NB] surveyed and compared several known circularity algorithms, which can be divided into two groups: MZC (minimum zone center) and LSC (least-square center). One representative in each group is selected. The comparison is in the context of computer-aided inspection through coordinate measuring machines as an important process controller in industry, since circularity is an important attribute. In LSC algorithm, the circularity value is calculated based on the nearest and farthest points measured from the center of mass. In the Voronoi diagram based MZC approach, the nearest and farthest Voronoi diagrams are first constructed. Each intersection point of an edge of the nearest and one edge of the farthest Voronoi diagram is a center of the smallest circle that contains all the points from the set. Two points that define that nearest Voronoi edge define another circle centered at the same intersection point. The MZC solution corresponds to the pair of circles with the smallest separation. We did not consider either measure because they are sensitive to protrusions in the set. Sladoje, Nyström, and Saha [SNS, SNS-2] investigated and proposed several measurements for digitized 2D and 3D roundness (compactness, circularity) for objects with fuzzy borders. That is, each pixel, instead of a binary value, has a value in the interval [0, 1] which corresponds to its probability of belonging to the shape. Their new method provides

advantages for low resolution images with respect to crisp (binary) images. A 3D roundness index, or generalization of a circularity measure in 3D, has been studies in several articles, for instance [SNS, BJM]. Roundness is normally defined as A3/(36πV2), or the quadratic or cubic root of it (A is the area while V is the volume of a 3D image). In [BJM], a voxel-based measure of digital compactness for brain imaging is defined. Lee, Wang and Lee [LWL] searched for a method for measuring the perimeter, area and compactness of a digital shape so that, for a given analog geometric shape, the difference between its analog compactness and digital compactness is minimized. Compactness is measured by the traditional formula, while the novelty is in several definitions of perimeter and area. Perimeter measures include the total number of boundary pixels, the sum of lengths between adjacent pixels, and the sum of lengths of ‘cracks’ (lines between adjacent pairs of object and background pixels). Area measures have several variants, depending on the treatment of boundary pixels, which may be included, excluded, or partially included in several ways. Stojmenovic, Nayak and Zunic [SNZ] proposed 6 linearity measurements for finite, planar point sets. Their measures are quickly calculated and are invariant to scale, position or rotation. All of their methods give linearity estimates in the range [0, 1] after some normalization. Their Average Orientations measure takes k random pairs of points along the curve. It finds their slopes (m), and finds the normals to their slopes (-m, 1). These normals are averaged out, and the resulting normal (A, B) is deemed to be the normal to the orientation of the curve. The averaging is done separately for each vector coordinate. The measure of linearity is defined as A2 + B2 . III. MEASURING CIRCULARITY We will present our circularity measure for unordered point sets here. The choice of center for each shape is an important factor in measuring circularity. We have used two methods for finding the center of each shape. The first method is the general center of gravity of the shape, which corresponds to a percoordinate average value of each pixel in the shape. The true center finding method takes the median point value of k triplets sample points belonging to the curve. The results of all tests cases are presented in the next chapter. A. Adapting Linearity measures to measure Circularity The Average Orientation linearity measure presented in [SNZ] was used to measure circularity here. The intended input for the linearity functions was an array of n Cartesian pixel coordinates in the form (xi, yi). The Cartesian pixel array was transformed into an array of polar coordinate pixels. The polar coordinate representation of a point p is of the form (rp, αp), where rp is the distance of the point p from the origin, and αp is the angle p makes with respect to the x-axis, as seen in Figure 1. Points are transferred from Cartesian coordinates to Polar coordinates as follows: Point (x, y) in Cartesian form would be represented by x 2 + y 2 , arctan ( y x ) , where x≠0. This form of point representation is beneficial since circular objects

(

)

whose points are transferred to polar coordinates appear linear when these points are mapped. Figure 2. shows a circle with radius r drawn in planar Cartesian coordinates, and its corresponding polar representation on the right.

Figure 3. Choosing the correct shape center

Figure 1. Polar coordinate example

Here, we see a semicircular shape where the red dot, labeled Cg, represents the center of the shape as determined by the center of gravity method. Transferring the shape to polar coordinates with respect to the red dot would not yield a straight line, but rather a curved one. This would result in a lower circularity measure than expected for the given shape. However, had the green dot been chosen as the center, labeled Ct, the resulting polar coordinate representation would have looked similar to the line seen in Figure 4. , and a much higher circularity measure would have been awarded.

Figure 2. Cartesian and polar representations

We can see from Figure 2. that measuring the linearity of a polar coordinate point set is synonymous to measuring the circularity of a point set in Cartesian coordinates. The linearity measures of [SNZ] were used to measure the circularity of polar coordinate input sets as seen in section II. The algorithm that was used to test circularity of a planar set of points with a general linearity method X is presented below. Algorithm: Input: Output:

Average Orientations Circularity; Array of points: Points = (Xi, Yi), 1≤ i ≤ n; Circularity X;

Begin: Find center of gravity (Xc, Yc) of set of points; Translate the set by (-Xc, -Yc) so that it is at the origin; Transform set to Polar Coordinates. X = linearity value of the polar coordinate set; Output X; End; B. Finding the center of a shape The trivial way of choosing a shape’s center is to take the per-coordinate average of all pixels, which is referred to as the center of gravity. This is the method that is usually chosen when measuring any shape property such as linearity or elongation. Choosing the appropriate center of a shape when measuring circularity is more delicate and heavily influences the result of the circularity measure. To illustrate this point, we turn to Figure 3.

Figure 4. Finding the center from 3 points

We used both types of center finding methods in this work. In order to find the ‘true center’ of a shape, as opposed to its traditional center of gravity, we sampled k triplets of points from its point set. From each triplet, we found the center (Xtc, Ytc) that the points define. Figure 4. shows the idea behind finding each ‘true center’. It is expected that each triplet of points will yield a different center (Xi, Yi). To choose a unique center for the shape, we individually sort the k center values per coordinate, and choose the median per coordinate value to be the true center, (Xtc, Ytc). C. Application to Iris Detection Given that our main contribution is a circularity measure, it is natural that it can be applied to iris detection. Even if human irises are sometimes not perfect circles, nor do the iris and pupil necessarily form concentric circles, circularity is the best approximation of these shapes most of the time. Consider the CASIA test image in the top left of Figure 5. as input to our iris detection algorithm. We begin by applying a Gaussian blur to the original image (top right) followed by a Laplacian edge detector (bottom left).

The pseudo code for the outer iris boundary search is given below: Algorithm: Input: Output:

Iris Boundary Detection; Center of Pupil (Px, Py), radius of Pupil R, Input Image IM; Iris Boundary Circle X;

Begin: Extract 1d horizontal array HR[1..IM.width] of pixels at Py; Exclude all pixels {Px-R, Px+R} from HR, making it size IM.width – 2R – 1; Pc is the index of array HR where the center of the pupil previously existed, Pc = Px-R; Gaussian Blur HR; let i ∈ { Pc, Pc+3*R} let Loc be an array of size 3R

Figure 5. Finding the Pupil

A binary threshold is applied to the resulting edges, eliminating minor noise, followed by a dilatation operation on the remaining edges to join them into connected regions. The three largest connected regions of edge pixels regions are tested for circularity by both the center of gravity and true center approaches. The pupil is selected as the region with similar circular dimensions as determined by the two center finding approaches, has the darkest median color, and is entirely contained within the image. Large circular regions such as the line of the eyelid form circles whose centers are outside of the scope of the image and are ignored. Note that no morphological operations were performed in order to eliminate the reflections of the light sources in the pupil (which can sometimes be rather large compared to the area of the pupil). The outer iris boundary is more difficult to detect due to the more varied, yet generally smaller contrast between the iris and sclera regions. It is detected by searching horizontally, left and right of the pupil, through the 1D array of pixels that passes through its center for the symmetric locations with the largest intensity contrast, as seen in Figure 6. These locations are found by examining the differences in intensities 10 pixels to the left and right of the candidate spot, both left and right of the pupil. This number of pixels is dependent on the resolution of the image, but appears to work well for the [CA] test set. The location with the maximal cumulative intensity difference is chosen as the outer iris boundary.

For each pair of points (i, Pc -i) { Loc[Pc-i] = abs(HR[i-10] – HR[i+10]) + abs(HR[C-i10] – HR[C-i+10]) Iris R = abs(Pc - index of max (Loc))

End;

The blur, edge detection, circle detection and inner circle detection determine the time complexity of our algorithm. Blur and edge detection are O(n2) operations whereas the circle detection procedure is O(c) since it requires a constant number of pixels to test for circularity, regardless of the resolution or number of pixels on the circle boundary. Searching for the Iris boundary is an O(n) operation since the search is performed only along the width of the image. In summary, the entire procedure has time complexity O(n2). IV. EXPERIMENTAL DATA Applying the SCS algorithm to the iris detection problem on the CASIA v3 set [CA] proved successful 95% of the time. This means that the pupil and outer iris boundaries were successfully detected at this rate. The algorithm was implemented in C++ using the EMGU openCV wrapper on a single core T3200 Intel 2.0 Ghz processor, and performed the iris localization procedure in the order of 5ms. Figure 7. shows the results of our algorithm on a small subset of the CASIA v3 test set.

Figure 7. Iris detection subset

Figure 6. Finding the outer iris boundary

We compared the speed of our algorithm to the iris localization procedure proposed by [D], which was implemented in Matlab on the same hardware. The Matlab implementation localized irises at a average speed of 15s per

image. Taking into account that C# code might run 100 times faster than matlab code, we still arrive at the fact that our approach is several orders of magnitude faster than that of [D]. Our method also performs faster than, to the best of our knowledge, the fastest algorithm found in literature [HAK] which was multithreaded, and implemented on an Intel 2.4Ghz quad core processor, and runs an average of 190ms per image. V. CONCLUSION This article investigated the circularity of point sets, or how close these points are to a circle, when considered together. A related problem is to measure how close a set of pixels is to a digital circle. Digital circles are sets of pixels that represent circles in digital space. This problem is studied in [KA, CKT, SNS, LWL, and B].

[J]

R. Jillela, A. Ross, V.N. Boddeti, B. V. Kumar, X. Hu, R. Plemmons, P. Pauca, An Evaluation of Iris Segmentation Algorithms in Challenging Periocular Images, Handbook of Iris Recognition, Eds. Burge, M., Bowyer, K., Springer, Preprint, August, 2011, to appear January (2012).

[KA]

Chul E. Kim, Timothy A. Anderson: Digital Disks and a Digital Compactness Measure, Proceedings of the sixteenth annual ACM symposium on Theory of STOC 1984: 117-124.

[LS]

D.R. Lee, T. Sallee, A method of measuring shape, Geographical Review, Vol. 60, No. 4, pp. 555-563, 1970.

[LWL] S.C.Lee, Y. Wang, E.T. Lee, Compactness measure of digital shapes, Region 5 Conference: Annual Technical and Leadership Workshop, April 2004, 103- 105. [M]

The ‘true center’ method produces circularity results that perform better than the ‘center of gravity’ method when it comes to semi circular shapes, if the later are to be regarded as fully circular shapes. The new circularity measures that were proposed here outperformed the ones from literature when compared to human measurements.

L. Masek, Recognition of Human Iris Patterns for Biometric Identification. BEng. Thesis, School of Computer Science and Software Engineering, The University of Western, Australia, 2003.

[NB]

O. Novaski and C. Barczak, Utilization of Voronoi diagrams for circularity algorithms, Precision Engineering 20, 1997, 188-195.

Recently, there have been several applications that concentrate on finding circular objects in images. They range from identifying onions in gardens, to finding tires on cars when seen from the side. Future applications might include identifying individual cells based on their boundaries in medical imaging.

[P]

D. Proffitt, The measurement of circularity and ellipticity on a digital grid, Pattern Recognition, Vol. 15, No. 5, pp. 383387, 1982.

[RB]

E. S. Reddy, I. R. Babu, Performance of Iris Based Hard Fuzzy Vault, IJCSNS Int. J. Computer Science and Network Security, Vol. 8 No. 1, January 2008.

[SM]

M. Savoj, S.A. Monadjemi, Iris Localization using Circle and Fuzzy Circle Detection Method, World Academy of Science, Engineering and Technology, No. 61, pp. 2, 2012.

[SN]

M. Stojmenovic, A. Nayak, Shape based circularity measures of planar point sets, IEEE International Conference on Signal Processing and Communications, pp. 1279 – 1282, ICSPC 2007.

[SNS]

N. Sladoje, I. Nyström, and P.K. Saha, "Measurements of digitized objects with fuzzy borders in 2D and 3D", Image and Vision Computing, vol. 23, pp 123-132, 2005.

VI. ACKNOWLEDGMENTS This work was partially supported by the following grant: "Digital signal processing, and the synthesis of an information security system", TR32054, Serbian Ministry of Science and Education, and by NSERC Discovery grant. VII.

REFERENCES

[BJM] E. Bribiesca, J.R. Jimenez, V. Medina, R. Valdes, O. Yanez, A voxel-based measure of discrete compactness for brain imaging, 25th IEEE Annual Int. Conf. Engineering in Medicine and Biology Society, Vol. 1, Sept. 2003 910 - 913. [CA]

CASIA Iris Image Database, http://biometrics.idealtest.org/, Chinese Academy of Sciences' Institute of Automation (CASIA).

[CKT] V. Chatzis, V.G. Kaburlasos, M. Theodorides, An image processing method for particle size and shape estimation, 2rd Int. Scientific Conf. Computer Science, Chalkidiki, Greece, 30 September - 2 October 2005.

[SNS-2] N. Sladoje, I. Nyström, and P.K. Saha, "Measuring perimeter and area in low resolution images using a fuzzy approach", Proc. 13th Scandinavian Conf. Image Analysis (SCIA 2003), Göteborg, Sweden, Lecture Notes in Computer Science, vol. 2749, pp 853-860, Springer-Verlag, 2003. [SNZ] M. Stojmenovic, A. Nayak, J. Zunic, Measuring Linearity of Planar Points, Pattern Recognition, Vol. 41, pp. 2503-2511, 2008.

[D]

J. Daugman, How iris recognition works, IEEE Transactions on Circuits and Systems for Video Technology, Vol. 14, No. 1, January 2004.

T. Maenpaa, An Iterative Algorithm for Fast Iris Detection, International Workshop on Biometric Recognition Systems, (IWBRS) 2005, Lecture Notes in Computer Science 3781, pp. 127–134, 2005.

[DD]

C. Di Ruberto and A. Dempster, Circularity measures based on mathematical Morphology, Electronics Letters, Vol. 38, No. 20, 1691-1693, September 2000.

[YZW] L. Yu, D. Zhang, K. Wang, The relative distance of key point based iris recognition, Pattern Recognition, Vol. 40, No. 2, pp. 423 – 430, 2007.

[E]

N. Efford, Digital image processing, Addison-Wesley, 2000.

[H]

R.M. Haralick, A measure for circularity of digital figures, IEEE Trans. Syst. Man Cybernet. SMC-4, pp. 394-396, 1974.

[ZTW] Y. Zhu, T. Tan, Y. Wang, Biometric Personal Identification Based on Iris Patterns, Pattern Recognition, Vol. 2, pp. 801 804, 2000.

[HAK] R. H. Abiyev, K.I. Kilic, Robust Feature Extraction and Iris Recognition for Biometric Personal Identification, Biometric Systems, Design and Applications, pp. 149-169, 2011.

[T]