VIEWS: 116 PAGES: 6 CATEGORY: Emerging Technologies POSTED ON: 3/8/2011
The International Journal of Computer Science and Information Security (IJCSIS Vol. 9 No. 2) is a reputable venue for publishing novel ideas, state-of-the-art research results and fundamental advances in all aspects of computer science and information & communication security. IJCSIS is a peer reviewed international journal with a key objective to provide the academic and industrial community a medium for presenting original research and applications related to Computer Science and Information Security. . The core vision of IJCSIS is to disseminate new knowledge and technology for the benefit of everyone ranging from the academic and professional research communities to industry practitioners in a range of topics in computer science & engineering in general and information & communication security, mobile & wireless networking, and wireless communication systems. It also provides a venue for high-calibre researchers, PhD students and professionals to submit on-going research and developments in these areas. . IJCSIS invites authors to submit their original and unpublished work that communicates current research on information assurance and security regarding both the theoretical and methodological aspects, as well as various applications in solving real world information security problems.
(IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 PIFS CODES BASED FOR BIOMETRIC PALMPRINT VERIFICATION I Ketut Gede Darma Putra Departement of Electrical Engineering, Faculty of Engineering Udayana University, Bukit Jimbaran, Bali - Indonesia email : duglaire@yahoo.com Abstract — This paper proposes a new technique to extract resolution images can be used, low cost capture devices the palmprint features based on some fractal codes. The can be used, it is very difficult or impossible to fake palmprint features representation is formed based on position palmprints, and their characteristics are stable and unique of range blocks and direction between the position of range [18]. and domain blocks of fractal codes. Each palmprint Recently, many verification/identification technologies representation is divided into a set n blocks and the mean value of each block are used to form the feature vector. The using palmprint biometrics have been developed normalized correlation metrics are used to measure the [2],[3],[4],[5],[11],[12],[13],[18],[21]. Zhang et al. [21] degree of similarity of two feature vectors of palmprint applied 2-D Gabor filter to obtain the texture features of images. We collected 1050 palmprint images, 5 samples from palmprints. Pang at al. [13] used the pseudo-orthogonal each of 210 persons. Experiment results show that our moments to extract the features of palmprint. LI et al. [12] proposed method can achieve an acceptable accuracy rate transformed the palmprint from spatial to frequency with FRR = 1.754, and FAR= 0.699. domain using Fourier transform and then computed ring and sector energy features. Connie at al.[2] extracted the Keyword; biometrics, fractal codes, fractal dimension, texture feature of palmprint using PCA and ICA. Wu et feature extraction, palmprint recognition al.[18] extracted line feature vectors (LFV) using the magnitudes and orientations of the gradient of the points on palm-lines. Kumar et al.[11] combined the palmprints I. INTRODUCTION and hand geometries for verification system. Each The personal verification becomes an important and palmprint was divided into overlapping blocks and the highly demanded technique for security access systems in standard deviation value of each block was used to form this information area. Traditional automatic personal the feature vector. recognition can be divided into two categories: token- In this paper, we propose a new technique to extract the based, such as a physical key, an ID card, and a passport, features of palmprint based on fractal codes. This and knowledge-based, such as a password and a PIN. technique is different with the method in [4] and [5]. However these approaches have some limitations. In the token-based approach, the “token” can be easily stolen or lost. In the knowledge-based approach, the “knowledge” II. IMAGE ACQUISITION can be guessed or forgotten [21]. In order to reduce the All of palm images are captured using Sony DSC P72 security problem caused by traditional methods, biometric digital camera with resolution of 640 x 480 pixels. Each verification techniques have been intensively studied and persons was requested to put his/her left hand palm down developed to improve reliability of personal verification. on with a black background. There are some pegs on the Biometric-based approach use human physiological or board to control the hand oriented, translation, and behavioral features to identify a person. The most widely stretching. A sample of the hand and pegs position on the used biometric features are of the fingerprints and the most black board is shown on Figure 1 (a). reliable are of the irises. However, it is very difficult to extract small minutiae features from unclear fingerprints and the iris input devices are very expensive [19]. Other III. PALMPRINT EXTRACTION AND biometric features such as of face, voice, hand geometries, NORMALIZATION and handwritten are less accurate. Faces and voices can be mimicked easily, hand geometries and handwritten can be This paper used new technique to extract the ROI faked easily. (region of interest) of palmprint. This technique consists of Palmprint is the relatively new in physiological two steps in center of mass (centroid) method. These steps biometrics [18]. There are many unique features in a can be explained as follow. palmprint image that can be used for personal recognition. a. The gray level hand image is thresholded to obtain the Principal lines, wrinkles, ridges, minutiae points, singular binary hand image. The threshold value was computed points and texture are regarded as useful features for automatically using the Otsu method. To avoid the palmprint representations [21]. A palmprint has several white pixels (not pixel object) outside of the hand advantages compared to other available features: low- object is used median filter. 47 http://sites.google.com/site/ijcsis/ ISSN 1947-5500 (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 b. Each of the acquired hand images needs to be aligned in a preferred direction so as to capture the same features for matching. The moment orientation method is applied to the binary image to estimate the orientation of the hand. In the method, the angle of rotation ( θ ) is the difference between normal axis and major axis of ellipse that can be computed as follows. 1 2 µ1,1 (a) (b) (c) θ = tan −1 (1) 2 µ 2,0 − µ0,2 µ p ,q = ∑∑ (m − m ) (n − n )q p (2) m n (d) (e) (f) (g) where µ p,q th represent the (p,q) moment central, and Figure 1. Extraction of palmprint, (a) original image, (b) ( m, n ) represents center of area is defined as binary image of (a), (c) object bounded, (d) and (e) position of the first centroid mass in segmented binary and 1 1 m= ∑∑ m , n = N ∑∑ n , N m n (3) gray level image, respectively, (f) and (g) position of the m n second centroid mass in segmented binary and gray level where N represents number of pixel object. image, respectively. Furthermore, the grayscale and the binary image are rotated about ( θ ) degree. c. Bounding box operation is applied to the rotated binary image to get the smallest rectangle which IV. FEATURES EXTRACTION contains the binary hand image. The original hand image, binarized image, and the bounded image There are three main steps to extract the palmprint shown in Figure 1 (a), (b), and (c), respectively. features based on fractal codes proposed in this paper. d. The centroid of bounded image is computed using These steps can be explained as follows. equation (3) and based on this centroid, the bounded binary and original images are segmented with 200 x A. Extraction of fractal codes of palmprint images 200 pixels. The segmented image and its centroid Fractal codes of palmprint images are obtained using position are shown in Figure 1 (d) and (e). the partitioned iterated function system (PIFS) method. In e. The centroid of the segmented binary image is PIFS method, each image is partitioned into its range computed and based on this centroid the ROI of blocks and domain blocks. The size of the domain blocks grayscale palmprint image can be cropped with size is usually larger than the size of the range blocks. The 128 x 128 pixels. The first and the second positions of relation between a pair of range block (Ri) and domain centroid in binary and gray level image are shown in block (Di) is noted as Figure 1 (f) and (g). Ri = wi (Di ) (6) This method is so simple. This method has been tested for 1050 palmprint images acquired from 210 persons, and wi is contracted mapping that describes the similarity the results show this method is reliable. relation between Ri and Di, and is usually defined as an Before the feature extraction phase, the extracted ROI affine transformation as below: are normalized using normalization method in [11] to xi a i bi 0 xi ei reduce the possible imperfections in the image due to non- uniform illumination. The method is as below: wi y i = ci di 0 yi + f i (7) zi 0 0 s i z i oi φ d + λ if I ( x, y ) > φ I ' ( x, y ) = (4) φ d − λ otherwise where xi and yi represent top-left coordinate of the Ri , and zi is the brightness value of its block. Matrix elements ai, bi, ci, and di, are the parameters of spatial rotations and ρ d {I ( x, y ) − φ}2 flips of Di, si is the contrast scaling and oi is the luminance λ= (5) ρ offset. Vector elements ei and fi are offset value of space. In this paper, we used the size of domain region twice the where I and I’ represents original grayscale palmprint range size, so the values of ai, bi, ci, and di are 0.5. The image and the normalized image respectively, φ and ρ actual fractal code pi below is usually used in practice[19]. (( )( ) ) represents mean and variance of the original image respectively, while φd and ρd are the desired values for f i = x Di , y Di , x Ri , y Ri , sizei , θ i , s i , oi (8) mean and variance respectively. This research use φd = 180 and ρd = 180 for all experiments. 48 http://sites.google.com/site/ijcsis/ ISSN 1947-5500 (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 where (xR , y R ) and (xD , y D ) represent top-left i i i i coordinate position of the range block and domain block, respectively, and size is the size of range block. The fractal codes of a palmprint image is denoted as follow: N F = U fi (9) (a) (b) i =1 where N represents the number of the fractal code. The inequality expression below is used to indicate whether the range and the relevant domain block are similar or not. d ( R, D ) ≤ ε , (10) (c) (d) where d(R,D) represents rmse value, and є is the threshold Figure 2. Palmprint feature extraction, (a) original image, (tolerance) value. The range and the relevant domain block (b) Image I, (c) Image I’, (d) block feature representation is similar if d(R,D) is less or equal than є. Otherwise, the block is regarded not similar. The Figure 2 (d) show the palmprint feature representation in 16 x 16 sub blocks. Figure 3 shows example of three B. Palmprint features representation groups of palmprints from the same palm and palms with The first step of this method is the forming of angle similar/different line structures. The features of these image A as follows. palmprints are plotted in figure 4. The results show that the A( j , k ) = α i , j = 1,2,3, K M 1 , k = 1,2,3, K M 2 (11) features of three palm images from the same person are close to each other than the features of three palm images yD − yR , α i = arctan if j=x and k = y from the different persons with similar or different line xD − xR i Ri Ri structures. otherwise, α i = 0 (12) ( where x D , y D i ) represent top-left coordinate of the i V. PALMPRINT FEATURE MATCHING domain block (see formula (8)) and di represent the angle between range and domain block. The angle image is not The degree of similarity between two palmprint binary image representation. The criterion below are added features is computed as follows: to compute the direction α i . d rs = 1 − (xr − xr )(x s − x s )T (15) if xR < xD and yR ≥ yD then αi = αi [(x r − x r )( x r − x r ) T ] [(x 1 2 s − x s )( x s − x s ) T ] 1 2 if xR > xD and yR ≥ yD then αi = 180 − α i where x r , x s are the mean of palmprint feature xr and xs , if xR > xD and yR ≤ yD then αi = 180 + α i respectively. The above equation computes one minus normalized correlation between palmprint feature vector xr if xR < xD and yR ≤ yD then αi = 360 − α i and xs. The values of drs are between 0 – 2. The d rs will if xR = xD and yR ≥ yD then αi = 90 be close to 0 if xr and xs obtained from two image of the if xR = xD and yR ≤ yD then αi = 270 (13) same palmprint. Otherwise, the d rs will be far from 0. Figure 4 shows comparison of feature component of The criterion sizei = min(size) means the palmprint those palmprint shown in figure 3, and their score are listed features representation is formed practically using the in Table 1. The matching score of group A are close to 0, coordinate of the smallest size range block. Later, the and the matching score of group B and C are far from 0. representation is filtered as follow. The average score of group A, B, and C are 0.1762, I ' ( x , y ) = I ( x , y ) ∗ h ( x , y )m x n , (14) 0.5057, and 0.6452, respectively. It is easy to distinguish group A from group B and C using these scores. h(x,y) is filter which all of its component are one. Figure 2(b) show the palmprint features image of Figure 2(a). C. Palmprint feature vector Palmprint feature vector (V) is obtained by dividing the palmprint image into 16 x 16 blocks, and for each block its mean value is computed, so obtained the feature vector V = (v1 , v 2 K , v N ) , where N = 256,and vi is (a1) (a2) (a3) Group A: palmprints from the same person mean value of block i. 49 http://sites.google.com/site/ijcsis/ ISSN 1947-5500 (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 (b1) (b2) (b3) Group B: palmprints from different person with similar line structure (a) (c1) (c2) (c3) Group 3: palmprints from different person with different line structure Figure 3. Example of three groups of palmprint Table 1 Matching Score of groups A, B, and C in figure 3 a1 a2 a3 Average a1 0 0.1957 0.1404 a2 0.1957 0 0.1925 0,1762 (b) a3 0.1404 0.1925 0 b1 b2 b3 Average b1 0 0.5352 0.3056 b2 0.5352 0 0.6763 0,5057 b3 0.3056 0.6763 0 c1 c2 c3 Average c1 0 0.6900 0.6177 c2 0.6900 0 0.6280 0,6452 c3 0.6177 0.6280 0 (c) VI. EXPERIMENTS AND RESULTS Figure 4. Comparison of feature component of the We collected palm image from 210 persons from both palmprint group shown in figure 2. (a),(b),(c) are feature sexes and different ages, 5 samples from each person, so component of group A, B, and C, respectively. Red, green, our database contains 1050 images. The resolution of hand blue color are the first, second, and third palmprint in each image is 640 x 480 pixels. The palmprint images, of size group, respectively. 128 x 128 pixels, were automatically extracted from hand image as described in the Section 3. The averages of the first three images from each user were used for training and the rest were used for testing. The performances of the verification system are 400 obtained by matching each of testing palmprint images 300 with all of the training palmprint images in the database. A matching is noted as a correct matching if the two v26 200 palmprint images are from the same palm and as incorrect 100 if otherwise. 0 400 300 250 200 200 150 v24 100 50 100 v22 0 0 Figure 5. Distribution of three feature components of 1050 palmprints in feature space 50 http://sites.google.com/site/ijcsis/ ISSN 1947-5500 (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 method for palmprint verification. The experiment results show that the proposed method can achieve an acceptable accuracy rate with FRR = 1.7544, and FAR= 06998. In the future, we will combine the proposed method with wavelet transformation to extract the feature of palmprint to retain the block operation. REFERENCES [1] Chih-Lung Lin., “Biometric Verification Using Palmprints and Vein-patterns of Palm-dorsum”, http://thesis.lib.ncu.edu.tw/etd-db/etd-search/ [2] Connie T., Andrew Teoh, Michael Goh, David Ngo, 2003, “Palmprint Recognition with PCA and ICA”, sprg.massye.ac.nz/ivcnz/proccedings/ivcnz_41.pdf [3] C.L. Lin, Biometric Verification Using Palmprints and Vein-patterns of Palm-dorsum, 2004, http://thesis.lib.ncu.edu.tw/etd-db/etd-search/ [4] Darma Putra, IKG., Adhi Susanto, A. Harjoko & TS. Widodo, Palmprint Verification based on Fractal Codes and Fractal Dimensions, Proceedings of the Eighth IEASTED International Conference Signal and Image Processing, Honolulu, Hawai, 2006, 323–328. [5] Darma Putra, Adhi Susanto, Agus Harjoko, Thomas Sri Widodo, 2006, Biometrics Palmprint Verification (a) (b) Using Fractal Method, EECCIS proceedings, Part 2, pp.22-23, Brawijaya University, Malang, Indonesia. Figure 6. Performance of verification system,(a) genuine [6] Duta N., Jain A.K., Mardia K.V.,2002, Matching of and imposter distribution, (b) FAR/FRR/EER with various Palmprints, Pattern Recognition Letters, 23, pp. 477- threshold 485. [7] Ekinci Murat, Vasif V., Nabiyev, Yusuf Ozturk, 2003, Table 2. FRR/FAR with various threshold value A Biometric Personal Verification Using Palmprint Structural Features and Classifications, IJCI Threshold FRR FAR Proceedings of Intl, XII, Vol.1, No.1. 0.4386 2.0734 0.4734 [8] Jain A.K., 1995, Fundamentals of Digital Image Processing, Second Printing, Prentice-Hall, Inc. 0.4586 1.9139 0.5158 [9] Jain A.K., Ross A., and Pankanti S., 1999, A Prototype 0.4626 1.7544 0.6998 Hand Geometry-based Verification System, 0.4746 1.4354 0.9160 www.research.ibm.com/ecvg/publications.html 0.4786 1.2759 1.3552 [10] Jain A.K, Introduction to Biometrics System, 0.4986 1.1164 2.1480 http://biometrics.cse.msu.edu/. 0.5386 1.1164 2.2881 [11] Kumar A., David C.M.Wong, Helen C.Shen, Anil K.Jain, 2004, “Personal Verification using Palmprint Figure 6 (a) shows the probability distributions of a and Hand Geometry Biometric”, genuine and imposter parts with tolerance value = 3, and http:/biometrics.cse.msu.edu/Kumar_AVBPA2003.pdf feature vector length = 256 (16 x 16 blocks). The genuine [12] LI Wen-xin, David Z,, Shuo-qun XU., 2002, and imposter parts are estimated from correct and incorrect Palmprint Recognition Based on Fourier Transform, matching scores, respectively. The result with various Journal of Software, Vol.13, No.5 threshold and false acceptance rates (FAR)/false rejection [13] Pang Y., Andrew T.B.J., David N.C.L., Hiew Fu San., rates (FRR) are shown in figure 6 (b). The equal error rate 2003, Palmprint Verification with Moments, Journal of (EER) of the verification system is 1.2758. Table 2 show WSCG, Vol.12, No.1-3, ISSN 1213-6972, Science the performance (FAR/FRR) system with some threshold Press. values. [14] Sarraille, J., 2002, Developing Algorithms For The main advantage by using PIFS code in this paper Measuring Fractal Dimension, http://ishi.csustan.edu is both palmprint feature and palmprint image can be [15] Shu W., Zhang D., 1998, Automated personal obtained directly from compressed domain (fractal code). identification by palmprint, Opt. eng., Vol. 37, No.8, pp. 2359-2363. [16] Tao Y., Thomas R.I., Yuan Y.T., Extraction of VII. CONCLUSIONS AND FUTURE WORK Rotation Invariant Signature Based On Fractal Geometry, http://cs.tamu.edu In this paper, we introduced a fractal characteristics based feature extraction and representation 51 http://sites.google.com/site/ijcsis/ ISSN 1947-5500 (IJCSIS) International Journal of Computer Science and Information Security, Vol. 9, No. 2, February 2011 [17] Wohlberg B., Gerhanrd de Jager, 1999, A Review of the Fractal Image Coding Literature, IEE Transactions on Image Processing, Vol. 8, No.12. [18] WU Xiang-Quan, Kuan-Quan Wang, David Zhang, 2004, An Approach to Line Feature Representation and Matching for Palmprint Recognition, Journal of Software, Vol.15., No.6. [19] Yokoyama T., Sugawara K., Watanabe T., Similarity- based image retrieval system using partitioned iterated function system codes, The 8th International Symposium on Artificial Life and Robotics, January 24-26 2006, Oita, Japan, email:yokotaka@sd.is.uec.ac.jp [20] Yokoyama T., Watanabe T., Koga H.,Similarity- Based Retrieval Method for Fractal Coded Images in the Compressed Data Domain, email:yokotaka@sd.is.uec.ac.jp [21] Zhang D., Wai-Kin Kong, Jane You, Michael Wong, 2003, Online Palmprint Identification, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol.25, No.9. [22] Zhang D., and W.Shu, Two novel characteritics in palmprint verification: datum point invariance and line feature matching, pattern recognition vol 32, pp.691-702,1999 AUTHOR PROFILE Dr. I Ketut Gede Darma Putra is a lecturer in Department of Electrical Engineering and Information Technology, Udayana University Bali, Indonesia. He obtained his master and doctorate degree on informatics engineering from Electrical Engineering, Gadjah Mada University, Indonesia. His research interest includes biometrics, image processing, expert system and Soft computing. 52 http://sites.google.com/site/ijcsis/ ISSN 1947-5500