A Bank Cheque Signature Verification System using FFBP ... - IJETTCS

33 downloads 186796 Views 681KB Size Report
signature verification and testing using the Offline Bank. Cheque Signature ... scan the signature and convert into digital format and by using various digital ...
International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014

ISSN 2278-6856

A Bank Cheque Signature Verification System using FFBP Neural Network Architecture and Feature Extraction based on GLCM Ashok Kumar. D1 and Dhandapani. S2 1,2

Department of Computer Science and Applications, Government Arts College, Tiruchirapalli – 620 022, Tamil Nadu, INDIA.

Abstract: Everyone chooses signature as an authorization tool from the beginning till now. Each and every businessman uses bank cheques for most of his transactions. Even though banks are computerized, verification of signature in cheques is done manually which consumes time and even misleads sometimes. This is how signature verification system originated. Verification of signatures can be done on-line or off-line depending upon the application. In this study, a Neural Network model is designed for the signature verification and testing using the Offline Bank Cheque Signature Verification System. The acquired signature from the bank cheque is preprocessed and the Gray Level Co-occurrence Matrix (GLCM) features of the signature are extracted. The extracted features are used to train a Feed Forward Back Propagation Neural Network (FFBPNN). The signature features, to be tested, are fed to the trained neural network to find whether the signature is genuine or a forged one. The system modeled is checked with 120 genuine and 120 forged signatures. The system is found to have an accuracy of 92.08% with sensitivity of 93.25% and specificity of 91.12%.

the forger has not seen the signature before and but by simply knowing the name, he signs randomly. But in Simple forgery, the forger has seen the signature once or twice in a document previously and tries to sign. In skilled forgery, forger has the copy of the signature in hand and tries to imitate the signature. Figure 1 and Figure 2 shows samples of cheque and signature.

Figure 1 Sample Cheque

Keywords: Forgeries, off-line Signature verification, cheque, GLCM, Neural Network.

1. INTRODUCTION Signature is the cheapest way to authenticate a person. Every document has the signature on it to identify the signer. A signature verification system is badly needed for every document processing platforms [18]. Today all banks are almost computerized but even then, the signature of the account holder, found in the cheque is verified manually. This gave birth to the automation of the signature verification systems. The system accepts signature as input and classifies it into genuine or forged. The system modeled will verify the signature with maximum accuracy and in minimum time using less computational resources. Signature is a scanned image with a pattern which differs for each person [9]. A person’s signature is constant, unique and also difficult for others to imitate. His/her handwriting may vary due to aging factor, sickness, unusual conditions but the width-height ratio of the signature does not vary. In case a signature is forged, one can find it out by using certain basic things. If a signature is forged, the density of the ink will be high at the point where the forger stops for verification with the original one. After the pause, the beginning stroke will be heavier. There are three types of signature forgeries, namely, (i) Random (ii) Simple and (iii) Skilled. In Random forgery, Volume 3, Issue 3 May – June 2014

Figure 2 Sample Signatures There are two types of signature verification systems. One is On-line and the other is Off-line. On-line signature verification is easy since the signature is scanned online and various dynamic features like velocity of the pen, pressure applied at a point, time taken to sign, acceleration, sequence of strokes, etc can be recorded easily for the verification process. It also needs the equipments like digital pad and stylus. Whereas in offline signature verification, it requires only a scanner to scan the signature and convert into digital format and by using various digital image processing techniques signatures are verified. Further the signature can be obtained anywhere and for processing only it has to reach the processing point, in this study, the bank. Whereas, in an on-line system, the signature has to be obtained only at the processing points with the necessary electronic gadgets. Based on the above advantages, the authors choose off-line signature verification for study. Features of off-line verification are to be extracted from grey-level information in the signature image [16]. The signature is scanned and preprocessed to enhance the quality of the image. Various techniques like Support Page 46

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 Vector Machines, Hidden Markov Models, Neural Networks, Principal Component Analysis, Dynamic Time Warping are used to extract features and verify whether the signature is genuine or not. The general steps carried out in the entire verification process are shown in Figure 3 which consists of image acquisition, preprocessing, feature extraction, training, verification and finally the result. The rest of the paper is organized in the following manner. Section 2 gives the proposed offline signature verification model and review. Section 3 gives the quality measures. Section 4 gives experiment and result analysis and Section 5 gives conclusion and future work.

2. PROPOSED OFF-LINE SIGNATURE

ISSN 2278-6856

Image acquisition

Preprocessing

Feature Extraction

Training

VERIFICATION MODEL AND REVIEW

Signature verification systems are created and verified against commonly available signature databases. A signature database or signature corpus is a set of digitalized signatures. Various public corpuses for off-line signature verification available are BME-AUT1, SVC2004, GPDS, MCYT and Center of Excellence of Document Analysis and Recognition. 2.1 Image Acquisition The process of selecting the image and giving to the system as input is called image acquisition. Signatures can be captured by a camera or it can be scanned by a scanner. Camera uses MegaPixels (MP) format and scanner uses Dots per inches (DPI) format and so scanning the signature by using a scanner will give more accurate results [20]. The scanner helps to digitize, analyze and process an image. The output of a scanner is an uncompressed image. The scanner has to be placed in a proper environment with computer connectivity. The proposed model uses 120 genuine and 120 forged signatures of 5 persons. The Scanned signature is treated as a pixel distribution of a particular person. It is considered as a single digital image as one cannot split individual characters in a signature. Image scanned should have a good resolution. The background of the signature will be different for each bank cheque and also there should not be any fold marks on the signature. The use of rubber stamps should not overshadow the clear appearance of the signature. The scanned image should have all the portions of the cheque. 2.2 Image Preprocessing: Preprocessing is done to enhance the quality of the scanned signature. The acquired image is binarised to convert the image into black and white pixels of 0’s and 1’s. Then skeletonization is done to reduce the signature into one-pixel width for reducing the memory and computational time for processing it. After that, the image is filtered from noise. The noise may arise due to the scanned image standards, background dots, design and lines or even dust.

Volume 3, Issue 3 May – June 2014

Signature verification

Result

Figure 3 General Process in off-line Signature Verification There exist many situations which make the system complex. They are, finding the position of the signature in the bank cheque, complex background with designs, watermarked logo of the bank, mutilated cheque and signature with a low resolution. Reserve Bank of India (RBI) has introduced Cheque Truncation System (CTS2010). According to this, a CTS compliant cheque will reduce a number of aforementioned problems. It insists the account holder to sign in a dark coloured ink. Usually banks display the processing time for each transaction in the notice board and cheque processing time should not exceed the allocated time. Even if a signature is suspected, the entire process of verification should go with the time norms of the bank [7]. 2.3 Feature Extraction: After the preprocessing steps, feature extraction is performed to collect similar unique properties of a signature. Various methods are available for creating features from the signature image. All these methods can be classified into direct and indirect methods. Direct methods use feature values from image pixels like grid based information, gray-level information and pixel density. Whereas in indirect methods, transformations like Fourier, Wavelet and Radon are applied. For this study, the direct method is used. There are three types of features. They are global, grid and texture features [6]. Selection of the features depends upon the application. Texture feature plays a vital role in image classification. In off-line signature verification, various features like height, width, curvature, stroke, peaks, aspect ratio, etc Page 47

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 are considered. In paper [1], Abu shamim verifies signatures by comparing the curves in the signature. The stroke and its length are taken into account. Directional feature and energy density is combined to make a verification in [11] which takes minimum sample to train the system. Angle and pixel density combined with Neural network is used as a classifier in [17]. GLCM features combined with support vector machine were used in [10] and gives good results. Vargas, J.F and et al [19] measures the stroke gray-level variations and compares the signature. In [2] the original image is segmented from complex backgrounds and gray level features are trained with SVM and genuiness of the signature is found. In [9], the authors use horizontal projection profile and vertical projection profile of the image. It is the sum of all pixels in each coordinate. In [4], Geometric features are extracted by segmenting the image horizontally and vertically and taking the midpoints. In [12], the author measures the gray level features robustness when it is distorted by a complex background. Developed system should consider all three types of signature forgeries and if so the system will be an efficient one. In [5], Daramola Samuel partitions the signature based upon the gravity. Feature vectors for each was developed and classified according to the Euclidean distance and achieved good results. Anu Rathi, Dhivya Rathi and Parmanand [3] take the entire signature and do pixel-by-pixel comparision and got 80% accuracy. In [8], the authors found that mixed method of energy with chain code features is giving maximum performance. In [13], signature verification is done based on the angular features of the genuine and forged signatures. The signature is divided into blocks based on the center of the signature and the angular features are extracted and verified. This paper takes into account the GLCM, the powerful tool for image feature extraction. GLCM is the method for texture features extraction proposed by Haralick [14]. A GLCM is a matrix where rows and columns are equal to the number of gray levels in the image. It is used to find the intensities of the image at a point and the points which are similar to that intensity values. The features extracted using GLCM are Autocorrelation, Contrast, Correlation, Cluster Prominence, Cluster Shade, Dissimilarity, Energy, Entropy, Homogeneity, Maximum probability, Sum of squares, Variance, Sum average, Sum variance, Sum entropy, Difference variance, Difference entropy, Information measure of correlation, Inverse difference normalized (INN) and Inverse difference moment normalized. The co-occurrence matrix C is defined over an n × m image I, parameterized by an offset (Δx, Δy), as:

Where i and j are

the

intensity

Volume 3, Issue 3 May – June 2014

values

of

the

ISSN 2278-6856

image, p and q are the spatial positions in the image I and the offset (Δx, Δy) depends on the direction used and the distance at which the matrix is computed d. The 'value' of the image originally referred to the grayscale value of the specified pixel, but could be anything, from a binary on/off value to 32-bit color and beyond. Table 1 gives the formula for calculating some feature values for the GLCM [21] and Table 2 gives the sample feature values.

Figure 4 Gray Level Co-occurrence Matrix Figure 4. shows the gray level co-occurrence matrix of a signature image. GLCM is created by calculating how often a pixel with gray-level (grayscale intensity) value i occurs horizontally adjacent to a pixel with the value j. The GLCM used for the study is a series of second order texture calculations. First order texture measures are statistics calculated from the original image values, like variance, and do not consider neighborhood pixel relationships. Second order measures consider the relationship between groups of two (usually neighboring) pixels in the original image useful for signature verification. The measures require that each GLCM cell contain the probability not the count. It is only an approximate probability value because a true probability would require continuous values but the grey levels are integer values which are discrete. This process is called normalizing the matrix. Texture measures can be grouped into three. They are contrast group which are measures related to contrast and uses weights related to the distance from the GLCM diagonal. The second is the measure related to orderliness and third uses descriptive statistics of the GLCM texture measures. Table 1: Formula for calculating feature values Feature Autocorrelation Contrast Correlation paper Correlation Matlab Cluster Prominence Cluster Shade Dissimilarity Energy Entropy Homogeneity Matlab Homogeneity Maximum

Formula sum(sum(Pij.*(i.*j))); sum(sum((abs(i-j).^2).*Pij)); (corp-u_x*u_y)/(s_x*s_y); corm/(s_x*s_y); sum(sum(Pij.*((i+j-u_x-u_y).^4))); sum(sum(Pij.*((i+j-u_x-u_y).^3))); sum(sum(abs(i-j).*Pij)); sum(sum(Pij.^2)); -sum(sum(Pij.*log(Pij+eps))); sum(sum(Pij./(1+abs(i-j)))); sum(sum(Pij./(1+abs(i-j).^2))); max (Pij(:));

Page 48

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 probability Sum of squares: Variance Sum average Sum variance Sum entropy Difference variance Difference entropy

sum(sum(Pij.*((jglcm_mean).^2))); sum((ii+1).*p_xplusy); sum((((ii+1) out.senth(k)).^2).*p_xplusy); -sum(p_xplusy.*log(p_xplusy+eps)); sum((jj.^2).*p_xminusy); sum(p_xminusy.*log(p_xminusy+ep s));

Information measure of (hxy-hxy1)/(max([hx,hy])); correlation 1 Information measure of (1-exp(-2*(hxy2-hxy)))^0.5; correlation 2 Inverse difference sum(sum(Pij./(1+(abs(i-j). / normalized size_glcm_1)))); Inverse difference sum(sum(Pij./(1+((i-j)./ moment size_glcm_1).^2))); normalized where hxy1 = -sum(sum(Pij.*log(p_x*p_y' + eps))); hxy2 = - sum(sum(( p_x*p_y').* log(p_x* p_y' + eps))); s_x = sum(sum(Pij.*((i-u_x).^2)))^0.5; hx = -sum(p_x.* log(p_x+eps)); s_y = sum(sum(Pij.*((j-u_y).^2)))^0.5; hy = -sum(p_y.* log(p_y+eps)); corp = sum(sum(Pij.*(i.*j))); hxy = out.entro(k); corm = sum(sum(Pij.*(i-u_x).*(j-u_y)));

ISSN 2278-6856

242.673 3

15.830 7 243.34 2

15.822 1 242.42 8

0.239

0.2777

0.2595

0.28

Difference variance

0.1208

0.0988

0.0605

0.0588

Difference entropy

0.1706

0.1856

0.1561

0.1643

Information measure of correlation1

-0.4487

-0.4718

-0.5655

-0.5712

Information measure of correlation2

0.3879

0.4258

0.4546

0.4726

Inverse difference normalized (INN)

0.9948

0.9946

0.9958

0.9956

Inverse difference moment normalized

0.9983

0.9986

0.9991

0.9991

Sum average

15.8418

15.8292

Sum variance

244.313

Sum entropy

2.4 Training and Signature Verification using FFBPNN: Once feature extraction is over, the sample signature features are stored in the database. A network is designed to train the extracted features. Then signature comparison is made with the sample and the input signatures using the network. The verification technique adopted will either accept or reject the input signature.

Table 2 : Sample Feature Values Feature

Origina l

Original

Forged

Forged

Autocorrelation

62.9267

62.8221

62.857 5

62.787 6

Contrast

0.1208

0.0988

0.0605

0.0588

Correlation

0.755

0.7857

0.8712

0.8735

Correlation

0.755

0.7857

0.8712

0.8735

Cluster Prominence

38.1143

28.3666

33.220 6

29.477 1

Cluster Shade

-5.4083

-4.5291

-5.1067

-4.7728

To train the network with the features extracted by gray level co-occurrence matrix, a feed forward backpropagation network type (Figure 5) is used. Since it is a supervised training neural network method, both sample input and expected output is given. The network consists of input layer, hidden layer and an output layer. The input layer has 22 nodes as there are 22 features to be fed to the network. The second layer is the hidden layer. It has a tan sigmoid activation function represented by

Dissimilarity

0.0532

0.053

0.0397

0.0411

 (yi) = tanh(vi)

Energy

0.9274

0.9113

0.9186

0.9101

Entropy

0.2818

0.324

0.2936

0.3161

Homogeneity

0.9819

0.9798

0.9834

0.9823

Homogeneity

0.9798

0.978

0.9822

0.9812

Maximum probability

0.963

0.9546

0.9584

0.9539

Sum of squares: Variance

62.7398

62.6244

62.640 6

62.57

Volume 3, Issue 3 May – June 2014

Figure 5 FFBPNN Architecture

This is a hyperbolic tangent function where yi is the output of the ith node or neuron and vi is the weighted sum of the input and the output layer. The output of each layer can be represented by

yNx1 = f (WNxMXM, 1 + b N, 1) where Y is a vector containing the output from each neuron N in a layer, W is a matrix containing the weights for each input M. X is a vector containing inputs, b is a vector containing the biases and f is the activation function. Page 49

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 The network uses trainlm (Levenberg-Marquardt) function. Trainlm is a network training function that updates weight and bias values according to LevenbergMarquardt optimization. It is often the fastest backpropagation algorithm in the toolbox. Further the network uses purelin as a neural transfer function which calculates a layer's output from its net input. The general neural network design process has seven steps like collecting the data, creating the network, configuring the network, initializing the weights and biases, training the network, validating the network (post-training analysis) and finally using or testing the network. Algorithm 1: Steps involved in the proposed Signature Verification System Model Step 1: Convert the scanned signature to gray scale Step 2: Create and Normalize GLCM Step 3: Compute mean after normalization Step 4: Calculate features by the formula given in Table I. Step 5: Create a FFBPNN with 22 input nodes Step 6: Use tan sigmoid f(x) = 1 / (1 + e-x) as the Activation function Step 7: Use trainlm (Levenberg -Marquardt) for NN training Step 8: Use purelin as a neural transfer function which calculates a layer's output from its net input Step 9: Signature features are fed through the network in a forward direction, producing results at the output layer. Step 10: Error is calculated at the output nodes based on known target information. Step 11: The necessary changes to the weights that lead into the output layer are determined based upon this error calculation. The error is calculated by E = Σi=1 ~ n (Ti - Yi)2. Ti is the actual value and Yi is the estimated value The signal on the output layer can expressed by netk = TVk + ∂kL . where TVk is the target value of the output neuron k and ∂kL is the error of neuron k. Step 12: The changes to the weights that lead to the preceding network layers are determined as a function of the properties of the neurons to which they directly connected until all necessary weight changes are calculated for the entire network. Step 13: Weight changes are calculated, layer by layer, as a function of the errors determined for all subsequent layers, working backward toward the input layer. Step 14: The next iteration begins, and the entire procedure is repeated using the next training pattern. Figure 6 shows the entire verification process. The entire Volume 3, Issue 3 May – June 2014

ISSN 2278-6856

system in divided into training and testing phase. In training phase, the sample signatures are acquired, preprocessed and feature extraction takes place. The extracted features are used to train the designed neural network. The signature to be tested is acquired, preprocessed and feature values are extracted and fed into the network for analysis. The FFBPNN makes the decision and reports the signature as genuine or forged.

Figure 6 Signature Verification System Model

3. QUALITY MEASURES Signature verification system is modeled by using the foresaid concepts. Signature verification system’s performance evaluation is usually done by the error rates, False Acceptance Ratio (FAR) and False rejection ratio (FRR). FAR is the count of false signatures which is accepted as true. FRR is the count of true signatures recognized as false. The accuracy of the system depends on the FAR and FRR and the threshold value. Specificity and Sensitivity are statistical measures of performance. Sensitivity measures the proportion of actual positives which are identified as such. It is also called the true positive rate. Specificity measures the proportion of negatives as such and also called as true negative rate. Specificity and Sensitivity plays the decisive role in any signature verification system [15], [9]. Specificity and sensitivity jointly determines the accuracy and the accuracy is calculated by means of both FAR, FRR and True Positives (TP), True Negatives (TN), False Positives (FP) and False Negatives (FN) as shown in table II. TP is the number of correct positive predictions, FP is the number of incorrect positive predictions, TN is the number of correct negative predictions and FN is the number of incorrect negatives. When the differentiating values between genuine and forged are followed very strict, a genuine signature may not be accepted and so an error percentage has to be allowed for the system. Systems which secure very highly confidential platforms like the military should have a higher threshold and a system like signature verification Page 50

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 system can enjoy a lower level of threshold with low FN values. This is due to the variations in intrapersonal signatures. It is hard to put the exact same signature each time when he repeats. At the same time if the threshold value is greater, the system tends to fail. Taking the above mentioned conditions into account, this signature verification system has been modeled. Table 3: Performance Measures Sensitivity Specificity Accuracy

In this model, the 2-D image of the signature is scanned using the scanner with high resolution (300 dpi). The scanned image is converted to grayscale and GLCM is created from the scanned signature. The features are extracted and a Neural Network is designed. Features of the sample signature of the person is loaded as input to the network and trained. The result of the network is either 1 or 0. 1 indicates that the signature is genuine and 0 indicates it as forged. All the 240 samples of genuine and forged signatures are fed to the network and the output values are stored in the database. The outliers which have a high deviation from the actual result are found using the neural network and the signatures are classified into genuine or forged. Sensitivity, Specificity, Accuracy, FAR and FRR are calculated and the results are tabulated in table 4 and 5. Table 4: Sensitivity and Specificity Analysis

Sign.1 Sign.2 Sign.3 Sign.4 Sign.5

No. of Samples(50% genuine and 50% forgery) 48 48 48 48 48 Average

signatures trained and classifies the given signature as genuine or forged. The average sensitivity of the modeled system is 93.25 and the specificity is 91.12. The performance chart in figure 7 shows an excellent result giving an accuracy of 92.08 %. Also the FAR is less than FRR which is required for such verification systems.

5. CONCLUSION AND FUTURE WORK

TP / ( TP + FN ) TN / ( TN + FP ) ( TP + TN ) / ( TP + TN + FP + FN)

4. EXPERIMENT AND RESULT ANALYSIS

Run

ISSN 2278-6856

TP

TN

FP

F N

22 21 21 23 22

21 23 22 23 23

2 3 3 1 2

3 1 2 1 1

21.8

22.4

2.2

1.6

In this study, the authors proposed a model of off-line signature verification on bank cheques. The signature on a bank cheque is scanned and preprocessed. Features are extracted using the GLCM. A FFBPNN is designed and the features are fed to the neural network for training. After the training process, the signature to be tested is fed to the trained network for classification into genuine or a forged. The modeled system is tested with 120 genuine and 120 forged signatures. It shows an excellent performance with an accuracy of 92.08 % with sensitivity of 93.25 % and specificity of 91.12 %.

Figure 7 Chart Showing Experimental Analysis The system modeled can be improved in future by minimizing the number of input samples to train the network. Further the system classifies signatures only into genuine or forged one. Further the classification can be extended to find whether the forgery is a skilled, random or simple one.

Acknowledgements The authors are very thankful to the researchers who have given their valuable inputs to bring out this model. The authors thank everyone who has extended their help and support.

Table 5: Experimental Results Run

Sensitivity

Specificity

Accuracy

FRR

FAR

Sign.1

88

91.30

89.58

8.33

12.5

Sign.2

95.45

88.46

91.67

12.5

4.17

Sign.3

91.30

88

89.58

12.5

8.33

Sign.4

95.83

95.83

95.83

4.17

4.17

Sign.5

95.65

92.00

93.75

8.33

4.17

Average

93.25

91.12

92.08

9.17

6.67

Table 5 shows that the system has performed well for the Volume 3, Issue 3 May – June 2014

References [1] Abu Shamim Mohammad Arif, Md. Sabbir Hussain, Md. Rafiqul Islam, S. A. Ahsan Rajon and Abdullah Al Nahid – An Approach for Off-Line Signature Verification System using Peak and Curve Comparision, JCIT, ISSN 2078-5828 (PRINT), ISSN 2218-5224 (ONLINE), Volume 01, Issue 01, 2010. [2] Anjali.R, Manju Rani Mathew, Offline Signature Verification Based on SVM and Neural Network, International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering ,Vol. 2, Special Issue 1, December 2013. Page 51

International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) Web Site: www.ijettcs.org Email: [email protected] Volume 3, Issue 3, May – June 2014 [3] Anu Rathi, Divya Rathi, Parmanand Astya , Offline handwritten Signature Verification by using Pixel based Method, International Journal of Engineering Research & Technology (IJERT) Vol. 1 Issue 7, ISSN: 2278-0181, September – 2012. [4] Banshider Majhi, Y.Santhosh and Prasanna, Novel Features for offline signature verification, IJCCC, Vol I, 2006. [5] Daramola Samuel and Prof.Ibiyemi Samuel, Novel feature extraction technique for off-line signature verification system, IJEST, Vol ED-2(7), 2010. [6] H.Baltzakis and Papamarkos, A new signature verification technique based in a two-stage neural network classifier, EAAI 14, 2001. [7] İlkhan Cuceloglu, Hasan Ogul, Detecting handwritten signatures in scanned documents, 19th Computer Vision Winter Workshop Zuzana Kúkelová and Jan Heller (eds.) Křtiny, Czech Republic, February 3–5, 2014. [8] Imran Hussain, Vikash Shrivastava, Vivek Kr. Shrivastava, Review On Offline Signature Verification Methods Based On Artificial Intelligence Technique, IJOART, Volume 2, Issue 5, May-2013. [9] Kritika, Niketa Dubey, Riju Nema and Rishabh, Signature Verification through MATLAB using Image Processing, IJETECS, Vol 2, Issue 4, April 2013. [10] Miguel A. Ferrer, Francisco Vargas . J, Aythami Morales, and Aaron Ordnez, Robustness of Offline Signature Verification Based on Gray Level Features, IEEE Transactions on Information Forensics and Security, Volume. 7, No. 3, June 2012. [11] Minal Tomar and Pratibha Singh , A Directional Feature with Energy based Offline Signature Verification Network, International Journal on Soft Computing ( IJSC ), Vol.2, No.1, February 2011. [12] Nagendra Babu P., K.Chaithanya Sagar, A.Surendra Reddy, Offline Signature Verification based on GLCM, International Journal of Electronics Signals and Systems (IJESS), ISSN: 2231‐ 5969, Vol‐3, Iss‐2, 2013. [13] Prashanth C.R. and Raja.K. B., Off-line Signature Verification Based on Angular Features, International Journal of Modeling and Optimization, Volume. 2, No. 4, August 2012. [14] R. M. Haralick, K. Shanmugam, and I. Dinstein, Textural Features of Image Classification, IEEE Transactions on Systems, Man and Cybernetics, vol. SMC-3, no. 6, Nov. 1973. [15] R.Manavala and K.Thangavel Evaluation of textural feature extraction from GRLM for prostrate cancer through medical images, IJAET, Jan 2012. [16] R.Plamondon and N.Shrihari, Online and Offline Handwritten Recognition: A Comprehensive Survey, IEEE Trans. On Pattern Analysis and Machine Intelligence, Vol 22, Jan 2000. Volume 3, Issue 3 May – June 2014

ISSN 2278-6856

[17] Rahul Verma, Rao D.S, Offline Signature Verification and Identification Using Angle Feature and Pixel Density Feature And Both Method Together, International Journal of Soft Computing and Engineering (IJSCE) ISSN: 2231-2307, Volume2, Issue-4, April 2013. [18] Saba Mustaq and Ajaz Hussain Mir, Signature Verification Based on Texture Features of Image, IJCST, Vol 4, Issue 8, August 2013. [19] Vargas, J.F., GEPAR, Univ. de Antioquia, Medellin, Colombia, Travieso, C.M. , Alonso, J.B., Ferrer, M.A., Off-line Signature Verification Based on Gray Level Information Using Wavelet Transform and Texture Features, International Conference on Frontiers in Handwriting Recognition (ICFHR), Print ISBN:978-1-4244-8353-2, 2010. [20] Vargas,J.F., Ferrer M.A., Travieso C.M., Alonso J.B., Off-line Signature Verification System Performance against Image Acquisition Resolution, Document Analysis and Recognition, ICDAR 2007. [21] http://www.mathworks.com/matlabcentral/fileexchan ge/22187-glcm-texture-features.

AUTHORS Dr. D. Ashok Kumar did his Master degree in Mathematics and Computer Applications in 1995 and completed Ph.D., on Intelligent Partitional Clustering Algorithm’s in 2008, from Gandhigram Rural Institute – Deemed University, Gandhigram, Tamilnadu, INDIA. He is currently working as Associate Professor and Head in the Department of Computer Science and Applications, Government Arts College, Tiruchirapalli- 620 022, Tamilnadu, INDIA. His research interest includes Pattern Recognition and Data Mining by various soft computing approaches viz., Neural Networks, Genetic Algorithms, Fuzzy Logic, Rough set, etc., S.Dhandapani completed his M. C. A. Degree from Bharathiar University, Coimbatore, TamilNadu and M.Phil. from Periyar University, Salem. He is currently a Ph.D. research scholar in the field of Pattern Recognition and Image Processing at Bharathidasan University, Tiruchirapalli, Tamil Nadu.

Page 52