72
Handwritten Character Recognition Using BP NN, LAMSTAR NN and SVM Majed Valad Beigi PhD student at EECS department of Northwestern University Email: [email protected] SPRING 2015 Abstract An off-line handwritten alphabetical character recognition system using Back Propagation neural network, LAMSTAR neural network and Support Vector Machine (SVM) is described in this report. The general steps of the algorithm are: 1- Scanning the source material (a paper with all the characters written on it) using an optical scanner, 2- Performing automatic processing on the image, 3- Creating the input dataset for ANN or SVM by extracting the most important attributes from the image of each character and representing the features in the form of a matrix of ‘0’s and ‘1’s(attributes are important and can have a crucial impact on end results) 4- Classifying the dataset using BP NN, or LAMSTAR NN or SVM, and performing Recognition during the test 5- Finally, getting the results of the recognition. In the next parts of the report I will explain each part in more details. For the results, four data sets, each containing 52 alphabets (26 Upper-Case and 26 Lower-Case characters) written by various people, are used for training the neural network and 520 (10 per each 52 characters) different handwritten alphabetical characters are used for testing. The proposed recognition system performs quite well. Experimental results show that the LAMSTAR NN with the success rate of 93.84% and the training time of 1.2 Seconds is the fastest and the most efficient compared to Back Propagation Neural Network and Support Vector Machine for this problem. 1. Introduction The development of handwriting recognition systems began in the 1950s when there were human operators whose job was to convert data from various documents into electronic format, making the process quite long and often affected by errors. Automatic text recognition aims at limiting these errors by using image preprocessing techniques that bring increased speed and precision to the entire recognition process. Handwriting recognition has been one of the most fascinating and challenging research areas in field of image processing and pattern recognition in the recent years. It contributes immensely to the advancement of automation process and improves the interface between man and machine in numerous applications. Optical character recognition is a field of study than can encompass many different solving techniques. Neural networks (Sandhu & Leon, 2009), support vector machines and statistical classifiers seem to be the preferred solutions to the problem due to their proven accuracy in classifying new data [1].

Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Embed Size (px)

Citation preview

Page 1: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Handwritten Character Recognition Using BP NN, LAMSTAR NN and SVM

Majed Valad Beigi PhD student at EECS department of Northwestern University

Email: [email protected] SPRING 2015

Abstract

An off-line handwritten alphabetical character recognition system using Back Propagation neural network, LAMSTAR neural network and Support Vector Machine (SVM) is described in this report. The general steps of the algorithm are: 1- Scanning the source material (a paper with all the characters written on it) using an optical scanner, 2- Performing automatic processing on the image, 3- Creating the input dataset for ANN or SVM by extracting the most important attributes from the image of each character and representing the features in the form of a matrix of ‘0’s and ‘1’s(attributes are important and can have a crucial impact on end results) 4- Classifying the dataset using BP NN, or LAMSTAR NN or SVM, and performing Recognition during the test 5- Finally, getting the results of the recognition. In the next parts of the report I will explain each part in more details.

For the results, four data sets, each containing 52 alphabets (26 Upper-Case and 26 Lower-Case characters) written by various people, are used for training the neural network and 520 (10 per each 52 characters) different handwritten alphabetical characters are used for testing. The proposed recognition system performs quite well. Experimental results show that the LAMSTAR NN with the success rate of 93.84% and the training time of 1.2 Seconds is the fastest and the most efficient compared to Back Propagation Neural Network and Support Vector Machine for this problem.

1. Introduction

The development of handwriting recognition systems began in the 1950s when there were human operators whose job was to convert data from various documents into electronic format, making the process quite long and often affected by errors. Automatic text recognition aims at limiting these errors by using image preprocessing techniques that bring increased speed and precision to the entire recognition process. Handwriting recognition has been one of the most fascinating and challenging research areas in field of image processing and pattern recognition in the recent years. It contributes immensely to the advancement of automation process and improves the interface between man and machine in numerous applications. Optical character recognition is a field of study than can encompass many different solving techniques. Neural networks (Sandhu & Leon, 2009), support vector machines and statistical classifiers seem to be the preferred solutions to the problem due to their proven accuracy in classifying new data [1].

Page 2: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

The Optical Character Recognizer actually is a convertor which translates handwritten text images to a machine based text. In general, handwriting recognition is classified into two types as off-line and on-line. In the off-line recognition, the writing is usually capture optically by a scanner and the completed writing is available as an image. In other words, Offline Handwritten Text is when hand written text is scanned by a scanner into a digital format. But, in the on-line system the two dimensional coordinates of successive points are represented as a function of time and the order of strokes made by the writer. In other words, X-Y coordinates are given as a result that tells the location of the pen and the force applied by the user during writing and speed too. Online Handwritten Text is written by a stylus on a tablet. There is also a third method which is not as famous as the first two methods mentioned above in which laser, inkjet devices, can be used for obtaining machine printed text [2].

There is extensive work in the field of handwriting recognition, and a number of reviews exist. The on-line methods have been shown to be superior to their off-line counterparts in recognizing handwritten characters due to the temporal information available with the former [3] [4]. However, several applications including mail sorting, bank processing, document reading and postal address recognition require off-line handwriting recognition systems. Moreover, in the off-line systems, the neural networks and support vector machines have been successfully used to yield comparably high recognition accuracy levels. As a result, the off-line handwriting recognition continues to be an active area for research towards exploring the newer techniques that would improve recognition accuracy [5] [6]. Therefore, for this report, I have decided to work on an off-line handwritten alphabetical character recognition system using Back Propagation neural network, LAMSTAR neural network and Support Vector Machine (SVM).

Artificial Neural Network (ANN) is a computing model of brain, having paralleled distributed processing elements that are learned by adjusting the connected weights between the neurons. Due to its flexibility and strength, it has been now broadly used in different fields such as pattern recognition, decision-making optimization, market analysis, robot intelligence [7]. ANN can be more remarkable as computational processors for different tasks like data compression, classification, combinatorial optimization problem solving, pattern recognition etc. [8]. ANN has many advantages over the other classical methods. While having the computational complexity, ANN offered many advantages in pattern recognition adapting a very little context of human intelligence [9]. In the off-line recognition system, the neural networks have emerged as the fast and reliable tools for classification towards achieving high recognition accuracy [10]. Classification techniques have been applied to handwritten character recognition since the 1990s. These methods include statistical methods based on Bayes decision rule, Artificial Neural Networks (ANNs), Kernel Methods including Support Vector Machines (SVM) and multiple classifier combination [11], [12].

I have taken the main idea of this project from [13]. I have chosen to use the image processing Toolbox of MATLAB to solve the image pre-processing stage of the handwritten character recognition problem at hand as the authors of [13] did. In [13], a back propagation Artificial Neural Network is used for performing classification and recognition tasks. However, I have also checked the performance of the LAMSTAR neural network and Support Vector Machine classifier for this problem. Moreover, the authors of [13] have just calculated the average value in the 10×10 sub-matrices of their bigger original matrix obtained from the image of each character, but in this work I have resized the character images into two different sizes (50×70

Page 3: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

pixels and 90×120 pixels) initially and got the average value in the 10×10 sub-matrices for the former and in the 15×15 sub-matrices for the latter.

This report is organized as follow: section 2 describes the design. In this part, first I will explain the automatic image pre-processing technique performed on the scanned image of the handwritten characters for extracting each character from the scanned image. Then, I will describe the feature extraction technique in which, first, the sub-image of each character has to be resized (in order to standardize the sub-images with different sizes), then a matrix of Boolean values must be created from the resized sub-image and finally a smaller matrix of fuzzy values has to be created by running a window of a specific size (explained earlier) on the original matrix and getting the average of the values in that window. In section 3, 4, 5 the different classification techniques that have been examined in this project are explained in more details. Section 6 gives a summary of the results that are obtained by using each of the three classification techniques used for this project. Finally, in section 7, the full MATLAB source codes are given.

2. Design

The objective of this project is to identify handwritten characters with the use of Back Propagation neural network, LAMSTAR neural network and Support Vector Machine. For the Back Propagation and LAMSTAR, a suitable neural network has to be constructed and be trained properly. For the SVM, multiple Binary Classifiers are required for Multi-Class Classification which we need for this problem.

The program should be able to extract the characters one by one and map the target output for training purpose. After automatic processing of the image, the training dataset has to be used to train “classification engine” (BP NN, LAMSTAR NN or SVM) for recognition purpose. The program code is written in MATLAB and supported with the usage of Graphical User Interface (GUI).

o Automatic Image Preprocessing

The image is first being converted to grayscale image follow by the threshing technique, which make the image become binary image. The binary image is then sent through connectivity test in order to check for the maximum connected component, which is, the box of the form. After locating the box, the individual characters are then cropped into different sub images that are the raw data for the following feature extraction routine.

The size of the sub-images are not fixed since they are expose to noises which will affect the cropping process to be vary from one to another. This will causing the input of the network become not standard and hence, prohibit the data from feeding through the network. To solve this problem, initially, I resized the sub-images to 50×70 and then by finding the average value in each 10×10 block, converted the images to 5×7 matrices, with fuzzy values, and got 35 inputs for the networks. However, in order to get a higher resolution, this time I resized my sub-images to 90×120 and then found the average value in each 15×15 block and converted the images to

Page 4: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

6×8 matrices and thus got 48 inputs for the networks. However, before resizing the sub-images, another process must be gone through to eliminate the white space in the boxes.

1- Read the image to MATLAB workspace 2- Convert to grayscale 3- Convert to binary image 4- Edge detection 5- Morphology At the step image dilation and Image filling is performed. 6- Blob analysis At this step, all the objects on the image and all the properties of each object are found. 7- Plot the object location At this step, the location of each object is plotted.

Figure 1: Automatic image pre-processing steps.

After these steps, the characters can be extracted from the image and thus we would be able to extract the most important features of each characters.

o Feature Extraction

Neural networks can be used, if we have a suitable dataset for training and learning purposes. Datasets are one of the most important things when constructing a neural network. Without proper dataset, training will be useless. In order to get a proper data set: First we have to scan the image. After the image is scanned, we define processing algorithm, which will extract important attributes from the image and map them into a database or better to say dataset.

Extracted attributes will have numerical values and will be usually stored in arrays. With these values, neural networks and SVM can be trained and we can get a good end results. The problem of well-defined datasets lies also in carefully chosen algorithm attributes. Attributes are important and can have a crucial impact on end results. In the following, the steps required to extract the most important features of each character are described:

1- The sub-images have to be cropped sharp to the border of the character in order to standardize the sub-images. The image standardization is done by finding the maximum row and column with 1s and with the peak point, increase and decrease the counter until meeting the white space, or the line with all 0s. This technique is shown in figure below where a character “C” is being cropped and resized.

Page 5: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Figure 2: Cropped and Resized Picture.

2- The image pre-processing is then followed by the image resize again to meet the network input requirement, 5×7 (or 6×8) matrices, where the value of 1 will be assign to all pixel where all 10×10 (15×15) box are filled with 1s, as shown below:

Figure 3: Image resize again to meet the network input requirement.

Finally, the 5×7 (6×8) matrices is concatenated into a stream so that it can be feed into network with 35 (48) input neurons. The input of the network is actually the negative image of the figure, where the input range is 0 to 1, with 0 equal to black and 1 indicate white, while the value in between show the intensity of the relevant pixel.

By this, we are able to extract the character and pass to another stage for future "classification" or "training" purpose of the neural network or SVM.

Page 6: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

3. METHOD ‘1’: Back Propagation Neural Network

In this part, an Artificial Neural Network (ANNs) form the basis of an OCR which is trained using the Back Propagation algorithm. After converting the handwritten English characters into 5×7 or 6×8 matrices as explained earlier, these matrices can be fed to the ANN as input. After the Feed Forward Algorithm which gives workings of a neural network, the Back Propagation Algorithm performs Training, Calculating Error, and Modifying Weights. The BP algorithm starts with computing the output layer, which is the only one where desired outputs are available. The error rate in the output layer is calculated based on the difference between the desired output and the actual output. In this project, the result of BP ANN is a matrix of 1×52. The output being obtained from the BP ANN can be used to obtain one of the 52 (26 Upper-Case and 26 Lower-Case) alphabets of the English language.

(a) Structure:

The Back Propagation Neural Network implemented for the purpose of this project is composed of 3 layers, one input, one hidden and one output. For the 5×7 (6×8) matrices, the input layer has 35 (48) neurons, the hidden layer has 100 neurons, (the number of neurons in the hidden layer has been determined by trial and error) and the output layer has 52 neurons. The output layer is in fact a competitive layer (only one bit of the output becomes 1 for each class). For this project, the sigmoid function has been used as a non-linear neuron activation function:

Bias terms (equal to 1) with trainable weights were also included in the network structure. The structural diagram of the neural network is given in the following figure:

Figure 4: Schematic design of the back-propagation neural network.

Page 7: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

(b) Network set-up:

The Back propagation (BP) learning algorithm was used to solve the problem. The goal of this algorithm is to minimize the error energy at the output layer. In this method a training set of input vectors is applied vector-by-vector to the input of the network and is forward-propagated to the output. Weights are then adjusted by the BP algorithm. Subsequently, these steps are repeated for all training sets. The algorithm stops when adequate convergence is reached.

o Training algorithm:

To train the network to recognize the English Alphabet characters, the corresponding 5×7 (6×8) grids are applied in the form of 1×35 (1×48) vectors to the input of the network. Then the weights are calculated using the equations provided in the text book for the BP NN. The initial learning rate was experimentally set to 1.5 which is divided by a factor of 2 every 100 iterations and is reset to its initial value after every 400 iterations and the momentum rate is set to 0.95.

o Testing algorithm:

For testing, the weights that were calculated during the training are used. The testing inputs are given in the form of a 1×35 (1×48) vectors for the corresponding 5×7 (6×8) grids. Character are considered recognized if all the outputs of the network were no more than 0.01 off their respective desired values.

Page 8: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

4. METHOD ‘2’: LAMSTAR Neural Network

The LAMSTAR network has the following components:

(a) Input word and its subwords:

The input word is divided into a number of subwords. Each subword represents an attribute of the input word. For this part I have considered the 6×8 matrices obtained from the feature extraction algorithm.

In this case, each subword is 4 bit long and is obtained by running a 2×2 window on the feature matrix of a character (each cell of the feature matrix actually contains the information of a 15(pixels)×15(pixels) block of the original sub-image of a character). Thus I totally have 12 subwords for each character.

Figure 5: Obtaining each sub-word by running a 2×2 window on the feature matrix.

Figure 6: Layout of the LAMSTAR Neural Network.

Page 9: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

(b) SOM modules for storing input subwords:

For every subword there is an associated Self Organizing Map (SOM) module with neurons that are designed to function as Kohonen `Winner Take All' neurons where the winning neuron has an output of 1 while all other neurons in that SOM module have a zero output.

In this algorithm, the SOM modules are built dynamically in the sense that instead of setting the number of neurons at some value arbitrarily, the network was built to have neurons depending on the class to which a given input to a particular subword might belong. For example if there are two subwords that have all their pixels as `1's, then these would fire the same neuron in their SOM layer and hence all they need is 1 neuron instead of 2 neurons. This way the network is designed with lesser number of neurons and the time taken to fire a particular neuron at the classification stage is reduced considerably.

(c) Output (decision) layer:

The present output layer is designed to have six layers, which have the following neuron firing patterns:

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Output

0

0

0

0

0

0

0

0

0

0

0

1

0

0

0

0

1

0

0

0

0

0

1

1

0

0

0

1

0

0

0

0

0

1

0

1

0

0

0

1

1

0

0

0

0

1

1

1

0

0

1

0

0

0

0

0

1

0

0

1

0

0

1

0

1

0

0

0

1

0

1

1

0

0

1

1

0

0

0

0

1

1

0

1

0

0

1

1

1

0

0

0

1

1

1

1

0

1

0

0

0

0

0

1

0

0

0

1

0

1

0

0

1

0

0

1

0

0

1

1

0

1

0

1

0

0

0

1

0

1

0

1

0

1

0

1

1

0

0

1

0

1

1

1

0

1

1

0

0

0

0

1

1

0

0

1

Class a b c d e f g h i j k l m n o p q r s t u v w x y z

Output

1

0

0

1

1

0

1

0

0

1

1

1

1

0

1

0

0

0

1

0

1

0

0

1

1

0

1

0

1

0

1

0

1

0

1

1

1

0

1

1

0

0

1

0

1

1

0

1

1

0

1

1

1

0

1

0

1

1

1

1

1

1

0

0

0

0

1

1

0

0

0

1

1

1

0

0

1

0

1

1

0

0

1

1

1

1

0

1

0

0

1

1

0

1

0

1

1

1

0

1

1

0

1

1

0

1

1

1

1

1

1

0

0

0

1

1

1

0

0

1

1

1

1

0

1

0

1

1

1

0

1

1

1

1

1

1

0

0

1

1

1

1

0

1

1

1

1

1

1

0

1

1

1

1

1

1

Table 1: Values at the output layer for each character (class).

The link-weights from the input SOM modules to the output decision layer are adjusted during training on a reward/punishment principle. Furthermore, they continue being trained during normal operational runs. Specifically, if the output of the particular output neuron is what is desired, then the link weights to that neuron is rewarded by increasing it by a non-zero increment, while punishing it by a small non-zero number if the output is not what is desired.

Page 10: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

o Training algorithm:

The training of the LAMSTAR network with 208 character (4 per each of the 52 English alphabet). The training was performed as follows:

1- Subword Formation:

The input patterns are to be divided into subwords before training/testing the LAMSTAR network.

2- Input Normalization:

Each subwords of every input pattern is normalized as follows:

Where j=1, 2, 3 and 4; �� is the un-normalized subword of an input pattern and�� ’ is the normalized subword of the same input pattern. During the process, those subwords, which are all zeros, are identified and their normalized values are manually set to zero.

3- Dynamic Neuron formation in the SOM modules:

The first neurons in all the SOM modules are constructed as Kohonen neurons as follows:

As the first pattern is input to the system, one neuron is built with 4 inputs and weights to start with are initialized the normalized input subword.

When the subwords of the subsequent patterns is input to the respective modules, the output at any of the previously built neuron is checked to see if it is close to 1 (with a tolerance of 0.05). If one of the neurons satisfies the condition, then this is declared as the winning neuron, i.e., a neuron whose weights closely resemble the input pattern. Else another neuron is built with new sets of weights that are normalized and adjusted as above to resemble the input subword.

During this process, if there is a subword with all zeros then this will not contribute to a change in the output and hence the output is made to zero and the process of finding a winning neuron is bypassed for such a case.

4- Desired neuron firing pattern:

The output neuron firing pattern for each character in the training set has been established as given in Table 1.

Page 11: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

5- Link weights:

Link weights are defined as the weights that come from the winning neuron at every module to the 6 output neurons. If in the desired firing, a neuron is to be fired, then its corresponding link weights are rewarded by adding a small positive value of 0.05 till the output is as expected. On the other hand, if a neuron should not be fired then its link weights are reduced by 0.05 till the output is as expected.

This will result in the summed link weights at the output layer being a positive value indicating a fired neuron if the neuron has to be fired for the pattern and high negative value if it should not be fired.

6- The weights at the SOM neuron modules and the link weights are stored.

o Testing:

The LAMSTAR network was tested with 520 patterns as follows, 10 for each 52 classes.

1- The patterns are processed to get 12 subwords as before. Normalization is done for the subwords as explained in the training.

2- The stored weights are loaded

3- The subwords are propagated through the network and the neuron with the maximum output at the Kohonen layer is found and their link weights are sent to the output neurons.

4- The output is a sum of all the link weights.

Page 12: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

5. METHOD ‘3’: Support Vector Machine

o Training algorithm:

(a) Linearly Separable Binary Classification:

The training data for SVM is in the form of {��,��} where i= 1, 2… 52 and �� = ϵ {-1, +1}. In this formula �� is the input vector. The corresponding 5×7 (6×8) grids are applied in the form of 1×35 (1×48) vectors to the input.

In SVM, we have some hyperplane which separates the positive (�� = +1) from the negative (�� = -1) examples (a separating hyperplane). The points ‘x’ which lie on the hyperplane satisfy x ・

w + b = 0 where w is normal to the hyperplane. Here, |�|

||�|| is the perpendicular distance from the

hyperplane to the origin, and ||w|| is the Euclidean norm of w. Let �(�) be the shortest distance from the separating hyperplane to the closest positive (negative) example. The “margin” of a separating hyperplane is defined to be � +�. For the linearly separable case, the support vector algorithm simply looks for the separating hyperplane with largest margin. This can be formulated as follows: suppose that all the training data satisfy the following constraints:

1:1)(negative

1:1)( positive

−≤+⋅−=≥+⋅=

by

by

iii

iii

wxx

wxx

These can be combined into one set of inequalities:

1)( ≥+ bwxy ii

Now consider the points for which the equality in Equation 1 holds (requiring that there exists such a point is equivalent to choosing a scale for w and b). These points lie on the

hyperplane H1: xi ・ w + b = 1 with normal w and perpendicular distance from the origin|��|

||�||.

Similarly, the points for which the equality in Equation 2 holds lie on the hyperplane H2: xi ・ w

+ b = −1, with normal again w, and perpendicular distance from the origin|��|

||�||. Hence �=�

and the margin is simply�

||�||. H1 and H2 are parallel (they have the same normal) and no

training points fall between them. Thus we can find the pair of hyperplanes which gives the maximum margin by minimizing|| ||�, subject to constraint 1.

Thus, the solution for a typical two dimensional case have the form shown in the following figure. Those training points for which the equality in constraint 1 holds (i.e. those which wind up lying on one of the hyperplanes H1, H2), and whose removal would change the solution found, are called support vectors; they are indicated in figure by the extra circles [14].

Equation 1 Equation 2

Constraint 1

Page 13: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Figure 7: Linear separating hyperplanes for the separable case.

To solve this minimization problem by considering the constraint mentioned above, Lagrange formulas must be used.

(b) Non-Linearly separable Binary Classification:

When the data set is not linearly separable, a Kernel function would be used to map the data to a higher dimensional space (feature space). Some examples of Kernel functions are given in the following: For this project, I have used the linear Kernel function.

Figure 8: A Kernel function can be used to map the data point to a higher dimensional space.

Page 14: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

(c) Multi-Class Classification:

SVMs are inherently two-class classifiers. In particular, the most common technique in practice has been to build as many one-versus-rest classifiers as the number of classes (commonly referred to as ``one-versus-all'' or OVA classification), and to choose the classifier with the largest positive output. In other words, this technique is based on building binary classifiers which distinguish between one of the labels and the rest (one-versus-all). Classification of new instances for the one-versus-all is done by a winner-takes-all strategy, in which the classifier with the highest output function assigns the class. Therefore for training, K (number of classes) different binary problems must be solved (K binary classifiers are required) to classify “class k” versus “the rest classes” for k = 1, 2 · · · K. For this project, I have 52 classifiers (because I have 52 classes).

o Testing algorithm:

For the testing, test sample would be assigned to the class that gives the largest �� (�) (most positive) value, where �� (x) is the solution from the c’th problem (classifier).

Page 15: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

6. Results

o Method ‘1’: Back Propagation Neural Network

As explained earlier, in order to train the network to recognize the English Alphabet characters, I applied the corresponding 5×7 or 6×8 grids in the form of 1×35(1×48) vectors to the input of the network. I had initially tested the network only for the 5×7 grids (which are obtained by running a 10×10 window on the 50 pixel ×70 pixel resized sub-image of a character) but then I also tested the network for a larger matrix, so I resized each sub-image to a 90 pixel ×120 pixel image and then ran a 15×15 window on it this time. As it is obvious the resolution of the second approach is much higher, so my results got improved.

The Error goal is set to 0.01 for all the 52 outputs, which means a character can be recognized only if all the fifty two outputs of the network were no more than 0.01 off their respective desired values. The initial learning rate was experimentally set to 1.5 and the momentum was set to 0.95. A summary of the network characteristics is given in the following:

Four data sets, each containing 52 alphabets (26 Upper-Case and 26 Lower-Case characters) written by various people, are used for training the neural network and 520 (10 per each 52 characters) different handwritten alphabetical characters are used for testing.

I have not considered ‘c’ as ‘C’, ‘o’ as ‘O’, ‘z’ as ‘Z’, ‘p’ as ‘P’, ‘w’ as ‘W’, and ‘s’ as ‘S’ and vice versa miss-prediction.

The following table gives a summary of the results for each character for the 5×7 input grids: (Note that SR means Success Rate).

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

SR

90

90

80

80

90

80

80

90

70

80

10

0

80

90

90

70

90

70

90

10

0

80

70

90

10

0

70

90

90

Class a b c d e f g h i j k l m n o p q r s t u v w x y z

SR 70

60

80

90

90

70

90

60

70

60

90

70

80

70

70

90

80

70

10 0

90

80

70

10 0

70

70

90

Page 16: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

The average success rate of the BB NN with the 1×35 input vector (5×7 grid) is 81.35%.

The following table gives a summary of the results for each character for the 6×8 input grids. (Note that SR means Success Rate).

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

SR

10

0

10

0

80

90

90

90

80

10

0

80

80

10

0

80

10

0

10

0

70

90

80

90

10

0

90

80

10

0

10

0

90

90

10

0

Class a b c d e f g h i j k l m n o p q r s t u v w x y z

SR 80

80

80

90

10 0

90

10 0

80

90

90

10 0

80

90

80

70

90

90

80

10 0

10 0

90

10 0

10 0

80

80

10 0

The average success rate of the BP NN with the 1×48 input vector (6×8 grid) is 89.61%.

The training time for BP NN was about 3.5 seconds on average and the testing time for a single character was about 15 milliseconds.

The following diagram compares the success rate of the two BP NN described in this sub-section:

Figure 9: Comparison of the percentage of the success rate for the two BB NN.

I have been able to train our neural network so that it successfully recognizes the handwritten English alphabet. However, there is a price to pay for this convenience. When the network is tested for the handwritten characters that are very different from the characters that the network has been trained with, the success rate drops and so the network becomes less robust.

From the testing results, I understood that the BB NN with the 1×35 input vector mostly confuses the words that have almost the same shapes. For example, the network mostly miss-classified ‘h’ as ‘b’, ‘O’ as ‘Q’, ‘n’ as ‘a’, ‘I’ as ‘L’. But, when I increased the resolution and tested the BB NN with the 1×48 input vector, the number of miss-classification decreased and the system became more robust.

Page 17: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

o Method ‘2’: LAMSTAR Neural Network

Four data sets, each containing 52 alphabets (26 Upper-Case and 26 Lower-Case characters) written by various people, are used for training the neural network and 520 (10 per each 52 characters) different handwritten alphabetical characters are used for testing.

I have not considered ‘c’ as ‘C’, ‘o’ as ‘O’, ‘z’ as ‘Z’, ‘p’ as ‘P’, ‘w’ as ‘W’, and ‘s’ as ‘S’ and vice versa miss-prediction. The results obtained after training the network are presented in the following table: (Note that SR means Success Rate).

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

SR

10

0

10

0

90

90

90

90

90

10

0

90

80

10

0

80

10

0

10

0

90

10

0

90

90

10

0

10

0

10

0

10

0

10

0

90

90

10

0

Class a b c d e f g h i j k l m n o P q r s t u v w x y z

SR 90

80

10 0

90

10 0

10 0

10 0

90

90

90

10 0

80

90

10 0

90

10 0

90

80

10 0

10 0

90

10 0

10 0

90

90

10 0

The average success rate of the LAMSTAR NN for the 6×8 input grid is 93.84%.

The training time for both cases was about 1.2 seconds and the testing time for a single character was about 10 milliseconds.

Observations:

1. The LAMSTAR NN network was much faster than the Back Propagation network for the same handwritten character recognition problem.

2. By dynamically building the neurons in the SOM modules, the number of computation is largely reduced as the search time to find the winning neuron is reduced to a small number of neurons in many cases.

3. Even in the case when neurons are lost (simulated as a case where the output of the neuron is zero i.e., all its inputs are zeros), the recognition efficiency is pretty high. This is attributed to the link weights, which takes cares of the above situations.

4. The NN learns as it goes even if untrained.

5. The accuracy of LAMASTAR NN is higher than BP NN.

The following diagram compares the success rate of the best BP NN described in the previous sub-section and the LAMSTAR NN:

Page 18: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Figure 10: Comparison of the percentage of the success rate of the two BB NN and the LAMSTAR NN.

o Method ‘3’: Support Vector Machine

As mentioned earlier, the training data for SVM is in the form of {��,��} where i= 1, 2… 52 and �� = ϵ {-1, +1}. In this formula �� is the input vector. The corresponding 5×7 (6×8) grids are applied in the form of 1×35 (1×48) vectors to the input. A summary of the SVM characteristics is given in the following:

The following table gives a summary of the results for each character for the 5×7 input grids (Note that SR means Success Rate).

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

SR

10

0

90

90

90

90

90

90

10

0

80

80

10

0

80

90

10

0

90

10

0

80

90

10

0

10

0

10

0

90

10

0

90

90

90

Class a b c d e f g h i j k l m n o P q r s t u v w x y z

SR 80

80

10 0

90

10 0

10 0

10 0

90

90

90

10 0

80

90

80

90

10 0

90

90

10 0

90

80

80

10 0

90

90

90

Page 19: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

The average success rate of the SVM with the 1×35 input vector (5×7 grid) is 91.34%.

The following table gives a summary of the results for each character for the 6×8 input grids(Note that SR means Success Rate).

Class A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

SR

10

0

10

0

90

90

90

90

10

0

10

0

90

80

10

0

80

10

0

10

0

90

10

0

90

90

10

0

10

0

10

0

90

10

0

90

90

10

0

Class a b c d e f g h i j k l m n o P q r s t u v w x y z

SR 90

80

10 0

90

10 0

10 0

10 0

80

90

90

10 0

80

90

90

90

10 0

90

90

10 0

90

90

90

10 0

90

90

10 0

The average success rate of the SVM with the 1×48 input vector (6×8 grid) is 93.26%.

The training time for SVM was about 5.6 seconds on average and the testing time for a single character was about 35 milliseconds.

The following diagram compares the success rate of the best BP NN, the LAMSTAR NN and the best SVM:

Figure 11: Comparison of the percentage of the success rate of the best BB NN, the LAMSTAR NN and SVM.

As it can be detected from the diagram above, the success rate of the LAMSTAR NN is higher than the other two methods. However, the success rate of the SVM is pretty close to that of the LAMSTAR. The BP NN has the lowest success rate compared the other two methods.

The following diagram compares the training time of the best BP NN, the LAMSTAR NN and the best SVM. As it can be detected from this diagram, the LAMSTAR NN is much faster than the two other approaches and SVM is the slowest among these three methods.

Page 20: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

Figure 12: Average training time of the BP NN, LAMSTAR NN and SVM.

7. Conclusion

An off-line handwritten character recognition using Back Propagation Neural Network, LAMSTAR Neural Network and Support Vector Machine has been described in this report.

A summary of this project is as follow:

1- Scanning the paper page with the handwritten characters on it 2- Extracting sub-images of individual characters form the scanned image using the image

processing toolbox of the MATLAB 3- Resizing the sub-images either to a 50 pixel ×70 pixel image or a 90 pixel×120 pixel

image (Since the cropped sub-images of characters from the last step can have different sizes, they have to be resized to a same standard size to be given as the input to the classifiers)

4- Creating a 50×70 or a 90×120 matrix of Boolean values from each sub-image by assigning ‘0’s to white pixels and ‘1’ to black pixels.

5- Resizing the original large matrix to a smaller matrix (by running a 10×10/15×15 window on the 50×70/90×120 original matrix and finding the average of the values in the window)

6- Feeding the 5×7/6×8 matrix in the form of a 1×35/1×48 matrix to the input of the BP NN or SVM. For the LAMSTAR NN, I have only considered the 6×8 input matrix.

7- Finally, we can get the results of the classification

As the results from the previous section suggests, LAMSTAR NN is the most efficient and the fastest classifier for solving this problem compared to the two other techniques that have been examined in this project. Moreover, for the BP NN and SVM, as I mentioned earlier I have considered two different sizes for the input matrix and I got higher accuracy for the input that had higher resolution (6×8 input matrix).

Page 21: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

8. References

[1] E. Tautu and F. Leon, “Optical Character Recognition System Using Support Vector Machines,” pp. 1-13, 2012.

[2] Designing an Intelligent System for Optical Handwritten Character Recognition using ANN

[3] R. Plamondon and S. N. Srihari, “On-line and off- line handwritten character recognition: A comprehensive survey,”IEEE. Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 63-84, 2000.

[4] N. Arica and F. Yarman-Vural, “An Overview of Character Recognition Focused on Off-line Handwriting”, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 2001, 31(2), pp. 216 - 233.

[5] U. Bhattacharya, and B. B. Chaudhuri, “Handwritten numeral databases of Indian scripts and multistage recognition of mixed numerals,” IEEE Transaction on Pattern analysis and machine intelligence, vol.31, No.3, pp.444-457, 2009.

[6] U. Pal, T. Wakabayashi and F. Kimura, “Handwritten numeral recognition of six popular scripts,” Ninth International conference on Document Analysis and Recognition ICDAR 07, Vol.2, pp.749-753, 2007.

[7] M. Simner, W. Hulstijn, and P. Girouard (Eds.), “Forensic, developmental and neuropsychological aspects of handwriting,” Special issue of of the Journal of Forensic Document Examination, 1994.

[8] R. Plamondon, “Pattern recognition,” Special Issue on Automatic Signature Verification, vol. 8, no. 3, June 1994. IACSIT International Journal of Engineering and Technology, Vol. 5, No. 2, April 2013

[9] G. P. Van Galen and P. Morasso, “Neuromotor control in handwriting and drawing," Acta Psychologica,, vol. 100, no. 1-2, p. 236, 1998.

[10] R.G. Casey and E.Lecolinet, “A Survey of Methods and Strategies in Character Segmentation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 18, No.7, July 1996, pp. 690-706.

[11] C. L. Liu, H. Fujisawa, “Classification and Learning for Character Recognition: Comparison of Methods and Remaining Problems”, Int. Workshop on Neural Networks and Learning in Document Analysis and Recognition, Seoul, 2005.

[12] F. Bortolozzi, A. S. Brito, Luiz S. Oliveira and M. Morita, “Recent Advances in Handwritten Recognition”, Document Analysis, Umapada Pal, Swapan K. Parui, Bidyut B. Chaudhuri, pp 1-30.

[13] J.Pradeep, E.Srinivasan, S.Himavathi, “Diagonal Based Feature Extraction For Handwritten Character Recognition System Using Neural Network”,Electronics Computer Technology (ICECT),2011, Volume-4, 2011, pp. 364-368.

[14] C.J.C. Burges, “tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery”, 2, pp. 121–167,1998.

Page 22: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting
Page 23: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

9. Appendix

All the source codes are written in MATLAB.

o Method ‘1’: Back Propagation Neural Network

function img = edu_imgpreprocess(I, selected_col,selected_ln) Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',3); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox); x=x/4; Ibox = reshape(Ibox,[4 x]); Ic = [Iprops.Centroid]; [z,w]=size(Ic);% w=w/2;% Ic = reshape(Ic,[2 w]); Ic = Ic'; Ic(:,3) = (mean(Ic.^2,2)).^(1/2); Ic(:,4) = 1:w; Ic2 = sortrows(Ic,2); for cnt = 1:selected_ln Ic2((cnt-1)*selected_col+1:cnt*selected_col,:) = sortrows(Ic2((cnt-1)*selected_col+1:cnt*selected_col,:),4); end Ic3 = Ic2(:,1:2); ind = Ic2(:,4); for cnt = 1:selected_ln*selected_col img{cnt} = imcrop(Ibw,Ibox(:,ind(cnt))); end

function bw2 = edu_imgcrop(bw) % Find the boundary of the image [y2temp x2temp] = size(bw); x1=1; y1=1; x2=x2temp; y2=y2temp; % Finding left side blank spaces cntB=1; while (sum(bw(:,cntB))==y2temp) x1=x1+1; cntB=cntB+1; end % Finding right side blank spaces cntB=1; while (sum(bw(cntB,:))==x2temp)

Page 24: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

y1=y1+1; cntB=cntB+1; end % Finding upper side blank spaces cntB=x2temp; while (sum(bw(:,cntB))==y2temp) x2=x2-1; cntB=cntB-1; end % Finding lower side blank spaces cntB=y2temp; while (sum(bw(cntB,:))==x2temp) y2=y2-1; cntB=cntB-1; end % Crop the image to the edge bw2=imcrop(bw,[x1,y1,(x2-x1),(y2-y1)]);

function lett = edu_imgresize(bw2) % This function will take the cropped binary image and change it to 7 x 5 % character representation in single vector. bw_7050=imresize(bw2,[70,50]); for cnt=1:7 for cnt2=1:5 Atemp=sum(bw_7050((cnt*10-9:cnt*10),(cnt2*1 0-9:cnt2*10))); lett((cnt-1)*5+cnt2)=sum(Atemp); end end lett=((100-lett)/100); fid = fopen('imageresize.txt','w'); lett=lett'; fprintf(fid,'%6.2f \n',lett); fclose(fid);

function [ net ] = edu_initnet( x, y, hid_layers ) %INIT_NET Summary of this function goes here % Detailed explanation goes here hid_layers = [size( x, 1 ) hid_layers size( y, 1 )] ; net.weights = cell( 1, length( hid_layers ) - 1 ); net.bias = net.weights; for i=2:length( hid_layers ) net.weights{i-1} = rand( hid_layers( i ), ... hid_layers( i - 1 ) ) * 0.8 - 0.4; net.bias{i-1} = rand( hid_layers( i ), 1 ) * 0. 8 - 0.4; end net.lr = 1.5; net.mc = 0.95; net.goal = 0.01; net.num_itr = 5000; net.reset = 400; net.max_fail = 100; end

function [net] = edu_trainnet( net, x, y ) rand_indx = randperm( size( x, 2 ) );

Page 25: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

num_samples = size( x, 2 ); x = x( :, rand_indx ); y = y( :, rand_indx ); num_tr = 0.8 * num_samples; num_v = 0.2 * num_samples; lr = net.lr; train_x = x( :, 1:num_tr ); train_y = y( :, 1:num_tr ); val_x = x( :, num_tr+1:num_tr+num_v ); val_y = y( :, num_tr+1:num_tr+num_v ); num_layers = length( net.weights ) + 1; layer_sum = cell( 1, num_layers ); layer_val = cell( 1, num_layers ); n_train_samples = size( train_x, 2 ); n_val_samples = size( val_x, 2 ); layer_val{1} = train_x; net.hist_train = []; net.hist_val = []; best_net = []; min_err = inf; num_fail = 0; last_mse_err = inf; str_out = []; for itr = 1:net.num_itr val_nn_out = val_x; for j = 2:num_layers layer_sum{j} = net.weights{j-1} * layer_val {j-1} + repmat( ... net.bias{j-1}, 1, n_train_samples ); layer_val{j} = 1 + exp( -1 * layer_sum{j} ) ; val_nn_out = 1 + exp( -1 * net.weights{j-1} * val_nn_out + repmat( ... net.bias{j-1}, 1, n_val_samples ) ); end mse_err_train = sum( sum( ( ( train_y - layer_v al{end} ) ... .^ 2 ) ) ) / n_train_samples; net.hist_train = [net.hist_train mse_err_train] ; mse_err_val = sum( sum( ( ( val_y - val_nn_out ) .^ 2 ) ) ) ... / n_val_samples; if mod( itr, 10 ) == 0 fprintf( repmat( '\b', 1, length( str_out ) ) ); str_out = sprintf( [ 'msev = %.2f, mset = % .2f,' ... ' itr = %.2f, nfail = %.2f, merr = %.2f '], ... mse_err_val, mse_err_train, itr, num_fa il, min_err ); fprintf( str_out ); end net.hist_val = [net.hist_val mse_err_val]; if mse_err_train <= net.goal net = best_net;

Page 26: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

break end if mse_err_val - 0.00001 >= last_mse_err num_fail = num_fail + 1; if num_fail > net.max_fail net = best_net; break; end else num_fail = 0; if min_err > mse_err_val best_net = net; min_err = mse_err_val; end end last_mse_err = mse_err_val; prop_err = layer_val{end} - train_y; for j = num_layers:-1:2 prop_err = prop_err .* (1+exp(-1 * 'dn', la yer_sum{j} )); net.bias{j-1} = net.bias{j-1} - lr * sum( p rop_err, 2 ); u_weights = prop_err * layer_val{j-1}'; prop_err = net.weights{j-1}' * prop_err; net.weights{j-1} = net.weights{j-1} - lr * u_weights; end end

function [y] = edu_simnet( net, x ) num_layers = size( net.weights, 2 ); num_samples = size( x, 2 ); y = x; for i=1:num_layers y = 1 + exp( -1 * net.weights{i} * y + repmat( ... net.bias{i}, 1, num_samples ) ); end

function varargout = charGUI4(varargin) %{ H = CHARGUI4 returns the handle to a new CHAR GUI4 or the handle to the existing singleton*. CHARGUI4('CALLBACK',hObject,eventData,handles ,...) calls the local function named CALLBACK in CHARGUI4.M with th e given input arguments. CHARGUI4('Property','Value',...) creates a ne w CHARGUI4 or raises the existing singleton*. Starting from the left, property value pairs are applied to the GUI before charGUI_OpeningFunc tion gets called. An unrecognized property name or invalid value m akes property application stop. All inputs are passed to charGUI4_Open ingFcn via varargin. %} gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @charGUI4_Open ingFcn, ... 'gui_OutputFcn', @charGUI4_Outp utFcn, ...

Page 27: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

'gui_LayoutFcn', [] , ... 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before charGUI4 is made visible . function charGUI4_OpeningFcn(hObject, eventdata, ha ndles, varargin) % This function has no output args, see OutputFcn. % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % varargin command line arguments to charGUI4 (se e VARARGIN) load data; assignin('base','net',net); % Choose default command line output for charGUI4 handles.output = hObject; % Update handles structure guidata(hObject, handles); % UIWAIT makes charGUI4 wait for user response (see UIRESUME) % uiwait(handles.figure1); % --- Outputs from this function are returned to th e command line. function varargout = charGUI4_OutputFcn(hObject, ev entdata, handles) % varargout cell array for returning output args ( see VARARGOUT); % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Get default command line output from handles stru cture varargout{1} = handles.output; % --- Executes on button press in pbLoadSet. function pbLoad_Callback(hObject, eventdata, handle s) % hObject handle to pbLoadSet (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); S = imread([pathname,filename]); axes(handles.axes1); imshow(S); handles.S = S; guidata(hObject, handles); % --- Executes on button press in pbSelect. function pbSelect_Callback(hObject, eventdata, hand les) % hObject handle to pbSelect (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) S = handles.S; axes(handles.axes1); % Selection of location if isfield(handles,'api') handles.api.delete(); rmfield(handles,'api'); rmfield(handles,'hRect'); axes(handles.axes1);

Page 28: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

imshow(S); end axes(handles.axes1); sz = size(S); handles.hRect = imrect(gca,[round(sz(2)/2) round(sz (1)/2) 20 20]); % Select object handles.api = iptgetapi(handles.hRect); guidata(hObject, handles); %{ img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop; %} %{ guidata(hObject, handles); S = handles.S; axes(handles.axes1); img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop; %} %{ axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% x=x/4;% Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) selected_col = get(handles.edit5,'string'); selected_col = evalin('base',selected_col); selected_ln = get(handles.edit6,'string'); selected_ln = evalin('base',selected_ln); for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r '); end guidata(hObject, handles); %} % --- Executes on button press in pbCrop. function pbCrop_Callback(hObject, eventdata, handle s) handles.loc = handles.api.getPosition(); axes(handles.axes1); S = handles.S; handles.img_crop = imcrop(S,handles.loc); axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw));

Page 29: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% disp(x); x=x/4;% handels.x = x; Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) [y, selected_col] = size(handles.Ibox); selected_ln = 1; for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r' ); end i=1; for cnt = 1:selected_ln * selected_col %rectangle('position',Ibox(:,cnt),'edgecolor',' r'); X=Ibox(:,cnt); a=X(2,1); b=X(1,1); c=X(3,1)+b; d=X(4,1)+a; %handles.cutImg = I(a:d,b:c); if(i==1) i=i+1; cutImg1 = I(a:d,b:c); handles.cutImg1 = cutImg1; axes(handles.axes2); imshow(handles.cutImg1); elseif(i==2) i=i+1; handles.cutImg2 = I(a:d,b:c); axes(handles.axes13); imshow(handles.cutImg2); elseif(i==3) i=i+1; handles.cutImg3 = I(a:d,b:c); axes(handles.axes26); imshow(handles.cutImg3); end end guidata(hObject, handles); % --- Executes on button press in pbPreprocess. function pbPreprocess_Callback(hObject, eventdata, handles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; img_crop1 = handles.cutImg1; %imgGray1 = rgb2gray(img_crop1); bw11 = im2bw(img_crop1,graythresh(img_c rop1));

Page 30: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

axes(handles.axes31); imshow(bw11); bw12 = edu_imgcrop(bw11); axes(handles.axes40); imshow(bw12); handles.bw12 = bw12; elseif(i==2) i=i+1; img_crop2 = handles.cutImg2; %imgGray2 = rgb2gray(img_crop2); bw21 = im2bw(img_crop2,graythresh(img_ crop2)); axes(handles.axes32); imshow(bw21); bw22 = edu_imgcrop(bw21); axes(handles.axes41); imshow(bw22); handles.bw22 = bw22; elseif(i==3) i=i+1; img_crop3 = handles.cutImg3; %imgGray3 = rgb2gray(img_crop3); bw31 = im2bw(img_crop3,graythresh(img_ crop3)); axes(handles.axes35); imshow(bw31); bw32 = edu_imgcrop(bw31); axes(handles.axes44); imshow(bw32); handles.bw32 = bw32; end end guidata(hObject, handles); % --- Executes on button press in pbExtract. function pbExtract_Callback(hObject, eventdata, han dles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; bw12 = handles.bw12; charvec1 = edu_imgresize(bw12); axes(handles.axes5); plotchar(charvec1); handles.charvec1 = charvec1; elseif(i==2) i=i+1; bw22 = handles.bw22; charvec2 = edu_imgresize(bw22); axes(handles.axes49); plotchar(charvec2); handles.charvec2 = charvec2; elseif(i==3) i=i+1; bw32 = handles.bw32; charvec3 = edu_imgresize(bw32); axes(handles.axes50);

Page 31: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

plotchar(charvec3); handles.charvec3 = charvec3; end end guidata(hObject, handles); % --- Executes on button press in pbRecognize. function pbRecognize_Callback(hObject, eventdata, h andles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; charvec1 = handles.charvec1; tic if (handles.training == 0) selected_net = handles.net; result1 = edu_simnet(selected_net,c harvec1); [val, num1] = max(result1); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec1(1,j) = charvec1(j, 1); end num1 = edu_testsvm(numClasses,svm_c harvec1,models); end if(num1 == 1) S1 = 'A '; elseif(num1 == 2) S1 = 'B '; elseif(num1 == 3) S1 = 'C '; elseif(num1 == 4) S1 = 'D '; elseif(num1 == 5) S1 = 'E '; elseif(num1 == 6) S1 = 'F '; elseif(num1 == 7) S1 = 'G '; elseif(num1 == 8) S1 = 'H '; elseif(num1 == 9) S1 = 'I '; elseif(num1 == 10) S1 = 'J '; elseif(num1 == 11) S1 = 'K '; elseif(num1 == 12) S1 = 'L '; elseif(num1 == 13) S1 = 'M '; elseif(num1 == 14) S1 = 'N '; elseif(num1 == 15) S1 = 'O '; elseif(num1 == 16) S1 = 'P '; elseif(num1 == 17) S1 = 'Q '; elseif(num1 == 18) S1 = 'R '; elseif(num1 == 19) S1 = 'S '; elseif(num1 == 20)

Page 32: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S1 = 'T '; elseif(num1 == 21) S1 = 'U '; elseif(num1 == 22) S1 = 'V '; elseif(num1 == 23) S1 = 'W '; elseif(num1 == 24) S1 = 'X '; elseif(num1 == 25) S1 = 'Y '; elseif(num1 == 26) S1 = 'Z '; elseif(num1 == 27) S1 = 'a '; elseif(num1 == 28) S1 = 'b '; elseif(num1 == 29) S1 = 'c '; elseif(num1 == 30) S1 = 'd '; elseif(num1 == 31) S1 = 'e '; elseif(num1 == 32) S1 = 'f '; elseif(num1 == 33) S1 = 'g '; elseif(num1 == 34) S1 = 'h '; elseif(num1 == 35) S1 = 'i '; elseif(num1 == 36) S1 = 'j '; elseif(num1 == 37) S1 = 'k '; elseif(num1 == 38) S1 = 'l '; elseif(num1 == 39) S1 = 'm '; elseif(num1 == 40) S1 = 'n '; elseif(num1 == 41) S1 = 'o '; elseif(num1 == 42) S1 = 'p '; elseif(num1 == 43) S1 = 'q '; elseif(num1 == 44) S1 = 'r '; elseif(num1 == 45) S1 = 's '; elseif(num1 == 46) S1 = 't '; elseif(num1 == 47) S1 = 'u '; elseif(num1 == 48) S1 = 'v '; elseif(num1 == 49) S1 = 'w '; elseif(num1 == 50) S1 = 'x '; elseif(num1 == 51) S1 = 'y '; elseif(num1 == 52) S1 = 'z '; end S10 = strcat(S1); toc elseif(i==2) i=i+1;

Page 33: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

charvec2 = handles.charvec2; if (handles.training == 0) selected_net = handles.net; result2 = edu_simnet(selected_net,c harvec2); [val, num2] = max(result2); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec2(1,j) = charvec2(j, 1); end num2 = edu_testsvm(numClasses,svm_c harvec2,models); end if(num2 == 1) S2 = 'A '; elseif(num2 == 2) S2 = 'B '; elseif(num2 == 3) S2 = 'C '; elseif(num2 == 4) S2 = 'D '; elseif(num2 == 5) S2 = 'E '; elseif(num2 == 6) S2 = 'F '; elseif(num2 == 7) S2 = 'G '; elseif(num2 == 8) S2 = 'H '; elseif(num2 == 9) S2 = 'I '; elseif(num2 == 10) S2 = 'J '; elseif(num2 == 11) S2 = 'K '; elseif(num2 == 12) S2 = 'L '; elseif(num2 == 13) S2 = 'M '; elseif(num2 == 14) S2 = 'N '; elseif(num2 == 15) S2 = 'O '; elseif(num2 == 16) S2 = 'P '; elseif(num2 == 17) S2 = 'Q '; elseif(num2 == 18) S2 = 'R '; elseif(num2 == 19) S2 = 'S '; elseif(num2 == 20) S2 = 'T '; elseif(num2 == 21) S2 = 'U '; elseif(num2 == 22) S2 = 'V '; elseif(num2 == 23) S2 = 'W '; elseif(num2 == 24) S2 = 'X '; elseif(num2 == 25) S2 = 'Y '; elseif(num2 == 26) S2 = 'Z '; elseif(num2 == 27) S2 = 'a '; elseif(num2 == 28) S2 = 'b ';

Page 34: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num2 == 29) S2 = 'c '; elseif(num2 == 30) S2 = 'd '; elseif(num2 == 31) S2 = 'e '; elseif(num2 == 32) S2 = 'f '; elseif(num2 == 33) S2 = 'g '; elseif(num2 == 34) S2 = 'h '; elseif(num2 == 35) S2 = 'i '; elseif(num2 == 36) S2 = 'j '; elseif(num2 == 37) S2 = 'k '; elseif(num2 == 38) S2 = 'l '; elseif(num2 == 39) S2 = 'm '; elseif(num2 == 40) S2 = 'n '; elseif(num2 == 41) S2 = 'o '; elseif(num2 == 42) S2 = 'p '; elseif(num2 == 43) S2 = 'q '; elseif(num2 == 44) S2 = 'r '; elseif(num2 == 45) S2 = 's '; elseif(num2 == 46) S2 = 't '; elseif(num2 == 47) S2 = 'u '; elseif(num2 == 48) S2 = 'v '; elseif(num2 == 49) S2 = 'w '; elseif(num2 == 50) S2 = 'x '; elseif(num2 == 51) S2 = 'y '; elseif(num2 == 52) S2 = 'z '; end S10 = strcat(S1,',',S2) elseif(i==3) i=i+1; charvec3 = handles.charvec3; if (handles.training == 0) selected_net = handles.net; result3 = edu_simnet(selected_net,c harvec3); [val, num3] = max(result3); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec3(1,j) = charvec3(j, 1); end num3 = edu_testsvm(numClasses,svm_c harvec3,models); end if(num3 == 1) S3 = 'A '; elseif(num3 == 2)

Page 35: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S3 = 'B '; elseif(num3 == 3) S3 = 'C '; elseif(num3 == 4) S3 = 'D '; elseif(num3 == 5) S3 = 'E '; elseif(num3 == 6) S3 = 'F '; elseif(num3 == 7) S3 = 'G '; elseif(num3 == 8) S3 = 'H '; elseif(num3 == 9) S3 = 'I '; elseif(num3 == 10) S3 = 'J '; elseif(num3 == 11) S3 = 'K '; elseif(num3 == 12) S3 = 'L '; elseif(num3 == 13) S3 = 'M '; elseif(num3 == 14) S3 = 'N '; elseif(num3 == 15) S3 = 'O '; elseif(num3 == 16) S3 = 'P '; elseif(num3 == 17) S3 = 'Q '; elseif(num3 == 18) S3 = 'R '; elseif(num3 == 19) S3 = 'S '; elseif(num3 == 20) S3 = 'T '; elseif(num3 == 21) S3 = 'U '; elseif(num3 == 22) S3 = 'V '; elseif(num3 == 23) S3 = 'W '; elseif(num3 == 24) S3 = 'X '; elseif(num3 == 25) S3 = 'Y '; elseif(num3 == 26) S3 = 'Z '; elseif(num3 == 27) S3 = 'a '; elseif(num3 == 28) S3 = 'b '; elseif(num3 == 29) S3 = 'c '; elseif(num3 == 30) S3 = 'd '; elseif(num3 == 31) S3 = 'e '; elseif(num3 == 32) S3 = 'f '; elseif(num3 == 33) S3 = 'g '; elseif(num3 == 34) S3 = 'h '; elseif(num3 == 35) S3 = 'i '; elseif(num3 == 36) S3 = 'j '; elseif(num3 == 37) S3 = 'k ';

Page 36: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num3 == 38) S3 = 'l '; elseif(num3 == 39) S3 = 'm '; elseif(num3 == 40) S3 = 'n '; elseif(num3 == 41) S3 = 'o '; elseif(num3 == 42) S3 = 'p '; elseif(num3 == 43) S3 = 'q '; elseif(num3 == 44) S3 = 'r '; elseif(num3 == 45) S3 = 's '; elseif(num3 == 46) S3 = 't '; elseif(num3 == 47) S3 = 'u '; elseif(num3 == 48) S3 = 'v '; elseif(num3 == 49) S3 = 'w '; elseif(num3 == 50) S3 = 'x '; elseif(num3 == 51) S3 = 'y '; elseif(num3 == 52) S3 = 'z '; end S10 = strcat(S1,',',S2,',',S3) end end set(handles.editResult, 'string',S10); guidata(hObject, handles); function pbNN_Callback(hObject, eventdata, handles) % --- Executes on button press in pbSVM function SVM_Callback(hObject, eventdata, handles) handles.training = 1; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function SVM_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbBP function BP_Callback(hObject, eventdata, handles) handles.training = 0; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA)

Page 37: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function BP_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function editResult_Callback(hObject, eventdata, ha ndles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editResult as text % str2double(get(hObject,'String')) returns contents of editResult as a double % --- Executes during object creation, after settin g all properties. function editResult_CreateFcn(hObject, eventdata, h andles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbLoad Training S et. function pbLoadSet_Callback(hObject, eventdata, han dles) % hObject handle to pushbutton6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); trimg = imread([pathname,filename]); selected_ln = 5; selected_col = 52; img = edu_imgpreprocess(trimg,selected_col,selected _ln); for cnt = 1:selected_ln * selected_col bw2 = edu_imgcrop(img{cnt}); charvec = edu_imgresize(bw2); out(:,cnt) = charvec; svm_out(cnt,:) = charvec; end if (handles.training == 0) P = out(:,1:208); T = [eye(52) eye(52) eye(52) eye(52)]; elseif (handles.training == 1) TrainingSet = svm_out(1:208,:); Labels = [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 6 17 18 19 20 21 22 23 24 25 26 27 28 ... 29 30 31 32 33 34 35 36 37 38 39 40 41 42 4 3 44 45 46 47 48 49 50 51 52]; GroupTrain = [Labels Labels Labels Labels]; end if (handles.training == 0) net = edu_createnn(P,T); handles.net = net; elseif (handles.training == 1) tic

Page 38: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

[models,numClasses] = edu_multisvm(TrainingSet, GroupTrain); toc handles.models = models; handles.numClasses = numClasses; end guidata(hObject, handles); function edit5_Callback(hObject, eventdata, handles ) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit5 as text % str2double(get(hObject,'String')) returns contents of edit5 as a double % --- Executes during object creation, after settin g all properties. function edit5_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function edit6_Callback(hObject, eventdata, handles ) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit6 as text % str2double(get(hObject,'String')) returns contents of edit6 as a double % --- Executes during object creation, after settin g all properties. function edit6_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function edit9_Callback(hObject, eventdata, handles ) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit9 as text % str2double(get(hObject,'String')) returns contents of edit9 as a double % --- Executes during object creation, after settin g all properties. function edit9_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on key press with focus on pbLoad an d none of its controls. function pbLoad_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to pbLoad (see GCBO)

Page 39: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA) % --- Executes on key press with focus on editNN an d none of its controls. function editNN_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to editNN (see GCBO) % eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA)

o Method ‘2’: LAMSTAR Neural Network

function img = edu_imgpreprocess(I, selected_col,selected_ln) Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',3); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox); x=x/4; Ibox = reshape(Ibox,[4 x]); Ic = [Iprops.Centroid]; [z,w]=size(Ic);% w=w/2;% Ic = reshape(Ic,[2 w]); Ic = Ic'; Ic(:,3) = (mean(Ic.^2,2)).^(1/2); Ic(:,4) = 1:w; Ic2 = sortrows(Ic,2); for cnt = 1:selected_ln Ic2((cnt-1)*selected_col+1:cnt*selected_col,:) = sortrows(Ic2((cnt-1)*selected_col+1:cnt*selected_col,:),4); end Ic3 = Ic2(:,1:2); ind = Ic2(:,4); for cnt = 1:selected_ln*selected_col img{cnt} = imcrop(Ibw,Ibox(:,ind(cnt))); end

function bw2 = edu_imgcrop(bw) % Find the boundary of the image [y2temp x2temp] = size(bw); x1=1; y1=1; x2=x2temp; y2=y2temp;

Page 40: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% Finding left side blank spaces cntB=1; while (sum(bw(:,cntB))==y2temp) x1=x1+1; cntB=cntB+1; end % Finding right side blank spaces cntB=1; while (sum(bw(cntB,:))==x2temp) y1=y1+1; cntB=cntB+1; end % Finding upper side blank spaces cntB=x2temp; while (sum(bw(:,cntB))==y2temp) x2=x2-1; cntB=cntB-1; end % Finding lower side blank spaces cntB=y2temp; while (sum(bw(cntB,:))==x2temp) y2=y2-1; cntB=cntB-1; end % Crop the image to the edge bw2=imcrop(bw,[x1,y1,(x2-x1),(y2-y1)]);

function lett = edu_imgresize(bw2) % This function will take the cropped binary image and change it to 7 x 5 % character representation in single vector. bw_7050=imresize(bw2,[70,50]); for cnt=1:7 for cnt2=1:5 Atemp=sum(bw_7050((cnt*10-9:cnt*10),(cnt2*1 0-9:cnt2*10))); lett((cnt-1)*5+cnt2)=sum(Atemp); end end lett=((100-lett)/100); fid = fopen('imageresize.txt','w'); lett=lett'; fprintf(fid,'%6.2f \n',lett); fclose(fid);

function edu_LAMSTARtrain (train_pattern) X= train_pattern; nu = 0.04; k = 1; n = 12 % Number of subwords flag = zeros(1,n); % To make 12 subwords from 1 input for i = 1:min(size(X)), X_r{i} = reshape(X(:,i),6,8); for k =1:6:2 for m=1:8:2 for j = 1:n, temp = X_r{i}(k:k+1,m:m+1); temp = reshape(temp,1,4)

Page 41: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

X_in{i}(j,:) = temp(1:4)'; end end end % To check if a subword is all '0's and makes i t normalized value equal to zero % and to normalize all other input subwords p(1,:) = zeros(1,8); for k = 1:n, for t = 1:4, if (X_in{i}(k,t)~= p(1,t)), X_norm{i}(k,:) = X_in{i}(k,:)/sqrt( sum(X_in{i}(k,:).^2)); else X_norm{i}(k,:) = zeros(1,4); end end end end%%%End of for %%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Dynamic Building of neurons %%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Building of the first neuron is done as Kohonen L ayer neuron %(this is for all the subwords in the first input p attern for all SOM modules i = 1; ct = 1; while (i<=n), i cl = 0; for t = 1:4, if (X_norm{ct}(i,t)==0), cl = cl+1; end end if (cl == 4), Z{ct}(i) = 0; elseif (flag(i) == 0), W{i}(:,ct) = rand(4,1); flag(i) = ct; W_norm{i}(:,ct) = W{i}(:,ct)/sqrt(sum(W{i}( :,ct).^2)); Z{ct}(i)= X_norm{ct}(i,:)*W_norm{i}; alpha =0.8; tol = 1e-5; while(Z{ct}(i) <= (1-tol)), W_norm{i}(:,ct) = W_norm{i}(:,ct) + alp ha*(X_norm{ct}(i,:)' - W_norm{i}(:,ct)); Z{ct}(i) = X_norm{ct}(i,:)*W_norm{i}(:, ct); end%%%%%End of while end%%%%End of if r(ct,i) = 1; i = i+1; end%%%%End of while r(ct,:) = 1; ct = ct+1; while (ct <= min(size(X))), for i = 1:n, cl = 0; for t = 1:4, if (X_norm{ct}(i,t)==0), cl = cl+1; end end if (cl == 4), Z{ct}(i) = 0; else i r(ct,i) = flag(i); r_new=0; for k = 1:max(r(ct,i)), Z{ct}(i) = X_norm{ct}(i,:)*W_norm{i }(:,k); if Z{ct}(i)>=0.95, r_new = k;

Page 42: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

flag(i) = r_new; r(ct,i) = flag(i); break; end%%%End of if end%%%%%%%End of for if (r_new==0), flag(i) = flag(i)+1; r(ct,i) = flag(i); W{i}(:,r(ct,i)) = rand(4,1); %flag(i) = r W_norm{i}(:,r(ct,i)) = W{i}(:,r(ct, i))/sqrt(sum(W{i}(:,r(ct,i)).^2)); Z{ct}(i) = X_norm{ct}(i,:)*W_norm{i }(:,r(ct,i)); alpha =0.8; tol = 1e-5; while(Z{ct}(i) <= (1-tol)), W_norm{i}(:,r(ct,i)) = W_norm{i }(:,r(ct,i)) + alpha*(X_norm{ct}(i,:)' - W_norm{i}(:,r(ct,i))); Z{ct}(i) = X_norm{ct}(i,:)*W_no rm{i}(:,r(ct,i)); end%%%End of while end%%%End of if %r_new %disp('Flag') %flag(i) end%%%%End of if end ct = ct+1; end save W_norm W_norm for i = 1:156:52, % 156 = 3*52 d(i,:) = [0 0 0 0 0 0]; d(i+1,:) = [0 0 0 0 0 1]; d(i+2,:) = [0 0 0 0 1 0]; d(i+3,:) = [0 0 0 0 1 1]; d(i+4,:) = [0 0 0 1 0 0]; d(i+5,:) = [0 0 0 1 0 1]; d(i+6,:) = [0 0 0 1 1 0]; d(i+7,:) = [0 0 0 1 1 1]; d(i+8,:) = [0 0 1 0 0 0]; d(i+9,:) = [0 0 1 0 0 1]; d(i+10,:) = [0 0 1 0 1 0]; d(i+11,:) = [0 0 1 0 1 1]; d(i+12,:) = [0 0 1 1 0 0]; d(i+13,:) = [0 0 1 1 0 1]; d(i+14,:) = [0 0 1 1 1 0]; d(i+15,:) = [0 0 1 1 1 1]; d(i+16,:) = [0 1 0 0 0 0]; d(i+17,:) = [0 1 0 0 0 1]; d(i+18,:) = [0 1 0 0 1 0]; d(i+19,:) = [0 1 0 0 1 1]; d(i+20,:) = [0 1 0 1 0 0]; d(i+21,:) = [0 1 0 1 0 1]; d(i+22,:) = [0 1 0 1 1 0]; d(i+23,:) = [0 1 0 1 1 1]; d(i+24,:) = [0 1 1 0 0 0]; d(i+25,:) = [0 1 1 0 0 1]; d(i+26,:) = [1 0 0 1 1 0]; d(i+27,:) = [1 0 0 1 1 1]; d(i+28,:) = [1 0 1 0 0 0]; d(i+29,:) = [1 0 1 0 0 1]; d(i+30,:) = [1 0 1 0 1 0]; d(i+31,:) = [1 0 1 0 1 1]; d(i+32,:) = [1 0 1 1 0 0]; d(i+33,:) = [1 0 1 1 0 1]; d(i+34,:) = [1 0 1 1 1 0]; d(i+35,:) = [1 0 1 1 1 1]; d(i+36,:) = [1 1 0 0 0 0]; d(i+37,:) = [1 1 0 0 0 1]; d(i+38,:) = [1 1 0 0 1 0]; d(i+39,:) = [1 1 0 0 1 1];

Page 43: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

d(i+40,:) = [1 1 0 1 0 0]; d(i+41,:) = [1 1 0 1 0 1]; d(i+42,:) = [1 1 0 1 1 0]; d(i+43,:) = [1 1 0 1 1 1]; d(i+44,:) = [1 1 1 0 0 0]; d(i+45,:) = [1 1 1 0 0 1]; d(i+46,:) = [1 1 1 0 1 0]; d(i+47,:) = [1 1 1 0 1 1]; d(i+48,:) = [1 1 1 1 0 0]; d(i+49,:) = [1 1 1 1 0 1]; d(i+50,:) = [1 1 1 1 1 0]; d(i+51,:) = [1 1 1 1 1 1]; end %%%%%%%%%%%%%%% % Link Weights %%%%%%%%%%%%%%% ct = 1; m_r = max(r); for i = 1:n, L_w{i} = zeros(m_r(i),6); end ct = 1; %%% Link weights and output calculations Z_out = zeros(52,6); while (ct <= 52), ct %for mn = 1:6 L = zeros(8,6); for i = 1:n, if (r(ct,i)~=0), for j = 1:6, if (d(ct,j)==0), L_w{i}(r(ct,i),j) = L_w{i}(r(ct ,i),j)-0.05*20;% /i ; else L_w{i}(r(ct,i),j) = L_w{i}(r(ct ,i),j)+0.05*20;% /i ; end%%End if loop end%%% End for loop L(i,:) = L_w{i}(r(ct,i),:); end%%%End for loop end%%% End for loop Z_out(ct,:) = sum(L); ct = ct+1; end save L_w L_w

function edu_LAMSTARtest (test_pattern) X = test_pattern; nu = 0.04; k = 1; figure(1) load W_norm load L_w % To make 12 subwords from 1 input for i = 1:min(size(X)), X_r{i} = reshape(X(:,i),6,8); for k =1:6:2 for m=1:8:2 for j = 1:n, temp = X_r{i}(k:k+1,m:m+1); temp = reshape(temp,1,4) X_in{i}(j,:) = temp(1:4)'; end end end

Page 44: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

p(1,:) = zeros(1,4); for k = 1:12, for t = 1:4, if (X_in{i}(k,t)~= p(1,t)), X_norm{i}(k,:) = X_in{i}(k,:)/sqrt( sum(X_in{i}(k,:).^2)); else X_norm{i}(k,:) = zeros(1,4); end end end for k = 1:12, Z = X_norm{i}(k,:)*W_norm{k}; if (max(Z) == 0), Z_out(k,:) = [0 0]; else index(k) = find(Z == max(Z)); L(k,:) = L_w{k}(index(k),:); Z_out(k,:) = L(k,:)*Z(index(k)); end end final_Z(i) = sum(Z_out) end

function varargout = charGUI4(varargin) %{ H = CHARGUI4 returns the handle to a new CHAR GUI4 or the handle to the existing singleton*. CHARGUI4('CALLBACK',hObject,eventData,handles ,...) calls the local function named CALLBACK in CHARGUI4.M with th e given input arguments. CHARGUI4('Property','Value',...) creates a ne w CHARGUI4 or raises the existing singleton*. Starting from the left, property value pairs are applied to the GUI before charGUI_OpeningFunc tion gets called. An unrecognized property name or invalid value m akes property application stop. All inputs are passed to charGUI4_Open ingFcn via varargin. %} gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @charGUI4_Open ingFcn, ... 'gui_OutputFcn', @charGUI4_Outp utFcn, ... 'gui_LayoutFcn', [] , ... 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before charGUI4 is made visible . function charGUI4_OpeningFcn(hObject, eventdata, ha ndles, varargin) % This function has no output args, see OutputFcn. % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % varargin command line arguments to charGUI4 (se e VARARGIN) load data; assignin('base','net',net); % Choose default command line output for charGUI4 handles.output = hObject; % Update handles structure

Page 45: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

guidata(hObject, handles); % UIWAIT makes charGUI4 wait for user response (see UIRESUME) % uiwait(handles.figure1); % --- Outputs from this function are returned to th e command line. function varargout = charGUI4_OutputFcn(hObject, ev entdata, handles) % varargout cell array for returning output args ( see VARARGOUT); % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Get default command line output from handles stru cture varargout{1} = handles.output; % --- Executes on button press in pbLoadSet. function pbLoad_Callback(hObject, eventdata, handle s) % hObject handle to pbLoadSet (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); S = imread([pathname,filename]); axes(handles.axes1); imshow(S); handles.S = S; guidata(hObject, handles); % --- Executes on button press in pbSelect. function pbSelect_Callback(hObject, eventdata, hand les) % hObject handle to pbSelect (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) S = handles.S; axes(handles.axes1); % Selection of location if isfield(handles,'api') handles.api.delete(); rmfield(handles,'api'); rmfield(handles,'hRect'); axes(handles.axes1); imshow(S); end axes(handles.axes1); sz = size(S); handles.hRect = imrect(gca,[round(sz(2)/2) round(sz (1)/2) 20 20]); % Select object handles.api = iptgetapi(handles.hRect); guidata(hObject, handles); %{ img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop; %} %{ guidata(hObject, handles); S = handles.S; axes(handles.axes1); img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop; %}

Page 46: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

%{ axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% x=x/4;% Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) selected_col = get(handles.edit5,'string'); selected_col = evalin('base',selected_col); selected_ln = get(handles.edit6,'string'); selected_ln = evalin('base',selected_ln); for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r '); end guidata(hObject, handles); %} % --- Executes on button press in pbCrop. function pbCrop_Callback(hObject, eventdata, handle s) handles.loc = handles.api.getPosition(); axes(handles.axes1); S = handles.S; handles.img_crop = imcrop(S,handles.loc); axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% disp(x); x=x/4;% handels.x = x; Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) [y, selected_col] = size(handles.Ibox); selected_ln = 1; for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r' ); end i=1; for cnt = 1:selected_ln * selected_col %rectangle('position',Ibox(:,cnt),'edgecolor',' r');

Page 47: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

X=Ibox(:,cnt); a=X(2,1); b=X(1,1); c=X(3,1)+b; d=X(4,1)+a; %handles.cutImg = I(a:d,b:c); if(i==1) i=i+1; cutImg1 = I(a:d,b:c); handles.cutImg1 = cutImg1; axes(handles.axes2); imshow(handles.cutImg1); elseif(i==2) i=i+1; handles.cutImg2 = I(a:d,b:c); axes(handles.axes13); imshow(handles.cutImg2); elseif(i==3) i=i+1; handles.cutImg3 = I(a:d,b:c); axes(handles.axes26); imshow(handles.cutImg3); end end guidata(hObject, handles); % --- Executes on button press in pbPreprocess. function pbPreprocess_Callback(hObject, eventdata, handles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; img_crop1 = handles.cutImg1; %imgGray1 = rgb2gray(img_crop1); bw11 = im2bw(img_crop1,graythresh(img_c rop1)); axes(handles.axes31); imshow(bw11); bw12 = edu_imgcrop(bw11); axes(handles.axes40); imshow(bw12); handles.bw12 = bw12; elseif(i==2) i=i+1; img_crop2 = handles.cutImg2; %imgGray2 = rgb2gray(img_crop2); bw21 = im2bw(img_crop2,graythresh(img_ crop2)); axes(handles.axes32); imshow(bw21); bw22 = edu_imgcrop(bw21); axes(handles.axes41); imshow(bw22); handles.bw22 = bw22; elseif(i==3) i=i+1; img_crop3 = handles.cutImg3;

Page 48: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

%imgGray3 = rgb2gray(img_crop3); bw31 = im2bw(img_crop3,graythresh(img_ crop3)); axes(handles.axes35); imshow(bw31); bw32 = edu_imgcrop(bw31); axes(handles.axes44); imshow(bw32); handles.bw32 = bw32; end end guidata(hObject, handles); % --- Executes on button press in pbExtract. function pbExtract_Callback(hObject, eventdata, han dles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; bw12 = handles.bw12; charvec1 = edu_imgresize(bw12); axes(handles.axes5); plotchar(charvec1); handles.charvec1 = charvec1; elseif(i==2) i=i+1; bw22 = handles.bw22; charvec2 = edu_imgresize(bw22); axes(handles.axes49); plotchar(charvec2); handles.charvec2 = charvec2; elseif(i==3) i=i+1; bw32 = handles.bw32; charvec3 = edu_imgresize(bw32); axes(handles.axes50); plotchar(charvec3); handles.charvec3 = charvec3; end end guidata(hObject, handles); % --- Executes on button press in pbRecognize. function pbRecognize_Callback(hObject, eventdata, h andles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; charvec1 = handles.charvec1; tic %edu_LAMSTARtest(charvec1); if (handles.training == 0) selected_net = handles.net; result1 = edu_simnet(selected_net,c harvec1); [val, num1] = max(result1); elseif (handles.training == 1) models = handles.models;

Page 49: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

numClasses = handles.numClasses; for j=1:35 svm_charvec1(1,j) = charvec1(j, 1); end num1 = edu_testsvm(numClasses,svm_c harvec1,models); end if(num1 == 1) S1 = 'A '; elseif(num1 == 2) S1 = 'B '; elseif(num1 == 3) S1 = 'C '; elseif(num1 == 4) S1 = 'D '; elseif(num1 == 5) S1 = 'E '; elseif(num1 == 6) S1 = 'F '; elseif(num1 == 7) S1 = 'G '; elseif(num1 == 8) S1 = 'H '; elseif(num1 == 9) S1 = 'I '; elseif(num1 == 10) S1 = 'J '; elseif(num1 == 11) S1 = 'K '; elseif(num1 == 12) S1 = 'L '; elseif(num1 == 13) S1 = 'M '; elseif(num1 == 14) S1 = 'N '; elseif(num1 == 15) S1 = 'O '; elseif(num1 == 16) S1 = 'P '; elseif(num1 == 17) S1 = 'Q '; elseif(num1 == 18) S1 = 'R '; elseif(num1 == 19) S1 = 'S '; elseif(num1 == 20) S1 = 'T '; elseif(num1 == 21) S1 = 'U '; elseif(num1 == 22) S1 = 'V '; elseif(num1 == 23) S1 = 'W '; elseif(num1 == 24) S1 = 'X '; elseif(num1 == 25) S1 = 'Y '; elseif(num1 == 26) S1 = 'Z '; elseif(num1 == 27) S1 = 'a '; elseif(num1 == 28) S1 = 'b '; elseif(num1 == 29) S1 = 'c '; elseif(num1 == 30) S1 = 'd '; elseif(num1 == 31) S1 = 'e '; elseif(num1 == 32) S1 = 'f ';

Page 50: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num1 == 33) S1 = 'g '; elseif(num1 == 34) S1 = 'h '; elseif(num1 == 35) S1 = 'i '; elseif(num1 == 36) S1 = 'j '; elseif(num1 == 37) S1 = 'k '; elseif(num1 == 38) S1 = 'l '; elseif(num1 == 39) S1 = 'm '; elseif(num1 == 40) S1 = 'n '; elseif(num1 == 41) S1 = 'o '; elseif(num1 == 42) S1 = 'p '; elseif(num1 == 43) S1 = 'q '; elseif(num1 == 44) S1 = 'r '; elseif(num1 == 45) S1 = 's '; elseif(num1 == 46) S1 = 't '; elseif(num1 == 47) S1 = 'u '; elseif(num1 == 48) S1 = 'v '; elseif(num1 == 49) S1 = 'w '; elseif(num1 == 50) S1 = 'x '; elseif(num1 == 51) S1 = 'y '; elseif(num1 == 52) S1 = 'z '; end S10 = strcat(S1); toc elseif(i==2) i=i+1; charvec2 = handles.charvec2; if (handles.training == 0) selected_net = handles.net; result2 = edu_simnet(selected_net,c harvec2); [val, num2] = max(result2); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec2(1,j) = charvec2(j, 1); end num2 = edu_testsvm(numClasses,svm_c harvec2,models); end if(num2 == 1) S2 = 'A '; elseif(num2 == 2) S2 = 'B '; elseif(num2 == 3) S2 = 'C '; elseif(num2 == 4) S2 = 'D '; elseif(num2 == 5) S2 = 'E ';

Page 51: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num2 == 6) S2 = 'F '; elseif(num2 == 7) S2 = 'G '; elseif(num2 == 8) S2 = 'H '; elseif(num2 == 9) S2 = 'I '; elseif(num2 == 10) S2 = 'J '; elseif(num2 == 11) S2 = 'K '; elseif(num2 == 12) S2 = 'L '; elseif(num2 == 13) S2 = 'M '; elseif(num2 == 14) S2 = 'N '; elseif(num2 == 15) S2 = 'O '; elseif(num2 == 16) S2 = 'P '; elseif(num2 == 17) S2 = 'Q '; elseif(num2 == 18) S2 = 'R '; elseif(num2 == 19) S2 = 'S '; elseif(num2 == 20) S2 = 'T '; elseif(num2 == 21) S2 = 'U '; elseif(num2 == 22) S2 = 'V '; elseif(num2 == 23) S2 = 'W '; elseif(num2 == 24) S2 = 'X '; elseif(num2 == 25) S2 = 'Y '; elseif(num2 == 26) S2 = 'Z '; elseif(num2 == 27) S2 = 'a '; elseif(num2 == 28) S2 = 'b '; elseif(num2 == 29) S2 = 'c '; elseif(num2 == 30) S2 = 'd '; elseif(num2 == 31) S2 = 'e '; elseif(num2 == 32) S2 = 'f '; elseif(num2 == 33) S2 = 'g '; elseif(num2 == 34) S2 = 'h '; elseif(num2 == 35) S2 = 'i '; elseif(num2 == 36) S2 = 'j '; elseif(num2 == 37) S2 = 'k '; elseif(num2 == 38) S2 = 'l '; elseif(num2 == 39) S2 = 'm '; elseif(num2 == 40) S2 = 'n '; elseif(num2 == 41)

Page 52: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S2 = 'o '; elseif(num2 == 42) S2 = 'p '; elseif(num2 == 43) S2 = 'q '; elseif(num2 == 44) S2 = 'r '; elseif(num2 == 45) S2 = 's '; elseif(num2 == 46) S2 = 't '; elseif(num2 == 47) S2 = 'u '; elseif(num2 == 48) S2 = 'v '; elseif(num2 == 49) S2 = 'w '; elseif(num2 == 50) S2 = 'x '; elseif(num2 == 51) S2 = 'y '; elseif(num2 == 52) S2 = 'z '; end S10 = strcat(S1,',',S2) elseif(i==3) i=i+1; charvec3 = handles.charvec3; if (handles.training == 0) selected_net = handles.net; result3 = edu_simnet(selected_net,c harvec3); [val, num3] = max(result3); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec3(1,j) = charvec3(j, 1); end num3 = edu_testsvm(numClasses,svm_c harvec3,models); end if(num3 == 1) S3 = 'A '; elseif(num3 == 2) S3 = 'B '; elseif(num3 == 3) S3 = 'C '; elseif(num3 == 4) S3 = 'D '; elseif(num3 == 5) S3 = 'E '; elseif(num3 == 6) S3 = 'F '; elseif(num3 == 7) S3 = 'G '; elseif(num3 == 8) S3 = 'H '; elseif(num3 == 9) S3 = 'I '; elseif(num3 == 10) S3 = 'J '; elseif(num3 == 11) S3 = 'K '; elseif(num3 == 12) S3 = 'L '; elseif(num3 == 13) S3 = 'M '; elseif(num3 == 14) S3 = 'N ';

Page 53: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num3 == 15) S3 = 'O '; elseif(num3 == 16) S3 = 'P '; elseif(num3 == 17) S3 = 'Q '; elseif(num3 == 18) S3 = 'R '; elseif(num3 == 19) S3 = 'S '; elseif(num3 == 20) S3 = 'T '; elseif(num3 == 21) S3 = 'U '; elseif(num3 == 22) S3 = 'V '; elseif(num3 == 23) S3 = 'W '; elseif(num3 == 24) S3 = 'X '; elseif(num3 == 25) S3 = 'Y '; elseif(num3 == 26) S3 = 'Z '; elseif(num3 == 27) S3 = 'a '; elseif(num3 == 28) S3 = 'b '; elseif(num3 == 29) S3 = 'c '; elseif(num3 == 30) S3 = 'd '; elseif(num3 == 31) S3 = 'e '; elseif(num3 == 32) S3 = 'f '; elseif(num3 == 33) S3 = 'g '; elseif(num3 == 34) S3 = 'h '; elseif(num3 == 35) S3 = 'i '; elseif(num3 == 36) S3 = 'j '; elseif(num3 == 37) S3 = 'k '; elseif(num3 == 38) S3 = 'l '; elseif(num3 == 39) S3 = 'm '; elseif(num3 == 40) S3 = 'n '; elseif(num3 == 41) S3 = 'o '; elseif(num3 == 42) S3 = 'p '; elseif(num3 == 43) S3 = 'q '; elseif(num3 == 44) S3 = 'r '; elseif(num3 == 45) S3 = 's '; elseif(num3 == 46) S3 = 't '; elseif(num3 == 47) S3 = 'u '; elseif(num3 == 48) S3 = 'v '; elseif(num3 == 49) S3 = 'w '; elseif(num3 == 50)

Page 54: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S3 = 'x '; elseif(num3 == 51) S3 = 'y '; elseif(num3 == 52) S3 = 'z '; end S10 = strcat(S1,',',S2,',',S3) end end set(handles.editResult, 'string',S10); guidata(hObject, handles); function pbNN_Callback(hObject, eventdata, handles) % --- Executes on button press in pbSVM function SVM_Callback(hObject, eventdata, handles) handles.training = 1; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function SVM_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbBP function BP_Callback(hObject, eventdata, handles) handles.training = 0; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function BP_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function editResult_Callback(hObject, eventdata, ha ndles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editResult as text % str2double(get(hObject,'String')) returns contents of editResult as a double

Page 55: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% --- Executes during object creation, after settin g all properties. function editResult_CreateFcn(hObject, eventdata, h andles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbLoad Training S et. function pbLoadSet_Callback(hObject, eventdata, han dles) % hObject handle to pushbutton6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); trimg = imread([pathname,filename]); selected_ln = 5; selected_col = 52; img = edu_imgpreprocess(trimg,selected_col,selected _ln); for cnt = 1:selected_ln * selected_col bw2 = edu_imgcrop(img{cnt}); charvec = edu_imgresize(bw2); out(:,cnt) = charvec; svm_out(cnt,:) = charvec; end if (handles.training == 0) P = out(:,1:208); T = [eye(52) eye(52) eye(52) eye(52)]; elseif (handles.training == 1) TrainingSet = svm_out(1:208,:); Labels = [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 6 17 18 19 20 21 22 23 24 25 26 27 28 ... 29 30 31 32 33 34 35 36 37 38 39 40 41 42 4 3 44 45 46 47 48 49 50 51 52]; GroupTrain = [Labels Labels Labels Labels]; end if (handles.training == 0) net = edu_createnn(P,T); handles.net = net; elseif (handles.training == 1) tic [models,numClasses] = edu_multisvm(TrainingSet, GroupTrain); toc handles.models = models; handles.numClasses = numClasses; end %LAMSTARtrain(P); guidata(hObject, handles); function edit5_Callback(hObject, eventdata, handles ) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit5 as text % str2double(get(hObject,'String')) returns contents of edit5 as a double % --- Executes during object creation, after settin g all properties. function edit5_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor'))

Page 56: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

set(hObject,'BackgroundColor','white'); end function edit6_Callback(hObject, eventdata, handles ) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit6 as text % str2double(get(hObject,'String')) returns contents of edit6 as a double % --- Executes during object creation, after settin g all properties. function edit6_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function edit9_Callback(hObject, eventdata, handles ) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit9 as text % str2double(get(hObject,'String')) returns contents of edit9 as a double % --- Executes during object creation, after settin g all properties. function edit9_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on key press with focus on pbLoad an d none of its controls. function pbLoad_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to pbLoad (see GCBO) % eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA) % --- Executes on key press with focus on editNN an d none of its controls. function editNN_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to editNN (see GCBO) % eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA)

o Method ‘3’: Support Vector Machine

function img = edu_imgpreprocess(I, selected_col,selected_ln) Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw));

Page 57: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

se = strel('square',3); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox); x=x/4; Ibox = reshape(Ibox,[4 x]); Ic = [Iprops.Centroid]; [z,w]=size(Ic);% w=w/2;% Ic = reshape(Ic,[2 w]); Ic = Ic'; Ic(:,3) = (mean(Ic.^2,2)).^(1/2); Ic(:,4) = 1:w; Ic2 = sortrows(Ic,2); for cnt = 1:selected_ln Ic2((cnt-1)*selected_col+1:cnt*selected_col,:) = sortrows(Ic2((cnt-1)*selected_col+1:cnt*selected_col,:),4); end Ic3 = Ic2(:,1:2); ind = Ic2(:,4); for cnt = 1:selected_ln*selected_col img{cnt} = imcrop(Ibw,Ibox(:,ind(cnt))); end

function bw2 = edu_imgcrop(bw) % Find the boundary of the image [y2temp x2temp] = size(bw); x1=1; y1=1; x2=x2temp; y2=y2temp; % Finding left side blank spaces cntB=1; while (sum(bw(:,cntB))==y2temp) x1=x1+1; cntB=cntB+1; end % Finding right side blank spaces cntB=1; while (sum(bw(cntB,:))==x2temp) y1=y1+1; cntB=cntB+1; end % Finding upper side blank spaces cntB=x2temp; while (sum(bw(:,cntB))==y2temp) x2=x2-1; cntB=cntB-1; end % Finding lower side blank spaces cntB=y2temp; while (sum(bw(cntB,:))==x2temp) y2=y2-1; cntB=cntB-1; end

Page 58: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% Crop the image to the edge bw2=imcrop(bw,[x1,y1,(x2-x1),(y2-y1)]);

function lett = edu_imgresize(bw2) % This function will take the cropped binary image and change it to 7 x 5 % character representation in single vector. bw_7050=imresize(bw2,[70,50]); for cnt=1:7 for cnt2=1:5 Atemp=sum(bw_7050((cnt*10-9:cnt*10),(cnt2*1 0-9:cnt2*10))); lett((cnt-1)*5+cnt2)=sum(Atemp); end end lett=((100-lett)/100); fid = fopen('imageresize.txt','w'); lett=lett'; fprintf(fid,'%6.2f \n',lett); fclose(fid);

function [model] = edu_svmtrain(X, Y, C) tol = 1e-3; max_passes = 5; % Data parameters m = size(X, 1); n = size(X, 2); % Map 0 to -1 Y(Y==0) = -1; % Variables alphas = zeros(m, 1); b = 0; E = zeros(m, 1); passes = 0; eta = 0; L = 0; H = 0; K = X*X'; % Train dots = 12; while passes < max_passes, num_changed_alphas = 0; for i = 1:m, % E(i) = b + sum (X(i, :) * (repmat(alphas. *Y,1,n).*X)') - Y(i); E(i) = b + sum (alphas.*Y.*K(:,i)) - Y(i); if ((Y(i)*E(i) < -tol && alphas(i) < C) || (Y(i)*E(i) > tol && alphas(i) > 0)), j = ceil(m * rand()); while j == i, % Make sure i \neq j j = ceil(m * rand()); end E(j) = b + sum (alphas.*Y.*K(:,j)) - Y( j); % Save old alphas alpha_i_old = alphas(i);

Page 59: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

alpha_j_old = alphas(j); if (Y(i) == Y(j)), L = max(0, alphas(j) + alphas(i) - C); H = min(C, alphas(j) + alphas(i)); else L = max(0, alphas(j) - alphas(i)); H = min(C, C + alphas(j) - alphas(i )); end if (L == H), % continue to next i. continue; end eta = 2 * K(i,j) - K(i,i) - K(j,j); if (eta >= 0), % continue to next i. continue; end alphas(j) = alphas(j) - (Y(j) * (E(i) - E(j))) / eta; % Clip alphas(j) = min (H, alphas(j)); alphas(j) = max (L, alphas(j)); % Check if change in alpha is significa nt if (abs(alphas(j) - alpha_j_old) < tol) , % continue to next i. % replace anyway alphas(j) = alpha_j_old; continue; end alphas(i) = alphas(i) + Y(i)*Y(j)*(alph a_j_old - alphas(j)); b1 = b - E(i) ... - Y(i) * (alphas(i) - alpha_i_old) * K(i,j)' ... - Y(j) * (alphas(j) - alpha_j_old) * K(i,j)'; b2 = b - E(j) ... - Y(i) * (alphas(i) - alpha_i_old) * K(i,j)' ... - Y(j) * (alphas(j) - alpha_j_old) * K(j,j)'; if (0 < alphas(i) && alphas(i) < C), b = b1; elseif (0 < alphas(j) && alphas(j) < C) , b = b2; else b = (b1+b2)/2; end num_changed_alphas = num_changed_alphas + 1; end end if (num_changed_alphas == 0), passes = passes + 1; else passes = 0; end fprintf('.'); dots = dots + 1; if dots > 78 dots = 0; fprintf('\n'); end end %fprintf(' Done! \n\n');

Page 60: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% Save the model idx = alphas > 0; model.X= X(idx,:); model.y= Y(idx); model.kernelFunction = kernelFunction; model.b= b; model.alphas= alphas(idx); model.w = ((alphas.*Y)'*X)'; end

function result = edu_testsvm(numClasses,TestSet,models) result = zeros(length(TestSet(:,1)),1); %classify test cases for k=1:numClasses if(svmclassify(models(k),TestSet)) break; end end result = k;

function varargout = charGUI4(varargin) %{ H = CHARGUI4 returns the handle to a new CHAR GUI4 or the handle to the existing singleton*. CHARGUI4('CALLBACK',hObject,eventData,handles ,...) calls the local function named CALLBACK in CHARGUI4.M with th e given input arguments. CHARGUI4('Property','Value',...) creates a ne w CHARGUI4 or raises the existing singleton*. Starting from the left, property value pairs are applied to the GUI before charGUI_OpeningFunc tion gets called. An unrecognized property name or invalid value m akes property application stop. All inputs are passed to charGUI4_Open ingFcn via varargin. %} gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @charGUI4_Open ingFcn, ... 'gui_OutputFcn', @charGUI4_Outp utFcn, ... 'gui_LayoutFcn', [] , ... 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before charGUI4 is made visible . function charGUI4_OpeningFcn(hObject, eventdata, ha ndles, varargin) % This function has no output args, see OutputFcn. % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % varargin command line arguments to charGUI4 (se e VARARGIN) load data; assignin('base','net',net); % Choose default command line output for charGUI4 handles.output = hObject;

Page 61: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% Update handles structure guidata(hObject, handles); % UIWAIT makes charGUI4 wait for user response (see UIRESUME) % uiwait(handles.figure1); % --- Outputs from this function are returned to th e command line. function varargout = charGUI4_OutputFcn(hObject, ev entdata, handles) % varargout cell array for returning output args ( see VARARGOUT); % hObject handle to figure % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Get default command line output from handles stru cture varargout{1} = handles.output; % --- Executes on button press in pbLoadSet. function pbLoad_Callback(hObject, eventdata, handle s) % hObject handle to pbLoadSet (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); S = imread([pathname,filename]); axes(handles.axes1); imshow(S); handles.S = S; guidata(hObject, handles); % --- Executes on button press in pbSelect. function pbSelect_Callback(hObject, eventdata, hand les) % hObject handle to pbSelect (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) S = handles.S; axes(handles.axes1); % Selection of location if isfield(handles,'api') handles.api.delete(); rmfield(handles,'api'); rmfield(handles,'hRect'); axes(handles.axes1); imshow(S); end axes(handles.axes1); sz = size(S); handles.hRect = imrect(gca,[round(sz(2)/2) round(sz (1)/2) 20 20]); % Select object handles.api = iptgetapi(handles.hRect); guidata(hObject, handles); %{ img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop; %} %{ guidata(hObject, handles); S = handles.S; axes(handles.axes1); img_crop = imcrop(S); axes(handles.axes2); imshow(img_crop); handles.img_crop = img_crop;

Page 62: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

%} %{ axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% x=x/4;% Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) selected_col = get(handles.edit5,'string'); selected_col = evalin('base',selected_col); selected_ln = get(handles.edit6,'string'); selected_ln = evalin('base',selected_ln); for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r '); end guidata(hObject, handles); %} % --- Executes on button press in pbCrop. function pbCrop_Callback(hObject, eventdata, handle s) handles.loc = handles.api.getPosition(); axes(handles.axes1); S = handles.S; handles.img_crop = imcrop(S,handles.loc); axes(handles.axes12); I=handles.img_crop; Igray = rgb2gray(I); Ibw = im2bw(Igray,graythresh(Igray)); Iedge = edge(uint8(Ibw)); se = strel('square',2); Iedge2 = imdilate(Iedge, se); Ifill= imfill(Iedge2,'holes'); [Ilabel num] = bwlabel(Ifill); Iprops = regionprops(Ilabel); Ibox = [Iprops.BoundingBox]; [y,x]=size(Ibox);% disp(x); x=x/4;% handels.x = x; Ibox = reshape(Ibox,[4 x]);% handles.Ibox=Ibox; imshow(I) [y, selected_col] = size(handles.Ibox); selected_ln = 1; for cnt = 1:selected_ln * selected_col rectangle('position',Ibox(:,cnt),'edgecolor','r' ); end i=1; for cnt = 1:selected_ln * selected_col

Page 63: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

%rectangle('position',Ibox(:,cnt),'edgecolor',' r'); X=Ibox(:,cnt); a=X(2,1); b=X(1,1); c=X(3,1)+b; d=X(4,1)+a; %handles.cutImg = I(a:d,b:c); if(i==1) i=i+1; cutImg1 = I(a:d,b:c); handles.cutImg1 = cutImg1; axes(handles.axes2); imshow(handles.cutImg1); elseif(i==2) i=i+1; handles.cutImg2 = I(a:d,b:c); axes(handles.axes13); imshow(handles.cutImg2); elseif(i==3) i=i+1; handles.cutImg3 = I(a:d,b:c); axes(handles.axes26); imshow(handles.cutImg3); end end guidata(hObject, handles); % --- Executes on button press in pbPreprocess. function pbPreprocess_Callback(hObject, eventdata, handles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; img_crop1 = handles.cutImg1; %imgGray1 = rgb2gray(img_crop1); bw11 = im2bw(img_crop1,graythresh(img_c rop1)); axes(handles.axes31); imshow(bw11); bw12 = edu_imgcrop(bw11); axes(handles.axes40); imshow(bw12); handles.bw12 = bw12; elseif(i==2) i=i+1; img_crop2 = handles.cutImg2; %imgGray2 = rgb2gray(img_crop2); bw21 = im2bw(img_crop2,graythresh(img_ crop2)); axes(handles.axes32); imshow(bw21); bw22 = edu_imgcrop(bw21); axes(handles.axes41); imshow(bw22); handles.bw22 = bw22; elseif(i==3) i=i+1;

Page 64: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

img_crop3 = handles.cutImg3; %imgGray3 = rgb2gray(img_crop3); bw31 = im2bw(img_crop3,graythresh(img_ crop3)); axes(handles.axes35); imshow(bw31); bw32 = edu_imgcrop(bw31); axes(handles.axes44); imshow(bw32); handles.bw32 = bw32; end end guidata(hObject, handles); % --- Executes on button press in pbExtract. function pbExtract_Callback(hObject, eventdata, han dles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; bw12 = handles.bw12; charvec1 = edu_imgresize(bw12); axes(handles.axes5); plotchar(charvec1); handles.charvec1 = charvec1; elseif(i==2) i=i+1; bw22 = handles.bw22; charvec2 = edu_imgresize(bw22); axes(handles.axes49); plotchar(charvec2); handles.charvec2 = charvec2; elseif(i==3) i=i+1; bw32 = handles.bw32; charvec3 = edu_imgresize(bw32); axes(handles.axes50); plotchar(charvec3); handles.charvec3 = charvec3; end end guidata(hObject, handles); % --- Executes on button press in pbRecognize. function pbRecognize_Callback(hObject, eventdata, h andles) [y, selected_col] = size(handles.Ibox); i=1; while(i<=selected_col) if(i==1) i = i+1; charvec1 = handles.charvec1; tic if (handles.training == 0) selected_net = handles.net; result1 = edu_simnet(selected_net,c harvec1); [val, num1] = max(result1); elseif (handles.training == 1) models = handles.models;

Page 65: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

numClasses = handles.numClasses; for j=1:35 svm_charvec1(1,j) = charvec1(j, 1); end num1 = edu_testsvm(numClasses,svm_c harvec1,models); end if(num1 == 1) S1 = 'A '; elseif(num1 == 2) S1 = 'B '; elseif(num1 == 3) S1 = 'C '; elseif(num1 == 4) S1 = 'D '; elseif(num1 == 5) S1 = 'E '; elseif(num1 == 6) S1 = 'F '; elseif(num1 == 7) S1 = 'G '; elseif(num1 == 8) S1 = 'H '; elseif(num1 == 9) S1 = 'I '; elseif(num1 == 10) S1 = 'J '; elseif(num1 == 11) S1 = 'K '; elseif(num1 == 12) S1 = 'L '; elseif(num1 == 13) S1 = 'M '; elseif(num1 == 14) S1 = 'N '; elseif(num1 == 15) S1 = 'O '; elseif(num1 == 16) S1 = 'P '; elseif(num1 == 17) S1 = 'Q '; elseif(num1 == 18) S1 = 'R '; elseif(num1 == 19) S1 = 'S '; elseif(num1 == 20) S1 = 'T '; elseif(num1 == 21) S1 = 'U '; elseif(num1 == 22) S1 = 'V '; elseif(num1 == 23) S1 = 'W '; elseif(num1 == 24) S1 = 'X '; elseif(num1 == 25) S1 = 'Y '; elseif(num1 == 26) S1 = 'Z '; elseif(num1 == 27) S1 = 'a '; elseif(num1 == 28) S1 = 'b '; elseif(num1 == 29) S1 = 'c '; elseif(num1 == 30) S1 = 'd '; elseif(num1 == 31) S1 = 'e '; elseif(num1 == 32) S1 = 'f ';

Page 66: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num1 == 33) S1 = 'g '; elseif(num1 == 34) S1 = 'h '; elseif(num1 == 35) S1 = 'i '; elseif(num1 == 36) S1 = 'j '; elseif(num1 == 37) S1 = 'k '; elseif(num1 == 38) S1 = 'l '; elseif(num1 == 39) S1 = 'm '; elseif(num1 == 40) S1 = 'n '; elseif(num1 == 41) S1 = 'o '; elseif(num1 == 42) S1 = 'p '; elseif(num1 == 43) S1 = 'q '; elseif(num1 == 44) S1 = 'r '; elseif(num1 == 45) S1 = 's '; elseif(num1 == 46) S1 = 't '; elseif(num1 == 47) S1 = 'u '; elseif(num1 == 48) S1 = 'v '; elseif(num1 == 49) S1 = 'w '; elseif(num1 == 50) S1 = 'x '; elseif(num1 == 51) S1 = 'y '; elseif(num1 == 52) S1 = 'z '; end S10 = strcat(S1); toc elseif(i==2) i=i+1; charvec2 = handles.charvec2; if (handles.training == 0) selected_net = handles.net; result2 = edu_simnet(selected_net,c harvec2); [val, num2] = max(result2); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec2(1,j) = charvec2(j, 1); end num2 = edu_testsvm(numClasses,svm_c harvec2,models); end if(num2 == 1) S2 = 'A '; elseif(num2 == 2) S2 = 'B '; elseif(num2 == 3) S2 = 'C '; elseif(num2 == 4) S2 = 'D '; elseif(num2 == 5) S2 = 'E ';

Page 67: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num2 == 6) S2 = 'F '; elseif(num2 == 7) S2 = 'G '; elseif(num2 == 8) S2 = 'H '; elseif(num2 == 9) S2 = 'I '; elseif(num2 == 10) S2 = 'J '; elseif(num2 == 11) S2 = 'K '; elseif(num2 == 12) S2 = 'L '; elseif(num2 == 13) S2 = 'M '; elseif(num2 == 14) S2 = 'N '; elseif(num2 == 15) S2 = 'O '; elseif(num2 == 16) S2 = 'P '; elseif(num2 == 17) S2 = 'Q '; elseif(num2 == 18) S2 = 'R '; elseif(num2 == 19) S2 = 'S '; elseif(num2 == 20) S2 = 'T '; elseif(num2 == 21) S2 = 'U '; elseif(num2 == 22) S2 = 'V '; elseif(num2 == 23) S2 = 'W '; elseif(num2 == 24) S2 = 'X '; elseif(num2 == 25) S2 = 'Y '; elseif(num2 == 26) S2 = 'Z '; elseif(num2 == 27) S2 = 'a '; elseif(num2 == 28) S2 = 'b '; elseif(num2 == 29) S2 = 'c '; elseif(num2 == 30) S2 = 'd '; elseif(num2 == 31) S2 = 'e '; elseif(num2 == 32) S2 = 'f '; elseif(num2 == 33) S2 = 'g '; elseif(num2 == 34) S2 = 'h '; elseif(num2 == 35) S2 = 'i '; elseif(num2 == 36) S2 = 'j '; elseif(num2 == 37) S2 = 'k '; elseif(num2 == 38) S2 = 'l '; elseif(num2 == 39) S2 = 'm '; elseif(num2 == 40) S2 = 'n '; elseif(num2 == 41)

Page 68: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S2 = 'o '; elseif(num2 == 42) S2 = 'p '; elseif(num2 == 43) S2 = 'q '; elseif(num2 == 44) S2 = 'r '; elseif(num2 == 45) S2 = 's '; elseif(num2 == 46) S2 = 't '; elseif(num2 == 47) S2 = 'u '; elseif(num2 == 48) S2 = 'v '; elseif(num2 == 49) S2 = 'w '; elseif(num2 == 50) S2 = 'x '; elseif(num2 == 51) S2 = 'y '; elseif(num2 == 52) S2 = 'z '; end S10 = strcat(S1,',',S2) elseif(i==3) i=i+1; charvec3 = handles.charvec3; if (handles.training == 0) selected_net = handles.net; result3 = edu_simnet(selected_net,c harvec3); [val, num3] = max(result3); elseif (handles.training == 1) models = handles.models; numClasses = handles.numClasses; for j=1:35 svm_charvec3(1,j) = charvec3(j, 1); end num3 = edu_testsvm(numClasses,svm_c harvec3,models); end if(num3 == 1) S3 = 'A '; elseif(num3 == 2) S3 = 'B '; elseif(num3 == 3) S3 = 'C '; elseif(num3 == 4) S3 = 'D '; elseif(num3 == 5) S3 = 'E '; elseif(num3 == 6) S3 = 'F '; elseif(num3 == 7) S3 = 'G '; elseif(num3 == 8) S3 = 'H '; elseif(num3 == 9) S3 = 'I '; elseif(num3 == 10) S3 = 'J '; elseif(num3 == 11) S3 = 'K '; elseif(num3 == 12) S3 = 'L '; elseif(num3 == 13) S3 = 'M '; elseif(num3 == 14) S3 = 'N ';

Page 69: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

elseif(num3 == 15) S3 = 'O '; elseif(num3 == 16) S3 = 'P '; elseif(num3 == 17) S3 = 'Q '; elseif(num3 == 18) S3 = 'R '; elseif(num3 == 19) S3 = 'S '; elseif(num3 == 20) S3 = 'T '; elseif(num3 == 21) S3 = 'U '; elseif(num3 == 22) S3 = 'V '; elseif(num3 == 23) S3 = 'W '; elseif(num3 == 24) S3 = 'X '; elseif(num3 == 25) S3 = 'Y '; elseif(num3 == 26) S3 = 'Z '; elseif(num3 == 27) S3 = 'a '; elseif(num3 == 28) S3 = 'b '; elseif(num3 == 29) S3 = 'c '; elseif(num3 == 30) S3 = 'd '; elseif(num3 == 31) S3 = 'e '; elseif(num3 == 32) S3 = 'f '; elseif(num3 == 33) S3 = 'g '; elseif(num3 == 34) S3 = 'h '; elseif(num3 == 35) S3 = 'i '; elseif(num3 == 36) S3 = 'j '; elseif(num3 == 37) S3 = 'k '; elseif(num3 == 38) S3 = 'l '; elseif(num3 == 39) S3 = 'm '; elseif(num3 == 40) S3 = 'n '; elseif(num3 == 41) S3 = 'o '; elseif(num3 == 42) S3 = 'p '; elseif(num3 == 43) S3 = 'q '; elseif(num3 == 44) S3 = 'r '; elseif(num3 == 45) S3 = 's '; elseif(num3 == 46) S3 = 't '; elseif(num3 == 47) S3 = 'u '; elseif(num3 == 48) S3 = 'v '; elseif(num3 == 49) S3 = 'w '; elseif(num3 == 50)

Page 70: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

S3 = 'x '; elseif(num3 == 51) S3 = 'y '; elseif(num3 == 52) S3 = 'z '; end S10 = strcat(S1,',',S2,',',S3) end end set(handles.editResult, 'string',S10); guidata(hObject, handles); function pbNN_Callback(hObject, eventdata, handles) % --- Executes on button press in pbSVM function SVM_Callback(hObject, eventdata, handles) handles.training = 1; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function SVM_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbBP function BP_Callback(hObject, eventdata, handles) handles.training = 0; guidata(hObject, handles); % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editNN as text % str2double(get(hObject,'String')) returns contents of editNN as a double % --- Executes during object creation, after settin g all properties. function BP_CreateFcn(hObject, eventdata, handles) % hObject handle to editNN (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function editResult_Callback(hObject, eventdata, ha ndles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of editResult as text % str2double(get(hObject,'String')) returns contents of editResult as a double

Page 71: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

% --- Executes during object creation, after settin g all properties. function editResult_CreateFcn(hObject, eventdata, h andles) % hObject handle to editResult (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pbLoad Training S et. function pbLoadSet_Callback(hObject, eventdata, han dles) % hObject handle to pushbutton6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) [filename, pathname] = uigetfile({'*.bmp';'*.jpg';' *.gif';'*.*'}, 'Pick an Image File'); trimg = imread([pathname,filename]); selected_ln = 5; selected_col = 52; img = edu_imgpreprocess(trimg,selected_col,selected _ln); for cnt = 1:selected_ln * selected_col bw2 = edu_imgcrop(img{cnt}); charvec = edu_imgresize(bw2); out(:,cnt) = charvec; svm_out(cnt,:) = charvec; end if (handles.training == 0) P = out(:,1:208); T = [eye(52) eye(52) eye(52) eye(52)]; elseif (handles.training == 1) TrainingSet = svm_out(1:208,:); Labels = [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 6 17 18 19 20 21 22 23 24 25 26 27 28 ... 29 30 31 32 33 34 35 36 37 38 39 40 41 42 4 3 44 45 46 47 48 49 50 51 52]; GroupTrain = [Labels Labels Labels Labels]; end if (handles.training == 0) net = edu_createnn(P,T); handles.net = net; elseif (handles.training == 1) tic [models,numClasses] = edu_multisvm(TrainingSet, GroupTrain); toc handles.models = models; handles.numClasses = numClasses; end guidata(hObject, handles); function edit5_Callback(hObject, eventdata, handles ) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit5 as text % str2double(get(hObject,'String')) returns contents of edit5 as a double % --- Executes during object creation, after settin g all properties. function edit5_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit5 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white');

Page 72: Handwritten Character Recognition Using BP NN, …users.eecs.northwestern.edu/~mvb541/EECS349/Report.pdf · Handwritten Character Recognition Using BP ... The development of handwriting

end function edit6_Callback(hObject, eventdata, handles ) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit6 as text % str2double(get(hObject,'String')) returns contents of edit6 as a double % --- Executes during object creation, after settin g all properties. function edit6_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit6 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end function edit9_Callback(hObject, eventdata, handles ) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles structure with handles and user data ( see GUIDATA) % Hints: get(hObject,'String') returns contents of edit9 as text % str2double(get(hObject,'String')) returns contents of edit9 as a double % --- Executes during object creation, after settin g all properties. function edit9_CreateFcn(hObject, eventdata, handle s) % hObject handle to edit9 (see GCBO) % eventdata reserved - to be defined in a future v ersion of MATLAB % handles empty - handles not created until afte r all CreateFcns called % Hint: edit controls usually have a white backgrou nd on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on key press with focus on pbLoad an d none of its controls. function pbLoad_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to pbLoad (see GCBO) % eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA) % --- Executes on key press with focus on editNN an d none of its controls. function editNN_KeyPressFcn(hObject, eventdata, han dles) % hObject handle to editNN (see GCBO) % eventdata structure with the following fields (s ee MATLAB.UI.CONTROL.UICONTROL) % Key: name of the key that was pressed, in lower c ase % Character: character interpretation of the key(s) that was pressed % Modifier: name(s) of the modifier key(s) (i.e., c ontrol, shift) pressed % handles structure with handles and user data ( see GUIDATA)