24
Data Mining: Assignment 3.1 Antonio Villacorta Benito [email protected] February 12, 2013 1 Introduction In this assignment, several exercises related to classifier design using neural networks will be completed. 2 Activities SNNS is an efficient universal simulator of neural networks for Unix workstations and Unix PCs. It consists of a simulator kernel, written in ANSI C for efficiency and portability reasons, a graphical user interface under X11R6, a network compiler which is able to generate C-programs out of trained ANSI nets, a batch version of the simulator and various other analysis tools [3]. JavaNNS is the successor of SNNS. It is based on its computing kernel, with a newly developed, comfortable graphical user interface written in Java set on top of it. Hence the compatibility with SNNS is achieved, while the platform-independence is increased. [2]. JavaNNS has been used to complete the activities of this assignment, following some instructions from the user manual available at: http://www.ra.cs.uni-tuebingen.de/software/JavaNNS/manual/ 2.1 Training and Validation files loading in SNNS In the File menu, we select Open... and pick the training file tra.pat. We repeat the steps to load the validation file val.pat. In the menu Tools Control Panel, we select the Patterns tab. The two combo boxes - Training set and Validation set - are used for selecting the active training and validation set, respectively. We choose tra for training and val for validation. 1

Data Mining: T3 A

Embed Size (px)

Citation preview

Data Mining: Assignment 3.1

Antonio Villacorta Benito

[email protected]

February 12, 2013

1 Introduction

In this assignment, several exercises related to classifier design using neural networks willbe completed.

2 Activities

SNNS is an efficient universal simulator of neural networks for Unix workstations and UnixPCs. It consists of a simulator kernel, written in ANSI C for efficiency and portabilityreasons, a graphical user interface under X11R6, a network compiler which is able to generateC-programs out of trained ANSI nets, a batch version of the simulator and various otheranalysis tools [3].

JavaNNS is the successor of SNNS. It is based on its computing kernel, with a newlydeveloped, comfortable graphical user interface written in Java set on top of it. Hence thecompatibility with SNNS is achieved, while the platform-independence is increased. [2].

JavaNNS has been used to complete the activities of this assignment, following someinstructions from the user manual available at:http://www.ra.cs.uni-tuebingen.de/software/JavaNNS/manual/

2.1 Training and Validation files loading in SNNS

In the File menu, we select Open... and pick the training file tra.pat. We repeat thesteps to load the validation file val.pat. In the menu Tools → Control Panel, we selectthe Patterns tab. The two combo boxes - Training set and Validation set - are usedfor selecting the active training and validation set, respectively. We choose tra for trainingand val for validation.

1

2.2 Neural Network Structure: input and output layers

An artificial neural network is a computational device that consists of many simple connectedunits (neurons) that work in parallel. The connections between the units or nodes areweighted usually by real-valued weights. Weights are the primary means of learning inneural networks, and a learning algorithm is used to adjust the weights [1].

A neural network has three different classes of units: input, hidden, and output units.An activation pattern is presented on its input units and spreads in a forward directionfrom the input units through one or more layers of hidden units to the output units. Theactivation coming into a unit from other units is multiplied by the weights on the linksover which it spreads. All incoming activation is then added together and the unit becomesactivated only if the incoming result is above the unit’s threshold.

Our exercise consist of classifying between two star types using 100 wave lengths todistinguish among them.

The number of neurons comprising the input layer is equal to the number of featuresin our data. Some neural network configurations add one additional node for a bias term.As we don’t have any indication of adding such a term, we consider our input layer for theexercise to have Ne = 100 neurons.

Regarding the output layer, its number of neurons is completely determined by thechosen model configuration. As our network is a classifier in this exercise, the output layerhas one node per class label in our model, that is, Ns = 2 neurons.

2.3 Neural Network Creation

The next step is to create the neural network with a hidden layer with No neurons, whereNo = Ne+Ns

2= 51.

This is consistent with the common practice in the neural network design for the hiddenlayer; which basically recommends to set the hidden layer configuration using just two rules:(i) number of hidden layers equals one; and (ii) the number of neurons in that layer is themean of the neurons in the input and output layers.

2.4 Random weights initialization and backpropagation training

Training is performed through the Control Panel. In the Initializing tab, an initializa-tion function and its parameters can be set. We select the function Random Weights andclick the Init button to perform the initialization.

Under the Learning tab, we choose the learning function, Backpropagation in our case.We accept the default parameters provided.

The Learn current button performs training with the currently selected pattern andLearn all with all patterns from the pattern set. We want to use both the training andvalidation sets, so we click Learn all.

2

In order to monitor the learning progress, we open the Error Graph and the Log window,both available from the View menu. During learning, the error graph displays the errorcurve.

The error-backpropagation algorithm is one of the most important and widely usedlearning techniques for neural networks. Backpropagation is used almost exclusively withfeed forward, multi-layer perceptrons using continuous valued cells. Learning takes placebased upon mean squared error and gradient descent. Backpropagation allows to find thenetworks’ error weight gradient for a given pattern.

The steps of the algorithm are:

1. Initialize weights

2. For next pattern2.1 Perform a forward propagation step2.2 Perform backward propagation2.3 Update weights

3. Stop when total error is acceptable

Selecting the number of cycles to be 25 with 5 steps, we obtain after 11 iterations themean square errors and validation results shown in Appendix A. Corresponding graph isshown in figure 1. In this case, the final validation error obtained is around 0.15.

Figure 1: Backpropagation Error

3

2.5 Weight initialization variation: between 10% and 1000% of

default values

Initialization of coefficients of neural networks before their optimization represents a veryimportant but also complicated research topic. The main problem is to select the globaloptimum from all possible local minima on the error surface and to start the optimizationfrom the set of values as close to optimum as possible to minimize the number of the trainingcycles.

Initializing the weights with thresholds of 10% of the original values, i.e. between –0.1and 0.1, gives the results presented in Appendix B. Corresponding graph is shown in figure2. With the weights initialized as above, the network achieved a lower mean square error ina much shorter time. The validation error was 0.08 after just 100 iterations.

Figure 2: Backpropagation Error with weight initialization 10%

On the other hand, if we select initialization values with a threshold of 1000% of theoriginal values, i.e. between –10 and 10, does not allow the network the learn properly asboth the MSE and the validation errors are very high. Results are shown in Appendix C.Corresponding graph is shown in figure 3.

4

Figure 3: Backpropagation Error with weight initialization 1000%

2.6 Hidden layer structure impact on validation error

Removal of hidden units affects in a negative way the performance of the network for theexample studied. Removing 10 hidden units at a time, results in an increase of the validationerror from 0.14 to 0.32 after having removed 40 hidden units of the total of 55.

Results are presented in Appendix D.

2.7 Backpropagation Momentum Experiments

Deciding on the magnitude of the learning rate for the network is not an easy problem, toolarge and learning can oscillate, too small and convergence becomes very slow. The additionof a momentum term helps to smooth out the learning curve. Backpropagation momentumadds a momentum term to the weight update term. This normally causes the reduction oftraining times. As the name implies, the weight update is given some inertia to keep goingin the direction of the previous update.

Results for the backpropagation momentum test are shown in Appendix E. Correspond-ing graph is shown in figure 4.

Here the MSE error increases at first, while the validation error stays flat, until bothdecrease at around 30 iterations. There is an increase in the validation error at 80 iterations,which means that we have hit the maximum training we should perform over this network.

5

Figure 4: Backpropagation Momentum

Again, tests with values of 10% the initialization weights are performed. Results arepresented in Appendix F. Corresponding graph is shown in figure 5. As both errors areflat, the parameters used for this test are not meaningful and should be discarded.

Figure 5: Backpropagation Momentum with Weight Initialization 10%

Results of the tests with 1000% of the initialization weights are presented in Appendix G.

6

Corresponding graph is shown in figure 6.

Less validation error can be found in this case, but a big increase can be seen after 50iterations.

Minimum validation error for the backpropagation momentum is obtained with the de-fault parameters for a value around 0.1975.

Figure 6: Backpropagation Momentum with Weight Initialization 1000%

2.8 Backpropagation Weight Decay Experiments

Weight decay is a mechanism for dynamically keeping the number of hidden neurons to aminimum. During training, each weight is decayed according to the following step:

wij = (1− ǫ)wij

where ǫ is some small parameter between 0 and 1 and determines the magnitude of thedecay. Those weights that are not continuously reinforced, will gradually decay to 0. Whena weight is 0, it is effectively equivalent to disconnecting that input. Results of this test areincluded in Appendix H. Corresponding graph is shown in figure 7.

Results of the tests with 10% of the initialization weights are presented in Appendix I.Corresponding graph is shown in figure 8. Results of the tests with 1000% of the initializationweights are presented in Appendix J. Corresponding graph is shown in figure 9.

For the case of the default parameters, after 30 iterations both the MSE and validationerrors are decreasing at very low rate. Further iterations are not helping the model tohave better performance. Results with 1000% of the initialization weights have the lowestvalidation error, with a value around 0.608.

7

Figure 7: Backpropagation Weight Decay

Figure 8: Backpropagation Weight Decay with Weight Initialization 10%

8

Figure 9: Backpropagation Weight Decay with Weight Initialization 1000%

2.9 Weka Experiments

WEKA allows to test the same input using the MultilayerPerceptron classifier. Aftergenerating the corresponding ARFF input file, we load it in the Explorer. We perform anattribute selection using CfsSubsetEval evaluator to keep only the most relevant attributesand simplify the calculation.

Results obtained with Weka are presented in Appendix K.

For this execution, the following parameters have been used:

• Test option: Use training set

• Classifier: Multilayer Perceptron

• Weight decay: false

• Momentum: 0.2

• Hidden Layers: 1

Relative absolute error reported by the classifier is 0.369.

3 Conclusion

Several exercises with neural networks have been performed as part of this assignment.Backpropagation methods, including momentum and weight decay, have been studied usingthe JavaNNS tool. An additional test in Weka has also been presented.

9

References

[1] Cândida Ferreira. “Designing Neural Networks Using Gene Expression Programming”.In: 9th Online World Conference on Soft Computing in Industrial Applications, Septem-

ber 20. 2004.

[2] University of Tubingen. JavaNNS. http://www.ra.cs.uni-tuebingen.de/software/JavaNNS/welcome_e.html.2012.

[3] University of Tubingen. SNNS. http://www.ra.cs.uni-tuebingen.de/software/snns/welcome_e.html.2012.

10

Appendix A BackPropagation Results

Step 5 MSE : 0 .6385461330413819 validation : 0 .8256843566894532Step 10 MSE : 0 .6195818424224854 validation : 0 .8256843566894532Step 15 MSE : 0 .6111838817596436 validation : 0 .8256843566894532Step 20 MSE : 0 .13696887493133544 validation : 0 .8256843566894532Step 25 MSE : 0 .11046689748764038 validation : 0 .8256843566894532Step 30 MSE : 0 .09606691002845764 validation : 0 .65080246925354Step 35 MSE : 0 .08632581830024719 validation : 0 .65080246925354Step 40 MSE : 0 .07582169771194458 validation : 0 .65080246925354Step 45 MSE : 0 .06633694767951966 validation : 0 .65080246925354Step 50 MSE : 0 .05856606364250183 validation : 0 .65080246925354Step 55 MSE : 0 .04844736456871033 validation : 0 .39900884628295896Step 60 MSE : 0 .03881103098392487 validation : 0 .39900884628295896Step 65 MSE : 0 .03307296335697174 validation : 0 .39900884628295896Step 70 MSE : 0 .027684205770492555 validation : 0 .39900884628295896Step 75 MSE : 0 .023453566431999206 validation : 0 .39900884628295896Step 80 MSE : 0 .020055408775806426 validation : 0 .22285239696502684Step 85 MSE : 0 .01675291359424591 validation : 0 .22285239696502684Step 90 MSE : 0 .014078685641288757 validation : 0 .22285239696502684Step 95 MSE : 0 .012243212759494781 validation : 0 .22285239696502684Step 100 MSE : 0 .010242762416601181 validation : 0 .22285239696502684Step 105 MSE : 0 .008835302293300628 validation : 0 .16151795387268067Step 110 MSE : 0 .007454404234886169 validation : 0 .16151795387268067Step 115 MSE : 0 .006529396772384644 validation : 0 .16151795387268067Step 120 MSE : 0 .005403700470924378 validation : 0 .16151795387268067Step 125 MSE : 0 .004951256513595581 validation : 0 .16151795387268067Step 130 MSE : 0 .004579658061265946 validation : 0 .1448182225227356Step 135 MSE : 0 .0039053142070770265 validation : 0 .1448182225227356Step 140 MSE : 0 .0036826081573963164 validation : 0 .1448182225227356Step 145 MSE : 0 .003497226536273956 validation : 0 .1448182225227356Step 150 MSE : 0 .002844603545963764 validation : 0 .1448182225227356Step 155 MSE : 0 .0026634920388460158 validation : 0 .14298603534698487Step 160 MSE : 0 .002358761616051197 validation : 0 .14298603534698487Step 165 MSE : 0 .0019976142793893814 validation : 0 .14298603534698487Step 170 MSE : 0 .0022356173023581503 validation : 0 .14298603534698487Step 175 MSE : 0 .0018595172092318534 validation : 0 .14298603534698487Step 180 MSE : 0 .0014857038855552672 validation : 0 .14660775661468506Step 185 MSE : 0 .001415438950061798 validation : 0 .14660775661468506Step 190 MSE : 0 .0016158677637577057 validation : 0 .14660775661468506Step 195 MSE : 0 .0015712801367044448 validation : 0 .14660775661468506Step 200 MSE : 0 .0011937223374843598 validation : 0 .14660775661468506Step 205 MSE : 0 .001273664738982916 validation : 0 .14551749229431152Step 210 MSE : 0 .0011805791407823562 validation : 0 .14551749229431152Step 215 MSE : 0 .0011836985126137734 validation : 0 .14551749229431152Step 220 MSE : 8 .775186724960804 E−4 validation : 0 .14551749229431152Step 225 MSE : 8 .465692400932312 E−4 validation : 0 .14551749229431152Step 230 MSE : 0 .001021929271519184 validation : 0 .1511439323425293Step 235 MSE : 8 .138338103890419 E−4 validation : 0 .1511439323425293Step 240 MSE : 7 .958724163472652 E−4 validation : 0 .1511439323425293Step 245 MSE : 5 .254192277789116 E−4 validation : 0 .1511439323425293Step 250 MSE : 5 .608030129224062 E−4 validation : 0 .1511439323425293Step 255 MSE : 5 .298847798258066 E−4 validation : 0 .1525209665298462Step 260 MSE : 2 .5185835547745226 E−4 validation : 0 .1525209665298462Step 265 MSE : 0 . 0 validation : 0 .1525209665298462Step 270 MSE : 0 . 0 validation : 0 .1525209665298462Step 275 MSE : 0 . 0 validation : 0 .1525209665298462

11

Appendix B BackPropagation Results with Weight Initialization

variation: 10%

Step 5 MSE : 0 .28783159255981444 validation : 0 .7078684329986572Step 10 MSE : 0 .3041790008544922 validation : 0 .7078684329986572Step 15 MSE : 0 .2542862892150879 validation : 0 .7078684329986572Step 20 MSE : 0 .17635586261749267 validation : 0 .7078684329986572Step 25 MSE : 0 .11892862319946289 validation : 0 .7078684329986572Step 30 MSE : 0 .08226268887519836 validation : 0 .24700021743774414Step 35 MSE : 0 .05834963321685791 validation : 0 .24700021743774414Step 40 MSE : 0 .04328305125236511 validation : 0 .24700021743774414Step 45 MSE : 0 .032400116324424744 validation : 0 .24700021743774414Step 50 MSE : 0 .026984813809394836 validation : 0 .24700021743774414Step 55 MSE : 0 .02120078504085541 validation : 0 .10502679347991943Step 60 MSE : 0 .017360036075115205 validation : 0 .10502679347991943Step 65 MSE : 0 .013704812526702881 validation : 0 .10502679347991943Step 70 MSE : 0 .011695709824562073 validation : 0 .10502679347991943Step 75 MSE : 0 .009418381005525589 validation : 0 .10502679347991943Step 80 MSE : 0 .008438212424516678 validation : 0 .08463789224624634Step 85 MSE : 0 .007001315057277679 validation : 0 .08463789224624634Step 90 MSE : 0 .00616910494863987 validation : 0 .08463789224624634Step 95 MSE : 0 .005603433400392532 validation : 0 .08463789224624634Step 100 MSE : 0 .004665692895650863 validation : 0 .08463789224624634Step 105 MSE : 0 .0046350758522748945 validation : 0 .08253878355026245Step 110 MSE : 0 .004252982512116432 validation : 0 .08253878355026245Step 115 MSE : 0 .0034595858305692673 validation : 0 .08253878355026245Step 120 MSE : 0 .0032017264515161515 validation : 0 .08253878355026245Step 125 MSE : 0 .0029774675145745277 validation : 0 .08253878355026245Step 130 MSE : 0 .0027750935405492783 validation : 0 .08562710285186767Step 135 MSE : 0 .002592264115810394 validation : 0 .08562710285186767Step 140 MSE : 0 .0024330282583832743 validation : 0 .08562710285186767Step 145 MSE : 0 .002162282541394234 validation : 0 .08562710285186767Step 150 MSE : 0 .0019096722826361656 validation : 0 .08562710285186767Step 155 MSE : 0 .0015558460727334023 validation : 0 .10684666633605958Step 160 MSE : 0 .0018046379089355468 validation : 0 .10684666633605958Step 165 MSE : 0 .0017492618411779405 validation : 0 .10684666633605958Step 170 MSE : 0 .0011214664205908774 validation : 0 .10684666633605958Step 175 MSE : 0 .0010800822637975216 validation : 0 .10684666633605958Step 180 MSE : 8 .396579883992672 E−4 validation : 0 .10668759346008301Step 185 MSE : 8 .302817121148109 E−4 validation : 0 .10668759346008301Step 190 MSE : 8 .072376251220704 E−4 validation : 0 .10668759346008301Step 195 MSE : 7 .8533124178648 E−4 validation : 0 .10668759346008301Step 200 MSE : 7 .644744589924812 E−4 validation : 0 .10668759346008301Step 205 MSE : 7 .445913273841142 E−4 validation : 0 .10879552364349365Step 210 MSE : 7 .256179116666317 E−4 validation : 0 .10879552364349365Step 215 MSE : 7 .074945140630006 E−4 validation : 0 .10879552364349365Step 220 MSE : 6 .901655346155166 E−4 validation : 0 .10879552364349365Step 225 MSE : 6 .735833827406168 E−4 validation : 0 .10879552364349365Step 230 MSE : 6 .57701026648283E−4 validation : 0 .11123313903808593Step 235 MSE : 6 .424780003726482 E−4 validation : 0 .11123313903808593Step 240 MSE : 6 .278756074607373 E−4 validation : 0 .11123313903808593Step 245 MSE : 6 .138564087450505 E−4 validation : 0 .11123313903808593Step 250 MSE : 6 .003896705806255 E−4 validation : 0 .11123313903808593Step 255 MSE : 5 .874437279999256 E−4 validation : 0 .11341379880905152Step 260 MSE : 5 .749902687966823 E−4 validation : 0 .11341379880905152Step 265 MSE : 5 .630030762404203 E−4 validation : 0 .11341379880905152Step 270 MSE : 5 .514571443200111 E−4 validation : 0 .11341379880905152Step 275 MSE : 5 .403297953307628 E−4 validation : 0 .11341379880905152Step 280 MSE : 5 .295995622873306 E−4 validation : 0 .1349014401435852

12

Step 285 MSE : 5 .442085210233926 E−4 validation : 0 .1349014401435852Step 290 MSE : 0 . 0 validation : 0 .1349014401435852Step 295 MSE : 0 . 0 validation : 0 .1349014401435852Step 300 MSE : 0 . 0 validation : 0 .1349014401435852

13

Appendix C BackPropagation Results with Weight Initialization

variation: 1000%

Step 5 MSE : 0 .7439137458801269 validation : 0 .8075325965881348Step 10 MSE : 0 .7394987106323242 validation : 0 .8075325965881348Step 15 MSE : 0 .7389796733856201 validation : 0 .8075325965881348Step 20 MSE : 0 .7383403301239013 validation : 0 .8075325965881348Step 25 MSE : 0 .7379341125488281 validation : 0 .8075325965881348Step 30 MSE : 0 .7376955032348633 validation : 0 .8199188232421875Step 35 MSE : 0 .7378225326538086 validation : 0 .8199188232421875Step 40 MSE : 0 .7376434326171875 validation : 0 .8199188232421875Step 45 MSE : 0 .7376845359802247 validation : 0 .8199188232421875Step 50 MSE : 0 .7376195907592773 validation : 0 .8199188232421875Step 55 MSE : 0 .7376099586486816 validation : 0 .8219523429870605Step 60 MSE : 0 .737328052520752 validation : 0 .8219523429870605Step 65 MSE : 0 .7375877857208252 validation : 0 .8219523429870605Step 70 MSE : 0 .7374239444732666 validation : 0 .8219523429870605Step 75 MSE : 0 .7372795104980469 validation : 0 .8219523429870605Step 80 MSE : 0 .7375036239624023 validation : 0 .8299553871154786Step 85 MSE : 0 .7375109672546387 validation : 0 .8299553871154786Step 90 MSE : 0 .7374855995178222 validation : 0 .8299553871154786Step 95 MSE : 0 .7374994277954101 validation : 0 .8299553871154786Step 100 MSE : 0 .7374274253845214 validation : 0 .8299553871154786Step 105 MSE : 0 .7373716831207275 validation : 0 .8317861557006836Step 110 MSE : 0 .7373436450958252 validation : 0 .8317861557006836Step 115 MSE : 0 .7374117851257325 validation : 0 .8317861557006836Step 120 MSE : 0 .7372286319732666 validation : 0 .8317861557006836Step 125 MSE : 0 .7373687744140625 validation : 0 .8317861557006836Step 130 MSE : 0 .737220287322998 validation : 0 .8748334884643555Step 135 MSE : 0 .7372170448303222 validation : 0 .8748334884643555Step 140 MSE : 0 .7365022659301758 validation : 0 .8748334884643555Step 145 MSE : 0 .7259296894073486 validation : 0 .8748334884643555Step 150 MSE : 0 .6967463016510009 validation : 0 .8748334884643555Step 155 MSE : 0 .6588467121124267 validation : 0 .8385674476623535Step 160 MSE : 0 .6548052310943604 validation : 0 .8385674476623535Step 165 MSE : 0 .6537187576293946 validation : 0 .8385674476623535Step 170 MSE : 0 .652716064453125 validation : 0 .8385674476623535Step 175 MSE : 0 .652466106414795 validation : 0 .8385674476623535Step 180 MSE : 0 .6518962860107422 validation : 0 .8461722373962403Step 185 MSE : 0 .6517388343811035 validation : 0 .8461722373962403Step 190 MSE : 0 .6518072605133056 validation : 0 .8461722373962403Step 195 MSE : 0 .6511579513549804 validation : 0 .8461722373962403Step 200 MSE : 0 .6512362480163574 validation : 0 .8461722373962403Step 205 MSE : 0 .6510291576385498 validation : 0 .846339225769043Step 210 MSE : 0 .6510846138000488 validation : 0 .846339225769043Step 215 MSE : 0 .6508896827697754 validation : 0 .846339225769043Step 220 MSE : 0 .6509273052215576 validation : 0 .846339225769043Step 225 MSE : 0 .6510131359100342 validation : 0 .846339225769043Step 230 MSE : 0 .6508260726928711 validation : 0 .8478297233581543Step 235 MSE : 0 .6506863117218018 validation : 0 .8478297233581543Step 240 MSE : 0 .6507798194885254 validation : 0 .8478297233581543Step 245 MSE : 0 .6507604598999024 validation : 0 .8478297233581543Step 250 MSE : 0 .6510311126708984 validation : 0 .8478297233581543Step 255 MSE : 0 .6507403373718261 validation : 0 .8500283241271973Step 260 MSE : 0 .650849437713623 validation : 0 .8500283241271973Step 265 MSE : 0 .6506006240844726 validation : 0 .8500283241271973Step 270 MSE : 0 .6506382465362549 validation : 0 .8500283241271973Step 275 MSE : 0 .6504817962646484 validation : 0 .8500283241271973Step 280 MSE : 0 .6504709243774414 validation : 0 .8480534553527832

14

Step 285 MSE : 0 .6504622459411621 validation : 0 .8480534553527832Step 290 MSE : 0 .6504539012908935 validation : 0 .8480534553527832Step 295 MSE : 0 .650580883026123 validation : 0 .8480534553527832Step 300 MSE : 0 .650442361831665 validation : 0 .8480534553527832

15

Appendix D Removal of Hidden Units Results

Step 10 MSE : 0 .11142877340316773 validation : 0 .1476667881011963Step 20 MSE : 0 .06621989607810974 validation : 0 .1476667881011963Step 30 MSE : 0 .03818177878856659 validation : 0 .1476667881011963Step 40 MSE : 0 .025848716497421265 validation : 0 .1476667881011963Step 50 MSE : 0 .017600364983081818 validation : 0 .1476667881011963Step 60 MSE : 0 .011762099713087082 validation : 0 .1476667881011963Step 70 MSE : 0 .008713281154632569 validation : 0 .1476667881011963Step 80 MSE : 0 .006434543430805207 validation : 0 .1476667881011963Step 90 MSE : 0 .0049030866473913194 validation : 0 .1476667881011963Step 100 MSE : 0 .003921518847346306 validation : 0 .1476667881011963

Step 10 MSE : 0 .11482280492782593 validation : 0 .15903276205062866Step 20 MSE : 0 .06861187815666199 validation : 0 .15903276205062866Step 30 MSE : 0 .04024173319339752 validation : 0 .15903276205062866Step 40 MSE : 0 .024357706308364868 validation : 0 .15903276205062866Step 50 MSE : 0 .016482774913311005 validation : 0 .15903276205062866Step 60 MSE : 0 .01196279376745224 validation : 0 .15903276205062866Step 70 MSE : 0 .00786278247833252 validation : 0 .15903276205062866Step 80 MSE : 0 .0058764155954122545 validation : 0 .15903276205062866Step 90 MSE : 0 .00453682504594326 validation : 0 .15903276205062866Step 100 MSE : 0 .0037276022136211394 validation : 0 .15903276205062866

Step 10 MSE : 0 .15671133995056152 validation : 0 .30880825519561766Step 20 MSE : 0 .11338856220245361 validation : 0 .30880825519561766Step 30 MSE : 0 .0770088016986847 validation : 0 .30880825519561766Step 40 MSE : 0 .043926915526390074 validation : 0 .30880825519561766Step 50 MSE : 0 .02434147149324417 validation : 0 .30880825519561766Step 60 MSE : 0 .01689411699771881 validation : 0 .30880825519561766Step 70 MSE : 0 .012554386258125305 validation : 0 .30880825519561766Step 80 MSE : 0 .009251511842012405 validation : 0 .30880825519561766Step 90 MSE : 0 .006868529319763184 validation : 0 .30880825519561766Step 100 MSE : 0 .005096805840730667 validation : 0 .30880825519561766

Step 10 MSE : 0 .21999382972717285 validation : 0 .3224976062774658Step 20 MSE : 0 .11915276050567628 validation : 0 .3224976062774658Step 30 MSE : 0 .061674898862838744 validation : 0 .3224976062774658Step 40 MSE : 0 .031208288669586182 validation : 0 .3224976062774658Step 50 MSE : 0 .018721453845500946 validation : 0 .3224976062774658Step 60 MSE : 0 .013021048903465272 validation : 0 .3224976062774658Step 70 MSE : 0 .009395958483219146 validation : 0 .3224976062774658Step 80 MSE : 0 .006773640215396881 validation : 0 .3224976062774658Step 90 MSE : 0 .0047381572425365445 validation : 0 .3224976062774658Step 100 MSE : 0 .0034866876900196075 validation : 0 .3224976062774658

16

Appendix E Backpropagation Momentum Results

Step 5 MSE : 0 .1420825481414795 validation : 0 .7292706966400146Step 10 MSE : 0 .15109704732894896 validation : 0 .7292706966400146Step 15 MSE : 0 .18733338117599488 validation : 0 .7292706966400146Step 20 MSE : 0 .2822264194488525 validation : 0 .7292706966400146Step 25 MSE : 0 .3557072877883911 validation : 0 .7292706966400146Step 30 MSE : 0 .32199726104736326 validation : 0 .17061424255371094Step 35 MSE : 0 .2577648162841797 validation : 0 .17061424255371094Step 40 MSE : 0 .17378295660018922 validation : 0 .17061424255371094Step 45 MSE : 0 .06730784177780151 validation : 0 .17061424255371094Step 50 MSE : 0 .018583722412586212 validation : 0 .17061424255371094Step 55 MSE : 0 .007354919612407684 validation : 0 .14153279066085817Step 60 MSE : 0 .003461942821741104 validation : 0 .14153279066085817Step 65 MSE : 0 .0021449776366353037 validation : 0 .14153279066085817Step 70 MSE : 0 .0016316570341587066 validation : 0 .14153279066085817Step 75 MSE : 9 .877845644950868 E−4 validation : 0 .14153279066085817Step 80 MSE : 4 .0708663873374463 E−4 validation : 0 .1975197196006775Step 85 MSE : 4 .2545082978904245 E−4 validation : 0 .1975197196006775Step 90 MSE : 0 . 0 validation : 0 .1975197196006775Step 95 MSE : 0 . 0 validation : 0 .1975197196006775Step 100 MSE : 0 . 0 validation : 0 .1975197196006775

17

Appendix F BackPropagation Momentum Results with Weight

Initialization variation: 10%

Step 5 MSE : 0 .4585519790649414 validation : 0 .6485322952270508Step 10 MSE : 0 .47411680221557617 validation : 0 .6485322952270508Step 15 MSE : 0 .47916426658630373 validation : 0 .6485322952270508Step 20 MSE : 0 .4799241065979004 validation : 0 .6485322952270508Step 25 MSE : 0 .4799973964691162 validation : 0 .6485322952270508Step 30 MSE : 0 .48000369071960447 validation : 0 .6485433578491211Step 35 MSE : 0 .48000426292419435 validation : 0 .6485433578491211Step 40 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 45 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 50 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 55 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 60 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 65 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 70 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 75 MSE : 0 .48000431060791016 validation : 0 .6485433578491211

18

Appendix G BackPropagation Momentum Results with Weight

Initialization variation: 1000%

Step 5 MSE : 0 .18023643493652344 validation : 0 .4294907093048096Step 10 MSE : 0 .0912842571735382 validation : 0 .4294907093048096Step 15 MSE : 0 .11970610618591308 validation : 0 .4294907093048096Step 20 MSE : 0 .09203974008560181 validation : 0 .4294907093048096Step 25 MSE : 0 .013596837222576142 validation : 0 .4294907093048096Step 30 MSE : 0 .023742586374282837 validation : 0 .37595040798187257Step 35 MSE : 0 .1100746750831604 validation : 0 .37595040798187257Step 40 MSE : 0 .05771434307098389 validation : 0 .37595040798187257Step 45 MSE : 0 .04049423038959503 validation : 0 .37595040798187257Step 50 MSE : 0 .06679400205612182 validation : 0 .37595040798187257Step 55 MSE : 0 .0018063843250274657 validation : 0 .7180890560150146Step 60 MSE : 7 .56674213334918E−4 validation : 0 .7180890560150146Step 65 MSE : 3 .226003143936396 E−4 validation : 0 .7180890560150146Step 70 MSE : 2 .9366351664066314 E−4 validation : 0 .7180890560150146Step 75 MSE : 2 .675879281014204 E−4 validation : 0 .7180890560150146Step 80 MSE : 0 . 0 validation : 0 .7209718704223633Step 85 MSE : 0 . 0 validation : 0 .7209718704223633Step 90 MSE : 0 . 0 validation : 0 .7209718704223633Step 95 MSE : 0 . 0 validation : 0 .7209718704223633Step 100 MSE : 0 . 0 validation : 0 .7209718704223633

19

Appendix H Backpropagation Weight Decay Results

Step 5 MSE : 0 .1531648278236389 validation : 0 .8822230339050293Step 10 MSE : 0 .14724457263946533 validation : 0 .8822230339050293Step 15 MSE : 0 .14965202808380126 validation : 0 .8822230339050293Step 20 MSE : 0 .15610545873641968 validation : 0 .8822230339050293Step 25 MSE : 0 .18509423732757568 validation : 0 .8822230339050293Step 30 MSE : 0 .22008249759674073 validation : 0 .7769925117492675Step 35 MSE : 0 .2746063947677612 validation : 0 .7769925117492675Step 40 MSE : 0 .2894240379333496 validation : 0 .7769925117492675Step 45 MSE : 0 .29315934181213377 validation : 0 .7769925117492675Step 50 MSE : 0 .290570068359375 validation : 0 .7769925117492675Step 55 MSE : 0 .285865330696106 validation : 0 .765700101852417Step 60 MSE : 0 .2822000741958618 validation : 0 .765700101852417Step 65 MSE : 0 .2785912036895752 validation : 0 .765700101852417Step 70 MSE : 0 .2757643461227417 validation : 0 .765700101852417Step 75 MSE : 0 .2732561111450195 validation : 0 .765700101852417Step 80 MSE : 0 .2710933446884155 validation : 0 .7513556480407715Step 85 MSE : 0 .26959595680236814 validation : 0 .7513556480407715Step 90 MSE : 0 .2697579860687256 validation : 0 .7513556480407715Step 95 MSE : 0 .26788547039031985 validation : 0 .7513556480407715Step 100 MSE : 0 .26771624088287355 validation : 0 .7513556480407715Step 105 MSE : 0 .26571102142333985 validation : 0 .7442266464233398Step 110 MSE : 0 .2642052412033081 validation : 0 .7442266464233398Step 115 MSE : 0 .2631103277206421 validation : 0 .7442266464233398Step 120 MSE : 0 .26329431533813474 validation : 0 .7442266464233398Step 125 MSE : 0 .2620270252227783 validation : 0 .7442266464233398Step 130 MSE : 0 .26094348430633546 validation : 0 .7428317546844483Step 135 MSE : 0 .2602025032043457 validation : 0 .7428317546844483Step 140 MSE : 0 .2597073793411255 validation : 0 .7428317546844483Step 145 MSE : 0 .2593426465988159 validation : 0 .7428317546844483Step 150 MSE : 0 .2589634656906128 validation : 0 .7428317546844483Step 155 MSE : 0 .2605225324630737 validation : 0 .731752586364746Step 160 MSE : 0 .259334397315979 validation : 0 .731752586364746Step 165 MSE : 0 .2585177183151245 validation : 0 .731752586364746Step 170 MSE : 0 .2595790147781372 validation : 0 .731752586364746Step 175 MSE : 0 .2583564281463623 validation : 0 .731752586364746Step 180 MSE : 0 .25734915733337405 validation : 0 .7308727741241455Step 185 MSE : 0 .2566494941711426 validation : 0 .7308727741241455Step 190 MSE : 0 .25615429878234863 validation : 0 .7308727741241455Step 195 MSE : 0 .25580003261566164 validation : 0 .7308727741241455Step 200 MSE : 0 .255544376373291 validation : 0 .7308727741241455

20

Appendix I BackPropagation Weight Decay Results with Weight

Initialization variation: 10%

Step 5 MSE : 0 .45860490798950193 validation : 0 .6485340595245361Step 10 MSE : 0 .47444753646850585 validation : 0 .6485340595245361Step 15 MSE : 0 .4792498588562012 validation : 0 .6485340595245361Step 20 MSE : 0 .479934024810791 validation : 0 .6485340595245361Step 25 MSE : 0 .479998254776001 validation : 0 .6485340595245361Step 30 MSE : 0 .48000383377075195 validation : 0 .6485433578491211Step 35 MSE : 0 .48000426292419435 validation : 0 .6485433578491211Step 40 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 45 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 50 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 55 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 60 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 65 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 70 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 75 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 80 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 85 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 90 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 95 MSE : 0 .48000431060791016 validation : 0 .6485433578491211Step 100 MSE : 0 .48000431060791016 validation : 0 .6485433578491211

21

Appendix J BackPropagation Weight Decay Results with Weight

Initialization variation: 1000%

Step 5 MSE : 0 .2646505355834961 validation : 0 .9129238128662109Step 10 MSE : 0 .25199041366577146 validation : 0 .9129238128662109Step 15 MSE : 0 .169249427318573 validation : 0 .9129238128662109Step 20 MSE : 0 .08482854962348937 validation : 0 .9129238128662109Step 25 MSE : 0 .11307567358016968 validation : 0 .9129238128662109Step 30 MSE : 0 .07542955875396729 validation : 0 .6089982986450195Step 35 MSE : 0 .06859263181686401 validation : 0 .6089982986450195Step 40 MSE : 0 .031367373466491696 validation : 0 .6089982986450195Step 45 MSE : 0 .010646123439073563 validation : 0 .6089982986450195Step 50 MSE : 0 . 0 validation : 0 .6089982986450195Step 55 MSE : 0 . 0 validation : 0 .6089982986450195Step 60 MSE : 0 . 0 validation : 0 .6089982986450195Step 65 MSE : 0 . 0 validation : 0 .6089982986450195Step 70 MSE : 0 . 0 validation : 0 .6089982986450195Step 75 MSE : 0 . 0 validation : 0 .6089982986450195

22

Appendix K Weka Results

=== Run information ===

Scheme : weka . classifiers . functions . MultilayerPerceptron −L 0 .3 −M 0 .2 −N 500 ←֓−V 0 −S 0 −E 20 −H 1

Relation : DM_Assigment3_val−weka . filters . unsupervised . attribute . Remove−←֓R1−2 ,4 ,6 ,10−25 ,28−29 ,31 ,34−47 ,49−79 ,81−93 ,95−100

Instances : 80Attributes : 14

X3

X5

X7

X8

X9

X26

X27

X30

X32

X33

X48

X80

X94

class

Test mode : evaluate on training data

=== Classifier model ( full training set ) ===

Linear Node 0Inputs Weights

Threshold 1.1994570847385295Node 1 −2.614723695767132

Sigmoid Node 1Inputs Weights

Threshold −0.7554063442554315Attrib X3 1.8523489640717965Attrib X5 0.5518004119247022Attrib X7 0.4936058899906072Attrib X8 −0.6277377634032811Attrib X9 −0.49799394244070255Attrib X26 1.2879558749327256Attrib X27 1.1085280606366834Attrib X30 0.6700420402390828Attrib X32 1.0771276410842265Attrib X33 0.8175176130192898Attrib X48 −0.5123583600544321Attrib X80 0.6166653535819162Attrib X94 0.4773680078446146

Class

Input

Node 0

Time taken to build model : 0 .08 seconds

=== Evaluation on training set ====== Summary ===

Correlation coefficient 1

23

Mean absolute error 0.0018Root mean squared error 0.0024Relative absolute error 0 .369 %Root relative squared error 0.4836 %Total Number of Instances 80

24