9
Research Article The Application and Research of the GA-BP Neural Network Algorithm in the MBR Membrane Fouling Chunqing Li, 1 Zixiang Yang, 1 Hongying Yan, 1 and Tao Wang 2 1 School of Computer Science and Soſtware Engineering, Tianjin Polytechnic University, Tianjin 300387, China 2 School of Environmental and Chemical Engineering, Tianjin Polytechnic University, Tianjin 300387, China Correspondence should be addressed to Chunqing Li; [email protected] Received 4 December 2013; Accepted 24 January 2014; Published 6 March 2014 Academic Editor: Rudong Chen Copyright © 2014 Chunqing Li et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. It is one of the important issues in the field of today’s sewage treatment of researching the MBR membrane flux prediction for membrane fouling. Firstly this paper used the principal component analysis method to achieve dimensionality and correlation of input variables and obtained the three major factors affecting membrane fouling most obvious: MLSS, total resistance, and operating pressure. en it used the BP neural network to establish the system model of the MBR intelligent simulation, the relationship between three parameters, and membrane flux characterization of the degree of membrane fouling, because the BP neural network has slow training speed, is sensitive to the initial weights and the threshold, is easy to fall into local minimum points, and so on. So this paper used genetic algorithm to optimize the initial weights and the threshold of BP neural network and established the membrane fouling prediction model based on GA-BP network. As this research had shown, under the same conditions, the BP network model optimized by GA of MBR membrane fouling is better than that not optimized for prediction effect of membrane flux. It demonstrates that the GA-BP network model of MBR membrane fouling is more suitable for simulation of MBR membrane fouling process, comparing with the BP network. 1. Introduction As a new wastewater treatment technology, the membrane bioreactor has attracted much concern by its high quality of the effluent water quality and was considered as a water- saving technology with better economic, social, and envi- ronmental benefits. But it has been proved by practice that membrane fouling is a major bottleneck restricting its devel- opment. en, by the study of the mechanism of membrane fouling, it is imminent to slow the membrane fouling rate and reduce the membrane pollution. With the emergence of the computer simulation tech- nology, it greatly reduces the time, space, and cost of the MBR experiment. erefore, the MBR computer simulation technology has become a powerful tool for the research of the MBR, and its development will have a positive reference and guiding role for the practical engineering applications of the MBR. Based on this idea, according to the real data of the exper- iment and industrial production of a MBR sewage treatment plant in Shijiazhuang, this paper focused on membrane foul- ing problem in the process of membrane bioreactor sewage wastewater. And it established intelligent prediction model with the correlation and the relationship between operation parameters and flux, by means of smart algorithm. rough the constant optimization of the algorithms and models, it achieved more accurate prediction of unknown membrane fluxes and thus predicted the degree of the membrane fouling. And seeking the optimum operating conditions of controlling membrane pollution trend, it improved the membrane fouling problem of the MBR process. e research of this topic had a certain reference value for membrane fouling prediction, parameters selection, and operation of the MBR practical engineering. It also had a positive guiding significance for stable operation and further application of the membrane bioreactor in the field of wastewater treatment Hindawi Publishing Corporation Abstract and Applied Analysis Volume 2014, Article ID 673156, 8 pages http://dx.doi.org/10.1155/2014/673156

Research Article The Application and Research of the GA-BP

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article The Application and Research of the GA-BP

Research ArticleThe Application and Research of the GA-BP Neural NetworkAlgorithm in the MBR Membrane Fouling

Chunqing Li1 Zixiang Yang1 Hongying Yan1 and Tao Wang2

1 School of Computer Science and Software Engineering Tianjin Polytechnic University Tianjin 300387 China2 School of Environmental and Chemical Engineering Tianjin Polytechnic University Tianjin 300387 China

Correspondence should be addressed to Chunqing Li franklcq163com

Received 4 December 2013 Accepted 24 January 2014 Published 6 March 2014

Academic Editor Rudong Chen

Copyright copy 2014 Chunqing Li et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

It is one of the important issues in the field of todayrsquos sewage treatment of researching the MBR membrane flux prediction formembrane fouling Firstly this paper used the principal component analysis method to achieve dimensionality and correlation ofinput variables and obtained the threemajor factors affectingmembrane foulingmost obviousMLSS total resistance and operatingpressure Then it used the BP neural network to establish the system model of the MBR intelligent simulation the relationshipbetween three parameters and membrane flux characterization of the degree of membrane fouling because the BP neural networkhas slow training speed is sensitive to the initial weights and the threshold is easy to fall into local minimum points and so onSo this paper used genetic algorithm to optimize the initial weights and the threshold of BP neural network and established themembrane fouling prediction model based on GA-BP network As this research had shown under the same conditions the BPnetwork model optimized by GA of MBR membrane fouling is better than that not optimized for prediction effect of membraneflux It demonstrates that the GA-BP network model of MBRmembrane fouling is more suitable for simulation of MBRmembranefouling process comparing with the BP network

1 Introduction

As a new wastewater treatment technology the membranebioreactor has attracted much concern by its high qualityof the effluent water quality and was considered as a water-saving technology with better economic social and envi-ronmental benefits But it has been proved by practice thatmembrane fouling is a major bottleneck restricting its devel-opment Then by the study of the mechanism of membranefouling it is imminent to slow themembrane fouling rate andreduce the membrane pollution

With the emergence of the computer simulation tech-nology it greatly reduces the time space and cost of theMBR experiment Therefore the MBR computer simulationtechnology has become a powerful tool for the research ofthe MBR and its development will have a positive referenceand guiding role for the practical engineering applications ofthe MBR

Based on this idea according to the real data of the exper-iment and industrial production of a MBR sewage treatmentplant in Shijiazhuang this paper focused on membrane foul-ing problem in the process of membrane bioreactor sewagewastewater And it established intelligent prediction modelwith the correlation and the relationship between operationparameters and flux by means of smart algorithm Throughthe constant optimization of the algorithms and models itachieved more accurate prediction of unknown membranefluxes and thus predicted the degree of the membranefouling And seeking the optimum operating conditionsof controlling membrane pollution trend it improved themembrane fouling problem of theMBR processThe researchof this topic had a certain reference value for membranefouling prediction parameters selection and operation of theMBR practical engineering It also had a positive guidingsignificance for stable operation and further application ofthemembrane bioreactor in the field of wastewater treatment

Hindawi Publishing CorporationAbstract and Applied AnalysisVolume 2014 Article ID 673156 8 pageshttpdxdoiorg1011552014673156

2 Abstract and Applied Analysis

and thus further expressed technological advantages of thisprocess

2 Prediction Model of MBR MembraneFouling Based on BP Algorithm

21 Intelligent BP Simulation Model of MBR Membrane Foul-ing Membrane flux is an important indicator of the degree ofmembrane fouling so this paper used intelligent simulationof MBRmembrane flux to feed back membrane fouling Andmembrane fouling index can be expressed as

119891 = (1 minus119869V1198690

) times 100 (1)

According to filtering model based on Darcyrsquos law thesetting of the initial membrane flux 119869

0is

119869V =Δ119901

120583 (119877119898+ 119877119892+ 119877119888) (2)

where Δ119901 is pressure difference on both sides of membranes(Pa) 120583 is filtrate viscosity (Pas) 119877

119898is resistance of the

membrane (1m)119877119892is resistance ofmembrane fouling (1m)

119877119888is resistance of cake (1m)It is shown that 119877

119888and 119877

119892are all 0 and MLSS is also 0 in

the initial state When the initial viscosity value is 1205830 we can

obtain the equation of initial membrane flux as follows

1198690=

Δ119901

1205830119877119898

(3)

Finally the setting of the membrane flux is determinedby many factors which is what this paper discussed andresearched questions

Mathematical simulation models are built on the basisof certain assumptions This paper assumed that the entiremembrane bioreactor system had been running in a stablestate activated sludge was completely trapped in the MBRand materials in the MBR were fully mixed

On this basis for the historical data a total of 80 groupsindustrial production and test of a MBR sewage treatmentplant in Shijiazhuang city after treating them by collectingstatistical analysis and principal component analysis wedetermine theMLSS of inlet operating pressure and the totalresistance as threemain factors affecting theMBRmembraneflux It is shown in Table 1 that the sample data and its formatwere finished And for four columns of this table we usedMATLAB programming software to determine gradually theoptimal number of neurons in hidden layer of neural networkas well as simulate and predict eventually MBR membranefouling After determining the input and output vectors weprocessed them normally

In the experiment we recorded that the BP networkneeds the evolution generations to meet preset accuracyrequirements 3 in the different numbers of units in hiddenlayer And after the evolution of each network was completedwe input test data to verify its accuracy and calculated

the respective prediction error of average After the networkstructure was determined we input training data for networktraining saved network structure after training and inputforecasting data validation And we analyzed the predictionerror between the actual output data and the desired one thusevaluating the performance of BP network prediction of theMBR membrane flux

22 The Simulation and Results The setting of BP algorithmparameters was as follows the rate of the initial learning was008 the initial space of the weights was 119861 = (minus1 1) thetransfer function of input layer to hidden layer was logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4 We selectedrandomly 74 groups from experimental samples as trainingsamples to train the BP neural network And we selected theremaining 6 groups as test samples to analyze the degree offitting between the measured values and the predicted valuesof neural network model so as to test the effect of learning

By the simulation of training samples and testing sampleswe get the following chart (Figures 1 and 2 and Table 2)

As shown in Figure 1 network error of the model sta-bilized through the 4254 iterations which indicated thatthe convergence speed of BP network is slow And theerror was bigger in a few iterations But at the end of thetraining mean square error of network was 0000199978and reached the value 00002 of predetermined set Fromthe histogram of the prediction results in Figure 2 wecan see that the network predicting and measured valueswere basically consistent It suggested that the BP networkmodel of MBR membrane fouling was more successful andcan realize basically the prediction of the MBR membraneflux

However it is shown from the relative prediction errorvalues in Table 2 that the predictionmaximumerror is 1627and it is relatively large It explains that the forecast of theBP neural network is not enough accurate for individualpoints and the average relative error is 740 which needsto improve

23 Defects and Improvement of BP Neural Network

231 Defects In the artificial neural network although BPneural network is the most widely used and has manyremarkable advantages some limitations still exist [1 2]Theyare mainly the following aspects

(1) slow convergence speed in the learning process

(2) easy to fall into local minima

(3) difficult to determine network structure

(4) lack of effective selection method about the learningstep

So in order to make the BP neural network play a greaterrole in the field of forecasting we need to make appropriateimprovements in the actual modeling

Abstract and Applied Analysis 3

Table 1 Data of membrane flux pressure and so forth collected

Time (h) Temperature (∘C) Pressure (MPa) MLSS of inflow (mgL) Total resistance (times1012mminus1) Fluxes (Lm2 h)1 24 0016 34338 205 4642 24 00168 37815 2403 4553 24 00175 39242 2989 4534 24 00243 47243 3796 4225 24 00268 48373 3957 4516 24 00291 58316 441 4227 24 00325 55643 5119 3978 24 00362 50356 6075 3739 24 00351 59141 7143 31410 24 00385 56185 8421 28911 24 00269 61242 7623 21712 24 00226 65547 9737 14513 24 00198 71243 11659 11214 24 00193 61512 12911 10515 24 00187 71589 1254 9416 24 00157 33128 1793 51817 24 00164 38015 24175 46518 24 00197 39141 3197 42319 24 00239 44244 3673 41220 24 00264 49372 3961 43421 24 00283 58723 4438 39222 24 00319 56843 5233 37823 24 00355 58356 6019 36724 24 00369 58541 7013 32525 24 00384 57167 8025 28926 24 00268 60132 9043 21627 24 00234 65415 9941 15928 24 00196 69823 10892 11329 24 00191 69572 12136 10130 24 00185 70946 12759 84

Table 2 Relative errors of membrane flux prediction of BP network

Desired values (Lm2 h) 112000 453000 423000 94000 217000 289000Actual values (Lm2 h) 102366 478755 416611 89128 232551 241967Relative error 00860 00569 00151 00518 00717 01627Average relative error 00740

232 Improvement In response to these problems research-ers have proposedmany effective improvements for exampleconsider the following

(1) Momentum Additional Method Momentum additionalmethod is based on the method of reverse propagation andat the same time takes into account the effect of the error inspatial gradient and errors surface And changes of the newweight are plus the value proportional to last-connectionweight on the basis of the change of each of connectionweights When being without additional momentum thenetwork may fall into the local minimum of shallow and itcan slip the minimum in the role of additional momentum

(2) Adaptive Learning Rate The learning rate is usually ob-tained by experience or experiment but sometimes even theeffect of the learning rate is better in initial training whichcan also not guarantee appropriate training in the later Soit needs to correct adaptively learning rate Its criteria are asfollows Check whether it really reduces the error functionafter the weight is corrected If it is then explain that thelearning rate selected is small and can be increased a quantityand if not then the value of learning rate should be reduced

(3) Optimize Initial Weights and theThreshold of the NetworkThe connection weights between neurons of each layer ofthe neural network contain all the knowledge of the neural

4 Abstract and Applied Analysis

10minus1

10minus2

10minus3

10minus4

0 500 1000 1500 2000 30002500 3500 4000

4254 epochsStop training

Trai

ning

blu

e go

al b

lack

Performance is 0000199978 goal is 0 0002

Figure 1 The error curve of membrane flux in BP network trainingprocess

network system As the BP network is based on gradientdescent technique it may produce an entirely different resultfor initial different values given randomly in traditional Oncethe value is improper it will cause oscillation or inconve-nience of the network and it is also easy to fall into localextremism in complex situations The genetic algorithm hasindependence of the initial value global search optimizationability faster convergence rate and so forth So we can usegenetic algorithm to search properly initial weights of BPneural network for optimizing network and then it is assignedto BP network for training

3 Optimize Parameters ofBP Model of the MBR MembraneFouling by Genetic Algorithm

Known by the last section the BP neural network canachieve the detection of the MBR membrane flux but theaccuracy and speed need to further improve BP algorithmonly adjusted connection weights of neural network from thelocal angle but did not examine the whole learning processform the global perspective So it is easy to converge tolocal minimum The first two methods for improving BPalgorithm proposed in the last section are essentially toadjust network weights in the local area and cannot overcomethe local searching feature of BP algorithm

To avoid this we need to consider it from the globalangle One of the ways to improve them is using the thirdmethod proposed in the last section It is introducing thesearching method of global intelligent optimization in thelearning process of the neural network genetic algorithm Sothis section considered adopting genetic algorithm to searchfor the initial weights and the threshold suitable to BP neuralnetwork so as to realize the optimization of the networkand achieve the purpose improving MBR membrane foulingprediction

0

5

10

15

20

25

30

35

40

45

50

Out

put o

f th

e m

embr

ane

Prediction result of membrane flux of BP

D Desired value

Input forecast samples1 2 3 4 5 6

BP

Figure 2 Measured values of membrane flux and predicted valuesof BP network

31 Genetic Algorithm Genetic algorithm (GA) is a kindof random searching algorithm referencing natural selec-tion and genetic mechanism of biological evolution Thealgorithm is a new class of general optimization algorithmfor solving the optimization problem of general nonlinearsearching It is not required for linear continuous andmicro of the model and also less affected by the number ofvariables and constraints And it can be adaptive to learnand accumulate the information of search space so as tofind the optimal solution GA has been applied for functionoptimization combinatorial optimization automatic controlrobotics parameter optimization of neural network struc-ture design image processing systems and other fields

Compared with other optimization algorithms GA opti-mization is a kind of robust and global searching algorithmand it mainly has the following features

(1) adopting the encoding of the decision variables asoperands

(2) directly taking values of the objective function assearching information

(3) usingmultiple searching points to search for informa-tion at the same time

(4) using probabilistic search technology

32 Optimization of BP Network Parameters by GA

321 Description of the Algorithm BP algorithm is a kindof local searching method based on gradient descent Afterthe initial weight vector is determined its searching directionis also determined So the algorithm can only start fromthe initial value and gradually goes for the direction untilit meets the stopping criteria of algorithm Its advantage isthe better ability of local optimization but the drawback

Abstract and Applied Analysis 5

is easy to fall into local minimum and slow convergencerate This is because the initial weight vector of the BPnetwork is selected randomly which makes the direction ofthe process optimizationwithout certainty So the effect of thesolution is sometimes good and sometimes bad As a kindof probabilistic optimization algorithm based on the law ofbiological evolution genetic algorithm has a strong adaptivesearching and global optimization ability and can avoid localminimum points with great probability Meanwhile it doesnot need gradient information for solving problems and itsoperation is more convenient This shows that the geneticalgorithm is suitable to be used to enhance the learning abilityof BP algorithm Therefore we introduced GA ideas in thetheory of BP network The two can well complement eachother and achieve the purpose of global optimization fast andefficiently In fact the combination of genetic algorithm andneural network has become one of the focuses of the researchand application of artificial intelligence algorithms [3] Withmaking full use of the advantage of both the new algorithmhas robustness and self-learning ability of neural network andalso has the global searching ability of genetic algorithm

Optimizing weights and the threshold of the BP neuralnetwork by GA is divided into two parts The first part usedgenetic algorithm embedding neural network and searchedthe best individual in the probable scope of the weights of BPnetworkThe second part continued to use BP algorithm Onthe basis of GA optimization we used the optimal individualssearched by GA as initial weights and the threshold of BPnetwork and then directly used BP algorithm to train thenetwork So this section mainly described achieved methodunder the premise of fixed structure of BP network that iswe first used GA to optimize initially connected weights andthe threshold of BP network and then used BP algorithm tosearch accurately the optimal solution of the weight and thethreshold

322 Steps of Implementation GAoptimizes weights and thethreshold of neural network of BP algorithm whose steps ofimplementation are as follows

(1) Encoding and Initialization Using the genetic algorithmto optimize the neural network its main focus is to optimizeconnection weights and threshold between neurons in theneural network Initial populations are whose connectionweights and thresholds of neural networks are initializedBecause connectionweights and the threshold of the networkare real so each of connection weights and the thresholddirectly is expressed by real number This encoding is moreintuitive comparing with binary and whose accuracy is alsohigher than that We set the size of the group as119873 and eachindividual of the population contained 119878 genes (connectionweights) 119878 = 119877 lowast 119878

1+ 1198781lowast 1198782+ 1198781+ 1198782 where 119877 119878

1 1198782

represented the number of neurons in the input layer thenumber of neurons in the hidden layer and the number ofneurons in the output layer The scope of each real was set as(minus1 1) and we get the initial population 119875(119905)

(2) Calculation of FitnessCalculate fitness value of all individ-uals of the population We assigned 119878 weights and thresholds

of the initial group to the BP neural network for forwardpropagation of the input signal and calculated the sum ofsquared errors between the output value and the desired ofthe network So the fitness function can be measured by thereciprocal of sum of squared errors [4] as follows

fitness (sol) = 1

sum1198990

119894=1(119900119894minus 119910119894)2 (4)

where sol means any individual of the population 0119894means

the actual output of the 119894th output neuron 119910119894means the

desired output of the 119894th output neuron and 1198990means the

number of the output neurons The smaller the differencebetween the actual output and the target output was thebigger the value of fitness function was and also the more thedesired requirement was

(3) Selection Sort After calculating the fitness of each indi-vidual in the population we sorted all individuals in thepopulation according to their respective fitness Accordingto this order it determined the probability of each individualselected that is proportional selection method as follows

fitness (sol) =119891119894

sum119872

119894=1119891119894

(5)

It ensured that the bigger the fitness value of individualis the greater it is selected and the lower the fitness valueof individual is the lower it is selected But there is a smallfitness value of the individual that may also be selectedTherefore while choosing we joined the best choice strategyand retained the optimal individual of each generation to theoffspring

(4) Genetic Operators

(a) Crossover Operator Cross is the most important meansgetting the new best individual This paper used the arith-metic crossover which produced two new individuals by alinear combination of two individualsWe set two individuals119883119894119860and119883119894

119861that crossed each other then the new individuals

produced are

119883119894+1119860

= 120572119883119894119861+ (1 minus 120572)119883

119894

119860

119883119894+1119861

= 120572119883119894119860+ (1 minus 120572)119883

119894

119861

(6)

where 120572 le 119875119888and 120572(0 lt 120572 lt 1) is a random number

generated by the parent individual of each pair and 119875119888is a

crossover rate of the parent individual of each pair

(b) Mutation Operator Use uniform mutation We set muta-tion probability as 119875

119898and set mutation point as 119909

119896whose

range is [119880119896min 119880119896

max] Then new genetic value of mutationpoint is 1199091015840

119896= 119880119896min + 119903(119880

119896

max minus 119880119896

min) where 119903 is a randomnumber in the range [0 1]

(5) Generate NewPopulationsWe inserted the new individualinto the population 119875(119905) which generated a new population119875(119905+1)Thenwe gave connection weights of the individual of

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Research Article The Application and Research of the GA-BP

2 Abstract and Applied Analysis

and thus further expressed technological advantages of thisprocess

2 Prediction Model of MBR MembraneFouling Based on BP Algorithm

21 Intelligent BP Simulation Model of MBR Membrane Foul-ing Membrane flux is an important indicator of the degree ofmembrane fouling so this paper used intelligent simulationof MBRmembrane flux to feed back membrane fouling Andmembrane fouling index can be expressed as

119891 = (1 minus119869V1198690

) times 100 (1)

According to filtering model based on Darcyrsquos law thesetting of the initial membrane flux 119869

0is

119869V =Δ119901

120583 (119877119898+ 119877119892+ 119877119888) (2)

where Δ119901 is pressure difference on both sides of membranes(Pa) 120583 is filtrate viscosity (Pas) 119877

119898is resistance of the

membrane (1m)119877119892is resistance ofmembrane fouling (1m)

119877119888is resistance of cake (1m)It is shown that 119877

119888and 119877

119892are all 0 and MLSS is also 0 in

the initial state When the initial viscosity value is 1205830 we can

obtain the equation of initial membrane flux as follows

1198690=

Δ119901

1205830119877119898

(3)

Finally the setting of the membrane flux is determinedby many factors which is what this paper discussed andresearched questions

Mathematical simulation models are built on the basisof certain assumptions This paper assumed that the entiremembrane bioreactor system had been running in a stablestate activated sludge was completely trapped in the MBRand materials in the MBR were fully mixed

On this basis for the historical data a total of 80 groupsindustrial production and test of a MBR sewage treatmentplant in Shijiazhuang city after treating them by collectingstatistical analysis and principal component analysis wedetermine theMLSS of inlet operating pressure and the totalresistance as threemain factors affecting theMBRmembraneflux It is shown in Table 1 that the sample data and its formatwere finished And for four columns of this table we usedMATLAB programming software to determine gradually theoptimal number of neurons in hidden layer of neural networkas well as simulate and predict eventually MBR membranefouling After determining the input and output vectors weprocessed them normally

In the experiment we recorded that the BP networkneeds the evolution generations to meet preset accuracyrequirements 3 in the different numbers of units in hiddenlayer And after the evolution of each network was completedwe input test data to verify its accuracy and calculated

the respective prediction error of average After the networkstructure was determined we input training data for networktraining saved network structure after training and inputforecasting data validation And we analyzed the predictionerror between the actual output data and the desired one thusevaluating the performance of BP network prediction of theMBR membrane flux

22 The Simulation and Results The setting of BP algorithmparameters was as follows the rate of the initial learning was008 the initial space of the weights was 119861 = (minus1 1) thetransfer function of input layer to hidden layer was logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4 We selectedrandomly 74 groups from experimental samples as trainingsamples to train the BP neural network And we selected theremaining 6 groups as test samples to analyze the degree offitting between the measured values and the predicted valuesof neural network model so as to test the effect of learning

By the simulation of training samples and testing sampleswe get the following chart (Figures 1 and 2 and Table 2)

As shown in Figure 1 network error of the model sta-bilized through the 4254 iterations which indicated thatthe convergence speed of BP network is slow And theerror was bigger in a few iterations But at the end of thetraining mean square error of network was 0000199978and reached the value 00002 of predetermined set Fromthe histogram of the prediction results in Figure 2 wecan see that the network predicting and measured valueswere basically consistent It suggested that the BP networkmodel of MBR membrane fouling was more successful andcan realize basically the prediction of the MBR membraneflux

However it is shown from the relative prediction errorvalues in Table 2 that the predictionmaximumerror is 1627and it is relatively large It explains that the forecast of theBP neural network is not enough accurate for individualpoints and the average relative error is 740 which needsto improve

23 Defects and Improvement of BP Neural Network

231 Defects In the artificial neural network although BPneural network is the most widely used and has manyremarkable advantages some limitations still exist [1 2]Theyare mainly the following aspects

(1) slow convergence speed in the learning process

(2) easy to fall into local minima

(3) difficult to determine network structure

(4) lack of effective selection method about the learningstep

So in order to make the BP neural network play a greaterrole in the field of forecasting we need to make appropriateimprovements in the actual modeling

Abstract and Applied Analysis 3

Table 1 Data of membrane flux pressure and so forth collected

Time (h) Temperature (∘C) Pressure (MPa) MLSS of inflow (mgL) Total resistance (times1012mminus1) Fluxes (Lm2 h)1 24 0016 34338 205 4642 24 00168 37815 2403 4553 24 00175 39242 2989 4534 24 00243 47243 3796 4225 24 00268 48373 3957 4516 24 00291 58316 441 4227 24 00325 55643 5119 3978 24 00362 50356 6075 3739 24 00351 59141 7143 31410 24 00385 56185 8421 28911 24 00269 61242 7623 21712 24 00226 65547 9737 14513 24 00198 71243 11659 11214 24 00193 61512 12911 10515 24 00187 71589 1254 9416 24 00157 33128 1793 51817 24 00164 38015 24175 46518 24 00197 39141 3197 42319 24 00239 44244 3673 41220 24 00264 49372 3961 43421 24 00283 58723 4438 39222 24 00319 56843 5233 37823 24 00355 58356 6019 36724 24 00369 58541 7013 32525 24 00384 57167 8025 28926 24 00268 60132 9043 21627 24 00234 65415 9941 15928 24 00196 69823 10892 11329 24 00191 69572 12136 10130 24 00185 70946 12759 84

Table 2 Relative errors of membrane flux prediction of BP network

Desired values (Lm2 h) 112000 453000 423000 94000 217000 289000Actual values (Lm2 h) 102366 478755 416611 89128 232551 241967Relative error 00860 00569 00151 00518 00717 01627Average relative error 00740

232 Improvement In response to these problems research-ers have proposedmany effective improvements for exampleconsider the following

(1) Momentum Additional Method Momentum additionalmethod is based on the method of reverse propagation andat the same time takes into account the effect of the error inspatial gradient and errors surface And changes of the newweight are plus the value proportional to last-connectionweight on the basis of the change of each of connectionweights When being without additional momentum thenetwork may fall into the local minimum of shallow and itcan slip the minimum in the role of additional momentum

(2) Adaptive Learning Rate The learning rate is usually ob-tained by experience or experiment but sometimes even theeffect of the learning rate is better in initial training whichcan also not guarantee appropriate training in the later Soit needs to correct adaptively learning rate Its criteria are asfollows Check whether it really reduces the error functionafter the weight is corrected If it is then explain that thelearning rate selected is small and can be increased a quantityand if not then the value of learning rate should be reduced

(3) Optimize Initial Weights and theThreshold of the NetworkThe connection weights between neurons of each layer ofthe neural network contain all the knowledge of the neural

4 Abstract and Applied Analysis

10minus1

10minus2

10minus3

10minus4

0 500 1000 1500 2000 30002500 3500 4000

4254 epochsStop training

Trai

ning

blu

e go

al b

lack

Performance is 0000199978 goal is 0 0002

Figure 1 The error curve of membrane flux in BP network trainingprocess

network system As the BP network is based on gradientdescent technique it may produce an entirely different resultfor initial different values given randomly in traditional Oncethe value is improper it will cause oscillation or inconve-nience of the network and it is also easy to fall into localextremism in complex situations The genetic algorithm hasindependence of the initial value global search optimizationability faster convergence rate and so forth So we can usegenetic algorithm to search properly initial weights of BPneural network for optimizing network and then it is assignedto BP network for training

3 Optimize Parameters ofBP Model of the MBR MembraneFouling by Genetic Algorithm

Known by the last section the BP neural network canachieve the detection of the MBR membrane flux but theaccuracy and speed need to further improve BP algorithmonly adjusted connection weights of neural network from thelocal angle but did not examine the whole learning processform the global perspective So it is easy to converge tolocal minimum The first two methods for improving BPalgorithm proposed in the last section are essentially toadjust network weights in the local area and cannot overcomethe local searching feature of BP algorithm

To avoid this we need to consider it from the globalangle One of the ways to improve them is using the thirdmethod proposed in the last section It is introducing thesearching method of global intelligent optimization in thelearning process of the neural network genetic algorithm Sothis section considered adopting genetic algorithm to searchfor the initial weights and the threshold suitable to BP neuralnetwork so as to realize the optimization of the networkand achieve the purpose improving MBR membrane foulingprediction

0

5

10

15

20

25

30

35

40

45

50

Out

put o

f th

e m

embr

ane

Prediction result of membrane flux of BP

D Desired value

Input forecast samples1 2 3 4 5 6

BP

Figure 2 Measured values of membrane flux and predicted valuesof BP network

31 Genetic Algorithm Genetic algorithm (GA) is a kindof random searching algorithm referencing natural selec-tion and genetic mechanism of biological evolution Thealgorithm is a new class of general optimization algorithmfor solving the optimization problem of general nonlinearsearching It is not required for linear continuous andmicro of the model and also less affected by the number ofvariables and constraints And it can be adaptive to learnand accumulate the information of search space so as tofind the optimal solution GA has been applied for functionoptimization combinatorial optimization automatic controlrobotics parameter optimization of neural network struc-ture design image processing systems and other fields

Compared with other optimization algorithms GA opti-mization is a kind of robust and global searching algorithmand it mainly has the following features

(1) adopting the encoding of the decision variables asoperands

(2) directly taking values of the objective function assearching information

(3) usingmultiple searching points to search for informa-tion at the same time

(4) using probabilistic search technology

32 Optimization of BP Network Parameters by GA

321 Description of the Algorithm BP algorithm is a kindof local searching method based on gradient descent Afterthe initial weight vector is determined its searching directionis also determined So the algorithm can only start fromthe initial value and gradually goes for the direction untilit meets the stopping criteria of algorithm Its advantage isthe better ability of local optimization but the drawback

Abstract and Applied Analysis 5

is easy to fall into local minimum and slow convergencerate This is because the initial weight vector of the BPnetwork is selected randomly which makes the direction ofthe process optimizationwithout certainty So the effect of thesolution is sometimes good and sometimes bad As a kindof probabilistic optimization algorithm based on the law ofbiological evolution genetic algorithm has a strong adaptivesearching and global optimization ability and can avoid localminimum points with great probability Meanwhile it doesnot need gradient information for solving problems and itsoperation is more convenient This shows that the geneticalgorithm is suitable to be used to enhance the learning abilityof BP algorithm Therefore we introduced GA ideas in thetheory of BP network The two can well complement eachother and achieve the purpose of global optimization fast andefficiently In fact the combination of genetic algorithm andneural network has become one of the focuses of the researchand application of artificial intelligence algorithms [3] Withmaking full use of the advantage of both the new algorithmhas robustness and self-learning ability of neural network andalso has the global searching ability of genetic algorithm

Optimizing weights and the threshold of the BP neuralnetwork by GA is divided into two parts The first part usedgenetic algorithm embedding neural network and searchedthe best individual in the probable scope of the weights of BPnetworkThe second part continued to use BP algorithm Onthe basis of GA optimization we used the optimal individualssearched by GA as initial weights and the threshold of BPnetwork and then directly used BP algorithm to train thenetwork So this section mainly described achieved methodunder the premise of fixed structure of BP network that iswe first used GA to optimize initially connected weights andthe threshold of BP network and then used BP algorithm tosearch accurately the optimal solution of the weight and thethreshold

322 Steps of Implementation GAoptimizes weights and thethreshold of neural network of BP algorithm whose steps ofimplementation are as follows

(1) Encoding and Initialization Using the genetic algorithmto optimize the neural network its main focus is to optimizeconnection weights and threshold between neurons in theneural network Initial populations are whose connectionweights and thresholds of neural networks are initializedBecause connectionweights and the threshold of the networkare real so each of connection weights and the thresholddirectly is expressed by real number This encoding is moreintuitive comparing with binary and whose accuracy is alsohigher than that We set the size of the group as119873 and eachindividual of the population contained 119878 genes (connectionweights) 119878 = 119877 lowast 119878

1+ 1198781lowast 1198782+ 1198781+ 1198782 where 119877 119878

1 1198782

represented the number of neurons in the input layer thenumber of neurons in the hidden layer and the number ofneurons in the output layer The scope of each real was set as(minus1 1) and we get the initial population 119875(119905)

(2) Calculation of FitnessCalculate fitness value of all individ-uals of the population We assigned 119878 weights and thresholds

of the initial group to the BP neural network for forwardpropagation of the input signal and calculated the sum ofsquared errors between the output value and the desired ofthe network So the fitness function can be measured by thereciprocal of sum of squared errors [4] as follows

fitness (sol) = 1

sum1198990

119894=1(119900119894minus 119910119894)2 (4)

where sol means any individual of the population 0119894means

the actual output of the 119894th output neuron 119910119894means the

desired output of the 119894th output neuron and 1198990means the

number of the output neurons The smaller the differencebetween the actual output and the target output was thebigger the value of fitness function was and also the more thedesired requirement was

(3) Selection Sort After calculating the fitness of each indi-vidual in the population we sorted all individuals in thepopulation according to their respective fitness Accordingto this order it determined the probability of each individualselected that is proportional selection method as follows

fitness (sol) =119891119894

sum119872

119894=1119891119894

(5)

It ensured that the bigger the fitness value of individualis the greater it is selected and the lower the fitness valueof individual is the lower it is selected But there is a smallfitness value of the individual that may also be selectedTherefore while choosing we joined the best choice strategyand retained the optimal individual of each generation to theoffspring

(4) Genetic Operators

(a) Crossover Operator Cross is the most important meansgetting the new best individual This paper used the arith-metic crossover which produced two new individuals by alinear combination of two individualsWe set two individuals119883119894119860and119883119894

119861that crossed each other then the new individuals

produced are

119883119894+1119860

= 120572119883119894119861+ (1 minus 120572)119883

119894

119860

119883119894+1119861

= 120572119883119894119860+ (1 minus 120572)119883

119894

119861

(6)

where 120572 le 119875119888and 120572(0 lt 120572 lt 1) is a random number

generated by the parent individual of each pair and 119875119888is a

crossover rate of the parent individual of each pair

(b) Mutation Operator Use uniform mutation We set muta-tion probability as 119875

119898and set mutation point as 119909

119896whose

range is [119880119896min 119880119896

max] Then new genetic value of mutationpoint is 1199091015840

119896= 119880119896min + 119903(119880

119896

max minus 119880119896

min) where 119903 is a randomnumber in the range [0 1]

(5) Generate NewPopulationsWe inserted the new individualinto the population 119875(119905) which generated a new population119875(119905+1)Thenwe gave connection weights of the individual of

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Research Article The Application and Research of the GA-BP

Abstract and Applied Analysis 3

Table 1 Data of membrane flux pressure and so forth collected

Time (h) Temperature (∘C) Pressure (MPa) MLSS of inflow (mgL) Total resistance (times1012mminus1) Fluxes (Lm2 h)1 24 0016 34338 205 4642 24 00168 37815 2403 4553 24 00175 39242 2989 4534 24 00243 47243 3796 4225 24 00268 48373 3957 4516 24 00291 58316 441 4227 24 00325 55643 5119 3978 24 00362 50356 6075 3739 24 00351 59141 7143 31410 24 00385 56185 8421 28911 24 00269 61242 7623 21712 24 00226 65547 9737 14513 24 00198 71243 11659 11214 24 00193 61512 12911 10515 24 00187 71589 1254 9416 24 00157 33128 1793 51817 24 00164 38015 24175 46518 24 00197 39141 3197 42319 24 00239 44244 3673 41220 24 00264 49372 3961 43421 24 00283 58723 4438 39222 24 00319 56843 5233 37823 24 00355 58356 6019 36724 24 00369 58541 7013 32525 24 00384 57167 8025 28926 24 00268 60132 9043 21627 24 00234 65415 9941 15928 24 00196 69823 10892 11329 24 00191 69572 12136 10130 24 00185 70946 12759 84

Table 2 Relative errors of membrane flux prediction of BP network

Desired values (Lm2 h) 112000 453000 423000 94000 217000 289000Actual values (Lm2 h) 102366 478755 416611 89128 232551 241967Relative error 00860 00569 00151 00518 00717 01627Average relative error 00740

232 Improvement In response to these problems research-ers have proposedmany effective improvements for exampleconsider the following

(1) Momentum Additional Method Momentum additionalmethod is based on the method of reverse propagation andat the same time takes into account the effect of the error inspatial gradient and errors surface And changes of the newweight are plus the value proportional to last-connectionweight on the basis of the change of each of connectionweights When being without additional momentum thenetwork may fall into the local minimum of shallow and itcan slip the minimum in the role of additional momentum

(2) Adaptive Learning Rate The learning rate is usually ob-tained by experience or experiment but sometimes even theeffect of the learning rate is better in initial training whichcan also not guarantee appropriate training in the later Soit needs to correct adaptively learning rate Its criteria are asfollows Check whether it really reduces the error functionafter the weight is corrected If it is then explain that thelearning rate selected is small and can be increased a quantityand if not then the value of learning rate should be reduced

(3) Optimize Initial Weights and theThreshold of the NetworkThe connection weights between neurons of each layer ofthe neural network contain all the knowledge of the neural

4 Abstract and Applied Analysis

10minus1

10minus2

10minus3

10minus4

0 500 1000 1500 2000 30002500 3500 4000

4254 epochsStop training

Trai

ning

blu

e go

al b

lack

Performance is 0000199978 goal is 0 0002

Figure 1 The error curve of membrane flux in BP network trainingprocess

network system As the BP network is based on gradientdescent technique it may produce an entirely different resultfor initial different values given randomly in traditional Oncethe value is improper it will cause oscillation or inconve-nience of the network and it is also easy to fall into localextremism in complex situations The genetic algorithm hasindependence of the initial value global search optimizationability faster convergence rate and so forth So we can usegenetic algorithm to search properly initial weights of BPneural network for optimizing network and then it is assignedto BP network for training

3 Optimize Parameters ofBP Model of the MBR MembraneFouling by Genetic Algorithm

Known by the last section the BP neural network canachieve the detection of the MBR membrane flux but theaccuracy and speed need to further improve BP algorithmonly adjusted connection weights of neural network from thelocal angle but did not examine the whole learning processform the global perspective So it is easy to converge tolocal minimum The first two methods for improving BPalgorithm proposed in the last section are essentially toadjust network weights in the local area and cannot overcomethe local searching feature of BP algorithm

To avoid this we need to consider it from the globalangle One of the ways to improve them is using the thirdmethod proposed in the last section It is introducing thesearching method of global intelligent optimization in thelearning process of the neural network genetic algorithm Sothis section considered adopting genetic algorithm to searchfor the initial weights and the threshold suitable to BP neuralnetwork so as to realize the optimization of the networkand achieve the purpose improving MBR membrane foulingprediction

0

5

10

15

20

25

30

35

40

45

50

Out

put o

f th

e m

embr

ane

Prediction result of membrane flux of BP

D Desired value

Input forecast samples1 2 3 4 5 6

BP

Figure 2 Measured values of membrane flux and predicted valuesof BP network

31 Genetic Algorithm Genetic algorithm (GA) is a kindof random searching algorithm referencing natural selec-tion and genetic mechanism of biological evolution Thealgorithm is a new class of general optimization algorithmfor solving the optimization problem of general nonlinearsearching It is not required for linear continuous andmicro of the model and also less affected by the number ofvariables and constraints And it can be adaptive to learnand accumulate the information of search space so as tofind the optimal solution GA has been applied for functionoptimization combinatorial optimization automatic controlrobotics parameter optimization of neural network struc-ture design image processing systems and other fields

Compared with other optimization algorithms GA opti-mization is a kind of robust and global searching algorithmand it mainly has the following features

(1) adopting the encoding of the decision variables asoperands

(2) directly taking values of the objective function assearching information

(3) usingmultiple searching points to search for informa-tion at the same time

(4) using probabilistic search technology

32 Optimization of BP Network Parameters by GA

321 Description of the Algorithm BP algorithm is a kindof local searching method based on gradient descent Afterthe initial weight vector is determined its searching directionis also determined So the algorithm can only start fromthe initial value and gradually goes for the direction untilit meets the stopping criteria of algorithm Its advantage isthe better ability of local optimization but the drawback

Abstract and Applied Analysis 5

is easy to fall into local minimum and slow convergencerate This is because the initial weight vector of the BPnetwork is selected randomly which makes the direction ofthe process optimizationwithout certainty So the effect of thesolution is sometimes good and sometimes bad As a kindof probabilistic optimization algorithm based on the law ofbiological evolution genetic algorithm has a strong adaptivesearching and global optimization ability and can avoid localminimum points with great probability Meanwhile it doesnot need gradient information for solving problems and itsoperation is more convenient This shows that the geneticalgorithm is suitable to be used to enhance the learning abilityof BP algorithm Therefore we introduced GA ideas in thetheory of BP network The two can well complement eachother and achieve the purpose of global optimization fast andefficiently In fact the combination of genetic algorithm andneural network has become one of the focuses of the researchand application of artificial intelligence algorithms [3] Withmaking full use of the advantage of both the new algorithmhas robustness and self-learning ability of neural network andalso has the global searching ability of genetic algorithm

Optimizing weights and the threshold of the BP neuralnetwork by GA is divided into two parts The first part usedgenetic algorithm embedding neural network and searchedthe best individual in the probable scope of the weights of BPnetworkThe second part continued to use BP algorithm Onthe basis of GA optimization we used the optimal individualssearched by GA as initial weights and the threshold of BPnetwork and then directly used BP algorithm to train thenetwork So this section mainly described achieved methodunder the premise of fixed structure of BP network that iswe first used GA to optimize initially connected weights andthe threshold of BP network and then used BP algorithm tosearch accurately the optimal solution of the weight and thethreshold

322 Steps of Implementation GAoptimizes weights and thethreshold of neural network of BP algorithm whose steps ofimplementation are as follows

(1) Encoding and Initialization Using the genetic algorithmto optimize the neural network its main focus is to optimizeconnection weights and threshold between neurons in theneural network Initial populations are whose connectionweights and thresholds of neural networks are initializedBecause connectionweights and the threshold of the networkare real so each of connection weights and the thresholddirectly is expressed by real number This encoding is moreintuitive comparing with binary and whose accuracy is alsohigher than that We set the size of the group as119873 and eachindividual of the population contained 119878 genes (connectionweights) 119878 = 119877 lowast 119878

1+ 1198781lowast 1198782+ 1198781+ 1198782 where 119877 119878

1 1198782

represented the number of neurons in the input layer thenumber of neurons in the hidden layer and the number ofneurons in the output layer The scope of each real was set as(minus1 1) and we get the initial population 119875(119905)

(2) Calculation of FitnessCalculate fitness value of all individ-uals of the population We assigned 119878 weights and thresholds

of the initial group to the BP neural network for forwardpropagation of the input signal and calculated the sum ofsquared errors between the output value and the desired ofthe network So the fitness function can be measured by thereciprocal of sum of squared errors [4] as follows

fitness (sol) = 1

sum1198990

119894=1(119900119894minus 119910119894)2 (4)

where sol means any individual of the population 0119894means

the actual output of the 119894th output neuron 119910119894means the

desired output of the 119894th output neuron and 1198990means the

number of the output neurons The smaller the differencebetween the actual output and the target output was thebigger the value of fitness function was and also the more thedesired requirement was

(3) Selection Sort After calculating the fitness of each indi-vidual in the population we sorted all individuals in thepopulation according to their respective fitness Accordingto this order it determined the probability of each individualselected that is proportional selection method as follows

fitness (sol) =119891119894

sum119872

119894=1119891119894

(5)

It ensured that the bigger the fitness value of individualis the greater it is selected and the lower the fitness valueof individual is the lower it is selected But there is a smallfitness value of the individual that may also be selectedTherefore while choosing we joined the best choice strategyand retained the optimal individual of each generation to theoffspring

(4) Genetic Operators

(a) Crossover Operator Cross is the most important meansgetting the new best individual This paper used the arith-metic crossover which produced two new individuals by alinear combination of two individualsWe set two individuals119883119894119860and119883119894

119861that crossed each other then the new individuals

produced are

119883119894+1119860

= 120572119883119894119861+ (1 minus 120572)119883

119894

119860

119883119894+1119861

= 120572119883119894119860+ (1 minus 120572)119883

119894

119861

(6)

where 120572 le 119875119888and 120572(0 lt 120572 lt 1) is a random number

generated by the parent individual of each pair and 119875119888is a

crossover rate of the parent individual of each pair

(b) Mutation Operator Use uniform mutation We set muta-tion probability as 119875

119898and set mutation point as 119909

119896whose

range is [119880119896min 119880119896

max] Then new genetic value of mutationpoint is 1199091015840

119896= 119880119896min + 119903(119880

119896

max minus 119880119896

min) where 119903 is a randomnumber in the range [0 1]

(5) Generate NewPopulationsWe inserted the new individualinto the population 119875(119905) which generated a new population119875(119905+1)Thenwe gave connection weights of the individual of

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Research Article The Application and Research of the GA-BP

4 Abstract and Applied Analysis

10minus1

10minus2

10minus3

10minus4

0 500 1000 1500 2000 30002500 3500 4000

4254 epochsStop training

Trai

ning

blu

e go

al b

lack

Performance is 0000199978 goal is 0 0002

Figure 1 The error curve of membrane flux in BP network trainingprocess

network system As the BP network is based on gradientdescent technique it may produce an entirely different resultfor initial different values given randomly in traditional Oncethe value is improper it will cause oscillation or inconve-nience of the network and it is also easy to fall into localextremism in complex situations The genetic algorithm hasindependence of the initial value global search optimizationability faster convergence rate and so forth So we can usegenetic algorithm to search properly initial weights of BPneural network for optimizing network and then it is assignedto BP network for training

3 Optimize Parameters ofBP Model of the MBR MembraneFouling by Genetic Algorithm

Known by the last section the BP neural network canachieve the detection of the MBR membrane flux but theaccuracy and speed need to further improve BP algorithmonly adjusted connection weights of neural network from thelocal angle but did not examine the whole learning processform the global perspective So it is easy to converge tolocal minimum The first two methods for improving BPalgorithm proposed in the last section are essentially toadjust network weights in the local area and cannot overcomethe local searching feature of BP algorithm

To avoid this we need to consider it from the globalangle One of the ways to improve them is using the thirdmethod proposed in the last section It is introducing thesearching method of global intelligent optimization in thelearning process of the neural network genetic algorithm Sothis section considered adopting genetic algorithm to searchfor the initial weights and the threshold suitable to BP neuralnetwork so as to realize the optimization of the networkand achieve the purpose improving MBR membrane foulingprediction

0

5

10

15

20

25

30

35

40

45

50

Out

put o

f th

e m

embr

ane

Prediction result of membrane flux of BP

D Desired value

Input forecast samples1 2 3 4 5 6

BP

Figure 2 Measured values of membrane flux and predicted valuesof BP network

31 Genetic Algorithm Genetic algorithm (GA) is a kindof random searching algorithm referencing natural selec-tion and genetic mechanism of biological evolution Thealgorithm is a new class of general optimization algorithmfor solving the optimization problem of general nonlinearsearching It is not required for linear continuous andmicro of the model and also less affected by the number ofvariables and constraints And it can be adaptive to learnand accumulate the information of search space so as tofind the optimal solution GA has been applied for functionoptimization combinatorial optimization automatic controlrobotics parameter optimization of neural network struc-ture design image processing systems and other fields

Compared with other optimization algorithms GA opti-mization is a kind of robust and global searching algorithmand it mainly has the following features

(1) adopting the encoding of the decision variables asoperands

(2) directly taking values of the objective function assearching information

(3) usingmultiple searching points to search for informa-tion at the same time

(4) using probabilistic search technology

32 Optimization of BP Network Parameters by GA

321 Description of the Algorithm BP algorithm is a kindof local searching method based on gradient descent Afterthe initial weight vector is determined its searching directionis also determined So the algorithm can only start fromthe initial value and gradually goes for the direction untilit meets the stopping criteria of algorithm Its advantage isthe better ability of local optimization but the drawback

Abstract and Applied Analysis 5

is easy to fall into local minimum and slow convergencerate This is because the initial weight vector of the BPnetwork is selected randomly which makes the direction ofthe process optimizationwithout certainty So the effect of thesolution is sometimes good and sometimes bad As a kindof probabilistic optimization algorithm based on the law ofbiological evolution genetic algorithm has a strong adaptivesearching and global optimization ability and can avoid localminimum points with great probability Meanwhile it doesnot need gradient information for solving problems and itsoperation is more convenient This shows that the geneticalgorithm is suitable to be used to enhance the learning abilityof BP algorithm Therefore we introduced GA ideas in thetheory of BP network The two can well complement eachother and achieve the purpose of global optimization fast andefficiently In fact the combination of genetic algorithm andneural network has become one of the focuses of the researchand application of artificial intelligence algorithms [3] Withmaking full use of the advantage of both the new algorithmhas robustness and self-learning ability of neural network andalso has the global searching ability of genetic algorithm

Optimizing weights and the threshold of the BP neuralnetwork by GA is divided into two parts The first part usedgenetic algorithm embedding neural network and searchedthe best individual in the probable scope of the weights of BPnetworkThe second part continued to use BP algorithm Onthe basis of GA optimization we used the optimal individualssearched by GA as initial weights and the threshold of BPnetwork and then directly used BP algorithm to train thenetwork So this section mainly described achieved methodunder the premise of fixed structure of BP network that iswe first used GA to optimize initially connected weights andthe threshold of BP network and then used BP algorithm tosearch accurately the optimal solution of the weight and thethreshold

322 Steps of Implementation GAoptimizes weights and thethreshold of neural network of BP algorithm whose steps ofimplementation are as follows

(1) Encoding and Initialization Using the genetic algorithmto optimize the neural network its main focus is to optimizeconnection weights and threshold between neurons in theneural network Initial populations are whose connectionweights and thresholds of neural networks are initializedBecause connectionweights and the threshold of the networkare real so each of connection weights and the thresholddirectly is expressed by real number This encoding is moreintuitive comparing with binary and whose accuracy is alsohigher than that We set the size of the group as119873 and eachindividual of the population contained 119878 genes (connectionweights) 119878 = 119877 lowast 119878

1+ 1198781lowast 1198782+ 1198781+ 1198782 where 119877 119878

1 1198782

represented the number of neurons in the input layer thenumber of neurons in the hidden layer and the number ofneurons in the output layer The scope of each real was set as(minus1 1) and we get the initial population 119875(119905)

(2) Calculation of FitnessCalculate fitness value of all individ-uals of the population We assigned 119878 weights and thresholds

of the initial group to the BP neural network for forwardpropagation of the input signal and calculated the sum ofsquared errors between the output value and the desired ofthe network So the fitness function can be measured by thereciprocal of sum of squared errors [4] as follows

fitness (sol) = 1

sum1198990

119894=1(119900119894minus 119910119894)2 (4)

where sol means any individual of the population 0119894means

the actual output of the 119894th output neuron 119910119894means the

desired output of the 119894th output neuron and 1198990means the

number of the output neurons The smaller the differencebetween the actual output and the target output was thebigger the value of fitness function was and also the more thedesired requirement was

(3) Selection Sort After calculating the fitness of each indi-vidual in the population we sorted all individuals in thepopulation according to their respective fitness Accordingto this order it determined the probability of each individualselected that is proportional selection method as follows

fitness (sol) =119891119894

sum119872

119894=1119891119894

(5)

It ensured that the bigger the fitness value of individualis the greater it is selected and the lower the fitness valueof individual is the lower it is selected But there is a smallfitness value of the individual that may also be selectedTherefore while choosing we joined the best choice strategyand retained the optimal individual of each generation to theoffspring

(4) Genetic Operators

(a) Crossover Operator Cross is the most important meansgetting the new best individual This paper used the arith-metic crossover which produced two new individuals by alinear combination of two individualsWe set two individuals119883119894119860and119883119894

119861that crossed each other then the new individuals

produced are

119883119894+1119860

= 120572119883119894119861+ (1 minus 120572)119883

119894

119860

119883119894+1119861

= 120572119883119894119860+ (1 minus 120572)119883

119894

119861

(6)

where 120572 le 119875119888and 120572(0 lt 120572 lt 1) is a random number

generated by the parent individual of each pair and 119875119888is a

crossover rate of the parent individual of each pair

(b) Mutation Operator Use uniform mutation We set muta-tion probability as 119875

119898and set mutation point as 119909

119896whose

range is [119880119896min 119880119896

max] Then new genetic value of mutationpoint is 1199091015840

119896= 119880119896min + 119903(119880

119896

max minus 119880119896

min) where 119903 is a randomnumber in the range [0 1]

(5) Generate NewPopulationsWe inserted the new individualinto the population 119875(119905) which generated a new population119875(119905+1)Thenwe gave connection weights of the individual of

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Research Article The Application and Research of the GA-BP

Abstract and Applied Analysis 5

is easy to fall into local minimum and slow convergencerate This is because the initial weight vector of the BPnetwork is selected randomly which makes the direction ofthe process optimizationwithout certainty So the effect of thesolution is sometimes good and sometimes bad As a kindof probabilistic optimization algorithm based on the law ofbiological evolution genetic algorithm has a strong adaptivesearching and global optimization ability and can avoid localminimum points with great probability Meanwhile it doesnot need gradient information for solving problems and itsoperation is more convenient This shows that the geneticalgorithm is suitable to be used to enhance the learning abilityof BP algorithm Therefore we introduced GA ideas in thetheory of BP network The two can well complement eachother and achieve the purpose of global optimization fast andefficiently In fact the combination of genetic algorithm andneural network has become one of the focuses of the researchand application of artificial intelligence algorithms [3] Withmaking full use of the advantage of both the new algorithmhas robustness and self-learning ability of neural network andalso has the global searching ability of genetic algorithm

Optimizing weights and the threshold of the BP neuralnetwork by GA is divided into two parts The first part usedgenetic algorithm embedding neural network and searchedthe best individual in the probable scope of the weights of BPnetworkThe second part continued to use BP algorithm Onthe basis of GA optimization we used the optimal individualssearched by GA as initial weights and the threshold of BPnetwork and then directly used BP algorithm to train thenetwork So this section mainly described achieved methodunder the premise of fixed structure of BP network that iswe first used GA to optimize initially connected weights andthe threshold of BP network and then used BP algorithm tosearch accurately the optimal solution of the weight and thethreshold

322 Steps of Implementation GAoptimizes weights and thethreshold of neural network of BP algorithm whose steps ofimplementation are as follows

(1) Encoding and Initialization Using the genetic algorithmto optimize the neural network its main focus is to optimizeconnection weights and threshold between neurons in theneural network Initial populations are whose connectionweights and thresholds of neural networks are initializedBecause connectionweights and the threshold of the networkare real so each of connection weights and the thresholddirectly is expressed by real number This encoding is moreintuitive comparing with binary and whose accuracy is alsohigher than that We set the size of the group as119873 and eachindividual of the population contained 119878 genes (connectionweights) 119878 = 119877 lowast 119878

1+ 1198781lowast 1198782+ 1198781+ 1198782 where 119877 119878

1 1198782

represented the number of neurons in the input layer thenumber of neurons in the hidden layer and the number ofneurons in the output layer The scope of each real was set as(minus1 1) and we get the initial population 119875(119905)

(2) Calculation of FitnessCalculate fitness value of all individ-uals of the population We assigned 119878 weights and thresholds

of the initial group to the BP neural network for forwardpropagation of the input signal and calculated the sum ofsquared errors between the output value and the desired ofthe network So the fitness function can be measured by thereciprocal of sum of squared errors [4] as follows

fitness (sol) = 1

sum1198990

119894=1(119900119894minus 119910119894)2 (4)

where sol means any individual of the population 0119894means

the actual output of the 119894th output neuron 119910119894means the

desired output of the 119894th output neuron and 1198990means the

number of the output neurons The smaller the differencebetween the actual output and the target output was thebigger the value of fitness function was and also the more thedesired requirement was

(3) Selection Sort After calculating the fitness of each indi-vidual in the population we sorted all individuals in thepopulation according to their respective fitness Accordingto this order it determined the probability of each individualselected that is proportional selection method as follows

fitness (sol) =119891119894

sum119872

119894=1119891119894

(5)

It ensured that the bigger the fitness value of individualis the greater it is selected and the lower the fitness valueof individual is the lower it is selected But there is a smallfitness value of the individual that may also be selectedTherefore while choosing we joined the best choice strategyand retained the optimal individual of each generation to theoffspring

(4) Genetic Operators

(a) Crossover Operator Cross is the most important meansgetting the new best individual This paper used the arith-metic crossover which produced two new individuals by alinear combination of two individualsWe set two individuals119883119894119860and119883119894

119861that crossed each other then the new individuals

produced are

119883119894+1119860

= 120572119883119894119861+ (1 minus 120572)119883

119894

119860

119883119894+1119861

= 120572119883119894119860+ (1 minus 120572)119883

119894

119861

(6)

where 120572 le 119875119888and 120572(0 lt 120572 lt 1) is a random number

generated by the parent individual of each pair and 119875119888is a

crossover rate of the parent individual of each pair

(b) Mutation Operator Use uniform mutation We set muta-tion probability as 119875

119898and set mutation point as 119909

119896whose

range is [119880119896min 119880119896

max] Then new genetic value of mutationpoint is 1199091015840

119896= 119880119896min + 119903(119880

119896

max minus 119880119896

min) where 119903 is a randomnumber in the range [0 1]

(5) Generate NewPopulationsWe inserted the new individualinto the population 119875(119905) which generated a new population119875(119905+1)Thenwe gave connection weights of the individual of

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Research Article The Application and Research of the GA-BP

6 Abstract and Applied Analysis

new population to neural network and computed the fitnessfunction of the new individual If it reached the predeter-mined value then we entered the next step Otherwise wecontinued their genetic operations

(6) Assign Values to the Initial Weights of BP Network Wedecoded the optimal individual searched by GA and assignedit to the initial connection weight corresponding with BPnetwork Then we continued to use BP algorithm to trainnetwork until the error sum of squares achieved the specifiedaccuracy 119864bp(119864bp lt 119864ga) or achieved the maximum numberof iterations set and the algorithm ends

The first part of the genetic BP algorithm was that weembedded neural network by genetic algorithm and searchedout the optimal individual within the approximate rangeof the neural network weights If the error sum of squaresachieved 119864ga we entered the next step without requiring toreach the highest accuracy The second part continued toadopt BP algorithm to train the network On the basis of opti-mization by genetic algorithm we set the optimal individualsearched by GA as initial weights of neural network and thendirectly used BP algorithm until the error sum of squaresachieved the minimum error of the problem Theoreticallygenetic algorithm has the global searching ability and canquickly determine the area of global optimal solution But itslocal searching ability is weak and cannot search the optimalsolution quickly So after the genetic algorithm searchedthe approximate scope of the global optimal solution wegave full play to local searching of BP algorithm With twoparts combining organically it can better accelerate operationefficiency and increase the learning ability of neural network

33 Model of GA-BP Neural Network of the MBR MembraneFouling Prediction

331 The Design of the GA-BP Model of the MBRMembrane Fouling

(1) Set BP Network Parameters BP network structure hadbeen determined by Section 22 In order to compare with BPneural network we used the same parameters settingwith thesecond section The rate of the initial learning was 008 thetransfer function of input layer to hidden layerwas logsig andhidden layer to output layer was purelin the training functionwas trainlm the index of error was set as 2119890 minus 4

(2) Optimize Weights and the Threshold of BP Networkby the GA Because the weight can be expressed by realnumber sp we used the real coding method In additionbecause the variable itself is real number we can directlycalculate the target value and fitness value without decodingwhich can accelerate the searching speed [4] and be easyto combine with BP algorithm Each connection weight isdirectly expressed by a real number the bit string obtained byencoding all weights and thresholds was called an individualsol = [119904

1 1199042 119904

119871] Each weight or threshold corresponded

to a bit of the individual which was called a gene We get allweights and thresholds together with the order 119882

1 1198611198822 119878

which formed chromosome form of the real-coded GA andits length was 119871 = 119899

119894times 119899 + 119899 + 119899 times 119899

0+ 1198990 In this paper there

are 3 input units 7 hidden units and one output unit Andadding the threshold of hidden layer and output layer unitseach individual consists of 3 times 7 + 7 + 7 times 1 + 1 = 36 andthe range of each real number is (minus1 1)

Genetic algorithm basically does not use external infor-mation in the process of the evolutionary search and is onlybased on the fitness function So it is crucial for choosingfitness function which directly affects the convergence rate ofgenetic algorithm and the ability to find the optimal solutionSince the objective function is error sum of squares of theneural network and for getting its minimum value so thefitness function adapts the inverse of error function

The selecting operation used fitness proportion which isthe most common way the cross used the method of arith-metic crossover and variation method used the operation ofnonuniform mutation It was implemented with MATLABstatements as follows

[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 11989211989011989910158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840 [009][1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]10158401198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993])

If the value of the fitness function the optimal individualcorresponding to met accuracy requirements or reachediterations (119879 = 100) that we set or the range of the averagefitness value changed lastly that was less than a certainconstant and over a certain algebra then we finished thetraining At this time we set the individual that had thelargest fitness as output of the optimal solution and it wasthe initial weight and threshold that need to be optimizedOtherwise we continued the cycle After reordering thecurrent parent and offspring we chose 119873 individuals thathad higher fitness value as the next generation of the groupAnd we computed its fitness and trained them again untilsatisfying the termination conditions of the above

(3) Train the BP Network We decoded the optimal solutionwhich was obtained from genetic algorithm and we assignedthem to BP network without starting training as initialweights of BP network Then according to BP algorithmwe input training samples for training and learning of thenetwork and calculated the error between the output valueand the expected value If it was beyond the precisionrequirement then we turned to the backpropagation processand returned the error signal along the way At the same timeaccording to the size of the error of each layer we adjustedits weights and thresholds layer and layer Until the error 119864was less than given value 2119890 minus 4 or it reached the trainingnumber predetermined then we ended BP algorithm Wesaved the weight and the threshold of hidden layer that hadbeen trained as the new initial weights and the threshold ofthe network and then input the test sample for simulation

In summary that used MATLAB7100 (R2010) withinstalling the genetic algorithm toolbox gaot to compilefile M which achieved the optimization of weights andthresholds the core code is as follows

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Research Article The Application and Research of the GA-BP

Abstract and Applied Analysis 7

Table 3 Relative error of the prediction of GA-BP network

Desired values (Lm2 h) 392000 113000 434000 412000 465000 112000Actual values (Lm2 h) 367028 107743 446790 415090 453245 109101Relative error 00637 00465 00295 00075 00253 00259Average relative error 00331

Train the weight and the threshold of BP by GA[119875 119879 119877 1198781 1198782 119878] = nninit 119875 and 119879 are a pair of

training samples for the input and output of network 119877 1198781and 1198782 are respectively the input of network hidden layerand the number of output neurons 119878 is the encoding lengthof genetic algorithm Consider the following

119886119886 = 119900119899119890119904(119878 1) lowast [minus1 1]popu = 50 The scope of the group

119894119899119894119905119875119900119901 = 119894119899119894119905119894119886119897119894119911119890119892119886(119901119900119901119906 1198861198861015840119892119886119887119901119864V1198861198971015840) Ini-tial population Among gabpEval is target functionwhich also is called fitness function119892119890119899 = 100 The number of genetic[119909 119890119899119889119875119900119901 119887119875119900119901 119905119903119886119888119890]

= 119892119886(1198861198861015840119892119886119887119901119864V1198861198971015840 [] 119894119899119894119905119875119901119901 [1119890 minus 611]1015840

1198981198861199091198661198901198991198791198901199031198981015840 119892119890119899 10158401198991199001199031198981198661198901199001198981198781198901198971198901198881199051015840

[009] [1015840119886119903119894119905ℎ119883119900V1198901199031015840] [2]1015840

1198991199001198991198801198991198941198911198721199061199051198861199051198941199001198991015840[21198921198901198993]) Decode the weight and the threshold form code119883 and assign them to BP network without startingtraining[1198821 11986111198822 1198612 119875 119879 1198601 1198602 119878119864 V119886119897] =119892119886119889119890119888119900119889(119909) gadecod is decoding function1198991198901199051198681198821 1 = 11988211198991198901199051198711198822 1 = 11988221198991198901199051198871 = 11986111198991198901199051198872 = 1198612 Use the new weight and thresholds for training119899119890119905amp = 119899119890119908119891119891(119894119899119901119906119905119883119883 119900119906119905119901119906119905119884119884 71015840

11990511988611989911990411989411989210158401015840119901119906119903119890119897119894119899101584010158401199051199031198861198941198991198971198981015840)[119899119890119905 119905119903] = 119905119903119886119894119899(119899119890119905 119894119899119901119906119905 119883119883 119900119906119905119901119906119905 119884119884) Trainnetwork Simulation test119866119860119861119875 = 119904119894119898(119899119890119905 119894119899119901119906119905119899

119905119890119904119905)

34 Experimental and Simulation Results To illustrate theperformance of genetic algorithm for designing and optimiz-ing weights and the threshold of neural networks this paperused the same sample data That is we randomly selected 74groups as the training sample and the rest were selected as testdata to observe the training and testing performance of MBRmembrane flux by the GA-BP network

In the experiment the error sum of square and the fitnesscurve of genetic algorithm are shown in Figure 3 The figureshows that when the genetic algorithm converges error sumof squares and fitness change with the increase of geneticgenerations As it is shown on thewhole the error is reducingand the fitness is increasing

In the experiment the BP network optimized by geneticalgorithm only needs to go through 106 iterations to achievethe same predetermined error which significantly acceleratesthe training speed of the network

To detect the remaining 6 groups of data the experimen-tal result is shown in Figure 4 Obviously the GA-BP networkalso realized the reasonable prediction about membrane fluxAnd we can see from Table 3 that the biggest relative erroris reduced to 637 and the average relative error is reducedto 331 Obviously it had a prediction effect with moreaccuracy comparing to BP network with optimization And itrealized more reasonable forecast for the process of the MBRmembrane fouling It is visible that after optimization theself-learning ability of GA-BP neural network was strongerthan that of the BP neural network without optimizationAnd it had relatively higher accuracy of prediction bettergeneralization and nonlinear mapping and better networkperformance

4 Conclusion and Outlook

For MBR membrane fouling factor the traditional mathe-matical model of MBR membrane fouling has some defectsin different degrees which cannot accurately explain thephenomenon of membrane fouling this paper introducedsome intelligent algorithms by consulting relevant referencessuch as neural network and genetic algorithm to build themodel of MBR membrane fouling and implement relevantoptimization First we established the system model of MBRintelligent simulation based on BP neural network aboutthe relationship between the membrane fouling factors andmembrane fouling that expresses the degree of membraneflux The experiment found that it can be basically consistentbetween predicted value and desired value and only the errorof a few points was larger It indicated that the predictionmodel of BP neural network for membrane fouling was moresuccessful Referencing the characteristics of genetic algo-rithm followed by optimizing theweight and the threshold ofthe BP neural network by using genetic algorithm this paperestablished the prediction model of MBR membrane foulingthat was based on GA-BP neural network It was shownfrom experimental study that GA-BP network optimizedwas better than BP network model nonoptimized on therate of convergence and prediction precision It shows that

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Research Article The Application and Research of the GA-BP

8 Abstract and Applied Analysis

Sum

-squ

ared

erro

rThe error sum of square

15

10

5

00 10 20 30 40 50 60 70 80 90 100

Generation

MAXAVG

(a)

10

5

0

The fitness curve end T = 100

0 10 20 30 40 50 60 70 80 90 100

Generation

Fitn

ess

MAXAVG

(b)

Figure 3 The error sum of square and the fitness curve

05

101520253035404550

Out

put o

f the

mem

bran

e

Prediction result of membrane flux of GA-BP

Input forecast samples

DesiredGA-BP

1 2 3 4 5 6

Figure 4 Predicted and measured values of the GA-BP membraneflux

comparing to the relatively simple BP network the GA-BP network was more suitable for the prediction of MBRmembrane fouling

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] H-M Ke and C-Z Chen ldquoStudy on BP neural networkclassification with optimization of genetic algorithm for remotesensing imageryrdquo Journal of Southwest University vol 32 no 7article 128 2010

[2] Y-S Xu and J-H Gu ldquoHandwritten character recognitionbased on improved BP neural networkrdquo Communications Tech-nology vol 5 no 44 article 107 2011

[3] A A Javadi R Farmani and T P Tan ldquoA hybrid intelligentgenetic algorithmrdquo Advanced Engineering Informatics vol 19no 4 pp 255ndash262 2005

[4] X-F Meng H-R Lu and L Guo ldquoApproximately duplicaterecord detectionmethod based on neural network and geneticalgorithmrdquo Computer Engineering and Design vol 31 no 7 pp1550ndash1553 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Research Article The Application and Research of the GA-BP

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of