Upload
others
View
24
Download
0
Embed Size (px)
Citation preview
Research ArticleAdaptive Cuckoo Search Algorithm forUnconstrained Optimization
Pauline Ong
Faculty of Mechanical and Manufacturing Engineering Universiti Tun Hussein Onn Malaysia (UTHM) 86400 Parit RajaBatu Pahat Johor Malaysia
Correspondence should be addressed to Pauline Ong ongputhmedumy
Received 24 June 2014 Accepted 15 August 2014 Published 14 September 2014
Academic Editor Nirupam Chakraborti
Copyright copy 2014 Pauline OngThis is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
Modification of the intensification and diversification approaches in the recently developed cuckoo search algorithm (CSA)is performed The alteration involves the implementation of adaptive step size adjustment strategy and thus enabling fasterconvergence to the global optimal solutionsThe feasibility of the proposed algorithm is validated against benchmark optimizationfunctions where the obtained results demonstrate a marked improvement over the standard CSA in all the cases
1 Introduction
The solutions to multitudinous domains-whether in engi-neering design operational research industrial process oreconomics inevitably have optimization at heart Howeverhaving a solid grasp for such problems turns out to bepainstakingly tough and tedious and thus gearing towardsan efficient and effective algorithm in the light of solvingincreasingly complex optimization problems in practice isof paramount significance Extensive and intensive studiesin this aspect fruit in numerous optimization techniquesparticularly the bioinspired metaheuristic methods whichdraw inspiration from the means on how humans and livingcreatures struggle to survive in a challenging environmentfor instance genetic algorithm (GA) [1] particle swarmoptimization (PSO) [2] differential evolution (DE) [3] antcolony optimization [4] artificial bee colony algorithm [5]and firefly algorithm [6] form the hot topics in this area
Cuckoo search algorithm (CSA) another adoption ofbiomimicry in the optimization technique which reproducesthe breeding strategy of the best known brood parasitic birdthe cuckoos has been proposed by Yang and Deb recently[7 8] Cuckoos probably one of themost vicious and cunningspecies of all bird breeds clandestinely lay their eggs in thenests of other host birds sparing themselves the parentalresponsibilities of raising the young In fact cuckoos practicethe art of deception all the time in their reproductive life
They mimic the colour and pattern of the host eggshell inorder to disguise their eggs from being detected by the hostbirds To make more space and food for their young chickcuckoos will steal the host egg while sneaking their own intothe nest However the relationship between the host speciesand the cuckoos is often a continuous arms race The hostslearn to discern the imposters and they either throw out theparasitic eggs or desert the nest the parasites improve theforgery skill to make their eggs to appear more alike with thehost eggs
The feasibility of applying the CSA to locate the globaloptimum for the optimization problems has been inves-tigated in the literature In the pioneering work of Yangand Deb the CSA has been implemented successfully inoptimizing several benchmark functions and their findingsshowed that the global search ability of the CSA is moreefficient than GA and PSO [7 8] On the other hand the CSAhas been employed in diverse domains since its inceptionincluding engineering design process [9ndash12] chaotic system[13] wireless sensor networks [14 15] structural optimizationproblem [9 16 17] image processing [18 19] milling process[20] and scheduling problem [21ndash23] Undoubtedly its pop-ularity increases unceasingly in the not-to-distant future
Nevertheless in real world situations obtaining the exactglobal optimum is impracticable as the underlying prob-lems are always subjected to various uncertainties and con-straints In this case instead of finding the actual optimum
Hindawi Publishing Corporatione Scientific World JournalVolume 2014 Article ID 943403 8 pageshttpdxdoiorg1011552014943403
2 The Scientific World Journal
the core consideration in selecting an appropriate optimiza-tion technique is how much improvement is achievable fora given application at a plausible computational complexitywith an acceptable error The main thrust of this paper istherefore geared towards a modified CSA which integratesan accelerated searching strategy in its computation Theimprovement over the CSA is tested and validated throughthe optimization of several benchmarks The paper is orga-nized as follows In Section 2 the standard CSA is intro-duced and its deficiencies are discussed The modified CSAspecifically the adaptive cuckoo search algorithm (ASCA) isproposed in Section 3 and the comparative results in evalu-ating the benchmark optimization functions are presented inSection 4 Finally some conclusions are drawn in Section 5
2 Cuckoo Search Algorithm
The CSA which draws inspiration from cuckoorsquos adaption tobreeding and reproduction is idealized with the assumptionsas follows
(i) each cuckoo lays one egg in a randomly selected hostnest at a time where the egg represents the possiblesolution for the problem under study
(ii) the CSA follows the survival of the fittest principleOnly the fittest among all the host nests with highquality eggs will be passed on to the next generation
(iii) the number of host nests in the CSA is fixed before-hand The host bird spots the intruder egg with aprobability 119901
119886isin [0 1] For such incidents the host
bird will either evict the parasitic egg or abandon thenest totally and seek for a new site to rebuild the nest
Derived from these assumptions the steps involved inthe computation of the standard CSA are presented inAlgorithm 1 [7]
Such idealized assumptions in the CSA similarly to whatwas previously proposed in other metaheuristic optimizationapproaches make use of the ideas of elitism intensificationand diversification Start with a population of possible solu-tions a new and potentially better solution (cuckoo egg) isgenerated If the fitness value of this cuckoo egg is higher thananother randomly selected egg from the available host neststhen it will replace that poor solution
The cuckoo lays an egg at random location via Levy flightwhich is characterized by
119909119894= (119894119905119890119903 + 1) = 119909
119894(119894119905119890119903) + 120572 times levy (120582)
levy (120582) =10038161003816100381610038161003816100381610038161003816
Γ (1 + 120582) times sin (1205871205822)Γ ((1 + 120582) 2) times 120582 times 2((120582minus1)2)
10038161003816100381610038161003816100381610038161003816
1120582
(1)
where 119909119894is the possible solution iter denotes the current
generation number Γ is the gamma function which is definedby the integral Γ(119909) = intinfin
0
119890minus119905
119905119909minus1
119889119905 and 120582 is a constant(1 lt 120582 le 3)
The Levy flight process is a random walk that forms aseries of instantaneous jumps chosen from a heavy-tailedprobability density function [24] The step size 120572 which
controls the scale of this random search patterns helpsexploit the search space around the current best solutionand meanwhile explore the search space more thoroughlyby far field randomization such that the search process caneffectively move away from a local optima Therefore thevalue for 120572 must be assigned judiciously The search processwill be less efficient if 120572 is chosen as a small value sincethe location for the new generated solution is near to theprevious On the other hand if the value for 120572 is too bigthe new cuckoo egg might be placed outside the boundsTo balance the effectiveness for both intensification anddiversification Yang and Deb assigned the value of 120572 as 1 [7]
Yang and Deb have also pointed out that the CSAoutperforms the GA and the PSO in terms of the number ofparameters to be adjusted [7] In theCSA only the probabilityof the abandoned nests 119901
119886is tuned However the setting of
119901119886= 025 is sufficient enough as they found out that the
convergence rate ofCSA is insensitive to119901119886Thus the fraction
of nests to desert 119901119886is assigned as 025 in this study
3 Adaptive Cuckoo Search Algorithm
For any optimization approach finding the optimum solu-tions competently and accurately relies utterly on the inherentsearch process The effectiveness of the standard CSA isunquestionable meaning that when given enough compu-tation time it is guaranteed to converge to the optimumsolutions eventuallyHowever the search processmay be timeconsuming due to the associated randomwalk behavior [24]In order to improve the convergence rate while maintainingthe eye-catching characteristics of the CSA an acceleratedsearching process which similarly to the inertia weightcontrol strategy in the PSO [25] is proposed here
The step size 120572 which manages the local and globalsearching is assigned as constant in the standard CSA where120572 = 1 is applied In this present work a new adaptive cuckoosearch algorithm (ACSA) is presented Instead of using aconstant value the step size 120572 is adjusted adaptively in theproposed ACSA based on the assumption that the cuckooslay their eggs at the area with a higher egg survival rate Inthis regard by adjusting the step size120572 adaptively the cuckoossearch around the current good solutions for laying an eggas this region probably will contain the optimal solutionsand on the contrary they exploremore rigorously for a betterenvironment if the current habitat is not suitable for breedingThe step size 120572 is determined adaptively as follows
120572 =
120572119871+ (120572119880minus 120572119871)119865119895minus 119865min
119865avg minus 119865min 119865119895le 119865avg
120572119880
radic119905 119865
119895gt 119865avg
(2)
where 120572119871is the predefined minimum step size 120572
119880is the
predefined maximum step size 119865119895is the fitness value of the
119895th cuckoo egg and 119865min and 119865avg denote the minimum andthe average fitness values of all host nests respectively Theflow of the ACSA is given in Algorithm 2
The step size 120572 determines how far a new cuckoo egg islocated from the current host nest Specifying the minimum
The Scientific World Journal 3
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Generate a cuckoo egg x
119895from random host nest by using Levy flight
Calculate the fitness function 119865119895= 119891(x
119895)
Get a random nest 119894 among 119902 host nestif (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 1 Cuckoo search algorithm
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
Define minimum step size value 120572119871
Define maximum step size value 120572119880
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Calculate the average fitness value 119865avg of all the host nestsFind the minimum fitness value 119865min among all the host nestsfor all host nests doCurrent position x
119894
Calculate the step size 120572 for Levy flight using (2)Generate a cuckoo egg x
119895from host nest x
119894by using Levy flight
if (x119895falls outside the bounds) then
Replace x119895with x
119894
end ifCalculate the fitness function 119865
119895= 119891(x
119895)
if (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifend forAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 2 Adaptive cuckoo search algorithm
and the maximum Levy flight step size values properly iscrucial such that the search process in the ACSA is neithertoo aggressive nor too ineffective The 120572
119871and 120572
119872are chosen
based on the domain of x119894 Additionally as precautionary
measure if theACSA generates a cuckoo egg that falls outsidethe domain of interest its position will remain unchanged
4 Numerical Simulations
To evaluate the feasibility of the proposed ACSA thealgorithm is applied to optimize the five benchmark functionswith known global optima where two of which are uni-modal and three of which are multimodal The optimization
4 The Scientific World Journal
performance is compared with the standard CSA For eachtest function the initial populations of 20 host nests aregenerated randomly The simulations are performed for 30independent runs The optimization process stops if the bestfitness value is less than a given tolerance 120585 le 10minus5 For boththe CSA and ACSA the Euclidean distance from the knownglobal minimum to the location of the best host nest with thelowest fitness value is evaluated in each iterationThe averageof the distance difference for each loop from all the 30 trialsis then measured
In addition to authenticate the statistical significance ofthe proposed ACSA the two-tailed 119905-test is applied The nullhypothesis is rejected at the confidence interval of 5 levelif the difference of the means of both CSA and ACSA isstatistically significant The results are presented in the lastcolumn of Table 1
41 Ackleyrsquos Function The Ackleyrsquos function is a multimodalfunction which is described as [26]
119891 (119909) = minus20 exp(minus02radic 1119889
119889
sum
119894=1
1199092119894)
minus exp(1119889
119889
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(3)
where 119909119894isin [minus32768 32768] and data dimension 119889 = 50
has a global minima of 119891lowast= 0 at 119909
lowast= (0 0 0) The
120572119871and 120572
119872are chosen as 02lowastdomain 119909
119894and 08lowastdomain
119909119894 respectively Figure 1 compares the relative performances
of the CSA and ACSA in optimizing Ackleyrsquos function Asobserved in this figure apparently the convergence rate of theACSA is better than the standard CSA on this test functionConsidering Table 1 which summarizes the average cyclesneeded for both the algorithms tomeet the stopping criterionit can be clearly seen that the CSA takes more iteration toconverge however the proposed ACSA reaches the globaloptima about one time faster on average The ACSA is ableto find the global solution approximately after 2500 fitnessfunction evaluations as opposed to CSA which reaches theglobal solution after 4000 objective function evaluations
42 De Jongrsquos Function As the simplest unimodal test func-tion the de Jongrsquos function is given by [27]
119891 (119909) =
119889
sum
119894=1
1199092
119894 (4)
where 119909119894isin [minus512 512] and 119889 = 50 This is a sphere
function with the known global minimum of 119891lowast= 0 at
119909lowast= (0 0 0) The 120572
119871and 120572
119872are chosen as 02lowastdomain
119909119894and 08lowastdomain 119909
119894 respectively The results achieved by
the standard CSA and the proposed ACSA in optimizingthe de Jongrsquos function are presented in Figure 2 Due tothe simplicity of the test function it can be found thatboth algorithms take considerably less iteration steps forconvergence but the ACSA with adjustable step size is much
0 500 1000 1500 2000 2500 3000 3500 4000 4500 50000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 1 Comparison of the performance of the standard CSA andthe proposed ACSA for the Ackleyrsquos function
0 200 400 600 800 1000 1200 1400 1600 1800 20000
2
4
6
8
10
12
14
16
18
20
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 2 Comparison of the performance of the standard CSA andthe proposed ACSA for the de Jongrsquos function
more efficient than the standard CSA As shown in Table 1the ACSA reaches the known global optimum in a mean of1000 cycles while the CSA requires a longer processing timein order to converge
43 Griewankrsquos Function Figure 3 depicts the optimizationresults of the CSA and ACSA in terms of convergencecharacteristics for the multimodal Griewankrsquos function [28]
119891 (119909) =1
4000
119889
sum
119894=1
1199092
119894minus
119889
prod
119894=1
cos(119909119894
radic119894) + 1 (5)
The Scientific World Journal 5
Table 1 Performance comparison of ACSA with CSA (in terms of the number of iterations needed in order to converge)
Benchmark function CSA ACSA SignificantBest Worst Mean SD Best Worst Mean SD
Ackley 3884 4862 4222 25074 1912 2068 1989 4103 YesDe Jong 1690 1833 1763 3755 1087 1158 1121 1981 YesGriewank 4287 4632 4499 7563 3617 3880 3725 6377 YesRastrigin 1582 1994 1734 11873 1190 1389 1286 5069 YesRosenbrock 17519 33929 25402 459224 9131 28803 22164 541674 YeslowastSD denotes the standard deviation
0
3500
3000
2500
2000
1500
1000
500
0 35003000 45004000 50002500200015001000500Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 3 Comparison of the performance of the standard CSA andthe proposed ACSA for the Griewankrsquos function
where 119909119894isin [minus600 600] with 119889 = 100 This is a high
dimensional multimodal function with the known globalminimum of 119891
lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and
120572119872
are chosen as 002lowastdomain 119909119894and 008lowastdomain 119909
119894
respectively Considering Figure 3 and Table 1 the obtainedresults suggest that the ACSA shows improved convergenceefficiency Moreover it can be inferred from Table 1 that thesuperiority of the ACSA is obvious as it outperforms the CSAin terms of the best worst average and standard deviationfor the number of iterations needed in order to reach theknown global optimum
44 Rastriginrsquos Function Themultimodal Rastriginrsquos functionis defined as [29]
119891 (119909) = 10119889 +
119889
sum
119894=1
[1199092
119894minus 10 cos (2120587119909
119894)] (6)
where 119909119894isin [minus512 512] and 119889 = 100 with a globalminimum
of 119891lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and 120572
119872are chosen
as 02lowastdomain 119909119894and 08lowastdomain 119909
119894 respectively Getting
the global minimum of the Rastriginrsquos function is a difficultprocess as this test function has many local minima which
90
80
70
60
50
40
30
20
10
06
64
4220
0minus2minus2minus4 minus4minus6
minus6
Figure 4 The 3-dimensional surface plot for the Rastriginrsquos func-tion
0 500 1000 1500 2000 25000
1
2
3
4
5
6
7
8
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 5 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rastriginrsquos function
can be observed in its 3-dimensional surface plot in Figure 4A more challenging 100-dimensional Rastriginrsquos function ischosen in this case to corroborate the global searching abilityof the proposed algorithm
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
2 The Scientific World Journal
the core consideration in selecting an appropriate optimiza-tion technique is how much improvement is achievable fora given application at a plausible computational complexitywith an acceptable error The main thrust of this paper istherefore geared towards a modified CSA which integratesan accelerated searching strategy in its computation Theimprovement over the CSA is tested and validated throughthe optimization of several benchmarks The paper is orga-nized as follows In Section 2 the standard CSA is intro-duced and its deficiencies are discussed The modified CSAspecifically the adaptive cuckoo search algorithm (ASCA) isproposed in Section 3 and the comparative results in evalu-ating the benchmark optimization functions are presented inSection 4 Finally some conclusions are drawn in Section 5
2 Cuckoo Search Algorithm
The CSA which draws inspiration from cuckoorsquos adaption tobreeding and reproduction is idealized with the assumptionsas follows
(i) each cuckoo lays one egg in a randomly selected hostnest at a time where the egg represents the possiblesolution for the problem under study
(ii) the CSA follows the survival of the fittest principleOnly the fittest among all the host nests with highquality eggs will be passed on to the next generation
(iii) the number of host nests in the CSA is fixed before-hand The host bird spots the intruder egg with aprobability 119901
119886isin [0 1] For such incidents the host
bird will either evict the parasitic egg or abandon thenest totally and seek for a new site to rebuild the nest
Derived from these assumptions the steps involved inthe computation of the standard CSA are presented inAlgorithm 1 [7]
Such idealized assumptions in the CSA similarly to whatwas previously proposed in other metaheuristic optimizationapproaches make use of the ideas of elitism intensificationand diversification Start with a population of possible solu-tions a new and potentially better solution (cuckoo egg) isgenerated If the fitness value of this cuckoo egg is higher thananother randomly selected egg from the available host neststhen it will replace that poor solution
The cuckoo lays an egg at random location via Levy flightwhich is characterized by
119909119894= (119894119905119890119903 + 1) = 119909
119894(119894119905119890119903) + 120572 times levy (120582)
levy (120582) =10038161003816100381610038161003816100381610038161003816
Γ (1 + 120582) times sin (1205871205822)Γ ((1 + 120582) 2) times 120582 times 2((120582minus1)2)
10038161003816100381610038161003816100381610038161003816
1120582
(1)
where 119909119894is the possible solution iter denotes the current
generation number Γ is the gamma function which is definedby the integral Γ(119909) = intinfin
0
119890minus119905
119905119909minus1
119889119905 and 120582 is a constant(1 lt 120582 le 3)
The Levy flight process is a random walk that forms aseries of instantaneous jumps chosen from a heavy-tailedprobability density function [24] The step size 120572 which
controls the scale of this random search patterns helpsexploit the search space around the current best solutionand meanwhile explore the search space more thoroughlyby far field randomization such that the search process caneffectively move away from a local optima Therefore thevalue for 120572 must be assigned judiciously The search processwill be less efficient if 120572 is chosen as a small value sincethe location for the new generated solution is near to theprevious On the other hand if the value for 120572 is too bigthe new cuckoo egg might be placed outside the boundsTo balance the effectiveness for both intensification anddiversification Yang and Deb assigned the value of 120572 as 1 [7]
Yang and Deb have also pointed out that the CSAoutperforms the GA and the PSO in terms of the number ofparameters to be adjusted [7] In theCSA only the probabilityof the abandoned nests 119901
119886is tuned However the setting of
119901119886= 025 is sufficient enough as they found out that the
convergence rate ofCSA is insensitive to119901119886Thus the fraction
of nests to desert 119901119886is assigned as 025 in this study
3 Adaptive Cuckoo Search Algorithm
For any optimization approach finding the optimum solu-tions competently and accurately relies utterly on the inherentsearch process The effectiveness of the standard CSA isunquestionable meaning that when given enough compu-tation time it is guaranteed to converge to the optimumsolutions eventuallyHowever the search processmay be timeconsuming due to the associated randomwalk behavior [24]In order to improve the convergence rate while maintainingthe eye-catching characteristics of the CSA an acceleratedsearching process which similarly to the inertia weightcontrol strategy in the PSO [25] is proposed here
The step size 120572 which manages the local and globalsearching is assigned as constant in the standard CSA where120572 = 1 is applied In this present work a new adaptive cuckoosearch algorithm (ACSA) is presented Instead of using aconstant value the step size 120572 is adjusted adaptively in theproposed ACSA based on the assumption that the cuckooslay their eggs at the area with a higher egg survival rate Inthis regard by adjusting the step size120572 adaptively the cuckoossearch around the current good solutions for laying an eggas this region probably will contain the optimal solutionsand on the contrary they exploremore rigorously for a betterenvironment if the current habitat is not suitable for breedingThe step size 120572 is determined adaptively as follows
120572 =
120572119871+ (120572119880minus 120572119871)119865119895minus 119865min
119865avg minus 119865min 119865119895le 119865avg
120572119880
radic119905 119865
119895gt 119865avg
(2)
where 120572119871is the predefined minimum step size 120572
119880is the
predefined maximum step size 119865119895is the fitness value of the
119895th cuckoo egg and 119865min and 119865avg denote the minimum andthe average fitness values of all host nests respectively Theflow of the ACSA is given in Algorithm 2
The step size 120572 determines how far a new cuckoo egg islocated from the current host nest Specifying the minimum
The Scientific World Journal 3
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Generate a cuckoo egg x
119895from random host nest by using Levy flight
Calculate the fitness function 119865119895= 119891(x
119895)
Get a random nest 119894 among 119902 host nestif (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 1 Cuckoo search algorithm
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
Define minimum step size value 120572119871
Define maximum step size value 120572119880
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Calculate the average fitness value 119865avg of all the host nestsFind the minimum fitness value 119865min among all the host nestsfor all host nests doCurrent position x
119894
Calculate the step size 120572 for Levy flight using (2)Generate a cuckoo egg x
119895from host nest x
119894by using Levy flight
if (x119895falls outside the bounds) then
Replace x119895with x
119894
end ifCalculate the fitness function 119865
119895= 119891(x
119895)
if (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifend forAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 2 Adaptive cuckoo search algorithm
and the maximum Levy flight step size values properly iscrucial such that the search process in the ACSA is neithertoo aggressive nor too ineffective The 120572
119871and 120572
119872are chosen
based on the domain of x119894 Additionally as precautionary
measure if theACSA generates a cuckoo egg that falls outsidethe domain of interest its position will remain unchanged
4 Numerical Simulations
To evaluate the feasibility of the proposed ACSA thealgorithm is applied to optimize the five benchmark functionswith known global optima where two of which are uni-modal and three of which are multimodal The optimization
4 The Scientific World Journal
performance is compared with the standard CSA For eachtest function the initial populations of 20 host nests aregenerated randomly The simulations are performed for 30independent runs The optimization process stops if the bestfitness value is less than a given tolerance 120585 le 10minus5 For boththe CSA and ACSA the Euclidean distance from the knownglobal minimum to the location of the best host nest with thelowest fitness value is evaluated in each iterationThe averageof the distance difference for each loop from all the 30 trialsis then measured
In addition to authenticate the statistical significance ofthe proposed ACSA the two-tailed 119905-test is applied The nullhypothesis is rejected at the confidence interval of 5 levelif the difference of the means of both CSA and ACSA isstatistically significant The results are presented in the lastcolumn of Table 1
41 Ackleyrsquos Function The Ackleyrsquos function is a multimodalfunction which is described as [26]
119891 (119909) = minus20 exp(minus02radic 1119889
119889
sum
119894=1
1199092119894)
minus exp(1119889
119889
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(3)
where 119909119894isin [minus32768 32768] and data dimension 119889 = 50
has a global minima of 119891lowast= 0 at 119909
lowast= (0 0 0) The
120572119871and 120572
119872are chosen as 02lowastdomain 119909
119894and 08lowastdomain
119909119894 respectively Figure 1 compares the relative performances
of the CSA and ACSA in optimizing Ackleyrsquos function Asobserved in this figure apparently the convergence rate of theACSA is better than the standard CSA on this test functionConsidering Table 1 which summarizes the average cyclesneeded for both the algorithms tomeet the stopping criterionit can be clearly seen that the CSA takes more iteration toconverge however the proposed ACSA reaches the globaloptima about one time faster on average The ACSA is ableto find the global solution approximately after 2500 fitnessfunction evaluations as opposed to CSA which reaches theglobal solution after 4000 objective function evaluations
42 De Jongrsquos Function As the simplest unimodal test func-tion the de Jongrsquos function is given by [27]
119891 (119909) =
119889
sum
119894=1
1199092
119894 (4)
where 119909119894isin [minus512 512] and 119889 = 50 This is a sphere
function with the known global minimum of 119891lowast= 0 at
119909lowast= (0 0 0) The 120572
119871and 120572
119872are chosen as 02lowastdomain
119909119894and 08lowastdomain 119909
119894 respectively The results achieved by
the standard CSA and the proposed ACSA in optimizingthe de Jongrsquos function are presented in Figure 2 Due tothe simplicity of the test function it can be found thatboth algorithms take considerably less iteration steps forconvergence but the ACSA with adjustable step size is much
0 500 1000 1500 2000 2500 3000 3500 4000 4500 50000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 1 Comparison of the performance of the standard CSA andthe proposed ACSA for the Ackleyrsquos function
0 200 400 600 800 1000 1200 1400 1600 1800 20000
2
4
6
8
10
12
14
16
18
20
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 2 Comparison of the performance of the standard CSA andthe proposed ACSA for the de Jongrsquos function
more efficient than the standard CSA As shown in Table 1the ACSA reaches the known global optimum in a mean of1000 cycles while the CSA requires a longer processing timein order to converge
43 Griewankrsquos Function Figure 3 depicts the optimizationresults of the CSA and ACSA in terms of convergencecharacteristics for the multimodal Griewankrsquos function [28]
119891 (119909) =1
4000
119889
sum
119894=1
1199092
119894minus
119889
prod
119894=1
cos(119909119894
radic119894) + 1 (5)
The Scientific World Journal 5
Table 1 Performance comparison of ACSA with CSA (in terms of the number of iterations needed in order to converge)
Benchmark function CSA ACSA SignificantBest Worst Mean SD Best Worst Mean SD
Ackley 3884 4862 4222 25074 1912 2068 1989 4103 YesDe Jong 1690 1833 1763 3755 1087 1158 1121 1981 YesGriewank 4287 4632 4499 7563 3617 3880 3725 6377 YesRastrigin 1582 1994 1734 11873 1190 1389 1286 5069 YesRosenbrock 17519 33929 25402 459224 9131 28803 22164 541674 YeslowastSD denotes the standard deviation
0
3500
3000
2500
2000
1500
1000
500
0 35003000 45004000 50002500200015001000500Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 3 Comparison of the performance of the standard CSA andthe proposed ACSA for the Griewankrsquos function
where 119909119894isin [minus600 600] with 119889 = 100 This is a high
dimensional multimodal function with the known globalminimum of 119891
lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and
120572119872
are chosen as 002lowastdomain 119909119894and 008lowastdomain 119909
119894
respectively Considering Figure 3 and Table 1 the obtainedresults suggest that the ACSA shows improved convergenceefficiency Moreover it can be inferred from Table 1 that thesuperiority of the ACSA is obvious as it outperforms the CSAin terms of the best worst average and standard deviationfor the number of iterations needed in order to reach theknown global optimum
44 Rastriginrsquos Function Themultimodal Rastriginrsquos functionis defined as [29]
119891 (119909) = 10119889 +
119889
sum
119894=1
[1199092
119894minus 10 cos (2120587119909
119894)] (6)
where 119909119894isin [minus512 512] and 119889 = 100 with a globalminimum
of 119891lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and 120572
119872are chosen
as 02lowastdomain 119909119894and 08lowastdomain 119909
119894 respectively Getting
the global minimum of the Rastriginrsquos function is a difficultprocess as this test function has many local minima which
90
80
70
60
50
40
30
20
10
06
64
4220
0minus2minus2minus4 minus4minus6
minus6
Figure 4 The 3-dimensional surface plot for the Rastriginrsquos func-tion
0 500 1000 1500 2000 25000
1
2
3
4
5
6
7
8
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 5 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rastriginrsquos function
can be observed in its 3-dimensional surface plot in Figure 4A more challenging 100-dimensional Rastriginrsquos function ischosen in this case to corroborate the global searching abilityof the proposed algorithm
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 3
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Generate a cuckoo egg x
119895from random host nest by using Levy flight
Calculate the fitness function 119865119895= 119891(x
119895)
Get a random nest 119894 among 119902 host nestif (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 1 Cuckoo search algorithm
beginGenerate initial population of 119902 host nest x
119894 119894 = 1 2 119902
Define minimum step size value 120572119871
Define maximum step size value 120572119880
for all x119894do
Evaluate the fitness function 119865119894= 119891(x
119894)
end forwhile (iter ltMaxGeneration) or (stopping criterion)Calculate the average fitness value 119865avg of all the host nestsFind the minimum fitness value 119865min among all the host nestsfor all host nests doCurrent position x
119894
Calculate the step size 120572 for Levy flight using (2)Generate a cuckoo egg x
119895from host nest x
119894by using Levy flight
if (x119895falls outside the bounds) then
Replace x119895with x
119894
end ifCalculate the fitness function 119865
119895= 119891(x
119895)
if (119865119895gt 119865119894) then
Replace x119894with x
119895
Replace 119865119894with 119865
119895
end ifend forAbandon a fraction 119901
119886of the worst nests
Build new nests randomly to replace nests lostEvaluate the fitness of new nestsend whileend
Algorithm 2 Adaptive cuckoo search algorithm
and the maximum Levy flight step size values properly iscrucial such that the search process in the ACSA is neithertoo aggressive nor too ineffective The 120572
119871and 120572
119872are chosen
based on the domain of x119894 Additionally as precautionary
measure if theACSA generates a cuckoo egg that falls outsidethe domain of interest its position will remain unchanged
4 Numerical Simulations
To evaluate the feasibility of the proposed ACSA thealgorithm is applied to optimize the five benchmark functionswith known global optima where two of which are uni-modal and three of which are multimodal The optimization
4 The Scientific World Journal
performance is compared with the standard CSA For eachtest function the initial populations of 20 host nests aregenerated randomly The simulations are performed for 30independent runs The optimization process stops if the bestfitness value is less than a given tolerance 120585 le 10minus5 For boththe CSA and ACSA the Euclidean distance from the knownglobal minimum to the location of the best host nest with thelowest fitness value is evaluated in each iterationThe averageof the distance difference for each loop from all the 30 trialsis then measured
In addition to authenticate the statistical significance ofthe proposed ACSA the two-tailed 119905-test is applied The nullhypothesis is rejected at the confidence interval of 5 levelif the difference of the means of both CSA and ACSA isstatistically significant The results are presented in the lastcolumn of Table 1
41 Ackleyrsquos Function The Ackleyrsquos function is a multimodalfunction which is described as [26]
119891 (119909) = minus20 exp(minus02radic 1119889
119889
sum
119894=1
1199092119894)
minus exp(1119889
119889
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(3)
where 119909119894isin [minus32768 32768] and data dimension 119889 = 50
has a global minima of 119891lowast= 0 at 119909
lowast= (0 0 0) The
120572119871and 120572
119872are chosen as 02lowastdomain 119909
119894and 08lowastdomain
119909119894 respectively Figure 1 compares the relative performances
of the CSA and ACSA in optimizing Ackleyrsquos function Asobserved in this figure apparently the convergence rate of theACSA is better than the standard CSA on this test functionConsidering Table 1 which summarizes the average cyclesneeded for both the algorithms tomeet the stopping criterionit can be clearly seen that the CSA takes more iteration toconverge however the proposed ACSA reaches the globaloptima about one time faster on average The ACSA is ableto find the global solution approximately after 2500 fitnessfunction evaluations as opposed to CSA which reaches theglobal solution after 4000 objective function evaluations
42 De Jongrsquos Function As the simplest unimodal test func-tion the de Jongrsquos function is given by [27]
119891 (119909) =
119889
sum
119894=1
1199092
119894 (4)
where 119909119894isin [minus512 512] and 119889 = 50 This is a sphere
function with the known global minimum of 119891lowast= 0 at
119909lowast= (0 0 0) The 120572
119871and 120572
119872are chosen as 02lowastdomain
119909119894and 08lowastdomain 119909
119894 respectively The results achieved by
the standard CSA and the proposed ACSA in optimizingthe de Jongrsquos function are presented in Figure 2 Due tothe simplicity of the test function it can be found thatboth algorithms take considerably less iteration steps forconvergence but the ACSA with adjustable step size is much
0 500 1000 1500 2000 2500 3000 3500 4000 4500 50000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 1 Comparison of the performance of the standard CSA andthe proposed ACSA for the Ackleyrsquos function
0 200 400 600 800 1000 1200 1400 1600 1800 20000
2
4
6
8
10
12
14
16
18
20
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 2 Comparison of the performance of the standard CSA andthe proposed ACSA for the de Jongrsquos function
more efficient than the standard CSA As shown in Table 1the ACSA reaches the known global optimum in a mean of1000 cycles while the CSA requires a longer processing timein order to converge
43 Griewankrsquos Function Figure 3 depicts the optimizationresults of the CSA and ACSA in terms of convergencecharacteristics for the multimodal Griewankrsquos function [28]
119891 (119909) =1
4000
119889
sum
119894=1
1199092
119894minus
119889
prod
119894=1
cos(119909119894
radic119894) + 1 (5)
The Scientific World Journal 5
Table 1 Performance comparison of ACSA with CSA (in terms of the number of iterations needed in order to converge)
Benchmark function CSA ACSA SignificantBest Worst Mean SD Best Worst Mean SD
Ackley 3884 4862 4222 25074 1912 2068 1989 4103 YesDe Jong 1690 1833 1763 3755 1087 1158 1121 1981 YesGriewank 4287 4632 4499 7563 3617 3880 3725 6377 YesRastrigin 1582 1994 1734 11873 1190 1389 1286 5069 YesRosenbrock 17519 33929 25402 459224 9131 28803 22164 541674 YeslowastSD denotes the standard deviation
0
3500
3000
2500
2000
1500
1000
500
0 35003000 45004000 50002500200015001000500Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 3 Comparison of the performance of the standard CSA andthe proposed ACSA for the Griewankrsquos function
where 119909119894isin [minus600 600] with 119889 = 100 This is a high
dimensional multimodal function with the known globalminimum of 119891
lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and
120572119872
are chosen as 002lowastdomain 119909119894and 008lowastdomain 119909
119894
respectively Considering Figure 3 and Table 1 the obtainedresults suggest that the ACSA shows improved convergenceefficiency Moreover it can be inferred from Table 1 that thesuperiority of the ACSA is obvious as it outperforms the CSAin terms of the best worst average and standard deviationfor the number of iterations needed in order to reach theknown global optimum
44 Rastriginrsquos Function Themultimodal Rastriginrsquos functionis defined as [29]
119891 (119909) = 10119889 +
119889
sum
119894=1
[1199092
119894minus 10 cos (2120587119909
119894)] (6)
where 119909119894isin [minus512 512] and 119889 = 100 with a globalminimum
of 119891lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and 120572
119872are chosen
as 02lowastdomain 119909119894and 08lowastdomain 119909
119894 respectively Getting
the global minimum of the Rastriginrsquos function is a difficultprocess as this test function has many local minima which
90
80
70
60
50
40
30
20
10
06
64
4220
0minus2minus2minus4 minus4minus6
minus6
Figure 4 The 3-dimensional surface plot for the Rastriginrsquos func-tion
0 500 1000 1500 2000 25000
1
2
3
4
5
6
7
8
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 5 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rastriginrsquos function
can be observed in its 3-dimensional surface plot in Figure 4A more challenging 100-dimensional Rastriginrsquos function ischosen in this case to corroborate the global searching abilityof the proposed algorithm
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
4 The Scientific World Journal
performance is compared with the standard CSA For eachtest function the initial populations of 20 host nests aregenerated randomly The simulations are performed for 30independent runs The optimization process stops if the bestfitness value is less than a given tolerance 120585 le 10minus5 For boththe CSA and ACSA the Euclidean distance from the knownglobal minimum to the location of the best host nest with thelowest fitness value is evaluated in each iterationThe averageof the distance difference for each loop from all the 30 trialsis then measured
In addition to authenticate the statistical significance ofthe proposed ACSA the two-tailed 119905-test is applied The nullhypothesis is rejected at the confidence interval of 5 levelif the difference of the means of both CSA and ACSA isstatistically significant The results are presented in the lastcolumn of Table 1
41 Ackleyrsquos Function The Ackleyrsquos function is a multimodalfunction which is described as [26]
119891 (119909) = minus20 exp(minus02radic 1119889
119889
sum
119894=1
1199092119894)
minus exp(1119889
119889
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(3)
where 119909119894isin [minus32768 32768] and data dimension 119889 = 50
has a global minima of 119891lowast= 0 at 119909
lowast= (0 0 0) The
120572119871and 120572
119872are chosen as 02lowastdomain 119909
119894and 08lowastdomain
119909119894 respectively Figure 1 compares the relative performances
of the CSA and ACSA in optimizing Ackleyrsquos function Asobserved in this figure apparently the convergence rate of theACSA is better than the standard CSA on this test functionConsidering Table 1 which summarizes the average cyclesneeded for both the algorithms tomeet the stopping criterionit can be clearly seen that the CSA takes more iteration toconverge however the proposed ACSA reaches the globaloptima about one time faster on average The ACSA is ableto find the global solution approximately after 2500 fitnessfunction evaluations as opposed to CSA which reaches theglobal solution after 4000 objective function evaluations
42 De Jongrsquos Function As the simplest unimodal test func-tion the de Jongrsquos function is given by [27]
119891 (119909) =
119889
sum
119894=1
1199092
119894 (4)
where 119909119894isin [minus512 512] and 119889 = 50 This is a sphere
function with the known global minimum of 119891lowast= 0 at
119909lowast= (0 0 0) The 120572
119871and 120572
119872are chosen as 02lowastdomain
119909119894and 08lowastdomain 119909
119894 respectively The results achieved by
the standard CSA and the proposed ACSA in optimizingthe de Jongrsquos function are presented in Figure 2 Due tothe simplicity of the test function it can be found thatboth algorithms take considerably less iteration steps forconvergence but the ACSA with adjustable step size is much
0 500 1000 1500 2000 2500 3000 3500 4000 4500 50000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 1 Comparison of the performance of the standard CSA andthe proposed ACSA for the Ackleyrsquos function
0 200 400 600 800 1000 1200 1400 1600 1800 20000
2
4
6
8
10
12
14
16
18
20
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 2 Comparison of the performance of the standard CSA andthe proposed ACSA for the de Jongrsquos function
more efficient than the standard CSA As shown in Table 1the ACSA reaches the known global optimum in a mean of1000 cycles while the CSA requires a longer processing timein order to converge
43 Griewankrsquos Function Figure 3 depicts the optimizationresults of the CSA and ACSA in terms of convergencecharacteristics for the multimodal Griewankrsquos function [28]
119891 (119909) =1
4000
119889
sum
119894=1
1199092
119894minus
119889
prod
119894=1
cos(119909119894
radic119894) + 1 (5)
The Scientific World Journal 5
Table 1 Performance comparison of ACSA with CSA (in terms of the number of iterations needed in order to converge)
Benchmark function CSA ACSA SignificantBest Worst Mean SD Best Worst Mean SD
Ackley 3884 4862 4222 25074 1912 2068 1989 4103 YesDe Jong 1690 1833 1763 3755 1087 1158 1121 1981 YesGriewank 4287 4632 4499 7563 3617 3880 3725 6377 YesRastrigin 1582 1994 1734 11873 1190 1389 1286 5069 YesRosenbrock 17519 33929 25402 459224 9131 28803 22164 541674 YeslowastSD denotes the standard deviation
0
3500
3000
2500
2000
1500
1000
500
0 35003000 45004000 50002500200015001000500Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 3 Comparison of the performance of the standard CSA andthe proposed ACSA for the Griewankrsquos function
where 119909119894isin [minus600 600] with 119889 = 100 This is a high
dimensional multimodal function with the known globalminimum of 119891
lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and
120572119872
are chosen as 002lowastdomain 119909119894and 008lowastdomain 119909
119894
respectively Considering Figure 3 and Table 1 the obtainedresults suggest that the ACSA shows improved convergenceefficiency Moreover it can be inferred from Table 1 that thesuperiority of the ACSA is obvious as it outperforms the CSAin terms of the best worst average and standard deviationfor the number of iterations needed in order to reach theknown global optimum
44 Rastriginrsquos Function Themultimodal Rastriginrsquos functionis defined as [29]
119891 (119909) = 10119889 +
119889
sum
119894=1
[1199092
119894minus 10 cos (2120587119909
119894)] (6)
where 119909119894isin [minus512 512] and 119889 = 100 with a globalminimum
of 119891lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and 120572
119872are chosen
as 02lowastdomain 119909119894and 08lowastdomain 119909
119894 respectively Getting
the global minimum of the Rastriginrsquos function is a difficultprocess as this test function has many local minima which
90
80
70
60
50
40
30
20
10
06
64
4220
0minus2minus2minus4 minus4minus6
minus6
Figure 4 The 3-dimensional surface plot for the Rastriginrsquos func-tion
0 500 1000 1500 2000 25000
1
2
3
4
5
6
7
8
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 5 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rastriginrsquos function
can be observed in its 3-dimensional surface plot in Figure 4A more challenging 100-dimensional Rastriginrsquos function ischosen in this case to corroborate the global searching abilityof the proposed algorithm
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 5
Table 1 Performance comparison of ACSA with CSA (in terms of the number of iterations needed in order to converge)
Benchmark function CSA ACSA SignificantBest Worst Mean SD Best Worst Mean SD
Ackley 3884 4862 4222 25074 1912 2068 1989 4103 YesDe Jong 1690 1833 1763 3755 1087 1158 1121 1981 YesGriewank 4287 4632 4499 7563 3617 3880 3725 6377 YesRastrigin 1582 1994 1734 11873 1190 1389 1286 5069 YesRosenbrock 17519 33929 25402 459224 9131 28803 22164 541674 YeslowastSD denotes the standard deviation
0
3500
3000
2500
2000
1500
1000
500
0 35003000 45004000 50002500200015001000500Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 3 Comparison of the performance of the standard CSA andthe proposed ACSA for the Griewankrsquos function
where 119909119894isin [minus600 600] with 119889 = 100 This is a high
dimensional multimodal function with the known globalminimum of 119891
lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and
120572119872
are chosen as 002lowastdomain 119909119894and 008lowastdomain 119909
119894
respectively Considering Figure 3 and Table 1 the obtainedresults suggest that the ACSA shows improved convergenceefficiency Moreover it can be inferred from Table 1 that thesuperiority of the ACSA is obvious as it outperforms the CSAin terms of the best worst average and standard deviationfor the number of iterations needed in order to reach theknown global optimum
44 Rastriginrsquos Function Themultimodal Rastriginrsquos functionis defined as [29]
119891 (119909) = 10119889 +
119889
sum
119894=1
[1199092
119894minus 10 cos (2120587119909
119894)] (6)
where 119909119894isin [minus512 512] and 119889 = 100 with a globalminimum
of 119891lowast= 0 at 119909
lowast= (0 0 0) The 120572
119871and 120572
119872are chosen
as 02lowastdomain 119909119894and 08lowastdomain 119909
119894 respectively Getting
the global minimum of the Rastriginrsquos function is a difficultprocess as this test function has many local minima which
90
80
70
60
50
40
30
20
10
06
64
4220
0minus2minus2minus4 minus4minus6
minus6
Figure 4 The 3-dimensional surface plot for the Rastriginrsquos func-tion
0 500 1000 1500 2000 25000
1
2
3
4
5
6
7
8
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 5 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rastriginrsquos function
can be observed in its 3-dimensional surface plot in Figure 4A more challenging 100-dimensional Rastriginrsquos function ischosen in this case to corroborate the global searching abilityof the proposed algorithm
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
6 The Scientific World Journal
6
5
4
3
2
1
0
minus1
minus1
minus05
minus05
0
0
05
05
1 1
Figure 6 The 3-dimensional surface plot for the Rosenbrockrsquosfunction
Figure 5 presents the optimization performances of theCSA and ACSA for Rastriginrsquos function It can be deducedfrom the figure that the proposed ACSA is superior to thestandard CSA in terms of the convergence rate It is evidentthat initially the CSA appears to converge faster to the knownglobal minimum than ASCA However as the evolutioncontinues the ACSA tends to get closer to the true solutionthan CSA This is presumably due to the CSA is gettingtrapped in local solution as there are many local minimapresent in this test function Moreover the optimizationprocess of CSA may span a longer period of time than theproposed ACSA as it makes too small and cautious stepswhile exploring the search space that is the 120572which controlsthe scale of the search patterns is specified as 1 On thecontrary the ACSA with adaptive step size control strategyperforms more rigorous search through the solution spaceThe ACSA is able to find the global minimum in a mean of20000 iterationsWhile on average the CSAneeds 2000moreiterations in order to reach the optimal solution
45 Rosenbrockrsquos Function Figure 6 illustrates the 3-dimensional surface plot for the Rosenbrockrsquos functionwhich is defined as [30]
119891 (119909) =
119889minus1
sum
119894=1
[(1 minus 119909119894)2
+ 100(119909119894+1minus 1199092
119894)2
] (7)
where 119909119894isin [minus100 100] and 119889 = 10 This test function has
a global minimum of 119891lowast= 0 at 119909
lowast= (1 1 1) The
120572119871and 120572
119872are chosen as 002lowastdomain 119909
119894and 008lowastdomain
119909119894 respectively As shown in this figure there is a long
narrow and parabolic shaped valley in the surface plot Thisis the region where the global minimum is resided Althoughfinding this valley is not tedious reaching the global optimais difficult [24]
The performances of the CSA and ACSA in terms of theconvergence efficiency are shown in Figure 7 Comparing thebest obtained results in among all the 30 trials apparently thatthe proposed ACSA speeds up the computation more thanone time faster as compared to the standard CSA Moreover
0 100 200 300 400 500 600 700 800 900 10000
20
40
60
80
100
120
140
Iteration number
Eucli
dean
dist
ance
from
kno
wn
glob
al o
ptim
a
CSAACSA
Figure 7 Comparison of the performance of the standard CSA andthe proposed ACSA for the Rosenbrockrsquos function
it can be noted that the average number of evaluations neededby the CSA is much inferior to the longest iteration achievedby the ACSA from among all the 30 independent runs
46 Performance Comparison with Other Optimization Algo-rithms In comparing the performances of different algo-rithms in optimizing the same benchmark functions severalways with different stopping criteria can be considered inwhich comparison of best fitness value for a fixed number offitness function evaluations a commonly used approach hasbeen adopted in this studyTheother optimization algorithmsconsidered are DE evolutionary programming (EP) GAPSO simulated annealing (SA) and CSAThe simulations ofall algorithms are performed for 30 independent runs withthe number of fitness function evaluations is set to 2000Thebest fitness value in each iteration is evaluated and its averageat each iteration from all the 30 trials is the measured Theobtained average best fitness values at fixed iteration numberof 1 500 1000 1500 and 2000 in optimizing the benchmarkfunctions are summarized in Table 2
As shown in this table the proposed ACSA has higherprecision than any of the other algorithms for all benchmarksconsidered except for Griewankrsquos functionTheDE producessolution that is closer to the global optima than the proposedACSA in this case However it is pertinent to note that thereis no or marginal improvement in DE after 500 functionevaluations The obtained best fitness value only decreasesslightly from 035 to 033 which might indicates that the DEget trapped in local optima In fact as shown in Table 2the DE EP and PSO algorithms usually converge fasterinitially but they often get stuck in local optima easily whichis particularly obvious in the case of Ackley de Jong andGriewankrsquos functionsOn the other hand the proposedACSAis getting closer to the global optima as the iteration increasesgradually The superiority of ACSA is more noticeable in
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 7
Table 2 Performance comparison of ACSA with other optimization methods (in terms of fixed iteration number)
Benchmarkfunction Generation Average best fitness value over 30 independent runs
DE EP GA PSO SA CSA ACSA
Ackley
1 1915 1962 2093 2077 2108 2095 2090500 268 494 1154 269 2054 1116 3761000 268 463 844 269 1943 401 0061500 268 441 694 269 1757 190 639119864 minus 04
2000 268 429 600 269 1539 050 871119864 minus 06
de Jong
1 3288 4185 33435 33412 33288 33559 34379500 002 144 1779 027 8998 191 0141000 002 121 765 027 6802 002 758119864 minus 05
1500 002 113 452 027 4057 129119864 minus 04 391119864 minus 08
2000 002 104 296 027 2032 106119864 minus 06 210119864 minus 11
Griewank
1 9676 15582 252120 247739 28959 251657 254372500 035 092 34243 213 25283 8987 69321000 034 085 19979 213 19762 740 4231500 033 083 13600 213 14797 146 1172000 033 078 10327 213 10696 101 070
Rastrigin
1 11964 12549 12283 12652 19139 12300 12226500 1469 4590 816 2809 18684 367 2941000 1469 4127 427 2808 17308 062 0251500 1469 3854 321 2784 16548 116119864 minus 03 882119864 minus 05
2000 1469 3682 228 2778 15828 972119864 minus 07 865119864 minus 09
Rosenbrock
1 311119864 + 09 510119864 + 09 481119864 + 09 554119864 + 09 179119864 + 10 544119864 + 09 590119864 + 09
500 109119864 + 05 339119864 + 04 328119864 + 05 734119864 + 04 793119864 + 09 20531 61011000 104119864 + 05 131119864 + 04 379119864 + 04 731119864 + 04 430119864 + 08 1382 6371500 104119864 + 05 934119864 + 03 899119864 + 03 731119864 + 04 302119864 + 05 472 3172000 104119864 + 05 836119864 + 03 521119864 + 03 731119864 + 04 897119864 + 04 294 175
the case ofAckley de Jong andRastriginrsquos functionwhere theACSA reaches the near-optimal solution after 1500 functionevaluations
5 Conclusions
In this paper by scrutinizing the advantages and the limita-tions of the standard CSA a new modified CSA specificallythe ACSA which adopts an adjustable step size in itscomputation has been proposed From all the consideredbenchmark optimization functions the superiority of theACSA over the standard CSA is validated in terms ofthe convergence characteristics The proposed algorithmpreserves the uniqueness and the fascinating features of theoriginal CSA as both of the algorithms are able to findthe global optimum when given enough computation timebut the ACSA is able to converge faster in less iterationThis is probably due to its capability to concurrently refinethe local search phase around the current good solutionswhile exploring more aggressively for the optimal solutionsattributed to the adopted adaptive step size It is worthmentioning that through empirical simulations increasingthemaximum step size value will encourage amore thorough
global exploration and eventually lead to faster convergencehowever the generated new solutions might fall outside thedesign domain for some cases Thus a moderate value ischosen in this study
Conflict of Interests
The author declares that there is no conflict of interestsregarding the publication of this paper
Acknowledgment
Financial support fromUniversiti TunHusseinOnnMalaysia(UTHM) through Short Term Grant Vot 1287 is gratefullyacknowledged
References
[1] J Holland Adaptation in Natural and Artificial Systems AnIntroductory Analysis with Applications to Biology Control andArtificial Intelligence MIT Press Cambridge Mass USA 1992
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks Proceedings vol 1ndash6 pp 1942ndash1948 1995
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
8 The Scientific World Journal
[3] R Storn and K Price ldquoDifferential evolutionmdasha simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] A Colorni M Dorigo and V Maniezzo ldquoDistributed opti-mization by ant coloniesrdquo in Proceedings of the 1st EuropeanConference on Artificial Life pp 134ndash142 Paris France 1991
[5] D Karaboga ldquoAn idea based on honey bee swarm for numericaloptimizationrdquo Tech Rep TR06 Erciyes University KayseriTurkey 2005
[6] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquo inStochastic Algorithms Foundations andApplications vol 5792 ofLecture Notes in Computer Science pp 169ndash178 Springer BerlinGermany 2009
[7] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 December 2009
[8] X-S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010
[9] I Durgun and A R Yildiz ldquoStructural design optimization ofvehicle components using Cuckoo search algorithmrdquoMaterialsTesting vol 54 no 3 pp 185ndash188 2012
[10] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of Modified Cuckoo Search algorithm in the designprocess of integrated power systems for modern and energyself-sufficient farmsrdquoApplied Energy vol 114 pp 901ndash908 2013
[11] A H Gandomi S Talatahari X-S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoStructural Design of Tall and Special Buildings vol 22 pp 1330ndash1349 2013
[12] R Sharma K P S Rana and V Kumar ldquoPerformance analysisof fractional order fuzzy PID controllers applied to a roboticmanipulatorrdquo Expert Systemswith Applications vol 41 no 9 pp4274ndash4289 2014
[13] X-T Li and M-H Yin ldquoParameter estimation for chaoticsystems using the cuckoo search algorithm with an orthogonallearning methodrdquo Chinese Physics B vol 21 no 5 Article ID050507 2012
[14] M Dhivya and M Sundarambal ldquoCuckoo Search for datagathering in wireless sensor networksrdquo International Journal ofMobile Communications vol 9 no 6 pp 642ndash656 2011
[15] S Goyal and M S Patterh ldquoWireless sensor network local-ization based on cuckoo search algorithmrdquo Wireless PersonalCommunications 2014
[16] A Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[17] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013
[18] AK Bhandari V K SinghAKumar andGK Singh ldquoCuckoosearch algorithm and wind driven optimization based study ofsatellite image segmentation for multilevel thresholding usingKapurrsquos entropyrdquo Expert Systems with Applications vol 41 pp3538ndash3560 2014
[19] A K Bhandari V Soni A Kumar and G K Singh ldquoCuckoosearch algorithm based satellite image contrast and brightnessenhancement using DWT-SVDrdquo ISA Transactions vol 53 no4 pp 1286ndash1296 2014
[20] A R Yildiz ldquoCuckoo search algorithm for the selection of opti-mal machining parameters in milling operationsrdquo InternationalJournal of Advanced Manufacturing Technology vol 64 no 1ndash4pp 55ndash61 2013
[21] K Chandrasekaran and S P Simon ldquoMulti-objective schedul-ing problem hybrid approach using fuzzy assisted cuckoosearch algorithmrdquo Swarm and Evolutionary Computation vol5 pp 1ndash16 2012
[22] S Burnwal and S Deb ldquoScheduling optimization of flexiblemanufacturing system using cuckoo search-based approachrdquoThe International Journal of Advanced Manufacturing Technol-ogy vol 64 no 5-8 pp 951ndash959 2013
[23] M K Marichelvam T Prabaharan and X S Yang ldquoImprovedcuckoo search algorithm for hybrid flow shop scheduling prob-lems to minimize makespanrdquo Applied Soft Computing Journalvol 19 pp 93ndash101 2014
[24] S Walton O Hassan K Morgan and M R Brown ldquoModifiedcuckoo search a new gradient free optimisation algorithmrdquoChaos Solitons and Fractals vol 44 no 9 pp 710ndash718 2011
[25] C Li J Zhou P Kou and J Xiao ldquoA novel chaotic particleswarm optimization based fuzzy clustering algorithmrdquo Neuro-computing vol 83 pp 98ndash109 2012
[26] D H Ackley ldquoAn empirical study of bit vector functionoptimizationrdquo in Genetic Algorithms and Simulated AnnealingL Davis Ed pp 170ndash215 Morgan Kaufmann Los Altos CalifUSA 1987
[27] K A de Jong An Analysis of the Behavior of a Class of GeneticAdaptive Systems Department of Computer and Communica-tion Sciences University of Michigan Ann Arbor Mich USA1975
[28] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981
[29] L A Rastrigin Extremal Control Systems Nauka MoscowRussia 1974
[30] H H Rosenbrock ldquoAutomatic method for finding the greatestor least value of a functionrdquo Engineering Technology amp AppliedSciences vol 3 no 3 pp 175ndash184 1960
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014