Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Research ArticleParallel and Cooperative Particle Swarm Optimizer forMultimodal Problems
Geng Zhang1 and Yangmin Li12
1Department of Electromechanical Engineering University of Macau E11-4067 Avenida da Universidade Taipa Macau2Tianjin Key Laboratory for Advanced Mechatronic System Design and Intelligent Control Tianjin University of TechnologyTianjin 300384 China
Correspondence should be addressed to Yangmin Li ymliumacmo
Received 25 November 2014 Accepted 13 February 2015
Academic Editor Erik Cuevas
Copyright copy 2015 G Zhang and Y Li This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
Although the original particle swarm optimizer (PSO) method and its related variant methods show some effectiveness for solvingoptimization problems itmay easily get trapped into local optimumespeciallywhen solving complexmultimodal problemsAimingto solve this issue this paper puts forward a novel method called parallel and cooperative particle swarm optimizer (PCPSO) Incase that the interacting of the elements in 119863-dimensional function vector 119883 = [119909
11199092 119909119889 119909119863] is independent cooperative
particle swarm optimizer (CPSO) is used Based on this the PCPSO is presented to solve real problems Since the dimension cannotbe split into several lower dimensional search spaces in real problems because of the interacting of the elements PCPSO exploitsthe cooperation of two parallel CPSO algorithms by orthogonal experimental design (OED) learning Firstly the CPSO algorithmis used to generate two locally optimal vectors separately then the OED is used to learn the merits of these two vectors and createsa better combination of them to generate further search Experimental studies on a set of test functions show that PCPSO exhibitsbetter robustness and converges much closer to the global optimum than several other peer algorithms
1 Introduction
Inspired from social behavior and cognitive behaviorKennedy and Eberhart [1 2] proposed particle swarm opti-mizer (PSO) algorithm to search for optimal value throughpopulation-based iterative learning algorithm Due to thesimple implementation and effective searching ability of thePSO algorithm it has been widely used in feature selection[3] robot path planning [4] data processing [5] and otherproblems However many experiments have shown that thePSO algorithm may easily get trapped into local optimumespecially when facing complex multimodal optimizationproblems Better optimization algorithms are always neededfor solving complex real-world engineering problems Ingeneral the unconstrained optimization problems that weare going to solve can be formulated as a 119863-dimensionalminimization problem as follows
Min119891 (119883) 119883 = [1199091 1199092 119909
119889 119909
119863] (1)
where 119883 = [1199091 1199092 119909
119889 119909
119863] is the vector to be
optimized and119863 is the number of parameters [6]
In PSO a member in the swarm called a particlerepresents a potential solution which is a point in the searchspace Let119872 denote the size of the swarm the current stateof each particle 119894 is represented by its position vector 119883
119894=
[1199091198941 1199091198942 119909
119894119889 119909
119894119863] and the movement of particle 119894 is
represented by velocity vector 119881119894= [V1198941 V1198942 V
119894119889 V
119894119863]
where 119894 = 1 2 119872 is positive integer indexing particlein the swarm Using 119905 = 1 2 119879 represents the iterationnumber the velocity V
119894119889(119905) and the position 119909
119894119889(119905) can be
updated as follows
V119894119889(119905 + 1) = 120596V
119894119889(119905) + 119888
11199031119894119889[119901119894119889(119905) minus 119909
119894119889(119905)]
+ 11988821199032119894119889[119901119899119889(119905) minus 119909
119894119889(119905)]
119909119894119889(119905 + 1) = 119909
119894119889(119905) + V
119894119889(119905 + 1)
(2)
where the inertia weight 120596 isin (0 1) determines howmuch the previous velocity can be preserved A large iner-tia weight value tends to global exploration and a smallvalue for local exploitation 119888
1and 119888
2denote the accel-
eration constants which are usually set 20 or adaptively
Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2015 Article ID 743671 10 pageshttpdxdoiorg1011552015743671
2 Mathematical Problems in Engineering
controlled [7] 1199031119894119889
and 1199032119894119889
are random numbers gen-erated between 0 and 1 for the 119889th dimension of 119894thparticle 119875
119894(119905) = [119901
1198941(119905) 1199011198942(119905) 119901
119894119889(119905) 119901
119894119863(119905)] rep-
resents the best previous position of particle 119894 whichis defined by 119901119887119890119904119905 from the previous 119905 iterations and119875119899(119905) = [119901
1198991(119905) 1199011198992(119905) 119901
119899119889(119905) 119901
119899119863(119905)] represents the
best position among particle 119894rsquos neighborhood which isdefined by 119892119887119890119904119905 or 119897119887119890119904119905 from the previous 119905 iterationsThe 119892119887119890119904119905 model which is inclined to exploitation has afaster convergence speed but has a higher probability ofgetting stuck into local optimum than the 119897119887119890119904119905 model Onthe contrary the 119897119887119890119904119905 model considered focusing more onexploration is less vulnerable to the attraction of local optimabut has slower convergence speed than the 119892119887119890119904119905 model [8]The position vector 119883
119894= [1199091198941 1199091198942 119909
119894119889 119909
119894119863] and the
velocity vector 119881119894= [V1198941 V1198942 V
119894119889 V
119894119863] are initialized
randomly and are updated by (2) generation by generationuntil some criteria are met
There are many modified versions of PSO that havebeen reported in the literature Most studies address theperformance improvement of PSO from one of the fouraspects population topology [9ndash11] diversity maintenance[6 12 13] hybrid with auxiliary operations [14ndash16] andadaptive PSO [8 17] However it is difficult to find a robustsolution due to the complexity and large search space ofhigh dimensional problems Therefore people try to splitthe large search space into smaller ones in order to simplifythe problem In [18] the search space implemented bygenetic algorithm is divided by splitting the solution vectorinto several smaller vectors Each smaller search space isoptimized separately and the fitness function is evaluatedby combining all the solutions found by the smaller searchspacesThe same technique is used in PSO called cooperativePSO (CPSO H) which uses several subswarms to search 119863-dimensional function vector 119883 = [119909
1 1199092 119909
119889 119909
119863]
separately [19]The searching results are integrated by a globalswarm to improve the performance of the original PSO onhigh dimensional problems Compared with traditional PSOthe CPSO H has significant performance in terms of solutionquality and robustness and performs better and better as thedimension of the problem increases However the efficiencyof the algorithm is highly affected by the degree of interactingbetween function vectors 119883 = [119909
1 1199092 119909
119889 119909
119863]
Inspired from previous studies a novel algorithm calledparallel and cooperative PSO (PCPSO) is proposed in thispaper PCPSO tries to overcome the influence of inter-acting between vector elements when taking similar splitmethod as mentioned in [19] Assuming that there isno interacting among the elements of the vector 119883 =
[1199091 1199092 119909
119889 119909
119863] the CPSO is used [19] Although the
application of CPSO is useless in solving real problemsit provides a framework for PCPSO For high degree ofinteracting of vector elements local optimumcanbe obtainedby CPSO [20] In order to jump out of local optimumorthogonal experimental design (OED) is used to learn fromtwo locally optimal vectors which are achieved by CPSO [21]A better combination of these two vectors can be obtained topush the search further to get closer to global optimum
The rest of this paper is organized as follows Section 2describes the PCPSO algorithm In Section 3 the benchmarktest functions are used to test PCPSO algorithm and theirresults are compared with some peer algorithms taken fromliterature to verify the effectiveness and robustness of PCPSOalgorithm Final conclusions are given in Section 4
2 Parallel and Cooperative ParticleSwarm Optimizer
21 Cooperative Particle SwarmOptimizer Among PSO vari-ants in the literature premature convergence is still the maindeficiency of the PSO especially when facing large dimen-sional high correlation problems The difficulty for particlesjump out of local optimum is that particles need to changeseveral elements of vector119883 = [119909
1 1199092 119909
119889 119909
119863] simul-
taneously but usually the PSO variants can only change oneelement at one time when facing local optimum Althoughthe OED method is used to combine the best elements ofvector 119883 = [119909
1 1199092 119909
119889 119909
119863] as an exemplar [21] the
function value 119891(119883) usually gets bigger because of the highcorrelation Here we first introduce the CPSO used in [19]then the PCPSO is proposed based on CPSO OED is used tolearn and create a new vector from two locally optimal vectorswhich are achieved by CPSO and the new vector is treated asa start point for further search
In [19] there are two cooperative PSO algorithmsCPSO S
119896and CPSO H
119896 In CPSO S
119896 the 119863-dimensional
search space is decomposed into 119896 subcomponents eachcorresponding to a swarm of 119904-dimensions (where 119899 = 119896times 119904)CPSO H
119896is a hybridmethod combining a standard PSOwith
the CPSO S119896 The 119896 subcomponents are used to place inter-
acting elements together However the interacting elementsare not known in real problems We simplify the CPSO S
119896
algorithm as CPSO which means the 119863-dimensional searchspace is decomposed into 119863 subcomponents Algorithm 1shows the algorithm of CPSO In order to evaluate the fitnessof a particle in a swarm the context vector is applied whichis a concatenation of all global best particles from all 119863swarmsThe evaluation of the 119894th particle in the 119889th swarm isdone by calling the function b(119889 119878
119889sdot 119909119894) which returns a 119863-
dimensional vector with its 119889th component replaced by 119878119889sdot119909119894
[22]
22 Parallel and Cooperative Particle Swarm Optimizer Algo-rithm Since CPSO method uses one-dimensional swarmto search each dimension and interacting elements of thesolution vector cannot be changed simultaneously it is easierto fall into local optimum If we try to use the CPSO algo-rithm (called First1D) to solve the high correlation problemthe vector 119883
1198711= [119909
11987111 11990911987112 119909
1198711119889 119909
1198711119863] started
from random values will fall into a locally optimal vectorHowever when another CPSO algorithm (called Second1D)which does not have any relationshipwith First1D is also usedto solve the same problem another locally optimal vector1198831198712= [11990911987121 11990911987122 119909
1198712119889 119909
1198712119863] will be obtained Since
the performance of CPSO depends on numerical distributionwhen 119883 = [119909
1 1199092 119909
119889 119909
119863] is initialized by random
Mathematical Problems in Engineering 3
(1)Define 119887(119889 119911) equiv (1198781sdot 119901119899 1198782sdot 119901119899 119878
119889minus1sdot 119901119899 119911 119878119889+1119901119899 119878
119863sdot 119901119899)
(2) Create and initialize 119863 one-dimensional PSOs 119878119889 119889 isin [1 119863]
(3) Repeat(4) for each swarm 119889 isin [1 119863](5) for each particle 119894 isin [1 119872](6) if 119891(b(119889 119878
119889sdot 119909119894)) lt 119891(b(119889 119878
119889sdot 119901119894))
(7) 119878119889sdot 119901119894= 119878119889sdot 119909119894
(8) if 119891(b(119889 119878119889sdot 119901119894)) lt 119891(b(119889 119878
119889sdot 119901119899))
(9) 119878119889sdot 119901119899= 119878119889sdot 119901119894
(10) end for(11) Perform velocity and position updates using (2) for each particle in 119878
119889
(12) end for(13) Until stopping condition is metThe dth swarm is denoted as 119878
119889 119878119889sdot 119909119894denotes the current position of the ith particle in
the dth swarm 119878119889sdot 119901119899represents the best position in dth swarm Similarly 119878
119889sdot 119901119894
represents the best previous position of particle i in dth swarm The context vector is theconcatenation of 119878
1sdot 119901119899 1198782sdot 119901119899 119878
119863sdot 119901119899
Algorithm 1 Pseudocode for the generic CPSO algorithm
Table 1 Best combination levels of Sphere function by using OED method
1199091
1199092
1199093
Function value1198621
(1198831198711) 2 (119883
1198711) 0 (119883
1198711) 1 119865
1= 5
1198622
(1198831198711) 2 (119883
1198712) minus2 (119883
1198712) 0 119865
2= 8
1198623
(1198831198712) 3 (119883
1198711) 0 (119883
1198712) 0 119865
3= 9
1198624
(1198831198712) 3 (119883
1198712) minus2 (119883
1198711) 1 119865
4= 14
Levels Factor analysis1198711
11987811= (1198651+ 1198652)2 = 65 119878
21= (1198651+ 1198653)2 = 7 119878
31= (1198651+ 1198654)2 = 95
1198712
11987812= (1198653+ 1198654)2 = 115 119878
22= (1198652+ 1198654)2 = 11 119878
32= (1198652+ 1198653)2 = 85
1198713
(1198831198711) 2 (119883
1198711) 0 (119883
1198712) 0 119865final = 4
values the independent vectors 1198831198711
and 1198831198712 containing
good information for optimum search will fall into differentlocally optimal vectors It is desperately needed to find amethod of extracting good information from 119883
1198711and 119883
1198712
and form a new vector119883119871119899which is closer to globally optimal
value If we exhaustively test all the combinations of1198831198711
and1198831198712
for the new vector 119883119871119899 there are 2119863 trials which are
unrealistic to accomplish in practice With the help of OED[15 16 23 24] a relatively good vector 119883
119871119899is obtained from
1198831198711
and1198831198712 using only a few experimental tests
The OED with both the orthogonal array (OA) andthe factor analysis makes it possible to discover the bestcombinations for different factors with only small numberof experimental samples [23 24] Here we have a briefdiscussion about OED combined with an example of vectors1198831198711
and 1198831198712 In order to simplify the problem assuming
119863 = 3 the optimization problem is to minimize the Spherefunction 119891(119883) = 119909
2
1+ 1199092
2+ 1199092
3 with 119883
1198711= [2 0 1] 119883
1198712=
[3 minus2 0] which are derived from First1D and Second1D(actually First1D and Second1D should obtain the sameglobally optimal vector because the test function is simplehere we just want to explain the OED theory) The wholeanalysis of usingOED for Sphere function is shown in Table 1
OA is a predefined table which has numbers arrangedin rows and columns In Table 1 we can see that there are
three factors 1199091 1199092 and 119909
3that affect function values and two
levels 1198831198711
and 1198831198712
to choose The rows represent the levelsof factors in each combination and the columns representspecific factors that can be changed from each combination[15 24] The programming of generating OA can be found in[24] Factor analysis evaluates the effects of individual factorsbased on function values and determines the best level foreach factor to form a new vector119883
119871119899 Assuming 119865
119905denotes a
function value of the combination 119905 where 119905 = 1 119899 standsfor total number of combinationsThe main effect of factor 119889with level 119897 as 119878
119889119897is expressed as
119904119889119897=sum119899
119905=1119865119905sdot 119862119889119897119905
sum119899
119905=1119862119889119897119905
(3)
where 119889 = 1 119863 and 119897 = 1 2 If the factor is 119889 and thelevel is 119897 119862
119889119897119905= 1 otherwise 0 In Table 1 we first calculate
the effect 11987811of level 1 in factor 119909
1 then the level 2 in factor
1199091is 11987812 Since this is a minimal optimization problem the
better level 1 is chosen because the effect value of level 1 11987811
is less than that of 11987812
in Table 1 As the example shown inTable 1 although this new vector119883
119871119899= [2 0 0] is not shown
in the four combinations tested the best combination can beobtained by OED method
OED works well when the interacting of vector elementsis small When the OED method is used to process some
4 Mathematical Problems in Engineering
StartEnd
Yes No
Two vectors XL1 and XL2 obtained fromFirst1D and Second1D are used for OED
learning to create new vector XLn
From XL1 and XL2 choosebetter one as XL1 XLn is
assigned to XL2
Parallel First1D and Second1D
Parallel First1D and Second1D
with XL1 and XL2 as contextvector respectively
algorithms each operates eightyiterations
i = 1
i = i + 1 i le 6
Figure 1 Flowchart of the PCPSO algorithm
functions with high interacting vector elements there existsa problem that the function value of new vector119883
119871119899is bigger
than the function values of vector 1198831198711
and 1198831198712 However
there are still two advantages by usingOED in high correlatedproblems One is the particle with new vector 119883
119871119899jumping
out of locally optimal vector the other one is that eachelement of the new vector 119883
119871119899chosen between 119883
1198711and 119883
1198712
is probably closer to the globally optimal vector Thereforewe try to use the new vector 119883
119871119899as context vector to repeat
the CPSO algorithmBased on the analysis mentioned above we propose
the PCPSO algorithm to overcome the problem of fallinginto local optimum Figure 1 shows the flowchart of PCPSOFirst we use two parallel CPSO algorithms (First1D andSecond1D) then OED learning is applied to form the newvector 119883
119871119899which is assigned to 119883
1198712 the better vector
chosen from 1198831198711
and 1198831198712
is assigned to the new 1198831198711 The
new vectors 1198831198711
and 1198831198712 which are treated as context
vectors in CPSO algorithm keep the search going furtherIn order to illustrate the efficiency of PCPSO algorithmFigure 2 shows the convergence characteristic of the PCPSOon Rotated Ackleyrsquos function The detailed description ofRotated Ackleyrsquos function can be found in Section 3 Theparameter settings for First1D and Second1D are the same asCPSO H algorithm [19] where 120596 decreases linearly from 09to 04 119888
1= 1198882= 1494119872 = 10 stands for particle number
and 119863 = 30 represents dimensions Assuming First1D andSecond1D need 80 CPSO iterations to accomplish locallyoptimal vectors followed with OED computation there are 6iterations of OED computation and therefore the total CPSOiteration is 6 times 80 = 480 From Figure 2 we can see thatthe function value drops sharply at first 80 iterations andthat First1D and Second1D have similar performance Forthe next 80 iterations the better vector chosen from 119883
1198711
and 1198831198712
is assigned to 1198831198711
for another First1D implemen-tation the new vector 119883
119871119899is assigned to 119883
1198712for another
Second1D implementation The performance on First1D andSecond1D are significantly different in iterations from 80 to240 First1D algorithm gets trapped in locally optimal vectorand Second1D algorithm still makes the function value dropsharply even though it is higher at start point at the 81stiteration Since First1D algorithm has almost lost searchingability after 80 iterations we can also adopt the strategy thatFirt1D algorithm is only implemented at first 80 iterations
0 100 200 300 400 5000
05
1
15
2
25
3
35
4
Iteration
Func
tion
valu
e
First1DSecond1D
Figure 2 The convergence characteristic of the PCPSO on RotatedAckleyrsquos function in 30 dimensions First1D and Second1D areCPSOalgorithms
in order to save computational cost Vector 1198831198711
is used forsaving the best optimal vector ever found and is combinedwith vector119883
1198712for new vector119883
119871119899by OED method after 80
iterations These operations iterate generation by generationuntil some criteria are met
Compared with other PSO algorithms using OED [1516] the PCPSO needs some extra computations of updatingvector 119883
1198711 However the cooperation between local vector
1198831198711
and 1198831198712
makes the searching more effective In [15] theOED result is set as an exemplar for the other particles tolearn From Figure 2 we can see that sometimes the functionvalues after OED are even bigger Another limitation of [15]is that the searching method gets trapped into local optimumeasily because the two vectors for OED learning are similarBased on the principle that each element of function vectormoves closer to the globally optimal vector after OED it canbe seen that the new vector 119883
119871119899jumps out of local optimum
and keeps searching further even though the function valueis higher than corresponding local optimum at start iteration
Mathematical Problems in Engineering 5
3 Experimental Results and Discussions
31 Test Functions Test functions including unimodal func-tions and multimodal functions are used to investigate theperformance of PSO algorithms [25] We choose 15 testfunctions on 30 dimensionsThe formulas and the propertiesof these functions are presented below
Group A unimodal and simple multimodal functions areas follows
(1) Sphere function
1198911(119909) =
119863
sum
119894=1
1199092
119894 (4)
(2) Rosenbrockrsquos function
1198912(119909) =
119863minus1
sum
119894=1
100 (119909119894+1
minus 1199092
119894)2
+ (119909119894minus 1)2
(5)
(3) Griewankrsquos function
1198913(119909) =
119863
sum
119894=1
1199092
119894
4000minus
119863
prod
119894=1
cos(119909119894
radic119894
) + 1 (6)
(4) Rastriginrsquos function
1198914(119909) = 119909
2
119894minus 10 cos (2120587119909
119894) + 10 (7)
(5) Ackleyrsquos function
1198915(119909) = minus 20 exp(minus02radic 1
119863
119863
sum
119894minus1
1199092
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(8)
(6) Weierstrass function
1198916(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20
(9)
(7) Noncontinuous Rastriginrsquos function
1198917(119909) =
119863
sum
119894=1
(1199102
119894minus 10 cos (2120587119910
119894) + 10)
119910119894=
119909119894
10038161003816100381610038161199091198941003816100381610038161003816 lt
1
2
round (2119909119894)
2
10038161003816100381610038161199091198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863
(10)
(8) Schwefelrsquos function
1198918(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119909119894sin (1003816100381610038161003816119909119894
1003816100381610038161003816
12
) (11)
Group B Rotated and shifted multimodal functions areas follows
In Group A the functions can be divided into smallersearch spaces because of low interacting of the elementsTherefore the CPSO algorithm is suitable to optimize thesefunctions In Group B seven rotated multimodal functionsare generated by an orthogonal matrix 119872 to test the per-formance of the PCPSO algorithm The new rotated vector119910 = 119872 lowast 119909 which is obtained through the original vector119909 left multiplied by orthogonal matrix 119872 performs likethe high interacting vector because all elements in vector119910 will be changed once one element in vector 119909 changesDetailed illustration of generating the orthogonal matrixis introduced in [26] Meanwhile one shifted and rotatedfunction is also listed in group B to test the performance ofPSOalgorithmTable 2 shows the globally optimal vectors119909lowastthe corresponding function values 119891(119909lowast) the search rangeand the initialization range of each function Consider thefollowing
(9) Rotated Ackleyrsquos function
1198919(119909) = minus20 exp(minus02radic 1
119863
119863
sum
119894=1
1199102
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119910119894)) + 20 + 119890 119910 = 119872 lowast 119909
(12)
(10) Rotated Rastriginrsquos function
11989110(119909) =
119863
sum
119894=1
1199102
119894minus 10 cos (2120587119910
119894) + 10 119910 = 119872 lowast 119909 (13)
(11) Rotated Griewankrsquos function
11989111(119909) =
119863
sum
119894=1
1199102
119894
4000minus
119863
prod
119894=1
cos(119910119894
radic119894
) + 1 119910 = 119872 lowast 119909 (14)
(12) Rotated Weierstrass function
11989112(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119910
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20 119910 = 119872 lowast 119909
(15)
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
2 Mathematical Problems in Engineering
controlled [7] 1199031119894119889
and 1199032119894119889
are random numbers gen-erated between 0 and 1 for the 119889th dimension of 119894thparticle 119875
119894(119905) = [119901
1198941(119905) 1199011198942(119905) 119901
119894119889(119905) 119901
119894119863(119905)] rep-
resents the best previous position of particle 119894 whichis defined by 119901119887119890119904119905 from the previous 119905 iterations and119875119899(119905) = [119901
1198991(119905) 1199011198992(119905) 119901
119899119889(119905) 119901
119899119863(119905)] represents the
best position among particle 119894rsquos neighborhood which isdefined by 119892119887119890119904119905 or 119897119887119890119904119905 from the previous 119905 iterationsThe 119892119887119890119904119905 model which is inclined to exploitation has afaster convergence speed but has a higher probability ofgetting stuck into local optimum than the 119897119887119890119904119905 model Onthe contrary the 119897119887119890119904119905 model considered focusing more onexploration is less vulnerable to the attraction of local optimabut has slower convergence speed than the 119892119887119890119904119905 model [8]The position vector 119883
119894= [1199091198941 1199091198942 119909
119894119889 119909
119894119863] and the
velocity vector 119881119894= [V1198941 V1198942 V
119894119889 V
119894119863] are initialized
randomly and are updated by (2) generation by generationuntil some criteria are met
There are many modified versions of PSO that havebeen reported in the literature Most studies address theperformance improvement of PSO from one of the fouraspects population topology [9ndash11] diversity maintenance[6 12 13] hybrid with auxiliary operations [14ndash16] andadaptive PSO [8 17] However it is difficult to find a robustsolution due to the complexity and large search space ofhigh dimensional problems Therefore people try to splitthe large search space into smaller ones in order to simplifythe problem In [18] the search space implemented bygenetic algorithm is divided by splitting the solution vectorinto several smaller vectors Each smaller search space isoptimized separately and the fitness function is evaluatedby combining all the solutions found by the smaller searchspacesThe same technique is used in PSO called cooperativePSO (CPSO H) which uses several subswarms to search 119863-dimensional function vector 119883 = [119909
1 1199092 119909
119889 119909
119863]
separately [19]The searching results are integrated by a globalswarm to improve the performance of the original PSO onhigh dimensional problems Compared with traditional PSOthe CPSO H has significant performance in terms of solutionquality and robustness and performs better and better as thedimension of the problem increases However the efficiencyof the algorithm is highly affected by the degree of interactingbetween function vectors 119883 = [119909
1 1199092 119909
119889 119909
119863]
Inspired from previous studies a novel algorithm calledparallel and cooperative PSO (PCPSO) is proposed in thispaper PCPSO tries to overcome the influence of inter-acting between vector elements when taking similar splitmethod as mentioned in [19] Assuming that there isno interacting among the elements of the vector 119883 =
[1199091 1199092 119909
119889 119909
119863] the CPSO is used [19] Although the
application of CPSO is useless in solving real problemsit provides a framework for PCPSO For high degree ofinteracting of vector elements local optimumcanbe obtainedby CPSO [20] In order to jump out of local optimumorthogonal experimental design (OED) is used to learn fromtwo locally optimal vectors which are achieved by CPSO [21]A better combination of these two vectors can be obtained topush the search further to get closer to global optimum
The rest of this paper is organized as follows Section 2describes the PCPSO algorithm In Section 3 the benchmarktest functions are used to test PCPSO algorithm and theirresults are compared with some peer algorithms taken fromliterature to verify the effectiveness and robustness of PCPSOalgorithm Final conclusions are given in Section 4
2 Parallel and Cooperative ParticleSwarm Optimizer
21 Cooperative Particle SwarmOptimizer Among PSO vari-ants in the literature premature convergence is still the maindeficiency of the PSO especially when facing large dimen-sional high correlation problems The difficulty for particlesjump out of local optimum is that particles need to changeseveral elements of vector119883 = [119909
1 1199092 119909
119889 119909
119863] simul-
taneously but usually the PSO variants can only change oneelement at one time when facing local optimum Althoughthe OED method is used to combine the best elements ofvector 119883 = [119909
1 1199092 119909
119889 119909
119863] as an exemplar [21] the
function value 119891(119883) usually gets bigger because of the highcorrelation Here we first introduce the CPSO used in [19]then the PCPSO is proposed based on CPSO OED is used tolearn and create a new vector from two locally optimal vectorswhich are achieved by CPSO and the new vector is treated asa start point for further search
In [19] there are two cooperative PSO algorithmsCPSO S
119896and CPSO H
119896 In CPSO S
119896 the 119863-dimensional
search space is decomposed into 119896 subcomponents eachcorresponding to a swarm of 119904-dimensions (where 119899 = 119896times 119904)CPSO H
119896is a hybridmethod combining a standard PSOwith
the CPSO S119896 The 119896 subcomponents are used to place inter-
acting elements together However the interacting elementsare not known in real problems We simplify the CPSO S
119896
algorithm as CPSO which means the 119863-dimensional searchspace is decomposed into 119863 subcomponents Algorithm 1shows the algorithm of CPSO In order to evaluate the fitnessof a particle in a swarm the context vector is applied whichis a concatenation of all global best particles from all 119863swarmsThe evaluation of the 119894th particle in the 119889th swarm isdone by calling the function b(119889 119878
119889sdot 119909119894) which returns a 119863-
dimensional vector with its 119889th component replaced by 119878119889sdot119909119894
[22]
22 Parallel and Cooperative Particle Swarm Optimizer Algo-rithm Since CPSO method uses one-dimensional swarmto search each dimension and interacting elements of thesolution vector cannot be changed simultaneously it is easierto fall into local optimum If we try to use the CPSO algo-rithm (called First1D) to solve the high correlation problemthe vector 119883
1198711= [119909
11987111 11990911987112 119909
1198711119889 119909
1198711119863] started
from random values will fall into a locally optimal vectorHowever when another CPSO algorithm (called Second1D)which does not have any relationshipwith First1D is also usedto solve the same problem another locally optimal vector1198831198712= [11990911987121 11990911987122 119909
1198712119889 119909
1198712119863] will be obtained Since
the performance of CPSO depends on numerical distributionwhen 119883 = [119909
1 1199092 119909
119889 119909
119863] is initialized by random
Mathematical Problems in Engineering 3
(1)Define 119887(119889 119911) equiv (1198781sdot 119901119899 1198782sdot 119901119899 119878
119889minus1sdot 119901119899 119911 119878119889+1119901119899 119878
119863sdot 119901119899)
(2) Create and initialize 119863 one-dimensional PSOs 119878119889 119889 isin [1 119863]
(3) Repeat(4) for each swarm 119889 isin [1 119863](5) for each particle 119894 isin [1 119872](6) if 119891(b(119889 119878
119889sdot 119909119894)) lt 119891(b(119889 119878
119889sdot 119901119894))
(7) 119878119889sdot 119901119894= 119878119889sdot 119909119894
(8) if 119891(b(119889 119878119889sdot 119901119894)) lt 119891(b(119889 119878
119889sdot 119901119899))
(9) 119878119889sdot 119901119899= 119878119889sdot 119901119894
(10) end for(11) Perform velocity and position updates using (2) for each particle in 119878
119889
(12) end for(13) Until stopping condition is metThe dth swarm is denoted as 119878
119889 119878119889sdot 119909119894denotes the current position of the ith particle in
the dth swarm 119878119889sdot 119901119899represents the best position in dth swarm Similarly 119878
119889sdot 119901119894
represents the best previous position of particle i in dth swarm The context vector is theconcatenation of 119878
1sdot 119901119899 1198782sdot 119901119899 119878
119863sdot 119901119899
Algorithm 1 Pseudocode for the generic CPSO algorithm
Table 1 Best combination levels of Sphere function by using OED method
1199091
1199092
1199093
Function value1198621
(1198831198711) 2 (119883
1198711) 0 (119883
1198711) 1 119865
1= 5
1198622
(1198831198711) 2 (119883
1198712) minus2 (119883
1198712) 0 119865
2= 8
1198623
(1198831198712) 3 (119883
1198711) 0 (119883
1198712) 0 119865
3= 9
1198624
(1198831198712) 3 (119883
1198712) minus2 (119883
1198711) 1 119865
4= 14
Levels Factor analysis1198711
11987811= (1198651+ 1198652)2 = 65 119878
21= (1198651+ 1198653)2 = 7 119878
31= (1198651+ 1198654)2 = 95
1198712
11987812= (1198653+ 1198654)2 = 115 119878
22= (1198652+ 1198654)2 = 11 119878
32= (1198652+ 1198653)2 = 85
1198713
(1198831198711) 2 (119883
1198711) 0 (119883
1198712) 0 119865final = 4
values the independent vectors 1198831198711
and 1198831198712 containing
good information for optimum search will fall into differentlocally optimal vectors It is desperately needed to find amethod of extracting good information from 119883
1198711and 119883
1198712
and form a new vector119883119871119899which is closer to globally optimal
value If we exhaustively test all the combinations of1198831198711
and1198831198712
for the new vector 119883119871119899 there are 2119863 trials which are
unrealistic to accomplish in practice With the help of OED[15 16 23 24] a relatively good vector 119883
119871119899is obtained from
1198831198711
and1198831198712 using only a few experimental tests
The OED with both the orthogonal array (OA) andthe factor analysis makes it possible to discover the bestcombinations for different factors with only small numberof experimental samples [23 24] Here we have a briefdiscussion about OED combined with an example of vectors1198831198711
and 1198831198712 In order to simplify the problem assuming
119863 = 3 the optimization problem is to minimize the Spherefunction 119891(119883) = 119909
2
1+ 1199092
2+ 1199092
3 with 119883
1198711= [2 0 1] 119883
1198712=
[3 minus2 0] which are derived from First1D and Second1D(actually First1D and Second1D should obtain the sameglobally optimal vector because the test function is simplehere we just want to explain the OED theory) The wholeanalysis of usingOED for Sphere function is shown in Table 1
OA is a predefined table which has numbers arrangedin rows and columns In Table 1 we can see that there are
three factors 1199091 1199092 and 119909
3that affect function values and two
levels 1198831198711
and 1198831198712
to choose The rows represent the levelsof factors in each combination and the columns representspecific factors that can be changed from each combination[15 24] The programming of generating OA can be found in[24] Factor analysis evaluates the effects of individual factorsbased on function values and determines the best level foreach factor to form a new vector119883
119871119899 Assuming 119865
119905denotes a
function value of the combination 119905 where 119905 = 1 119899 standsfor total number of combinationsThe main effect of factor 119889with level 119897 as 119878
119889119897is expressed as
119904119889119897=sum119899
119905=1119865119905sdot 119862119889119897119905
sum119899
119905=1119862119889119897119905
(3)
where 119889 = 1 119863 and 119897 = 1 2 If the factor is 119889 and thelevel is 119897 119862
119889119897119905= 1 otherwise 0 In Table 1 we first calculate
the effect 11987811of level 1 in factor 119909
1 then the level 2 in factor
1199091is 11987812 Since this is a minimal optimization problem the
better level 1 is chosen because the effect value of level 1 11987811
is less than that of 11987812
in Table 1 As the example shown inTable 1 although this new vector119883
119871119899= [2 0 0] is not shown
in the four combinations tested the best combination can beobtained by OED method
OED works well when the interacting of vector elementsis small When the OED method is used to process some
4 Mathematical Problems in Engineering
StartEnd
Yes No
Two vectors XL1 and XL2 obtained fromFirst1D and Second1D are used for OED
learning to create new vector XLn
From XL1 and XL2 choosebetter one as XL1 XLn is
assigned to XL2
Parallel First1D and Second1D
Parallel First1D and Second1D
with XL1 and XL2 as contextvector respectively
algorithms each operates eightyiterations
i = 1
i = i + 1 i le 6
Figure 1 Flowchart of the PCPSO algorithm
functions with high interacting vector elements there existsa problem that the function value of new vector119883
119871119899is bigger
than the function values of vector 1198831198711
and 1198831198712 However
there are still two advantages by usingOED in high correlatedproblems One is the particle with new vector 119883
119871119899jumping
out of locally optimal vector the other one is that eachelement of the new vector 119883
119871119899chosen between 119883
1198711and 119883
1198712
is probably closer to the globally optimal vector Thereforewe try to use the new vector 119883
119871119899as context vector to repeat
the CPSO algorithmBased on the analysis mentioned above we propose
the PCPSO algorithm to overcome the problem of fallinginto local optimum Figure 1 shows the flowchart of PCPSOFirst we use two parallel CPSO algorithms (First1D andSecond1D) then OED learning is applied to form the newvector 119883
119871119899which is assigned to 119883
1198712 the better vector
chosen from 1198831198711
and 1198831198712
is assigned to the new 1198831198711 The
new vectors 1198831198711
and 1198831198712 which are treated as context
vectors in CPSO algorithm keep the search going furtherIn order to illustrate the efficiency of PCPSO algorithmFigure 2 shows the convergence characteristic of the PCPSOon Rotated Ackleyrsquos function The detailed description ofRotated Ackleyrsquos function can be found in Section 3 Theparameter settings for First1D and Second1D are the same asCPSO H algorithm [19] where 120596 decreases linearly from 09to 04 119888
1= 1198882= 1494119872 = 10 stands for particle number
and 119863 = 30 represents dimensions Assuming First1D andSecond1D need 80 CPSO iterations to accomplish locallyoptimal vectors followed with OED computation there are 6iterations of OED computation and therefore the total CPSOiteration is 6 times 80 = 480 From Figure 2 we can see thatthe function value drops sharply at first 80 iterations andthat First1D and Second1D have similar performance Forthe next 80 iterations the better vector chosen from 119883
1198711
and 1198831198712
is assigned to 1198831198711
for another First1D implemen-tation the new vector 119883
119871119899is assigned to 119883
1198712for another
Second1D implementation The performance on First1D andSecond1D are significantly different in iterations from 80 to240 First1D algorithm gets trapped in locally optimal vectorand Second1D algorithm still makes the function value dropsharply even though it is higher at start point at the 81stiteration Since First1D algorithm has almost lost searchingability after 80 iterations we can also adopt the strategy thatFirt1D algorithm is only implemented at first 80 iterations
0 100 200 300 400 5000
05
1
15
2
25
3
35
4
Iteration
Func
tion
valu
e
First1DSecond1D
Figure 2 The convergence characteristic of the PCPSO on RotatedAckleyrsquos function in 30 dimensions First1D and Second1D areCPSOalgorithms
in order to save computational cost Vector 1198831198711
is used forsaving the best optimal vector ever found and is combinedwith vector119883
1198712for new vector119883
119871119899by OED method after 80
iterations These operations iterate generation by generationuntil some criteria are met
Compared with other PSO algorithms using OED [1516] the PCPSO needs some extra computations of updatingvector 119883
1198711 However the cooperation between local vector
1198831198711
and 1198831198712
makes the searching more effective In [15] theOED result is set as an exemplar for the other particles tolearn From Figure 2 we can see that sometimes the functionvalues after OED are even bigger Another limitation of [15]is that the searching method gets trapped into local optimumeasily because the two vectors for OED learning are similarBased on the principle that each element of function vectormoves closer to the globally optimal vector after OED it canbe seen that the new vector 119883
119871119899jumps out of local optimum
and keeps searching further even though the function valueis higher than corresponding local optimum at start iteration
Mathematical Problems in Engineering 5
3 Experimental Results and Discussions
31 Test Functions Test functions including unimodal func-tions and multimodal functions are used to investigate theperformance of PSO algorithms [25] We choose 15 testfunctions on 30 dimensionsThe formulas and the propertiesof these functions are presented below
Group A unimodal and simple multimodal functions areas follows
(1) Sphere function
1198911(119909) =
119863
sum
119894=1
1199092
119894 (4)
(2) Rosenbrockrsquos function
1198912(119909) =
119863minus1
sum
119894=1
100 (119909119894+1
minus 1199092
119894)2
+ (119909119894minus 1)2
(5)
(3) Griewankrsquos function
1198913(119909) =
119863
sum
119894=1
1199092
119894
4000minus
119863
prod
119894=1
cos(119909119894
radic119894
) + 1 (6)
(4) Rastriginrsquos function
1198914(119909) = 119909
2
119894minus 10 cos (2120587119909
119894) + 10 (7)
(5) Ackleyrsquos function
1198915(119909) = minus 20 exp(minus02radic 1
119863
119863
sum
119894minus1
1199092
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(8)
(6) Weierstrass function
1198916(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20
(9)
(7) Noncontinuous Rastriginrsquos function
1198917(119909) =
119863
sum
119894=1
(1199102
119894minus 10 cos (2120587119910
119894) + 10)
119910119894=
119909119894
10038161003816100381610038161199091198941003816100381610038161003816 lt
1
2
round (2119909119894)
2
10038161003816100381610038161199091198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863
(10)
(8) Schwefelrsquos function
1198918(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119909119894sin (1003816100381610038161003816119909119894
1003816100381610038161003816
12
) (11)
Group B Rotated and shifted multimodal functions areas follows
In Group A the functions can be divided into smallersearch spaces because of low interacting of the elementsTherefore the CPSO algorithm is suitable to optimize thesefunctions In Group B seven rotated multimodal functionsare generated by an orthogonal matrix 119872 to test the per-formance of the PCPSO algorithm The new rotated vector119910 = 119872 lowast 119909 which is obtained through the original vector119909 left multiplied by orthogonal matrix 119872 performs likethe high interacting vector because all elements in vector119910 will be changed once one element in vector 119909 changesDetailed illustration of generating the orthogonal matrixis introduced in [26] Meanwhile one shifted and rotatedfunction is also listed in group B to test the performance ofPSOalgorithmTable 2 shows the globally optimal vectors119909lowastthe corresponding function values 119891(119909lowast) the search rangeand the initialization range of each function Consider thefollowing
(9) Rotated Ackleyrsquos function
1198919(119909) = minus20 exp(minus02radic 1
119863
119863
sum
119894=1
1199102
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119910119894)) + 20 + 119890 119910 = 119872 lowast 119909
(12)
(10) Rotated Rastriginrsquos function
11989110(119909) =
119863
sum
119894=1
1199102
119894minus 10 cos (2120587119910
119894) + 10 119910 = 119872 lowast 119909 (13)
(11) Rotated Griewankrsquos function
11989111(119909) =
119863
sum
119894=1
1199102
119894
4000minus
119863
prod
119894=1
cos(119910119894
radic119894
) + 1 119910 = 119872 lowast 119909 (14)
(12) Rotated Weierstrass function
11989112(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119910
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20 119910 = 119872 lowast 119909
(15)
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 3
(1)Define 119887(119889 119911) equiv (1198781sdot 119901119899 1198782sdot 119901119899 119878
119889minus1sdot 119901119899 119911 119878119889+1119901119899 119878
119863sdot 119901119899)
(2) Create and initialize 119863 one-dimensional PSOs 119878119889 119889 isin [1 119863]
(3) Repeat(4) for each swarm 119889 isin [1 119863](5) for each particle 119894 isin [1 119872](6) if 119891(b(119889 119878
119889sdot 119909119894)) lt 119891(b(119889 119878
119889sdot 119901119894))
(7) 119878119889sdot 119901119894= 119878119889sdot 119909119894
(8) if 119891(b(119889 119878119889sdot 119901119894)) lt 119891(b(119889 119878
119889sdot 119901119899))
(9) 119878119889sdot 119901119899= 119878119889sdot 119901119894
(10) end for(11) Perform velocity and position updates using (2) for each particle in 119878
119889
(12) end for(13) Until stopping condition is metThe dth swarm is denoted as 119878
119889 119878119889sdot 119909119894denotes the current position of the ith particle in
the dth swarm 119878119889sdot 119901119899represents the best position in dth swarm Similarly 119878
119889sdot 119901119894
represents the best previous position of particle i in dth swarm The context vector is theconcatenation of 119878
1sdot 119901119899 1198782sdot 119901119899 119878
119863sdot 119901119899
Algorithm 1 Pseudocode for the generic CPSO algorithm
Table 1 Best combination levels of Sphere function by using OED method
1199091
1199092
1199093
Function value1198621
(1198831198711) 2 (119883
1198711) 0 (119883
1198711) 1 119865
1= 5
1198622
(1198831198711) 2 (119883
1198712) minus2 (119883
1198712) 0 119865
2= 8
1198623
(1198831198712) 3 (119883
1198711) 0 (119883
1198712) 0 119865
3= 9
1198624
(1198831198712) 3 (119883
1198712) minus2 (119883
1198711) 1 119865
4= 14
Levels Factor analysis1198711
11987811= (1198651+ 1198652)2 = 65 119878
21= (1198651+ 1198653)2 = 7 119878
31= (1198651+ 1198654)2 = 95
1198712
11987812= (1198653+ 1198654)2 = 115 119878
22= (1198652+ 1198654)2 = 11 119878
32= (1198652+ 1198653)2 = 85
1198713
(1198831198711) 2 (119883
1198711) 0 (119883
1198712) 0 119865final = 4
values the independent vectors 1198831198711
and 1198831198712 containing
good information for optimum search will fall into differentlocally optimal vectors It is desperately needed to find amethod of extracting good information from 119883
1198711and 119883
1198712
and form a new vector119883119871119899which is closer to globally optimal
value If we exhaustively test all the combinations of1198831198711
and1198831198712
for the new vector 119883119871119899 there are 2119863 trials which are
unrealistic to accomplish in practice With the help of OED[15 16 23 24] a relatively good vector 119883
119871119899is obtained from
1198831198711
and1198831198712 using only a few experimental tests
The OED with both the orthogonal array (OA) andthe factor analysis makes it possible to discover the bestcombinations for different factors with only small numberof experimental samples [23 24] Here we have a briefdiscussion about OED combined with an example of vectors1198831198711
and 1198831198712 In order to simplify the problem assuming
119863 = 3 the optimization problem is to minimize the Spherefunction 119891(119883) = 119909
2
1+ 1199092
2+ 1199092
3 with 119883
1198711= [2 0 1] 119883
1198712=
[3 minus2 0] which are derived from First1D and Second1D(actually First1D and Second1D should obtain the sameglobally optimal vector because the test function is simplehere we just want to explain the OED theory) The wholeanalysis of usingOED for Sphere function is shown in Table 1
OA is a predefined table which has numbers arrangedin rows and columns In Table 1 we can see that there are
three factors 1199091 1199092 and 119909
3that affect function values and two
levels 1198831198711
and 1198831198712
to choose The rows represent the levelsof factors in each combination and the columns representspecific factors that can be changed from each combination[15 24] The programming of generating OA can be found in[24] Factor analysis evaluates the effects of individual factorsbased on function values and determines the best level foreach factor to form a new vector119883
119871119899 Assuming 119865
119905denotes a
function value of the combination 119905 where 119905 = 1 119899 standsfor total number of combinationsThe main effect of factor 119889with level 119897 as 119878
119889119897is expressed as
119904119889119897=sum119899
119905=1119865119905sdot 119862119889119897119905
sum119899
119905=1119862119889119897119905
(3)
where 119889 = 1 119863 and 119897 = 1 2 If the factor is 119889 and thelevel is 119897 119862
119889119897119905= 1 otherwise 0 In Table 1 we first calculate
the effect 11987811of level 1 in factor 119909
1 then the level 2 in factor
1199091is 11987812 Since this is a minimal optimization problem the
better level 1 is chosen because the effect value of level 1 11987811
is less than that of 11987812
in Table 1 As the example shown inTable 1 although this new vector119883
119871119899= [2 0 0] is not shown
in the four combinations tested the best combination can beobtained by OED method
OED works well when the interacting of vector elementsis small When the OED method is used to process some
4 Mathematical Problems in Engineering
StartEnd
Yes No
Two vectors XL1 and XL2 obtained fromFirst1D and Second1D are used for OED
learning to create new vector XLn
From XL1 and XL2 choosebetter one as XL1 XLn is
assigned to XL2
Parallel First1D and Second1D
Parallel First1D and Second1D
with XL1 and XL2 as contextvector respectively
algorithms each operates eightyiterations
i = 1
i = i + 1 i le 6
Figure 1 Flowchart of the PCPSO algorithm
functions with high interacting vector elements there existsa problem that the function value of new vector119883
119871119899is bigger
than the function values of vector 1198831198711
and 1198831198712 However
there are still two advantages by usingOED in high correlatedproblems One is the particle with new vector 119883
119871119899jumping
out of locally optimal vector the other one is that eachelement of the new vector 119883
119871119899chosen between 119883
1198711and 119883
1198712
is probably closer to the globally optimal vector Thereforewe try to use the new vector 119883
119871119899as context vector to repeat
the CPSO algorithmBased on the analysis mentioned above we propose
the PCPSO algorithm to overcome the problem of fallinginto local optimum Figure 1 shows the flowchart of PCPSOFirst we use two parallel CPSO algorithms (First1D andSecond1D) then OED learning is applied to form the newvector 119883
119871119899which is assigned to 119883
1198712 the better vector
chosen from 1198831198711
and 1198831198712
is assigned to the new 1198831198711 The
new vectors 1198831198711
and 1198831198712 which are treated as context
vectors in CPSO algorithm keep the search going furtherIn order to illustrate the efficiency of PCPSO algorithmFigure 2 shows the convergence characteristic of the PCPSOon Rotated Ackleyrsquos function The detailed description ofRotated Ackleyrsquos function can be found in Section 3 Theparameter settings for First1D and Second1D are the same asCPSO H algorithm [19] where 120596 decreases linearly from 09to 04 119888
1= 1198882= 1494119872 = 10 stands for particle number
and 119863 = 30 represents dimensions Assuming First1D andSecond1D need 80 CPSO iterations to accomplish locallyoptimal vectors followed with OED computation there are 6iterations of OED computation and therefore the total CPSOiteration is 6 times 80 = 480 From Figure 2 we can see thatthe function value drops sharply at first 80 iterations andthat First1D and Second1D have similar performance Forthe next 80 iterations the better vector chosen from 119883
1198711
and 1198831198712
is assigned to 1198831198711
for another First1D implemen-tation the new vector 119883
119871119899is assigned to 119883
1198712for another
Second1D implementation The performance on First1D andSecond1D are significantly different in iterations from 80 to240 First1D algorithm gets trapped in locally optimal vectorand Second1D algorithm still makes the function value dropsharply even though it is higher at start point at the 81stiteration Since First1D algorithm has almost lost searchingability after 80 iterations we can also adopt the strategy thatFirt1D algorithm is only implemented at first 80 iterations
0 100 200 300 400 5000
05
1
15
2
25
3
35
4
Iteration
Func
tion
valu
e
First1DSecond1D
Figure 2 The convergence characteristic of the PCPSO on RotatedAckleyrsquos function in 30 dimensions First1D and Second1D areCPSOalgorithms
in order to save computational cost Vector 1198831198711
is used forsaving the best optimal vector ever found and is combinedwith vector119883
1198712for new vector119883
119871119899by OED method after 80
iterations These operations iterate generation by generationuntil some criteria are met
Compared with other PSO algorithms using OED [1516] the PCPSO needs some extra computations of updatingvector 119883
1198711 However the cooperation between local vector
1198831198711
and 1198831198712
makes the searching more effective In [15] theOED result is set as an exemplar for the other particles tolearn From Figure 2 we can see that sometimes the functionvalues after OED are even bigger Another limitation of [15]is that the searching method gets trapped into local optimumeasily because the two vectors for OED learning are similarBased on the principle that each element of function vectormoves closer to the globally optimal vector after OED it canbe seen that the new vector 119883
119871119899jumps out of local optimum
and keeps searching further even though the function valueis higher than corresponding local optimum at start iteration
Mathematical Problems in Engineering 5
3 Experimental Results and Discussions
31 Test Functions Test functions including unimodal func-tions and multimodal functions are used to investigate theperformance of PSO algorithms [25] We choose 15 testfunctions on 30 dimensionsThe formulas and the propertiesof these functions are presented below
Group A unimodal and simple multimodal functions areas follows
(1) Sphere function
1198911(119909) =
119863
sum
119894=1
1199092
119894 (4)
(2) Rosenbrockrsquos function
1198912(119909) =
119863minus1
sum
119894=1
100 (119909119894+1
minus 1199092
119894)2
+ (119909119894minus 1)2
(5)
(3) Griewankrsquos function
1198913(119909) =
119863
sum
119894=1
1199092
119894
4000minus
119863
prod
119894=1
cos(119909119894
radic119894
) + 1 (6)
(4) Rastriginrsquos function
1198914(119909) = 119909
2
119894minus 10 cos (2120587119909
119894) + 10 (7)
(5) Ackleyrsquos function
1198915(119909) = minus 20 exp(minus02radic 1
119863
119863
sum
119894minus1
1199092
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(8)
(6) Weierstrass function
1198916(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20
(9)
(7) Noncontinuous Rastriginrsquos function
1198917(119909) =
119863
sum
119894=1
(1199102
119894minus 10 cos (2120587119910
119894) + 10)
119910119894=
119909119894
10038161003816100381610038161199091198941003816100381610038161003816 lt
1
2
round (2119909119894)
2
10038161003816100381610038161199091198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863
(10)
(8) Schwefelrsquos function
1198918(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119909119894sin (1003816100381610038161003816119909119894
1003816100381610038161003816
12
) (11)
Group B Rotated and shifted multimodal functions areas follows
In Group A the functions can be divided into smallersearch spaces because of low interacting of the elementsTherefore the CPSO algorithm is suitable to optimize thesefunctions In Group B seven rotated multimodal functionsare generated by an orthogonal matrix 119872 to test the per-formance of the PCPSO algorithm The new rotated vector119910 = 119872 lowast 119909 which is obtained through the original vector119909 left multiplied by orthogonal matrix 119872 performs likethe high interacting vector because all elements in vector119910 will be changed once one element in vector 119909 changesDetailed illustration of generating the orthogonal matrixis introduced in [26] Meanwhile one shifted and rotatedfunction is also listed in group B to test the performance ofPSOalgorithmTable 2 shows the globally optimal vectors119909lowastthe corresponding function values 119891(119909lowast) the search rangeand the initialization range of each function Consider thefollowing
(9) Rotated Ackleyrsquos function
1198919(119909) = minus20 exp(minus02radic 1
119863
119863
sum
119894=1
1199102
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119910119894)) + 20 + 119890 119910 = 119872 lowast 119909
(12)
(10) Rotated Rastriginrsquos function
11989110(119909) =
119863
sum
119894=1
1199102
119894minus 10 cos (2120587119910
119894) + 10 119910 = 119872 lowast 119909 (13)
(11) Rotated Griewankrsquos function
11989111(119909) =
119863
sum
119894=1
1199102
119894
4000minus
119863
prod
119894=1
cos(119910119894
radic119894
) + 1 119910 = 119872 lowast 119909 (14)
(12) Rotated Weierstrass function
11989112(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119910
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20 119910 = 119872 lowast 119909
(15)
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
4 Mathematical Problems in Engineering
StartEnd
Yes No
Two vectors XL1 and XL2 obtained fromFirst1D and Second1D are used for OED
learning to create new vector XLn
From XL1 and XL2 choosebetter one as XL1 XLn is
assigned to XL2
Parallel First1D and Second1D
Parallel First1D and Second1D
with XL1 and XL2 as contextvector respectively
algorithms each operates eightyiterations
i = 1
i = i + 1 i le 6
Figure 1 Flowchart of the PCPSO algorithm
functions with high interacting vector elements there existsa problem that the function value of new vector119883
119871119899is bigger
than the function values of vector 1198831198711
and 1198831198712 However
there are still two advantages by usingOED in high correlatedproblems One is the particle with new vector 119883
119871119899jumping
out of locally optimal vector the other one is that eachelement of the new vector 119883
119871119899chosen between 119883
1198711and 119883
1198712
is probably closer to the globally optimal vector Thereforewe try to use the new vector 119883
119871119899as context vector to repeat
the CPSO algorithmBased on the analysis mentioned above we propose
the PCPSO algorithm to overcome the problem of fallinginto local optimum Figure 1 shows the flowchart of PCPSOFirst we use two parallel CPSO algorithms (First1D andSecond1D) then OED learning is applied to form the newvector 119883
119871119899which is assigned to 119883
1198712 the better vector
chosen from 1198831198711
and 1198831198712
is assigned to the new 1198831198711 The
new vectors 1198831198711
and 1198831198712 which are treated as context
vectors in CPSO algorithm keep the search going furtherIn order to illustrate the efficiency of PCPSO algorithmFigure 2 shows the convergence characteristic of the PCPSOon Rotated Ackleyrsquos function The detailed description ofRotated Ackleyrsquos function can be found in Section 3 Theparameter settings for First1D and Second1D are the same asCPSO H algorithm [19] where 120596 decreases linearly from 09to 04 119888
1= 1198882= 1494119872 = 10 stands for particle number
and 119863 = 30 represents dimensions Assuming First1D andSecond1D need 80 CPSO iterations to accomplish locallyoptimal vectors followed with OED computation there are 6iterations of OED computation and therefore the total CPSOiteration is 6 times 80 = 480 From Figure 2 we can see thatthe function value drops sharply at first 80 iterations andthat First1D and Second1D have similar performance Forthe next 80 iterations the better vector chosen from 119883
1198711
and 1198831198712
is assigned to 1198831198711
for another First1D implemen-tation the new vector 119883
119871119899is assigned to 119883
1198712for another
Second1D implementation The performance on First1D andSecond1D are significantly different in iterations from 80 to240 First1D algorithm gets trapped in locally optimal vectorand Second1D algorithm still makes the function value dropsharply even though it is higher at start point at the 81stiteration Since First1D algorithm has almost lost searchingability after 80 iterations we can also adopt the strategy thatFirt1D algorithm is only implemented at first 80 iterations
0 100 200 300 400 5000
05
1
15
2
25
3
35
4
Iteration
Func
tion
valu
e
First1DSecond1D
Figure 2 The convergence characteristic of the PCPSO on RotatedAckleyrsquos function in 30 dimensions First1D and Second1D areCPSOalgorithms
in order to save computational cost Vector 1198831198711
is used forsaving the best optimal vector ever found and is combinedwith vector119883
1198712for new vector119883
119871119899by OED method after 80
iterations These operations iterate generation by generationuntil some criteria are met
Compared with other PSO algorithms using OED [1516] the PCPSO needs some extra computations of updatingvector 119883
1198711 However the cooperation between local vector
1198831198711
and 1198831198712
makes the searching more effective In [15] theOED result is set as an exemplar for the other particles tolearn From Figure 2 we can see that sometimes the functionvalues after OED are even bigger Another limitation of [15]is that the searching method gets trapped into local optimumeasily because the two vectors for OED learning are similarBased on the principle that each element of function vectormoves closer to the globally optimal vector after OED it canbe seen that the new vector 119883
119871119899jumps out of local optimum
and keeps searching further even though the function valueis higher than corresponding local optimum at start iteration
Mathematical Problems in Engineering 5
3 Experimental Results and Discussions
31 Test Functions Test functions including unimodal func-tions and multimodal functions are used to investigate theperformance of PSO algorithms [25] We choose 15 testfunctions on 30 dimensionsThe formulas and the propertiesof these functions are presented below
Group A unimodal and simple multimodal functions areas follows
(1) Sphere function
1198911(119909) =
119863
sum
119894=1
1199092
119894 (4)
(2) Rosenbrockrsquos function
1198912(119909) =
119863minus1
sum
119894=1
100 (119909119894+1
minus 1199092
119894)2
+ (119909119894minus 1)2
(5)
(3) Griewankrsquos function
1198913(119909) =
119863
sum
119894=1
1199092
119894
4000minus
119863
prod
119894=1
cos(119909119894
radic119894
) + 1 (6)
(4) Rastriginrsquos function
1198914(119909) = 119909
2
119894minus 10 cos (2120587119909
119894) + 10 (7)
(5) Ackleyrsquos function
1198915(119909) = minus 20 exp(minus02radic 1
119863
119863
sum
119894minus1
1199092
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(8)
(6) Weierstrass function
1198916(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20
(9)
(7) Noncontinuous Rastriginrsquos function
1198917(119909) =
119863
sum
119894=1
(1199102
119894minus 10 cos (2120587119910
119894) + 10)
119910119894=
119909119894
10038161003816100381610038161199091198941003816100381610038161003816 lt
1
2
round (2119909119894)
2
10038161003816100381610038161199091198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863
(10)
(8) Schwefelrsquos function
1198918(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119909119894sin (1003816100381610038161003816119909119894
1003816100381610038161003816
12
) (11)
Group B Rotated and shifted multimodal functions areas follows
In Group A the functions can be divided into smallersearch spaces because of low interacting of the elementsTherefore the CPSO algorithm is suitable to optimize thesefunctions In Group B seven rotated multimodal functionsare generated by an orthogonal matrix 119872 to test the per-formance of the PCPSO algorithm The new rotated vector119910 = 119872 lowast 119909 which is obtained through the original vector119909 left multiplied by orthogonal matrix 119872 performs likethe high interacting vector because all elements in vector119910 will be changed once one element in vector 119909 changesDetailed illustration of generating the orthogonal matrixis introduced in [26] Meanwhile one shifted and rotatedfunction is also listed in group B to test the performance ofPSOalgorithmTable 2 shows the globally optimal vectors119909lowastthe corresponding function values 119891(119909lowast) the search rangeand the initialization range of each function Consider thefollowing
(9) Rotated Ackleyrsquos function
1198919(119909) = minus20 exp(minus02radic 1
119863
119863
sum
119894=1
1199102
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119910119894)) + 20 + 119890 119910 = 119872 lowast 119909
(12)
(10) Rotated Rastriginrsquos function
11989110(119909) =
119863
sum
119894=1
1199102
119894minus 10 cos (2120587119910
119894) + 10 119910 = 119872 lowast 119909 (13)
(11) Rotated Griewankrsquos function
11989111(119909) =
119863
sum
119894=1
1199102
119894
4000minus
119863
prod
119894=1
cos(119910119894
radic119894
) + 1 119910 = 119872 lowast 119909 (14)
(12) Rotated Weierstrass function
11989112(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119910
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20 119910 = 119872 lowast 119909
(15)
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 5
3 Experimental Results and Discussions
31 Test Functions Test functions including unimodal func-tions and multimodal functions are used to investigate theperformance of PSO algorithms [25] We choose 15 testfunctions on 30 dimensionsThe formulas and the propertiesof these functions are presented below
Group A unimodal and simple multimodal functions areas follows
(1) Sphere function
1198911(119909) =
119863
sum
119894=1
1199092
119894 (4)
(2) Rosenbrockrsquos function
1198912(119909) =
119863minus1
sum
119894=1
100 (119909119894+1
minus 1199092
119894)2
+ (119909119894minus 1)2
(5)
(3) Griewankrsquos function
1198913(119909) =
119863
sum
119894=1
1199092
119894
4000minus
119863
prod
119894=1
cos(119909119894
radic119894
) + 1 (6)
(4) Rastriginrsquos function
1198914(119909) = 119909
2
119894minus 10 cos (2120587119909
119894) + 10 (7)
(5) Ackleyrsquos function
1198915(119909) = minus 20 exp(minus02radic 1
119863
119863
sum
119894minus1
1199092
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119909119894)) + 20 + 119890
(8)
(6) Weierstrass function
1198916(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20
(9)
(7) Noncontinuous Rastriginrsquos function
1198917(119909) =
119863
sum
119894=1
(1199102
119894minus 10 cos (2120587119910
119894) + 10)
119910119894=
119909119894
10038161003816100381610038161199091198941003816100381610038161003816 lt
1
2
round (2119909119894)
2
10038161003816100381610038161199091198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863
(10)
(8) Schwefelrsquos function
1198918(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119909119894sin (1003816100381610038161003816119909119894
1003816100381610038161003816
12
) (11)
Group B Rotated and shifted multimodal functions areas follows
In Group A the functions can be divided into smallersearch spaces because of low interacting of the elementsTherefore the CPSO algorithm is suitable to optimize thesefunctions In Group B seven rotated multimodal functionsare generated by an orthogonal matrix 119872 to test the per-formance of the PCPSO algorithm The new rotated vector119910 = 119872 lowast 119909 which is obtained through the original vector119909 left multiplied by orthogonal matrix 119872 performs likethe high interacting vector because all elements in vector119910 will be changed once one element in vector 119909 changesDetailed illustration of generating the orthogonal matrixis introduced in [26] Meanwhile one shifted and rotatedfunction is also listed in group B to test the performance ofPSOalgorithmTable 2 shows the globally optimal vectors119909lowastthe corresponding function values 119891(119909lowast) the search rangeand the initialization range of each function Consider thefollowing
(9) Rotated Ackleyrsquos function
1198919(119909) = minus20 exp(minus02radic 1
119863
119863
sum
119894=1
1199102
119894)
minus exp( 1
119863
119863
sum
119894=1
cos (2120587119910119894)) + 20 + 119890 119910 = 119872 lowast 119909
(12)
(10) Rotated Rastriginrsquos function
11989110(119909) =
119863
sum
119894=1
1199102
119894minus 10 cos (2120587119910
119894) + 10 119910 = 119872 lowast 119909 (13)
(11) Rotated Griewankrsquos function
11989111(119909) =
119863
sum
119894=1
1199102
119894
4000minus
119863
prod
119894=1
cos(119910119894
radic119894
) + 1 119910 = 119872 lowast 119909 (14)
(12) Rotated Weierstrass function
11989112(119909) =
119863
sum
119894=1
(
119896max
sum
119896=0
[119886119896 cos (2120587119887119896 (119910
119894+ 05))])
minus 119863
119896max
sum
119896=0
[119886119896 cos (120587119887119896)]
119886 = 05 119887 = 3 119896max = 20 119910 = 119872 lowast 119909
(15)
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
6 Mathematical Problems in Engineering
Table 2 Globally optimal vector 119909lowast corresponding function value 119891(119909lowast) search range and initialization range of test functions
Functions 119909lowast
119891(119909lowast) Search range Initialization range
1198911
[0 0 0] 0 [minus100 100]119863 [minus100 50]119863
1198912
[1 1 1] 0 [minus2 2]119863 [minus2 2]119863
1198913
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
1198914
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198915
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
1198916
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
1198917
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
1198918
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
1198919
[0 0 0] 0 [minus32 32]119863 [minus32 16]119863
11989110
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989111
[0 0 0] 0 [minus600 600]119863 [minus600 200]119863
11989112
[0 0 0] 0 [minus05 05]119863 [minus05 02]119863
11989113
[0 0 0] 0 [minus512 512]119863 [minus512 2]119863
11989114
[42096 42096 42096] 0 [minus500 500]119863 [minus500 500]119863
11989115
[1199001 1199002 119900
119863] minus330 [minus512 512]119863 [minus512 2]119863
(13) Rotated noncontinuous Rastriginrsquos function
11989113(119909) =
119863
sum
119894=1
1199112
119894minus 10 cos (2120587119911
119894) + 10
119911119894=
119910119894
10038161003816100381610038161199101198941003816100381610038161003816 lt
1
2
round (2119910119894)
2
10038161003816100381610038161199101198941003816100381610038161003816 ge
1
2
for 119894 = 1 2 119863 119910 = 119872 lowast 119909
(16)
(14) Rotated Schwefelrsquos function
11989114(119909) = 4189829 times 119863 minus
119863
sum
119894=1
119911119894
119911119894=
119910119894sin (1003816100381610038161003816119910119894
1003816100381610038161003816
12
) if 10038161003816100381610038161199101198941003816100381610038161003816 le 500
0 otherwise
for 119894 = 1 2 119863
119910 = 1199101015840+ 42096 119910
1015840= 119872 lowast (119909 minus 42096)
(17)
(15) Rotated and Shifted Rastriginrsquos function
11989115(119909) =
119863
sum
119894=1
(1199112
119894minus 10 cos (2120587119911
119894) + 10) minus 330
119911 = 119872 lowast 119910 119910 = 119909 minus 119900 119900 = [1199001 1199002 119900
119863]
(18)
where 119900 = [1199001 1199002 119900
119863] is the shifted globally optimal
vector
32 PCPSOrsquos Performance on 15 Functions The performanceof PCPSO is tested by functions 119891
1ndash11989115which includes mul-
timodal rotated and shifted functions in 30 dimensionsThe
0 5 10 15 20 25 30minus25
minus20
minus15
minus10
minus5
0
5
10
Iteration
Sphere functionRosenbrock functionGriewank functionRastriginrsquos function
Ackley functionWeierstrass functionNoncontinuous Rastriginrsquos functionSchwefelrsquos function
log(f(x))
Figure 3 The convergence characteristics of the PCPSO algorithmon test functions 119891
1ndash1198918in 30 iterations
parameter settings for the PCPSO algorithm are as follows120596 decreases linearly from 09 to 04 119888
1= 1198882= 1494 and
particle number119872 = 10 Since the vectors of1198911ndash1198918have very
low interacting elements Figure 3 shows the convergencecharacteristics of PCPSO algorithm on test functions 119891
1ndash1198918
in 30 iterations where the 119910-axis is expressed in log formatFrom Figure 3 we can see that all the function values dropsignificantly in 30 iterations except Rosenbrockrsquos functionIn order to fully test PCPSOrsquos performance on Rosenbrockrsquosfunction we make a comparison between PCPSO and CPSOalgorithm in Figure 4 The upper plot of Figure 4 showsconvergence characteristic of PCPSO which cooperates with
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 7
0 100 200 300 400 5000
200400600800
1000
Iteration
Func
tion
valu
e
First1DSecond1D
(a)
0 100 200 300 400 5000
200
400
600
Iteration
Func
tion
valu
e
(b)
Figure 4 The convergence characteristics of PCPSO and CPSOalgorithms on Rosenbrockrsquos function (a) PCPSO (b) CPSO
the two independent algorithms (First1D and Second1D)by using OED learning the lower plot is the convergencecharacteristic of CPSO algorithm From Figure 4 we cansee that the OED learning makes the function vector jumpout of locally optimal vector and drops the function valuefinally The convergence characteristics for the remaining sixtest functions 119891
10ndash11989115
are shown in Figure 5 All these sixfunctions are rotated multimodal and nonseparable func-tions Similar to Figure 2 in spite the fact that the functionvalue of Second1D is increased in some cases at the startpoint caused by OED learning it will drop sharply with theincrease of iterationsThis phenomenon is especially obviousin test function 119891
12 To sum up PCPSO works well in terms
of preventing locally optimal vector and dropping functionvalue sharply The OED learning finds the best combinationof two locally optimal vectors 119883
1198711and 119883
1198712in iterations 80
160 and 240However there are still some drawbacks causingthe vector falling into locally optimal vector like 119891
14 the
reason is that the vectors1198831198711and119883
1198712are equal to each other
the OED learning ability is lost finally
33 Comparison with Other PSOs and Discussions SevenPSO variants are taken from the literature for comparisonon 15 test functions with 30 dimensions In the followingwe briefly describe the peer algorithms shown in Table 3The first algorithm is the standard PSO (SPSO) whoseperformance is improved by using a random topology Thesecond algorithm fully informed PSO (FIPS) uses all theneighbor particles to influence the flying velocity The thirdorthogonal PSO (OPSO) aims to generate a better positionby using OED The fourth algorithm fitness-distance-ratioPSO (FDR PSO) solves the premature convergence problem
Table 3 Parameter settings for the seven PSO variants
Algorithm Parameter settingsSPSO [27] 120596 04sim09 119888
1= 1198882= 1193
FIPS [10] 120594 = 07298 sum 119888119894= 41
OPSO [16] 120596 04sim09 1198881= 1198882= 20
FDR PSO [13] 120596 04sim09sum119888119894= 40
CLPSO [6] 120596 04sim09 1198881= 1198882= 20
CPSO H [19] 120596 04sim09 119888 = 149 120581 = 6OLPSO [15] 120596 04sim09 119888 = 20 119866 = 5
by using the fitness distance ratio The fifth comprehensivelearning PSO (CLPSO) is proposed for solving multimodalproblems The sixth cooperative PSO (CPSO H) uses one-dimensional swarms to search each dimension separatelyThe seventh orthogonal learning PSO (OLPSO) can guideparticles to fly towards an exemplar which is constructed by119875119894and 119875119899using OED method
The swarm size is set 40 for the seven PSO variantsmentioned above and 10 for the PCPSO algorithm Themaximal FEs for the eight algorithms are set 200000 ineach run of each test function All functions were run 25times and the mean values and standard deviation of theresults are presented in Table 4 The best results are shownin boldface From the results we observe that CPSO HCLPSO andOLPSOperformwell on the first 8 functionsThereason is that CPSO-H suits well on 1-dimensional searchCLPSO and OLPSO can search further by using their ownlearning strategies The learning strategy in CLPSO is toprevent falling into local optimumby random learning whilein OLPSO the strategy is to find a better combination oflocally optimal vector PCPSOmethod can keep searching bythe cooperation of two locally optimal vectors 119883
1198711and 119883
1198712
With the implementation of Second1D the algorithm canalways find a better vector 119883
1198712compared with the previous
best one 1198831198711
by using OED method The PCPSO methodperforms well on 119891
2-1198913 1198919ndash11989111 11989113 and 119891
15 However the
function value obtained by this method is not close enoughto the globally optimal value the premature convergence stillhappens in 119891
12and 119891
14due to lack of vector diversity The
advantages of this method are the rapid convergent speedand robustness whatever test functions are the algorithmcan find an acceptable optimum especially when facing somecomplex functions The nonparametric Wilcoxon rank sumtests are taken as 119905-test in Table 5 to determine whether theresults obtained by PCPSO are statistically different fromthe best results achieved by the other algorithms A value ofone indicates that the performance of the two algorithms isstatistically different with 95 certainty whereas the valueof zero implies that the performance is not statisticallydifferent Among these 15 functions there are 10 functionsthat are statistically different between the best results andthe results of PCPSO From Tables 4 and 5 we can see thatthe PCPSO obtains 5 best functions (119891
2 1198913 11989110 11989111 11989113) that
are statistically different from the others In addition mostof the best functions obtained by PCPSO are focused on
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
8 Mathematical Problems in Engineering
0 200 400 6000
100
200
300
Iteration
Func
tion
valu
e
(a)
0 200 400 600Iteration
0
02
04
06
08
Func
tion
valu
e
(b)
0 200 400 600Iteration
Func
tion
valu
e
0
10
20
30
40
50
(c)
0 200 400 600Iteration
Func
tion
valu
e
0
50
100
150
200
250
(d)
0 200 400 600Iteration
Func
tion
valu
e
2000
4000
6000
8000
10000
12000
First1DSecond1D
(e)
0 200 400 600Iteration
minus1000
0
1000
2000
3000
Func
tion
valu
e
First1DSecond1D
(f)
Figure 5 The convergence progress of 30-dimensional test functions using PCPSO (a) Rotated Rastriginrsquos function (b) Rotated Griewankrsquosfunction (c) Rotated Weierstrass function (d) Rotated noncontinuous Rastriginrsquos function (e) Rotated Schwefelrsquos function and (f) Rotatedand Shifted Rastriginrsquos function
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 9
Table 4 Results for 30D functions
119865 SPSO FIPS OPSO FDR PSO1198911
116119890 minus 069 plusmn 286119890 minus 068 269119890 minus 012 plusmn 228119890 minus 011 671119890 minus 018 plusmn 682119890 minus 018 488e minus 102 plusmn 153e minus 1011198912
168119890 + 001 plusmn 856119890 + 000 247119890 + 001 plusmn 219119890 minus 001 412119890 + 001 plusmn 119119890 + 001 154119890 + 001 plusmn 679119890 + 000
1198913
173119890 minus 002 plusmn 108119890 minus 001 465119890 minus 003 plusmn 371119890 minus 002 210119890 minus 003 plusmn 619119890 minus 003 172119890 minus 002 plusmn 908119890 minus 003
1198914
461119890 + 001 plusmn 785119890 + 001 744119890 + 001 plusmn 156119890 + 001 327119890 + 000 plusmn 635119890 + 000 291119890 + 001 plusmn 806119890 + 000
1198915
127119890 + 000 plusmn 479119890 + 000 512119890 minus 007 plusmn 781119890 minus 007 374119890 minus 009 plusmn 596119890 minus 009 285119890 minus 014 plusmn 432119890 minus 015
1198916
238119890 + 000 plusmn 804119890 + 000 313119890 minus 001 plusmn 374119890 minus 001 328119890 minus 002 plusmn 493119890 minus 002 775119890 minus 003 plusmn 121119890 minus 002
1198917
521119890 + 001 plusmn 419119890 + 001 638119890 + 001 plusmn 905119890 + 000 423119890 + 000 plusmn 824119890 + 000 145119890 + 001 plusmn 658119890 + 000
1198918
376119890 + 003 plusmn 343119890 + 003 256119890 + 003 plusmn 731119890 + 002 272119890 + 003 plusmn 331119890 + 003 316119890 + 003 plusmn 465119890 + 002
1198919
135119890 + 000 plusmn 102119890 + 000 726119890 minus 005 plusmn 142119890 minus 005 407119890 minus 005 plusmn 826119890 minus 006 363119890 minus 001 plusmn 572119890 minus 001
11989110
838119890 + 001 plusmn 153119890 + 002 175119890 + 002 plusmn 587119890 + 001 164119890 + 002 plusmn 398119890 + 001 467119890 + 001 plusmn 144119890 + 001
11989111
116119890 minus 002 plusmn 128119890 minus 002 695119890 minus 005 plusmn 321119890 minus 005 113119890 minus 003 plusmn 245119890 minus 003 879119890 minus 003 plusmn 107119890 minus 002
11989112
283119890 + 001 plusmn 531119890 + 000 135119890 + 001 plusmn 839119890 + 000 121119890 + 001 plusmn 105119890 + 001 235119890 + 000 plusmn 141119890 + 000
11989113
551119890 + 001 plusmn 207119890 + 001 874119890 + 001 plusmn 192119890 + 001 767119890 + 001 plusmn 743119890 + 001 464119890 + 001 plusmn 901119890 + 000
11989114
559119890 + 003 plusmn 607119890 + 002 267119890 + 003 plusmn 772119890 + 002 245119890 + 003 plusmn 289119890 + 002 380119890 + 003 plusmn 618119890 + 002
11989115
minus249119890 + 002 plusmn 476119890 + 001 minus107119890 + 002 plusmn 152119890 + 002 minus226119890 + 002 plusmn 107119890 + 001 minus271119890 + 002 plusmn 369119890 + 001
119891 CLPSO CPSO H OLPSO PCPSO1198911
446119890 minus 014 plusmn 151119890 minus 014 116119890 minus 033 plusmn 226119890 minus 033 116119890 minus 038 plusmn 156119890 minus 038 112119890 minus 030 plusmn 514119890 minus 31
1198912
210119890 + 001 plusmn 271119890 + 000 206119890 + 001 plusmn 113119890 + 001 174119890 + 001 plusmn 105119890 + 001 112e + 001 plusmn 819e + 0001198913
314119890 minus 010 plusmn 464119890 minus 010 363119890 minus 002 plusmn 711119890 minus 002 317119890 minus 010 plusmn 623119890 minus 011 336e minus 012 plusmn 751e minus 0131198914
476119890 minus 010 plusmn 217119890 minus 010 0 plusmn 0 0 plusmn 0 221119890 minus 012 plusmn 614119890 minus 013
1198915
497119890 minus 014 plusmn 212119890 minus 014 456119890 minus 014 plusmn 125119890 minus 014 476e minus 015 plusmn 157e minus 016 483119890 minus 008 plusmn 138119890 minus 008
1198916
434119890 minus 007 plusmn 209119890 minus 007 815e minus 015 plusmn 892e minus 016 127119890 minus 002 plusmn 279119890 minus 002 241119890 minus 011 plusmn 109119890 minus 011
1198917
436e minus 010 plusmn 244e minus 010 121119890 minus 001 plusmn 319119890 minus 001 451119890 minus 008 plusmn 117119890 minus 008 265119890 minus 007 plusmn 227119890 minus 007
1198918
126e minus 012 plusmn 862e minus 013 108119890 + 003 plusmn 227119890 + 002 388119890 minus 004 plusmn 621119890 minus 005 382119890 minus 004 plusmn 731119890 minus 006
1198919
347119890 minus 004 plusmn 211119890 minus 004 213119890 + 000 plusmn 355119890 minus 001 228119890 minus 005 plusmn 114119890 minus 005 225e minus 006 plusmn 108e minus 00611989110
335119890 + 001 plusmn 558119890 + 000 804119890 + 001 plusmn 223119890 + 001 577119890 + 001 plusmn 127119890 + 001 163e + 000 plusmn 518e + 00011989111
707119890 minus 005 plusmn 118119890 minus 005 554119890 minus 002 plusmn 479119890 minus 002 273119890 minus 005 plusmn 308119890 minus 005 122e minus 005 plusmn 515e minus 00611989112
309e + 000 plusmn 151e + 000 153119890 + 001 plusmn 362119890 + 000 703119890 + 000 plusmn 141119890 + 000 937119890 + 000 plusmn 358119890 + 000
11989113
353119890 + 001 plusmn 552119890 + 000 860119890 + 001 plusmn 261119890 + 001 403119890 + 001 plusmn 134119890 + 001 107e + 001 plusmn 237e + 00011989114
246e + 003 plusmn 188e + 003 382119890 + 003 plusmn 662119890 + 002 315119890 + 003 plusmn 127119890 + 003 326119890 + 003 plusmn 123119890 + 003
11989115
minus297119890 + 002 plusmn 213119890 + 001 minus238119890 + 002 plusmn 742119890 + 001 minus276119890 + 002 plusmn 347119890 + 001 minus308e + 002 plusmn 102e + 001
Table 5 119905-test on 15 functions
Functions 1198911
1198912
1198913
1198914
1198915
1198916
1198917
1198918
119905-test 1 1 1 1 1 0 0 1Functions 119891
911989110
11989111
11989112
11989113
11989114
11989115
119905-test 0 1 1 1 1 0 0
nonseparable multimodal problems which demonstrates theeffectiveness and robustness of PCPSO algorithm
4 Conclusion
In this paper a novel PCPSO algorithm is presented in orderto overcome the problem of falling into local optimum InPCPSO considering the complex interacting of the elementsof the vector the CPSO (First1D and Second1D) algorithm isused twice to create two locally optimal vectors These two
vectors are combined together by OED learning so that thebest elements can be chosen to jump out of local optimumand search further
Although OED operation may cause the function valuehigher than local optima in some cases the function valuewill always drop sharply after OED because the OEDoperation chooses the elements that are closer to globallyoptimal vector Experimental tests have been conducted on15 benchmark functions including unimodal multimodalrotated and shifted functions From the comparative resultsit can be concluded that PCPSO significantly improves theperformance of PSO and provides a better solution on mostrotated multimodal problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
10 Mathematical Problems in Engineering
Acknowledgments
This work was supported in part by National Natural ScienceFoundation of China (51275353) Macao Science and Tech-nology Development Fund (1082012A3 1102013A3) andResearch Committee of University of Macau (MYRG183(Y1-L3)FST11-LYM MYRG203(Y1-L4)-FST11-LYM)
References
[1] R C Eberhart and J Kennedy ldquoNew optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995
[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[3] A Adam M I Shapiai M Z M Tumari M S Mohamadand M Mubin ldquoFeature selection and classifier parametersestimation for EEG signals peak detection using particle swarmoptimizationrdquoThe Scientific World Journal vol 2014 Article ID973063 13 pages 2014
[4] N Geng D W Gong and Y Zhang ldquoPSO-based robot pathplanning for multisurvivor rescue in limited survival timerdquoMathematical Problems in Engineering vol 2014 Article ID187370 10 pages 2014
[5] M Shoaib andW-C Song ldquoData aggregation for Vehicular Ad-hocNetwork using particle swarmoptimizationrdquo in Proceedingsof the 14th Asia-Pacific Network Operations and ManagementSymposium pp 1ndash6 September 2012
[6] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006
[7] Y H Shi and R C Eberhart ldquoParameter selection in par-ticle swarm optimizationrdquo in Evolutionary Programming VIIvol 1447 of Lecture Notes in Computer Science pp 591ndash600Springer Berlin Germany 1998
[8] C Li S Yang and T T Nguyen ldquoA self-learning particle swarmoptimizer for global optimization problemsrdquo IEEE Transactionson Systems Man and Cybernetics Part B Cybernetics vol 42no 3 pp 627ndash646 2012
[9] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[10] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[11] J Kennedy and R Mendes ldquoPopulation structure and par-ticle swarm performancerdquo in Proceedings of the Congress onEvolutionary Computation (CEC rsquo02) pp 1671ndash1676 HonoluluHawaii USA May 2002
[12] T M Blackwell and J Branke ldquoMultiswarms exclusion andanti-convergence in dynamic environmentsrdquo IEEETransactionson Evolutionary Computation vol 10 no 4 pp 459ndash472 2006
[13] T Peram K Veeramachaneni and C K Mohan ldquoFitness-distance-ratio based particle swarm optimizationrdquo in Proceed-ings of the IEEE Swarm Intelligence Symposium (SIS rsquo03) pp 174ndash181 IEEE April 2003
[14] X Chen and Y M Li ldquoA modified PSO structure resulting inhigh exploration ability with convergence guaranteedrdquo IEEE
Transactions on Systems Man and Cybernetics Part B Cyber-netics vol 37 no 5 pp 1271ndash1289 2007
[15] Z-H Zhan J Zhang Y Li and Y-H Shi ldquoOrthogonal learningparticle swarm optimizationrdquo IEEE Transactions on Evolution-ary Computation vol 15 no 6 pp 832ndash847 2011
[16] S Y Ho H S Lin W H Liauh and S J Ho ldquoOPSO Orthog-onal particle swarm optimization and its application to taskassignment problemsrdquo IEEE Transactions on Systems Man andCybernetics Part A Systems andHumans vol 38 no 2 pp 288ndash298 2008
[17] Z-H Zhan J Zhang Y Li and H S-H Chung ldquoAdaptiveparticle swarm optimizationrdquo IEEE Transactions on SystemsMan and Cybernetics Part B Cybernetics vol 39 no 6 pp1362ndash1381 2009
[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994
[19] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004
[20] G Zhang and Y M Li ldquoCooperative particle swarm opti-mizer with elimination mechanism for global optimization ofmultimodal problemsrdquo in Proceedings of the IEEE Congresson Evolutionary Computation (CEC rsquo14) pp 210ndash217 BeijingChina July 2014
[21] G Zhang and Y M Li ldquoOrthogonal experimental designmethod used in particle swarm optimization for multimodalproblemsrdquo in Proceedings of the 6th International Conference onAdvanced Computational Intelligence (ICACI rsquo13) pp 183ndash188Hangzhou China October 2013
[22] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012
[23] D C Montgomery Design and Analysis of Experiments JohnWiley amp Sons New York NY USA 5th edition 2000
[24] S-Y Ho L-S Shu and J-H Chen ldquoIntelligent evolutionaryalgorithms for large parameter optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 8 no 6 pp522ndash541 2004
[25] X Yao Y Liu andGM Lin ldquoEvolutionary programmingmadefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999
[26] R Salomon ldquoRe-evaluating genetic algorithm performanceunder coordinate rotation of benchmark functions A survey ofsome theoretical and practical aspects of genetic algorithmsrdquoBioSystems vol 39 no 3 pp 263ndash278 1996
[27] SPSO 2007 httpwwwparticleswarminfoProgramshtml
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of