91
Evolutionary Large-Scale Global Optimization An Introduction: Part I Mohammad Nabi Omidvar 1 Xiaodong Li 2 Daniel Molina 3 Antonio LaTorre 4 1 School of Computer Science, University of Birmingham, UK 2 School of Science, RMIT University, Australia, 3 DASCI Andalusian Institute of Data Science, University of Granada, Spain, 4 Universidad Polit´ ecnica de Madrid, Spain, Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorre Decomposition and CC for LSGO 1 / 91

Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Evolutionary Large-Scale Global OptimizationAn Introduction: Part I

Mohammad Nabi Omidvar1 Xiaodong Li2 Daniel Molina3

Antonio LaTorre4

1School of Computer Science, University of Birmingham, UK

2School of Science, RMIT University, Australia,

3DASCI Andalusian Institute of Data Science, University of Granada, Spain,

4Universidad Politecnica de Madrid, Spain,

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 1/ 91

Page 2: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Outline

1 Introduction: Large Scale Global Optimization

2 Approaches to Large-Scale Optimization

3 Variable Interaction: Definitions and Importance

4 Interaction Learning: Exploiting Modularity

5 Conclusion

6 Questions

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 2/ 91

Page 3: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Optimization

min f (x), x = (x1, . . . , xn) ∈ Rn (1)

s.t. : g(x) ≤ 0 (2)

h(x) = 0 (3)

Can be converted to unconstrained optimization using:

Penalty method;

Lagrangian;

Augmented Lagrangian.

Our focus is unconstrained optimization. We must learn how to walkbefore we can run.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 3/ 91

Page 4: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large Scale Global Optimization (LSGO)

How large is large?

The notion of large-scale is not fixed.

Changes over time.

Differs from problem to problem.

The dimension at which existing methods start to fail.

State-of-the-art (EC)

Binary: ≈ 1 billion [a].

Integer (linear): ≈ 1 billion [b], [c].

Real: ≈ 1000-5000.

[a] Kumara Sastry, David E Goldberg, and Xavier Llora. “Towards billion-bit optimization via a parallel estimationof distribution algorithm”. In: Genetic and Evolutionary Computation Conference. ACM. 2007, pp. 577–584.

[b] Kalyanmoy Deb and Christie Myburgh. “Breaking the Billion-Variable Barrier in Real-World OptimizationUsing a Customized Evolutionary Algorithm”. In: Genetic and Evolutionary Computation Conference. ACM. 2016,pp. 653–660.

[c] Kalyanmoy Deb and Christie Myburgh. “A population-based fast algorithm for a billion-dimensional resourceallocation problem with integer variables”. In: European Journal of Operational Research 261.2 (2017),pp. 460–474.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 4/ 91

Page 5: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large Scale Global Optimization: Applications

Why large-scale optimization is important?

Growing applications in various fields.◮ Target shape design optimization [a].◮ Satellite layout design [b].◮ Parameter estimation in large scale biological systems [c].◮ Seismic waveform inversion [d].◮ Parameter calibration of water distribution systems [e].◮ Vehicle routing [f].

[a] Zhenyu Yang et al. “Target shape design optimization by evolving B-splines with cooperative coevolution”. In:Applied Soft Computing 48 (Nov. 2016), pp. 672–682.

[b] Hong-Fei Teng et al. “A dual-system variable-grain cooperative coevolutionary algorithm: satellite-modulelayout design”. In: IEEE transactions on evolutionary computation 14.3 (Dec. 2010), pp. 438–455.

[c] Shuhei Kimura et al. “Inference of S-system models of genetic networks using a cooperative coevolutionaryalgorithm”. In: Bioinformatics 21.7 (Apr. 2005), pp. 1154–1163.

[d] Chao Wang and Jinghuai Gao. “High-dimensional waveform inversion with cooperative coevolutionarydifferential evolution algorithm”. In: IEEE Geoscience and Remote Sensing Letters 9.2 (Mar. 2012), pp. 297–301.

[e] Yu Wang et al. “Two-stage based ensemble optimization framework for large-scale global optimization”. In:European Journal of Operational Research 228.2 (2013), pp. 308–320.

[f] Yi Mei, Xiaodong Li, and Xin Yao. “Cooperative coevolution with route distance grouping for large-scalecapacitated arc routing problems”. In: IEEE Transactions on Evolutionary Computation 18.3 (2014), pp. 435–449.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 5/ 91

Page 6: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large Scale Global Optimization: Research

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 6/ 91

Page 7: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large Scale Global Optimization: Research

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 7/ 91

Page 8: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

The Challenge of Large Scale Optimization

Why is it difficult?

Exponential growth in the size of search space (curse ofdimensionality).

Research Goal

Improving search quality (get to the optimal point).

Improving search efficiency (get there fast).

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 8/ 91

Page 9: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large Scale Global Optimization: Evolutionary Approaches

1 Initialization

2 Sampling and Variation Operators

3 Approximation and Surrogate Modeling

4 Local Search and Memetic Algorithms

5 Decomposition and Divide-and-Conquer

6 Parallelization (GPU, CPU)

7 Hybridization

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 9/ 91

Page 10: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Initialization Methods

Study the importance of initialization methods [1] in large-scaleoptimization.

[1] Borhan Kazimipour, Xiaodong Li, and A Kai Qin. “A review of population initialization techniques for evolutionaryalgorithms”. In: Evolutionary Computation (CEC), 2014 IEEE Congress on. IEEE. 2014, pp. 2585–2592.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 10 / 91

Page 11: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Initialization Methods

Inconclusive evidence for or against initialization methods:◮ Uniform design works worse than RNG, while good-lattice point and

opposition-based methods perform better [1].◮ Another study showed that population size has a more significant effect

than the initialization [2].◮ Achieving uniformity is difficult in high-dimensional spaces [3].◮ Yet another study suggest comparing average performances may not

reveal the effect of initialization [4].

Shortcomings:◮ It is difficult to isolate the effect of initialization.◮ Different effect on different algorithms (mostly tested on DE).◮ Numerous parameters to study.

[1] Borhan Kazimipour, Xiaodong Li, and A Kai Qin. “Initialization methods for large scale global optimization”. In: IEEECongress on Evolutionary Computation. IEEE. 2013, pp. 2750–2757.

[2] Borhan Kazimipour, Xiaodong Li, and A Kai Qin. “Effects of population initialization on differential evolution for largescale optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2014, pp. 2404–2411.

[3] Borhan Kazimipour, Xiaodong Li, and A Kai Qin. “Why advanced population initialization techniques perform poorly inhigh dimension?” In: SEAL. 2014, pp. 479–490.

[4] Eduardo Segredo et al. “On the comparison of initialisation strategies in differential evolution for large scale optimisation”.In: Optimization Letters (2017), pp. 1–14.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 11 / 91

Page 12: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Sampling and Variation Operators

Opposition-based sampling [1]

Center-based sampling [2].

Quantum-behaved particle swarm [3].

Competitive Swarm Optimizer [4].

Social learning PSO [5].

Mutation operators [6], [7].

[1] Hui Wang, Zhijian Wu, and Shahryar Rahnamayan. “Enhanced opposition-based differential evolution for solvinghigh-dimensional continuous optimization problems”. In: Soft Computing 15.11 (2011), pp. 2127–2140.

[2] Sedigheh Mahdavi, Shahryar Rahnamayan, and Kalyanmoy Deb. “Center-based initialization of cooperative co-evolutionaryalgorithm for large-scale optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2016, pp. 3557–3565.

[3] Deyu Tang et al. “A quantum-behaved particle swarm optimization with memetic algorithm and memory for continuousnon-linear large scale problems”. In: Information Sciences 289 (2014), pp. 162–189.

[4] Ran Cheng and Yaochu Jin. “A competitive swarm optimizer for large scale optimization”. In: IEEE Transactions onCybernetics 45.2 (2015), pp. 191–204.

[5] Ran Cheng and Yaochu Jin. “A social learning particle swarm optimization algorithm for scalable optimization”. In:Information Sciences 291 (2015), pp. 43–60.

[6] Hongwei Ge et al. “Cooperative differential evolution with fast variable interdependence learning and cross-clustermutation”. In: Applied Soft Computing 36 (2015), pp. 300–314.

[7] Ali Wagdy Mohamed and Abdulaziz S Almazyad. “Differential Evolution with Novel Mutation and Adaptive CrossoverStrategies for Solving Large Scale Global Optimization Problems”. In: Applied Computational Intelligence and Soft Computing2017 (2017).

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 12 / 91

Page 13: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Approximation Methods and Surrogate Modeling

High-Dimensional Model Representation (HDMR) [1].

Radial Basis Functions [2].

Kriging and Gradient-Enhanced Kriging Metamodels [3].

Piecewise Polynomial (Spline) [4].

Turning large-scale problems into expensive optimization problems [5].

[1] Enying Li, Hu Wang, and Fan Ye. “Two-level Multi-surrogate Assisted Optimization method for high dimensional nonlinearproblems”. In: Applied Soft Computing 46 (2016), pp. 26–36.

[2] Rommel G Regis. “Evolutionary programming for high-dimensional constrained expensive black-box optimization usingradial basis functions”. In: IEEE Transactions on Evolutionary Computation 18.3 (2014), pp. 326–347.

[3] Selvakumar Ulaganathan et al. “A hybrid sequential sampling based metamodelling approach for high dimensionalproblems”. In: IEEE Congress on Evolutionary Computation. IEEE. 2016, pp. 1917–1923.

[4] Zhenyu Yang et al. “Target shape design optimization by evolving B-splines with cooperative coevolution”. In: AppliedSoft Computing 48 (Nov. 2016), pp. 672–682.

[5] Peng Yang, Ke Tang, and Xin Yao. “Turning high-dimensional optimization into computationally expensive optimization”.In: IEEE Transactions on Evolutionary Computation 22.1 (2018), pp. 143–156.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 13 / 91

Page 14: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Local Search and Memetic Algorithms

Multiple Trajectory Search (MTS) [1].

Memetic algorithm with local search chaining [2].◮ MA-SW-Chains [3].◮ MA-SSW-Chains [4].

Multiple offspring sampling (MOS) [5], [6].

[1] Lin-Yu Tseng and Chun Chen. “Multiple trajectory search for large scale global optimization”. In: IEEE Congress onEvolutionary Computation. IEEE. 2008, pp. 3052–3059.

[2] Daniel Molina, Manuel Lozano, and Francisco Herrera. “Memetic algorithm with local search chaining for large scalecontinuous optimization problems”. In: IEEE Congress on Evolutionary Computation. IEEE. 2009, pp. 830–837.

[3] Daniel Molina, Manuel Lozano, and Francisco Herrera. “MA-SW-Chains: Memetic algorithm based on local search chainsfor large scale continuous global optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2010, pp. 1–8.

[4] Daniel Molina et al. “Memetic algorithms based on local search chains for large scale continuous optimisation problems:MA-SSW-Chains”. In: Soft Computing 15.11 (2011), pp. 2201–2220.

[5] Antonio LaTorre, Santiago Muelas, and Jose-Marıa Pena. “Multiple offspring sampling in large scale global optimization”.In: IEEE Congress on Evolutionary Computation. IEEE. 2012, pp. 1–8.

[6] Antonio LaTorre, Santiago Muelas, and Jose-Marıa Pena. “A MOS-based dynamic memetic differential evolution algorithmfor continuous optimization: a scalability test”. In: Soft Computing 15.11 (2011), pp. 2187–2199.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 14 / 91

Page 15: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Parallelization

Algorithms capable of parallelization [1], [2].

GPU [3], [4].

CPU/OpenMP [5].

[1] Jing Tang, Meng Hiot Lim, and Yew Soon Ong. “Diversity-adaptive parallel memetic algorithm for solving large scalecombinatorial optimization problems”. In: Soft Computing 11.9 (2007), pp. 873–888.

[2] Hui Wang, Shahryar Rahnamayan, and Zhijian Wu. “Parallel differential evolution with self-adapting control parametersand generalized opposition-based learning for solving high-dimensional optimization problems”. In: Journal of Parallel andDistributed Computing 73.1 (2013), pp. 62–73.

[3] Kumara Sastry, David E Goldberg, and Xavier Llora. “Towards billion-bit optimization via a parallel estimation ofdistribution algorithm”. In: Genetic and Evolutionary Computation Conference. ACM. 2007, pp. 577–584.

[4] Alberto Cano and Carlos Garcıa-Martınez. “100 Million dimensions large-scale global optimization using distributed GPUcomputing”. In: IEEE Congress on Evolutionary Computation. IEEE. 2016, pp. 3566–3573.

[5] AJ Umbarkar. “OpenMP Genetic Algorithm for Continuous Nonlinear Large-Scale Optimization Problems”. In:International Conference on Soft Computing for Problem Solving. Springer. 2016, pp. 203–214.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 15 / 91

Page 16: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Hybridization (The best of both worlds)

Rationale: benefiting from unique features of different optimizers.◮ EDA+DE: [1].◮ PSO+ABC: [2].◮ Different DE variants: JADE+SaNSDE [3].◮ PSO+ACO [4].◮ Minimum Population Search+CMA-ES [5].

[1] Yu Wang, Bin Li, and Thomas Weise. “Estimation of distribution and differential evolution cooperation for large scaleeconomic load dispatch optimization of power systems”. In: Information Sciences 180.12 (2010), pp. 2405–2420.

[2] LN Vitorino, SF Ribeiro, and Carmelo JA Bastos-Filho. “A hybrid swarm intelligence optimizer based on particles andartificial bees for high-dimensional search spaces”. In: IEEE Congress on Evolutionary Computation. IEEE. 2012, pp. 1–6.

[3] Sishi Ye et al. “A hybrid adaptive coevolutionary differential evolution algorithm for large-scale optimization”. In: IEEECongress on Evolutionary Computation. IEEE. 2014, pp. 1277–1284.

[4] Wu Deng et al. “A novel two-stage hybrid swarm intelligence optimization algorithm and application”. In: Soft Computing16.10 (2012), pp. 1707–1722.

[5] Antonio Bolufe-Rohler, Sonia Fiol-Gonzalez, and Stephen Chen. “A minimum population search hybrid for large scaleglobal optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2015, pp. 1958–1965.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 16 / 91

Page 17: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Decomposition Methods

Divide-and-conquer

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 17 / 91

Page 18: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Variable Interaction, Linkage, Epistasis

What is variable interaction?

Genetics: two genes are said to interact with each other if they collectivelyrepresent a feature at the phenotype level.

The extent to which the fitness of one gene can be suppressed by another gene.

The extent to which the value taken by one gene activates or deactivates the effectof another gene.

Why variable interaction?

The effectiveness of optimization algorithms is affected by how muchthey take variable interaction into account.

Also applies to classic mathematical programming methods.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 18 / 91

Page 19: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Variable Interaction, Linkage, Epistasis

Illustrative Example

f (x , y) = x2 + λ1y2

g(x , y) = x2 + λ1y2 + λ2xy

Improvement Interval

Impro

vem

ent

Inte

rval

x2

x1

A

x2

x1

Impro

vem

ent

Inte

rval A

Improvement Interval

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 19 / 91

Page 20: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Definitions

Variable Interaction

A variable xi is separable or does not interact with any other variable iff:

arg minx

f (x) =(

arg minxi

f (x), arg min∀xj ,j 6=i

f (x))

,

where x = (x1, . . . , xn)⊤ is a decision vector of n dimensions.

Partial Separability

A function f (x) is partially separable with m independent subcomponents iff:

arg minx

f (x) =(

arg minx1

f (x1, . . . ), . . . , arg minxm

f (. . . , xm))

,

x1, . . . , xm are disjoint sub-vectors of x, and 2 ≤ m ≤ n.

Note: a function is fully separable if sub-vectors x1, . . . , xm are 1-dimensional (i.e.,m = n).

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 20 / 91

Page 21: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Definitions

Full Nonseparability

A function f (x) is fully non-separable if every pair of its decision variables interact witheach other.

Additive Separability

A function is partially additively separable if it has the following general form:

f (x) =

m∑

i=1

fi (xi ) ,

where xi are mutually exclusive decision vectors of fi , x = (x1, . . . , xn)⊤ is a global

decision vector of n dimensions, and m is the number of independent subcomponents.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 21 / 91

Page 22: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Effect of Variable Interaction (1)

Sampling and Variation Operators:

GAs: inversion operator to promote tight linkage [1].◮ Increasing the likelihood of placing linked genes close to each other to

avoid disruption by crossover.◮ Rotation of the landscape has a detrimental effect on GA [2].

The need for rotational invariance:◮ Model Building Methods:

⋆ Estimation of Distribution Algorithms and Evolutionary Strategies:Covariance Matrix Adaptation.

⋆ Bayesian Optimization: Bayesian Networks.

◮ DE’s crossover is not rotationally invariant.◮ PSO is also affected by rotation [3].

[1] David E Goldberg, Robert Lingle, et al. “Alleles, loci, and the traveling salesman problem”. In: International Conference onGenetic Algorithms and Their Applications. Vol. 154. 1985, pp. 154–159.

[2] Ralf Salomon. “Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions. A surveyof some theoretical and practical aspects of genetic algorithms”. In: BioSystems 39.3 (1996), pp. 263–278.

[3] Daniel N Wilke, Schalk Kok, and Albert A Groenwold. “Comparison of linear and classical velocity update rules in particleswarm optimization: Notes on scale and frame invariance”. In: International journal for numerical methods in engineering 70.8(2007), pp. 985–1008.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 22 / 91

Page 23: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Effect of Variable Interaction (2)

1 Approximation and Surrogate Modelling:◮ Should be able to capture variable interaction.◮ Second order terms of HDMR.

2 Local Search and Memetic Algorithms:◮ What subset of variables should be optimized in each iteration of local

search?◮ Coordinate-wise search may not be effective. Memetics perform well on

separable functions! A coincidence?!

3 Decomposition and Divide-and-Conquer:◮ Interacting variables should be placed in the same component.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 23 / 91

Page 24: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Linkage Learning and Exploiting Modularity

Implicit Methods:◮ In EC:

⋆ Estimation of Distribution Algorithms⋆ Bayesian Optimization: BOA, hBOA, Linkage Trees⋆ Adaptive Encoding, CMA-ES

◮ Classic Optimization:⋆ Quasi-Newton Methods: Approximation of the Hessian.⋆ Adaptive Coordinate Descent

Explicit Methods:◮ In EC:

⋆ Random Grouping⋆ Statistical Correlation-Based Methods⋆ Delta Grouping⋆ Meta Modelling⋆ Monotonicity Checking⋆ Differential Grouping

◮ Classic Optimization⋆ Block Coordinate Descent

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 24 / 91

Page 25: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Implicit Methods

Scaling Up EDAs:◮ Model Complexity Control [1].◮ Random Matrix Projection [2].◮ Use of mutual information [3].◮ Cauchy-EDA [4].

[1] Weishan Dong et al. “Scaling up estimation of distribution algorithms for continuous optimization”. In: IEEE Transactionson Evolutionary Computation 17.6 (2013), pp. 797–822.

[2] Ata Kaban, Jakramate Bootkrajang, and Robert John Durrant. “Toward large-scale continuous EDA: A random matrixtheory perspective”. In: Evolutionary Computation 24.2 (2016), pp. 255–291.

[3] Qi Xu, Momodou L Sanyang, and Ata Kaban. “Large scale continuous EDA using mutual information”. In: IEEE Congresson Evolutionary Computation. IEEE. 2016, pp. 3718–3725.

[4] Momodou L Sanyang, Robert J Durrant, and Ata Kaban. “How effective is Cauchy-EDA in high dimensions?” In: IEEECongress on Evolutionary Computation. IEEE. 2016, pp. 3409–3416.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 25 / 91

Page 26: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Implicit Methods

Scaling up CMA-ES:◮ CC-CMA-ES [1].◮ sep-CMA-ES [2]◮ Reducing space complexity:

⋆ L-CMA-ES [3].⋆ LM-CMA [4].

[1] Jinpeng Liu and Ke Tang. “Scaling up covariance matrix adaptation evolution strategy using cooperative coevolution”. In:International Conference on Intelligent Data Engineering and Automated Learning. Springer. 2013, pp. 350–357.

[2] Raymond Ros and Nikolaus Hansen. “A simple modification in CMA-ES achieving linear time and space complexity”. In:Parallel Problem Solving from Nature. Springer. 2008, pp. 296–305.

[3] James N Knight and Monte Lunacek. “Reducing the space-time complexity of the CMA-ES”. In: Genetic and EvolutionaryComputation Conference. ACM. 2007, pp. 658–665.

[4] Ilya Loshchilov. “LM-CMA: An Alternative to L-BFGS for Large-Scale Black Box Optimization”. In: EvolutionaryComputation (2015).

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 26 / 91

Page 27: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Scalability issues of EDAs

Accurate estimation requires a large sample size which growsexponentially with the dimensionality of the problem [1].

A small sample results in poor estimation of the eigenvalues [2].

The cost of sampling from a multi-dimensional Gaussian distributionincreases cubically with the problem size [3].

[1] Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The elements of statistical learning. Vol. 1. Springer series instatistics Springer, Berlin, 2001.

[2] Roman Vershynin. “Introduction to the non-asymptotic analysis of random matrices”. In: arXiv preprint arXiv:1011.3027(2010).

[3] Weishan Dong and Xin Yao. “Unified eigen analysis on multivariate Gaussian based estimation of distribution algorithms”.In: Information Sciences 178.15 (2008), pp. 3000–3023.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 27 / 91

Page 28: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Random Projection EDA [1]

.

[1] Ata Kaban, Jakramate Bootkrajang, and Robert John Durrant. “Toward large-scale continuous EDA: A random matrixtheory perspective”. In: Evolutionary Computation 24.2 (2016), pp. 255–291.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 28 / 91

Page 29: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Explicit Methods

A large problem can be subdivided into smaller and simpler problems.

Dates back to Rene Descartes (Discourse on Method).

Has been widely used in many areas:◮ Computer Science: Sorting algorithms (quick sort, merge sort)◮ Optimization: Large-scale linear programs (Dantzig)◮ Politics: Divide and rule (In Perpetual Peace by Immanuel Kant: Divide et impera

is the third political maxims.)

Acknowledgement: the above image is obtained from: http://draininbrain.blogspot.com.au/

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 29 / 91

Page 30: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Decomposition in EAs: Cooperative Co-evolution [1]

[1] Mitchell A. Potter and Kenneth A. De Jong. “A cooperative coevolutionary approach to function optimization”. In: Proc.Int. Conf. Parallel Problem Solving from Nature. Vol. 2. 1994, pp. 249–257.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 30 / 91

Page 31: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

CC is a Framework

CC as a scalability agent:

CC is not an optimizer.

Requires a component optimizer.

CC coordinates how the component optimizer is applied tocomponents.

A scalability agent.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 31 / 91

Page 32: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Challenges of CC

Main Questions1 How to decompose the problem?

2 How to allocated resources?

3 How to coordinate?

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 32 / 91

Page 33: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

The Decomposition Challenge

How to decompose?

There are many possibilities.

Which decomposition is the best?

Optimal decomposition

It is governed by the interaction structure of decision variables.

An optimal decomposition is the one that minimizes the interactionbetween components.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 33 / 91

Page 34: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Survey of Decomposition MethodsUninformed Decomposition [1]

◮ n 1-dimensional components (the original CC)◮ k s-dimensional components.

Random Grouping [2]

Statistical Correlation-Based Methods

Delta Grouping [3]

Meta Modelling [4]

Monotonicity Checking [5]

Differential Grouping [6][1] F. van den Bergh and Andries P Engelbrecht. “A cooperative approach to particle swarm optimization”. In: IEEE

Transactions on Evolutionary Computation 2.3 (June 2004), pp. 225–239.

[2] Zhenyu Yang, Ke Tang, and Xin Yao. “Large scale evolutionary optimization using cooperative coevolution”. In:Information Sciences 178.15 (2008), pp. 2985–2999.

[3] Mohammad Nabi Omidvar, Xiaodong Li, and Xin Yao. “Cooperative co-evolution with delta grouping for large scalenon-separable function optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2010, pp. 1–8.

[4] Sedigheh Mahdavi, Mohammad Ebrahim Shiri, and Shahryar Rahnamayan. “Cooperative co-evolution with a newdecomposition method for large-scale optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2014,pp. 1285–1292.

[5] Wenxiang Chen et al. “Large-scale global optimization using cooperative coevolution with variable interaction learning”. In:Parallel Problem Solving from Nature. Springer. 2010, pp. 300–309.

[6] Mohammad Nabi Omidvar et al. “Cooperative co-evolution with differential grouping for large scale optimization”. In:IEEE Transactions on Evolutionary Computation 18.3 (2014), pp. 378–393.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 34 / 91

Page 35: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Illustrative Example (Canonical CC)

x1

x2 x5x7

x6x3

x4

Figure: Variable interaction of a hypothetical function.

n 1-dimensional components:◮ C1: {x1}, {x2}, {x3}, {x4}, {x5}, {x6}, {x7}◮ C2: {x1}, {x2}, {x3}, {x4}, {x5}, {x6}, {x7}◮ ...◮ Cc : {x1}, {x2}, {x3}, {x4}, {x5}, {x6}, {x7}

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 35 / 91

Page 36: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Illustrative Example (fixed k s-dimensional)

x1

x2 x5x7

x6x3

x4

Figure: Variable interaction of a hypothetical function.

k s-dimensional (k = 2, s = 4):◮ C1: {x1, x2, x3, x4}, {x5, x6, x7}◮ C2: {x1, x2, x3, x4}, {x5, x6, x7}◮ ...◮ Cc : {x1, x2, x3, x4}, {x5, x6, x7}

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 36 / 91

Page 37: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Illustrative Example (Random Grouping)

x1

x2 x5x7

x6x3

x4

Figure: Variable interaction of a hypothetical function.

Random Grouping (k = 2, s = 4):◮ C1: {x2, x3, x6, x5}, {x7, x1, x4}◮ C2: {x3, x4, x1, x2}, {x6, x7, x5}◮ ...◮ Cc : {x1, x5, x6, x7}, {x2, x4, x3}

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 37 / 91

Page 38: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Random Grouping

Theorem

Given N cycles, the probability of assigning v interacting variables

x1, x2, ..., xv into one subcomponent for at least k cycles is:

P(X ≥ k) =

N∑

r=k

(

N

r

)(

1

mv−1

)r (

1−1

mv−1

)N−r

(4)

where N is the number of cycles, v is the total number of interacting

variables, m is the number of subcomponents, and the random variable X

is the number of times that v interacting variables are grouped in one

subcomponent.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 38 / 91

Page 39: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Random Grouping

Example

Given n = 1000, m = 10, N = 50 and v = 4, we have:

P(X ≥ 1) = 1− P(X = 0) = 1−

(

1−1

103

)50

= 0.0488

which means that over 50 cycles, the probability of assigning 4 interactingvariables into one subcomponent for at least 1 cycle is only 0.0488. As wecan see this probability is very small, and it will be even less if there aremore interacting variables.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 39 / 91

Page 40: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

2 3 4 5 6 7 8 9 10

Pro

babili

ty

Number of interacting variables(v)

P(X >= 1), N=50P(X >= 1), N=10000

Figure: Increasing v , the number of interacting variables will significantly decreasethe probability of grouping them in one subcomponent, given n = 1000 andm = 10.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 40 / 91

Page 41: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000

Pro

babili

ty

Number of cycles

P(X>=1),v=3P(X>=1),v=4P(X>=1),v=5P(X>=1),v=6

Figure: Increasing N , the number of cycle increases the probability of grouping v

number of interacting variables in one subcomponent.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 41 / 91

Page 42: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Illustrative Example (Informed with Fixed Groups)

x1

x2 x5x7

x6x3

x4

Figure: Variable interaction of a hypothetical function.

Delta Grouping (k = 2, s = 4):◮ C1: {x1, x5, x2, x4}, {x3, x6, x7}◮ C2: {x3, x5, x6, x7}, {x1, x2, x4}◮ ...◮ Cc : {x3, x6, x1, x4}, {x2, x5, x7}

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 42 / 91

Page 43: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Delta Grouping

Improvement Interval

Impro

vem

ent

Inte

rval

x2

x1

A

x2

x1

Imp

rov

emen

t In

terv

al A

Improvement Interval

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 43 / 91

Page 44: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Informed Decompositions with Fixed Groups

Adaptive Variable Partitioning [1].

Delta Grouping [2].

Min/Max-Variance Decomposition (MiVD/MaVD) [3].◮ Sorts the dimensions based on the diagonal elements of the covariance

matrix in CMA-ES.

Fitness Difference Partitioning [4], [5], [6].

[1] Tapabrata Ray and Xin Yao. “A cooperative coevolutionary algorithm with correlation based adaptive variablepartitioning”. In: IEEE Congress on Evolutionary Computation. IEEE. 2009, pp. 983–989.

[2] Mohammad Nabi Omidvar, Xiaodong Li, and Xin Yao. “Cooperative co-evolution with delta grouping for large scalenon-separable function optimization”. In: IEEE Congress on Evolutionary Computation. IEEE. 2010, pp. 1–8.

[3] Jinpeng Liu and Ke Tang. “Scaling up covariance matrix adaptation evolution strategy using cooperative coevolution”. In:International Conference on Intelligent Data Engineering and Automated Learning. Springer. 2013, pp. 350–357.

[4] Eman Sayed, Daryl Essam, and Ruhul Sarker. “Dependency identification technique for large scale optimization problems”.In: IEEE Congress on Evolutionary Computation. IEEE. 2012, pp. 1–8.

[5] Eman Sayed et al. “Decomposition-based evolutionary algorithm for large scale constrained problems”. In: InformationSciences 316 (2015), pp. 457–486.

[6] Adan E Aguilar-Justo and Efren Mezura-Montes. “Towards an improvement of variable interaction identification forlarge-scale constrained problems”. In: IEEE Congress on Evolutionary Computation. IEEE. 2016, pp. 4167–4174.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 44 / 91

Page 45: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Informed Decompositions with Variable Groups

Multilevel Grouping: MLCC [1], MLSoft [2].

Adaptive Variable Partitioning 2 [3].

4CDE [4].

Fuzzy Clustering [5].

[1] Zhenyu Yang, Ke Tang, and Xin Yao. “Multilevel cooperative coevolution for large scale optimization”. In: IEEE Congresson Evolutionary Computation. IEEE. 2008, pp. 1663–1670.

[2] Mohammad Nabi Omidvar, Yi Mei, and Xiaodong Li. “Effective decomposition of large-scale separable continuous functionsfor cooperative co-evolutionary algorithms”. In: IEEE Congress on Evolutionary Computation. IEEE. 2014, pp. 1305–1312.

[3] Hemant Kumar Singh and Tapabrata Ray. “Divide and conquer in coevolution: A difficult balancing act”. In: Agent-BasedEvolutionary Search. Springer, 2010, pp. 117–138.

[4] Yazmin Rojas and Ricardo Landa. “Towards the use of statistical information and differential evolution for large scaleglobal optimization”. In: International Conference on Electrical Engineering Computing Science and Automatic Control. IEEE.2011, pp. 1–6.

[5] Jianchao Fan, Jun Wang, and Min Han. “Cooperative coevolution for large-scale optimization based on kernel fuzzyclustering and variable trust region methods”. In: IEEE Transactions on Fuzzy Systems 22.4 (2014), pp. 829–839.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 45 / 91

Page 46: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Illustrative Example (Exact Methods)

x1

x2 x5x7

x6x3

x4

Figure: Variable interaction of a hypothetical function.

Differential Grouping and Variable Interaction Learning:◮ C1: {x1, x2, x4}, {x3, x5, x6, x7}◮ C2: {x1, x2, x4}, {x3, x5, x6, x7}◮ ...◮ Cc : {x1, x2, x4}, {x3, x5, x6, x7}

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 46 / 91

Page 47: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Monotonicity Check

∃ x, x ′i , x′j :f (x1, ..., xi , ..., xj , ..., xn) < f (x1, ..., x

′i , ..., xj , ..., xn)∧

f (x1, ..., xi , ..., x′j , ..., xn) > f (x1, ..., x

′i , ..., x

′j , ..., xn)

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 47 / 91

Page 48: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Monotonicity Check (Algorithms)

Linkage Identification by Non-Monotonicity Detection [1]

Adaptive Coevolutionary Learning [2]

Variable Interaction Learning [3]

Variable Interdependence Learning [4]

Fast Variable Interdependence [5]

[1] Masaharu Munetomo and David E Goldberg. “Linkage identification by non-monotonicity detection for overlappingfunctions”. In: Evolutionary Computation 7.4 (1999), pp. 377–398.

[2] Karsten Weicker and Nicole Weicker. “On the improvement of coevolutionary optimizers by learning variableinterdependencies”. In: IEEE Congress on Evolutionary Computation. Vol. 3. IEEE. 1999, pp. 1627–1632.

[3] Wenxiang Chen et al. “Large-scale global optimization using cooperative coevolution with variable interaction learning”. In:Parallel Problem Solving from Nature. Springer. 2010, pp. 300–309.

[4] Liang Sun et al. “A cooperative particle swarm optimizer with statistical variable interdependence learning”. In:Information Sciences 186.1 (2012), pp. 20–39.

[5] Hongwei Ge et al. “Cooperative differential evolution with fast variable interdependence learning and cross-clustermutation”. In: Applied Soft Computing 36 (2015), pp. 300–314.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 48 / 91

Page 49: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping [1]

Theorem

Let f (x) be an additively separable function. ∀a, b1 6= b2, δ ∈ R, δ 6= 0, ifthe following condition holds

∆δ,xp [f ](x)|xp=a,xq=b1 6= ∆δ,xp [f ](x)|xp=a,xq=b2 , (5)

then xp and xq are non-separable, where

∆δ,xp [f ](x) = f (. . . , xp + δ, . . . )− f (. . . , xp , . . . ), (6)

refers to the forward difference of f with respect to variable xp with

interval δ.

[1] Mohammad Nabi Omidvar et al. “Cooperative co-evolution with differential grouping for large scale optimization”. In:IEEE Transactions on Evolutionary Computation 18.3 (2014), pp. 378–393.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 49 / 91

Page 50: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

b

p1 = (x1, x2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 50 / 91

Page 51: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 51 / 91

Page 52: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

∆1 = f (p1)− f (p2)

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 52 / 91

Page 53: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

∆1 = f (p1)− f (p2)

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

b

p1 = (x1, x′2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 53 / 91

Page 54: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

∆1 = f (p1)− f (p2)

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

b

p1 = (x1, x′2)

b

p2 = (x ′1, x′2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 54 / 91

Page 55: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

∆1 = f (p1)− f (p2)

∆2 = f (p1)− f (p2)

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

b

p1 = (x1, x′2)

b

p2 = (x ′1, x′2)

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 55 / 91

Page 56: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

0 1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

7

8

9

10

∆1 = f (p1)− f (p2)

∆2 = f (p1)− f (p2)

b

p1 = (x1, x2)b

p2 = (x ′1, x2)

b

p1 = (x1, x′2)

b

p2 = (x ′1, x′2)

Λk,12 = |∆1 −∆2| > 0

⇒ x1 and x2 are nonseparable

Figure: f (x1, x2) = x21 + x22Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 56 / 91

Page 57: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Separability ⇒ ∆1 = ∆2

Assuming:

f (x) =

m∑

i=1

fi(xi )

We prove that:Separability ⇒ ∆1 = ∆2

By contraposition (P ⇒ Q ≡ ¬Q ⇒ ¬P):

∆1 6= ∆2 ⇒ non-separability

or|∆1 −∆2| > ǫ ⇒ non-separability

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 57 / 91

Page 58: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

The Differential Grouping Algorithm

Detecting Non-separable Variables

|∆1 −∆2| > ǫ ⇒ non-separability

Detecting Separable Variables

|∆1 −∆2| ≤ ǫ ⇒ Separability (more plausible)

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 58 / 91

Page 59: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Example

Consider the non-separable objective function f (x1, x2) = x21 + λx1x2 + x22 ,

λ 6= 0.∂f (x1, x2)

∂x1= 2x1 + λx2.

This clearly shows that the change in the global objective function with

respect to x1 is a function of x1 and x2. By applying the Theorem:

∆δ,x1[f ] =[

(x1 + δ)2 + λ(x1 + δ)x2 + x22]

−[

x21 + λx1x2 + x22]

= δ2 + 2δx1 + λx2δ.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 59 / 91

Page 60: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping vs CCVIL

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2

B

C

A

<

>

x1

x2

x′2

x′1 = x1 + δ

Figure: Detection of interacting variables using differential grouping and CCVILon different regions of a 2D Schwefel Problem 1.2.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 60 / 91

Page 61: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping Family of AlgorithmsLinkage Identification by Non-linearity Check (LINC, LINC-R) [1]Differential Grouping (DG) [2]Global Differential Grouping (GDG) [3]Improved Differential Grouping (IDG) [4]eXtended Differential Grouping (XDG) [5]Graph-based Differential Grouping (gDG) [6]Fast Interaction Identification [7]Recursive Differential Grouping (RDG1 and RDG2) [8]

[1] Masaru Tezuka, Masaharu Munetomo, and Kiyoshi Akama. “Linkage identification by nonlinearity check for real-codedgenetic algorithms”. In: Genetic and Evolutionary Computation–GECCO 2004. Springer. 2004, pp. 222–233.

[2] Mohammad Nabi Omidvar et al. “Cooperative co-evolution with differential grouping for large scale optimization”. In:IEEE Transactions on Evolutionary Computation 18.3 (2014), pp. 378–393.

[3] Yi Mei et al. “Competitive Divide-and-Conquer Algorithm for Unconstrained Large Scale Black-Box Optimization”. In:ACM Transaction on Mathematical Software 42.2 (June 2015), p. 13.

[4] Mohammad Nabi Omidvar et al. IDG: A Faster and More Accurate Differential Grouping Algorithm. Technical ReportCSR-15-04. University of Birmingham, School of Computer Science, Sept. 2015.

[5] Yuan Sun, Michael Kirley, and Saman Kumara Halgamuge. “Extended differential grouping for large scale globaloptimization with direct and indirect variable interactions”. In: Genetic and Evolutionary Computation Conference. ACM. 2015,pp. 313–320.

[6] Yingbiao Ling, Haijian Li, and Bin Cao. “Cooperative co-evolution with graph-based differential grouping for large scaleglobal optimization”. In: International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery. IEEE.2016, pp. 95–102.

[7] Xiao-Min Hu et al. “Cooperation coevolution with fast interdependency identification for large scale optimization”. In:Information Sciences 381 (2017), pp. 142–160.

[8] Yuan Sun, Michael Kirley, and Saman K Halgamuge. “A recursive decomposition method for large scale continuousMohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 61 / 91

Page 62: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Shortcomings of Differential Grouping

Cannot detect the overlapping functions.

Slow if all interactions are to be checked.

Requires a threshold parameter (ǫ).

Can be sensitive to the choice of the threshold parameter (ǫ).

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 62 / 91

Page 63: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Direct/Indirect Interactions

Indirect Interactions

In an objective function f (x), decision variablesxi and xj interact directly (denoted by xi ↔ xj)if

∃a :∂f

∂xi∂xj

x=a

6= 0,

decision variables xi and xj interact indirectly if

∂f

∂xi∂xj= 0,

and there exists a set of decision variables{xk1, ..., xks} such that xi ↔ xl1, ..., xks ↔ xj .

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 63 / 91

Page 64: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Efficiency vs Accuracy

Saving budget at the expense of missing overlaps:

eXtended Differential Grouping [1].

Fast Interdependence Identification [2].

1 2

3

45

6 1

2

3

4

5

6

Figure: The interaction structures represented by the two graphs cannot bedistinguished by XDG.

[1] Yuan Sun, Michael Kirley, and Saman Kumara Halgamuge. “Extended differential grouping for large scale globaloptimization with direct and indirect variable interactions”. In: Genetic and Evolutionary Computation Conference. ACM. 2015,pp. 313–320.

[2] Xiao-Min Hu et al. “Cooperation coevolution with fast interdependency identification for large scale optimization”. In:Information Sciences 381 (2017), pp. 142–160.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 64 / 91

Page 65: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping 2: Improving Accuracy [1]

DG2 Estimates the computational round-off errors as the threshold value:

einf := γ2max{f (x) + f (y′), f (y) + f (x′)} (7)

esup = γ√n max{f (x), f (x′), f (y), f (y′)} (8)

λ < einf → separable;

λ > esup → non-separable.

Otherwiseǫ =

η0

η0 + η1einf +

η1

η0 + η1esup, (9)

λ < ǫ → separable;

λ ≥ ǫ → non-separable.

[1] Mohammad Nabi Omidvar et al. “DG2: A faster and more accurate differential grouping for large-scale black-boxoptimization”. In: IEEE Transactions on Evolutionary Computation 21.6 (2017), pp. 929–942.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 65 / 91

Page 66: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

DG2 Error Analysis

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 66 / 91

Page 67: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping 2: Improving Efficiency

x1

x2

x3

1

32

(a, b, c) (a, b′, c)

(a′, b′, c)(a′, b, c)

(a, b, c ′)

(a′, b, c ′)

(a, b′, c ′)

Figure: Geometric representation of point generation in DG2 for a 3D function.

x1↔x2:∆(1)

=f (a′, b, c)−f (a, b, c),∆(2)=f (a′, b′, c)−f (a, b′, c)

x1↔x3:∆(1)

=f (a′, b, c)−f (a, b, c),∆(2)=f (a′, b, c ′)−f (a, b, c ′)

x2↔x3:∆(1)

=f (a, b′, c)−f (a, b, c),∆(2)=f (a, b′, c ′)−f (a, b, c ′),

λ = |∆(1) −∆(2)|

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 67 / 91

Page 68: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Differential Grouping 2: Improving Efficiency

Minimum Evaluations

The minimum number of unique function evaluations in order to detectthe interactions between all pairs of variables is

h(n) ≥n(n + 1)

2+ 1. (10)

Improving efficiency beyond the given lower bound is impossibleunless:

Sacrifice on the accuracy (partial variable interaction matrix);

and/or

Extending the DG theorem.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 68 / 91

Page 69: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Extending the Theorem: Further Improving Efficiency

1 Fast Interaction Identification (FII) [1]◮ identifying separable variables by checking the interaction between a

single variable with remaining variables.◮ examining pairwise interaction for non-separable variables.

2 Recursive Differential Grouping (RDG) [2] (O(n log(n)))◮ examining interaction between two variable subsets.◮ using a recursive procedure to group variables.

[1] Xiao-Min Hu et al. “Cooperation coevolution with fast interdependency identification for large scale optimization”. In:Information Sciences 381 (2017), pp. 142–160.

[2] Yuan Sun, Michael Kirley, and Saman K Halgamuge. “A recursive decomposition method for large scale continuousoptimization”. In: IEEE Transactions on Evolutionary Computation 22.5 (2017), pp. 647–661.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 69 / 91

Page 70: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Variants of RDG

1 RDG2 [1]: Combining the efficiency of RDG and accuracy of DG2.

2 RDG3 [2]: extending RDG2 for decomposing overlapping problems.

x1

x2

x3

x4x4

x5

x6

x7 x1

x2

x3

x4

x5

x6

x7

[1] Yuan Sun et al. “Adaptive threshold parameter estimation with recursive differential grouping for problem decomposition”.In: Proceedings of the Genetic and Evolutionary Computation Conference. ACM. 2018, pp. 889–896.

[2] Yuan Sun et al. “Decomposition for Large-scale Optimization Problems with Overlapping Components”. In: Proceedings ofthe IEEE Congress on Evolutionary Computation. IEEE. 2019.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 70 / 91

Page 71: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Benchmark Suites

CEC’2005 Benchmark Suite (non-modular)

CEC’2008 LSGO Benchmark Suite (non-modular)

CEC’2010 LSGO Benchmark Suite

CEC’2013 LSGO Benchmark Suite

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 71 / 91

Page 72: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Challenges of CC

Main Questions1 How to decompose the problem?

2 How to allocated resources?

3 How to coordinate?

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 72 / 91

Page 73: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

The Imbalance Problem

Non-uniform contribution of components.

Imbalanced Functions

f (x) =

m∑

i=1

wi fi (xi ), (11)

wi = 10sN (0,1),

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 73 / 91

Page 74: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

The Imbalance Problem (2)

10000

1e+06

1e+08

1e+10

1e+12

1e+14

1e+16

0 5000 10000 15000 20000 25000 30000

Iterations

DECC-I-NonsepDECC-I-Sep

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 74 / 91

Page 75: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Contribution-Based Cooperative Co-evolution (CBCC)

Types of CC

CC: round-robin optimization of components.

CBCC: favors components with a higher contribution.◮ Quantifies the contribution of components.◮ Optimizes the one with the highest contribution.

How to Quantify the Contribution

For quantification of contributions a relatively accurate decompositionis needed.

Changes in the objective value while other components are keptconstant.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 75 / 91

Page 76: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

1000

10000

100000

1e+06

1e+07

1e+08

1e+09

1e+10

1e+11

1e+12

1e+13

1e+14

0 5000 10000 15000 20000 25000 30000

f(x

)

Evaluations

S1S2S3S4S5S6S7S8

(c) Round-Robin CC

1e-12

1e-10

1e-08

1e-06

0.0001

0.01

1

100

10000

1e+06

1e+08

0 5000 10000 15000 20000 25000 30000

f(x

)

Evaluations

S1S2S3S4S5S6S7S8

(d) Contribution-Based CC

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 76 / 91

Page 77: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Contribution-Aware Algorithms

Contribution-Based Cooperative Co-evolution (CBCC) [1], [2].

Bandit-based Cooperative Coevolution (BBCC) [3].

Incremental Cooperative Coevolution [4]

Multilevel Framework for LSGO [5]

[1] Mohammad Nabi Omidvar, Xiaodong Li, and Xin Yao. “Smart use of computational resources based on contribution forcooperative co-evolutionary algorithms”. In: Proc. of Genetic and Evolutionary Computation Conference. ACM, 2011,pp. 1115–1122.

[2] Mohammad Nabi Omidvar et al. “CBCC3 – A Contribution-Based Cooperative Co-evolutionary Algorithm with ImprovedExploration/Exploitation Balance”. In: Proc. IEEE Congr. Evolutionary Computation. 2016, pp. 3541–3548.

[3] kazimipour2018bandit.

[4] Sedigheh Mahdavi, Shahryar Rahnamayan, and Mohammad Ebrahim Shiri. “Incremental cooperative coevolution forlarge-scale global optimization”. In: Soft Computing (2016), pp. 1–20.

[5] Sedigheh Mahdavi, Shahryar Rahnamayan, and Mohammad Ebrahim Shiri. “Multilevel framework for large-scale globaloptimization”. In: Soft Computing (2016), pp. 1–30.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 77 / 91

Page 78: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Some Auxiliary Topics

Variable Interaction and Constraint Handling [1], [2], [3]

Large-Scale Multiobjective Optimization

Available Benchmark Suites

[1] Eman Sayed et al. “Decomposition-based evolutionary algorithm for large scale constrained problems”. In: InformationSciences 316 (2015), pp. 457–486.

[2] Adan E Aguilar-Justo and Efren Mezura-Montes. “Towards an improvement of variable interaction identification forlarge-scale constrained problems”. In: IEEE Congress on Evolutionary Computation. IEEE. 2016, pp. 4167–4174.

[3] Julien Blanchard, Charlotte Beauthier, and Timoteo Carletti. “A cooperative co-evolutionary algorithm for solvinglarge-scale constrained problems with interaction detection”. In: Proceedings of the Genetic and Evolutionary ComputationConference. ACM. 2017, pp. 697–704.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 78 / 91

Page 79: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Variable Interaction and Constraint Handling [1]min f (x) = x21 x2 + 4x5

s.t g1(x) =x3

x24+

√x5 − x6 ≤ 0

g2(x) = x1 − x2e−x6 ≤ 0

Θ0 =

0 1 0 0 0 0

1 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0

,Θ1 =

0 0 0 0 0 00 0 0 0 0 0

0 0 0 1 0 0

0 0 1 0 0 00 0 0 0 0 00 0 0 0 0 0

,Θ2 =

0 0 0 0 0 0

0 0 0 0 0 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0

0 1 0 0 0 0

,

Θglob =

0 1 0 0 0 0

1 0 0 0 0 1

0 0 0 1 0 0

0 0 1 0 0 00 0 0 0 0 0

0 1 0 0 0 0

.x6

x2

x1

x5

x4

x3

[1] Julien Blanchard, Charlotte Beauthier, and Timoteo Carletti. “A cooperative co-evolutionary algorithm for solvinglarge-scale constrained problems with interaction detection”. In: Proceedings of the Genetic and Evolutionary ComputationConference. ACM. 2017, pp. 697–704.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 79 / 91

Page 80: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Large-Scale Multiobjective Optimization

Large-scale multiobjective optimization is growing popularity:

Benchmark development and analysis:

◮ Development of a benchmark [1].◮ Analysis of the existing benchmarks [2].

Algorithm development:

◮ Exploiting modularity using CC [3], [4], [5], [6].◮ Problem transformation [7].

[1] Ran Cheng et al. “Test problems for large-scale multiobjective and many-objective optimization”. In: IEEE Transactions onCybernetics (2016).

[2] Ke Li et al. “Variable Interaction in Multi-objective Optimization Problems”. In: Parallel Problem Solving from Nature.Springer International Publishing. 2016, pp. 399–409.

[3] Luis Miguel Antonio and Carlos A Coello Coello. “Use of cooperative coevolution for solving large scale multiobjectiveoptimization problems”. In: IEEE Congress on Evolutionary Computation. IEEE. 2013, pp. 2758–2765.

[4] Luis Miguel Antonio and Carlos A Coello Coello. “Decomposition-Based Approach for Solving Large Scale Multi-objectiveProblems”. In: Parallel Problem Solving from Nature. Springer. 2016, pp. 525–534.

[5] Xiaoliang Ma et al. “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjectiveoptimization problems with large-scale variables”. In: IEEE Transactions on Evolutionary Computation 20.2 (2016), pp. 275–298.

[6] Xingyi Zhang et al. “A Decision Variable Clustering-Based Evolutionary Algorithm for Large-scale Many-objectiveOptimization”. In: IEEE Transactions on Evolutionary Computation (2016).

[7] Heiner Zille et al. “A Framework for Large-Scale Multiobjective Optimization Based on Problem Transformation”. In: IEEETransactions on Evolutionary Computation 22.2 (2018), pp. 260–275.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 80 / 91

Page 81: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Analysis of ZDT

0 1 1 1 1 1

1 0 1 1 1 1

1 1 0 1 1 1

1 1 1 0 1 1

1 1 1 1 0 1

1 1 1 1 1 0

x1 x2 x3 x4 x5 x6

x1

x2

x3

x4

x5

x6

1 2

3

45

6

Figure: Variable interaction structures of the f2 function of ZDT test suite [1].

[1] Ke Li et al. “Variable Interaction in Multi-objective Optimization Problems”. In: Parallel Problem Solving from Nature.Springer International Publishing. 2016, pp. 399–409.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 81 / 91

Page 82: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Analysis of DTLZ1-DTLZ4

1 2

3

45

6

1 2

3

45

6

1 2

3

45

6

Figure: Variable interaction graphs of DTLZ1 to DTLZ4 .

Proposition 1

For DTLZ1 to DTLZ4, ∀fi , i ∈ {1, · · · ,m}, we divide the corresponding decisionvariables into two non-overlapping sets: xI = (x1, · · · , xℓ)

T , ℓ = m − 1 for i ∈ {1, 2}while ℓ = m − i + 1 for i ∈ {3, · · · ,m}; and xII = (xm, · · · , xn)

T . All members of xI notonly interact with each other, but also interact with those of xII ; all members of xII areindependent from each other.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 82 / 91

Page 83: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Analysis of DTLZ5-DTLZ7

1 2

3

45

6

1 2

3

45

6

1 2

3

45

6

Figure: Variable interaction graphs of DTLZ5 and DTLZ6.

Proposition 2

For DTLZ5 and DTLZ6, ∀fi , i ∈ {1, · · · ,m}, we divide the corresponding decisionvariables into two non-overlapping sets: xI = (x1, · · · , xℓ)

T , ℓ = m − 1 for i ∈ {1, 2}while ℓ = m − i + 1 for i ∈ {3, · · · ,m}; and xII = (xm, · · · , xn)

T . For fi , wherei ∈ {1, · · · ,m − 1}, all members of xI and xII interact with each other; for fm, we havethe same interaction structure as DTLZ1-DTLZ4.

Proposition 3

All objective functions of DTLZ7 are fully separable.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 83 / 91

Page 84: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Decomposition Based Large-Scale EMO

Figure: Image taken from [1]

[1] Xiaoliang Ma et al. “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjectiveoptimization problems with large-scale variables”. In: IEEE Transactions on Evolutionary Computation 20.2 (2016), pp. 275–298.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 84 / 91

Page 85: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Weighted Optimization Framework (WOF) [1], [2]

Figure: Weighted Optimization Framework

[1] Heiner Zille et al. “Weighted Optimization Framework for Large-scale Multi-objective Optimization”. In: Genetic andEvolutionary Computation Conference. ACM. 2016, pp. 83–84.

[2] Heiner Zille et al. “A Framework for Large-Scale Multiobjective Optimization Based on Problem Transformation”. In: IEEETransactions on Evolutionary Computation 22.2 (2018), pp. 260–275.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 85 / 91

Page 86: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Some Future Directions (I)

What if the components have overlap?

Differential group is time-consuming. Is there a more efficientmethod?

Do we need to get 100% accurate grouping? What is the relationshipbetween grouping accuracy and optimality achieved by a CCalgorithm?

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 86 / 91

Page 87: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Some Future Directions (II)

CC for combinatorial optimization, e.g.,◮ Y. Mei, X. Li and X. Yao, “Cooperative Co-evolution with Route

Distance Grouping for Large-Scale Capacitated Arc Routing Problems,”IEEE Transactions on Evolutionary Computation, 18(3):435-449, June2014.

However, every combinatorial optimization problem has its owncharacteristics. We need to investigate CC for other combinatorialoptimization problems.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 87 / 91

Page 88: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Some Future Directions (III)

Learning variable interdependencies is a strength of estimation ofdistribution algorithms (EDAs), e.g.,

◮ W. Dong, T. Chen, P. Tino and X. Yao, “Scaling Up Estimation ofDistribution Algorithms for Continuous Optimization,” IEEETransactions on Evolutionary Computation, 17(6):797-822, December2013.

◮ A. Kaban, J. Bootkrajang and R.J. Durrant. “Towards Large ScaleContinuous EDA: A Random Matrix Theory Perspective.” EvolutionaryComputation

Interestingly, few work exists on scaling up EDAs.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 88 / 91

Page 89: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

LSGO Resources

There is an IEEE Computational Intelligence Society (CIS) Task Forceon LSGO:

LSGO Repository: http://www.cercia.ac.uk/projects/lsgo

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 89 / 91

Page 90: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Acknowledgement

Thanks to

Dr. Yuan Sun from RMIT University, for assistance in revising theslides;

Dr. Ata Kaban and Dr. Momodou L. Sanyang for allowing us to usesome figures from their publications.

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 90 / 91

Page 91: Evolutionary Large-Scale Global Optimizatione46507/... · [5] Antonio LaTorre, Santiago Muelas, and Jos´e-Mar´ıa Pen˜a. “Multiple offspring sampling in large scale global optimization”

Questions

Thanks for your attention!

Mohammad Nabi Omidvar, Xiaodong Li, Daniel Molina, Antonio LaTorreDecomposition and CC for LSGO 91 / 91