View
19
Download
0
Category
Preview:
Citation preview
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Nonsmooth Optimization by combiningMADS and VNS
Charles AudetVincent Bechard
Sebastien Le DigabelEcole Polytechnique de Montreal
May 2006
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 1/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Presentation Outline
Introduction
MADS Algorithm
VNS Metaheuristic
Coupling of MADS and VNS
Preliminary Results
Conclusion
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 2/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
IntroductionProblem presentation
Introduction
I MADS is an algorithm for nonsmooth optimization
I VNS is a metaheuristic (most of the time) for combinatorialoptimization
I This work presents a way to incorporate VNS into MADS
I This is natural because :
I MADS has a flexible step allowing the introduction ofheuristics
I MADS defines a discrete structure of the variable space, easyto use as VNS neighborhoods
I These algorithms have a complementary behaviour (MADSsearch is more diversified when new solutions are foundwhereas VNS search is more diversified when no improvementare made)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 3/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
IntroductionProblem presentation
Problem presentation
minx∈Ω⊆Rn
f (x)
where
I f : Rn → R ∪ ∞I objective function f and functions defining Ω are
I nonsmooth, costly, can possibly fail to evaluate, derivativeapproximation is problematic
I viewed as unexploitable black-box functions
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 4/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
MADS Overview
I MADS : Mesh Adaptive Direct Search [Audet, Dennis]
I NOMAD is the c++ implemation of MADS (freely availableat www.gerad.ca/nomad) [Couture]
I MADS generalizes the Generalized Pattern Search (GPS,[Torczon]) Algorithm
I Main convergence result : MADS leads to a Clarke-KKTstationnary point x ∈ Ω if f is Lipschitz near x
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 5/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
MADS Overview
I The black-box functions are evaluated at some trial points,which are either accepted as new iterates or rejected
I Constraints are handled by a filter method determining whichnew iterates to accept
I All trial points are constructed to lie on a mesh
M(k,∆k) =xk + ∆kDz : z ∈ NnD
⊂ Rn
where ∆k ∈ R+ is the mesh size parameter and D a fixed setof directions in Rn
I After each iteration, the mesh size parameter ∆k is reducedwhen no new iterate has been found (iteration fail)
I Each MADS iteration has two steps, the Search and the Poll
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 6/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
MADS Poll
I Local exploration near the best current iterate xk
I A set of direction Dk is randomly chosen. In GPS thesedirections had to be taken in the global set of directions D,but MADS allows a larger choice with the use of a secondmesh size parameter ∆p
k
I The set of poll trial points (the poll frame) is thenconstructed :
Pk =xk + ∆kd : d ∈ Dk
⊆ M(k,∆k)
I The poll is rigidly defined (mesh update, directions used) toensure convergence results
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 7/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
Poll illustration (successive fails and mesh shrink)
∆k = 1
∆pk = 1
rxk
rp1
rp2
r
p3
Pk = p1, p2, p3
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 8/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
Poll illustration (successive fails and mesh shrink)
∆k = 1
∆pk = 1
rxk
rp1
rp2
r
p3
Pk = p1, p2, p3
∆k+1 = 1/4
∆pk+1 = 1/2
rxkr
p4 rp5
r
p6
Pk+1 = p4, p5, p6
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 8/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
Poll illustration (successive fails and mesh shrink)
∆k = 1
∆pk = 1
rxk
rp1
rp2
r
p3
Pk = p1, p2, p3
∆k+1 = 1/4
∆pk+1 = 1/2
rxkr
p4 rp5
r
p6
Pk+1 = p4, p5, p6
∆k+2 = 1/16
∆pk+2 = 1/4
rxk
rp7
r p8rSSp9
Pk+2 = p7, p8, p9
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 8/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
MADS Search
I The search is a flexible global search strategy
I A valid search must only generate a finite number of pointslying on the mesh
I User can use a problem specific search
I There are also generic searches (Random Search, LatinHypercube Sampling)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 9/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
MADS OverviewMADS PollMADS SearchDetailed MADS Algorithm
[0] Initializationsx0 ∈ X , ∆0 ∈ R+
k ← 0
[1] Poll and search stepSearch step
evaluate the functions on a finite number of points of M(k, ∆k )Poll step
compute p MADS directions Dk ∈ Rn×p
construct the frame Pk ⊆ M(k, ∆k ) with xk , Dk and ∆k
evaluate the functions on the p points of Pk
[2] Updatesdetermine the type of success of iteration ksolution update (xk+1)mesh update (∆k+1)k ← k + 1check the stopping conditionsgoto [1]
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 10/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS Overview
I VNS : Variable Neighborhood Search [Hansen,Mladenovic]
I More often used in combinatorial optimization but can beapplied in the continuous case
I It is based on a local search (descent) and on a perturbationmethod (shaking) allowing to get away from local optima
I The perturbation method is parametrized by ξk andincreasingly changes the current solution when ξk grows
I The search is more and more global when no improvementsare made
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 11/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk
sx ′=shaking(xk ,1)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk
sx ′=shaking(xk ,1) HHHj
x ′′=descent(x ′)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+1
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+1
sx ′=shaking(xk ,2)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+1
sx ′=shaking(xk ,2)
x ′′=descent(x ′)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+2
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+2
sx ′=shaking(xk ,3)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+2
sx ′=shaking(xk ,3)
BBBBBBBBBBBBBNs
x ′′=descent(x ′)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
VNS illustration
sxk+3
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 12/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
VNS OverviewVNS illustrationDetailed VNS Algorithm for minimizing f without constraints
[0] Initializationsξmax , ξ0, δ ∈ N+ , x0 ∈ Xk ← 0
[1] while (ξk ≤ ξmax)x ′ ← shaking(xk , ξk)x ′′ ← descent(x ′)if
(f (x ′′) < f (xk)
)xk+1 ← x ′′
ξk+1 ← ξ0
elsexk+1 ← xk
ξk+1 ← ξk + δk ← k + 1
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 13/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Coupling of MADS and VNSVNS shakingExample of shakingVNS descentMADS/VNS detailed Algorithm
Coupling of MADS and VNS
I The main contribution of this work is the incorporation ofVNS into the search step of MADS
I This new VNS search only has to generate a finite number ofmesh points in order to keep the convergence properties ofMADS
I The two VNS components (descent and shaking) are definedusing the mesh of MADS
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 14/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Coupling of MADS and VNSVNS shakingExample of shakingVNS descentMADS/VNS detailed Algorithm
VNS shaking
I The mesh defines a natural structure for the perturbationmethod which can be seen as a functionshaking :
(M(k,∆k), N
)→ M(k,∆V ) ⊆ M(k,∆k) and
x ′ ← shaking(x , ξk)
I ξk ∈ N is the perturbation amplitude
I The fixed-size mesh M(k,∆V ) ⊆ M(k,∆k) allows theperturbation to be based only on the amplitude ξk in order toremain independent of the current mesh size parameter ∆k
I ∆V is called the VNS trigger (VNS search only occurs atiteration k when ∆k ≤ ∆V and ∆V = `∆k for some ` ∈ N)
I If D = [I − I ]T , the perturbed point x ′ can be chosen so that‖xk − x ′‖∞ = ξk∆V
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 15/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Coupling of MADS and VNSVNS shakingExample of shakingVNS descentMADS/VNS detailed Algorithm
Examples of meshes M(k,∆k) (gray), M(k,∆V ) (black) andpossible choices for the perturbation (points x i on the bold frameat distance ξk∆V of xk)
∆V = ∆k , ξk∆V = 2∆k
shaking(xk , 2)∈ x1, . . . , x8
rxkc
x1
cx2
cx3
cx4
cx5
cx6
cx7
cx8
∆k
∆V
∆V = 4∆k , ξk∆V = 12∆k
shaking(xk , 3)∈ x1, . . . , x24
rxk
cx1
cx2
cx3
cx4
cx5
cx6
cx7
cx8
cx9
cx10
cx11
cx12
cx13cx14cx15cx16cx17cx18cx19
cx20
cx21
cx22
cx23
cx24
∆V
∆k
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 16/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Coupling of MADS and VNSVNS shakingExample of shakingVNS descentMADS/VNS detailed Algorithm
VNS descent
I Function descent : M(k,∆V )→ M(k,∆k) andx ′′ ← descent(x ′)
I Use of a specific poll step, with its own mesh size parameterand its own filter
I Cannot reduce the current mesh size
I Strategies to reduce the evaluations cost of the descent :
I Uses surrogate functions if availableI Compare the descent trial points with the points in cache (DS
strategy)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 17/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Coupling of MADS and VNSVNS shakingExample of shakingVNS descentMADS/VNS detailed Algorithm
[0] Initializationsx0 ∈ X , ∆0 ∈ R+, ξ0 , ξmax , δ, ∆V
k ← 0
[1] Poll and search stepSearch step (optional)
x ′ ← shaking (xk , ξk ) (perturb. of ampl. ξk )x ′′ ← descent (x ′) (descent on M(k, ∆V ) ⊆ M(k, ∆k ))Sk ← finite number of points of M(k, ∆k ) (possibly empty)evaluate the functions on Sk∪x ′′
Poll stepcompute p MADS directions Dk ∈ Rn×p
construct the frame Pk ⊆ M(k, ∆k ) with xk , Dk and ∆k
evaluate the functions on the p points of Pk
[2] Updatesupdate of VNS amplitude (ξk+1 ← ξ0 or ξk+1 ← ξk + δ)updates of solution and meshk ← k + 1check the stopping conditions, goto [1]
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 18/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
An analytic problem with many local optima
min f (a, b) =1000 b sin2 b sin 300a
as.t.
75 ≤ a ≤ 5000 ≤ b ≤ 10
bx0
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 19/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
Results for the analytic problem
testparameters average objective (f ) neval
LH VNS DS obj. (f ) neval best worst best worst
1 no no no -22.099 281 -104.419 -3.441 216 5282 100, 10 no no -84.896 1286 -105.119 -53.302 884 22753 100, 10 0.1 no -104.801 5718 -105.119 -102.794 2485 100004 100, 10 0.1 10−4 -103.304 3818 -105.119 -87.627 1764 8065
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 20/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
Results for the analytic problem
0 2000 4000 6000−100
−50
01: MADS poll only
f
0 2000 4000 6000−100
−50
02: + LH
0 2000 4000 6000−100
−50
03: + LH + VNS
f
0 2000 4000 6000−100
−50
04: + LH + VNS + DS
0 1000 2000 3000 4000 5000 6000−100
−50
01
234
summary: average values
neval
f
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 21/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
A MDO problem
I MDO : MultiDisciplinary Optimization
I Simplified aircraft model with 10 variables, 10 constraints and3 disciplines, one for each main model component : structure,aerodynamics and propulsion
I Convergence of the model with a fixed point method throughthe 3 disciplines
I Surrogate obtained by relaxing the fixed point methodstopping criteria (warning : only the number of “true”functions evaluations are counted in these tests)
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 22/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
Results for the MDO problem
testparameters average objective (f ) neval
LH VNS obj. (f ) neval best worst best worst
1 5000, 0 no -1623.416 5000 -2273.648 -1315.849 5000 50002 no no -3101.393 2567 -3964.199 -1588.350 1178 51653 100, 10 no -3443.092 5690 -3964.200 -1355.656 1204 100004 100, 10 0.1 -3961.385 3060 -3964.198 -3881.935 1374 5806
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 23/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
An analytic problem with many local optimaA MDO problem
Results for the MDO problem
0 2000 4000 6000 8000 10000
−4000
−3500
−3000
−2500
−2000
−15001: LH alone
f
0 2000 4000 6000 8000 10000
−4000
−3500
−3000
−2500
−2000
−15002: MADS poll only
0 2000 4000 6000 8000 10000
−4000
−3500
−3000
−2500
−2000
−15003: + LH
0 2000 4000 6000 8000 10000
−4000
−3500
−3000
−2500
−2000
−15004: + LH + VNS + SRGTE
0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000
−4000
−3500
−3000
−2500
−2000
−1500
1
2
3
4
summary: average values
neval
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 24/25
IntroductionMADS
VNSCoupling of MADS and VNS
Preliminary ResultsConclusion
Conclusion
I MADS and VNS are two complementary algorithms (MADSmesh and VNS neighborhoods, diversification when successesor failures occur), so it was natural to combine the 2
I Preliminary results show an improvement in term of quality ofthe solution (the random component of MADS is less critical)
I Use of surrogates and DS strategy reduce the number offunction evaluations.
Audet, Bechard and Le Digabel Nonsmooth Optimization by combining MADS and VNS 25/25
Recommended