36
P T S Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences spotseven.org 3rd Brazilian-German Frontiers of Science and Technology Symposium November 8-12, 2012, Brasilia, Brazil Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 1 / 35

Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

  • Upload
    others

  • View
    15

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Design of Experiments and Sequential ParameterOptimization

Thomas Bartz-Beielstein

Cologne University of Applied Sciences

spotseven.org

3rd Brazilian-German Frontiers of Science and Technology Symposium

November 8-12, 2012, Brasilia, Brazil

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 1 / 35

Page 2: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Agenda

Preliminary Remarks

The Scientific Approach

Some Stats

Examples

Outlook

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 2 / 35

Page 3: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The SPOTSeven Team

Samstag, 10. November 12

Iwww.spotseven.org

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 3 / 35

Page 4: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Preliminaries

What is this Thing called Science?

I Astrology versus Astronomy

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 4 / 35

Page 5: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Preliminaries

Algorithm Invention and Introduction

I How do we generate scientific results?

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 5 / 35

Page 6: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

How to Publish Paper in Evolutionary Computing

I An extraordinary number of algorithms exist in Evolutionary Computation

I Establishing new algorithms is highly competitive

I Paper, which introduces the new operators, is published

I Benchmarks

I Appropriate parameter settings must be determined

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 6 / 35

Page 7: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

Scientific Ingredient 1: Benchmarking—General Rules

I Validity

I Reproducibility

I ComparabilityI Commons rules:

I Use statisticsI DocumentationI Comparisons

I On-going discussion

Computational Intelligence: State-of-the-ArtMethoden und Benchmarkprobleme

Frank Hoffmann1, Ralf Mikut2, Andreas Kroll3,Markus Reischl2, Oliver Nelles4, Horst Schulte5,

Torsten Bertram1

1Technische Universität Dortmund, Lehrstuhl fürRegelungssystemtechnik

E-Mail: {frank.hoffmann}{torsten.bertram}@tu-dortmund.de2Karlsruher Institut für Technologie, Institut für Angewandte Informatik

E-Mail: {ralf.mikut}{markus.reischl}@kit.edu3Universität Kassel, FB Maschinenbau, FG Mess- und Regelungstechnik

E-Mail: [email protected]ät Siegen, Mess- und Regelungstechnik - Mechatronik

Department MaschinenbauE-Mail: [email protected]

5 HTW Berlin, FB Ingenieurwissenschaften I, FG Regelungstechnik undSystemdynamik

E-Mail: [email protected]

Zusammenfassung: Dieser Beitrag gibt einen Überblick über den Standder Technik in der Computational Intelligence für Methoden zur Klassifi-kation, zum Text Mining, zur nichtlinearen Regression, nichtlinearen Sy-stemidentifikation und Regelung. Im Fokus steht eine systematische, wis-senschaftlichen Ansprüchen genügende Vorgehensweise bei der verglei-chenden Bewertung und Analyse alternativer Ansätze. Die einzelnen Ab-schnitte geben praktikable Hinweise auf vorhandene, möglichst frei ver-fügbare Implementierungen, Benchmarkdatensätze und -probleme als Hil-festellung für den Methodenvergleich zukünftiger Publikationen innerhalbdes CI-Workshops.

1 Einführung

Die Methodik und Vorgehensweise bei der Bewertung, dem Vergleich undsystematischen Analyse neuartiger Methoden der Mustererkennung undFunktionsapproximation hat auf vergangenen Computational IntelligenceWorkshops zu Kritik und Diskussionen geführt. In einigen Beiträgen fehlte

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 7 / 35

Page 8: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

Scientific Ingredient 2: Features of Test FunctionGenerators

I Di�cult to solve using simple methods such as hill climbers

I Nonlinear, non separable, non symmetric

I Scalable with respect to problem dimensionality

I Scalable with respect to evaluation time

I Tunable by a small number of user parameters

See,e.g, [2]

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 8 / 35

Page 9: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

The Current Situation

I Combining the two scientific ingredients

I Authors report parameter values which seem to work reasonably well

I Test problem suite setup: usually a small number of specified test problems.Each algorithm will be run for some number, say ten, on each problem.Statistics are reported, e.g., mean, standard deviation

I What is the problem of this approach?

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 9 / 35

Page 10: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

One Problem (only?)

I Paper does not tell the whole story, because:

I Preliminary trials, which were performed to determine these parametersettings, are performed, but not reported

I Chance of 5% that your new algorithm performs better than the bestalgorithm

I Run your new (in fact worse) algorithm 100 times

I Report only positive results

I Even if not intended, might happen unconsciously

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 10 / 35

Page 11: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

Ben Michael Goldacre’s TED talk

http://embed.ted.com/talks/ben_goldacre_what_doctors_don_t_know_

about_the_drugs_they_prescribe.html

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 11 / 35

Page 12: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

Benchmarking—A First Solution?

I One expert compares his new algorithm with establishes approaches.Subjective (unfair?) comparison

I Many experts compare their algorithms on several, standardized data.Objective (fair) comparison

I Use accepted data bases, e.g., UCI

I Divide data into train, validation, and test data

I What is the problem of this approach?

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 12 / 35

Page 13: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

The Scientific Approach

Benchmarking—Only A First Solution

I Algorithms are trained for this specific set of benchmark functionsI Who defines this set of functions?

I In practice, I do not need an algorithm which performs good on a set of testproblems (which was developed by some experts)

I Really wanted:I An algorithm, which performs very good on my set of real-word test problemsI Not only demonstratingI Understanding!

I Let’s have a short look at the problem

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 13 / 35

Page 14: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

A Taxonomy of Algorithm and Problem Designs

I Classify parameters

I Parameters may be qualitative, like for the presence or not of anrecombination operater or numerical, like for parameters that assume realvalues

I Our interest: understanding the contribution of these components

I Statistically speaking: parameters are called factors

I The interest is in the e↵ects of the specific levels chosen for these factors

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 14 / 35

Page 15: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

Problems and Algorithms

Tuning

multiple

algorithms

single

problems

multiple

algorithms

multiple

problems

single

algorithm

multiple

problems

single

algorithm

single

problem

I How to perform comparisons?I Adequate statistics and models?

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 15 / 35

Page 16: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

SASP: Algorithm and Problem Designs

I Basic design: assess the performance of an optimization algorithm on a singleproblem instance ⇡

I Randomized optimization algorithms ) performance Y on one instance is arandom variable

I Experiment: On an instance ⇡ algorithm is run r times ) collect sample dataY1, . . . ,Yr (independent, identically distributed)

I One instance ⇡, run the algorithm r times ) r replicates of the performancemeasure Y , denoted by Y1, . . . ,Yr

I Samples are conditionally on the sampled instance and given the randomnature of the algorithm, independent and identically distributed (i.i.d.), i.e.,

p(y1, . . . , yr |⇡) =rY

j=1

p(yj |⇡). (1)

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 16 / 35

Page 17: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

SASP – Single Algorithm, Single Problem

Tuning

multiple algorithms

single problems

multiple algorithms

multiple problems

single algorithm

multiple problems

single

algorithm

single

problem

real-world setting

determine important factors

optimization

crucial: number of function evaluations

benefit:

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 17 / 35

Page 18: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

MASP: Algorithm and Problem Designs

I Several optimization algorithms are compared on one fixed problem instance ⇡

I Experiment: collect sample data Y1, . . . ,YR (independent, identicallydistributed)

I Goal: comparison of algorithms on one (real-world) problem instance ⇡

Tuning

multiple

algorithms

single problems

researchbeginner's paper => rejected

optimization

multiple algorithms ~ one algorithm, but

different parameters

tuning and comparison

benefit

similarities between algorithms

understanding

multiple algorithms

multiple problems

single algorithm

multiple problems

single algorithm

single problem

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 18 / 35

Page 19: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

SAMP: Algorithm and Problem Designs

I Goal: Drawing conclusions about a certain class or population of instances ⇧

Tuning

multiple algorithms

single problems

multiple algorithms

multiple problems

single

algorithm

multiple

problems

algorithm development

optimization

robustness

benefit

important factors

understanding

single algorithm

single problem

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 19 / 35

Page 20: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

MAMP: Fixed Algorithm and Problem Designs

I Typically:I Take a few, fixed instances for the problem at handI Collect the results of some runs of the algorithms on these instances

I Statistically, instances are also levels of a factor

I Instances treated as blocks

I All algorithms are run on each single instance

I Results are therefore grouped per instance

Tuning

multiple algorithms

single problems

multiple

algorithms

multiple

problems

research

expert paper =>

accepted

benefit:comparison

huge complexity

single algorithm

multiple problems

single algorithm

single problem

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 20 / 35

Page 21: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Some Stats Taxonomy

MAMP: Randomized Problem Designs

I Sometimes, several hundred (or even more) problem instances to be tested )interest not just on the performance of the algorithms on a few specificinstances, but rather on the generalization of the results to the entirepopulation of instances

I Procedure: instances are chosen at random from a large set of possibleinstances of the problem

I Statistically, instances are also levels of a factor

I However, factor is of a di↵erent nature from the fixed algorithmic factorsdescribed above

I Levels are chosen at random and the interest is not in these specific levels butin the population from which they are sampled

I ) levels and factors are random

I This leads naturally to a mixed model [1]

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 21 / 35

Page 22: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Comparison of Two Simulated Annealing ParameterSettings

I Fixed-E↵ects Design: one algorithm is evaluated on one instance ⇡ (fixed),i.e., SASP

> set.seed(123)> library(SPOT)> fn <- spotBraninFunction #test function to be optimized by SANN> x0 <- c(-2,3) #starting point that SANN uses when optimizing Branin> maxit <- 100 #number of evaluations of Branin allowed for SANN> temp <- 10> tmax <- 10> n <- 100> y <- rep(1,n)> y0<-sapply(y, function(x) x<-optim(par=x0, fn=fn, method="SANN"+ , control=list(maxit=maxit,+ temp=temp, tmax=tmax))$value)> temp <- 4> tmax <- 62> y <- rep(1,n)> y1<-sapply(y, function(x) x<-optim(par=x0, fn=fn, method="SANN"+ , control=list(maxit=maxit,+ temp=temp, tmax=tmax))$value)

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 22 / 35

Page 23: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Comparison: Simple EDA Using Boxplots> summary(y0)

Min. 1st Qu. Median Mean 3rd Qu. Max.

0.3984 0.4444 0.6587 2.2770 3.4020 17.9600

> summary(y1)

Min. 1st Qu. Median Mean 3rd Qu. Max.

0.3985 0.4150 0.4439 0.5609 0.5736 2.2250

> boxplot(y0,y1)

1 2

05

1015

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 23 / 35

Page 24: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Comparison: Simple EDA Using Histograms

> par(mfrow=c(2,1))> hist(y0,xlim = c( min(y0,y1), max(y0,y1)))> hist(y1,xlim = c( min(y0,y1), max(y0,y1)))> par(mfrow=c(1,1))

Histogram of y0

y0

Frequency

0 5 10 15

020

50

Histogram of y1

y1

Frequency

0 5 10 15

020

50

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 24 / 35

Page 25: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Comparison

I What have we learned from this comparison?I Reducing temp from 10 to 4 while increasing tmax from 10 to 62 improves the

performance of simulated annealingI We have demonstrated that one setting is better

I Is this interesting for other researcher?

I Can this result be generalized?

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 25 / 35

Page 26: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Introducing Mixed Models: Problem InstancesI Nine problem instances, which were randomly drawn from an infinite number

of instances: fSeed

−100 0 50 100

02

46

810

−120:120

this.f(−120:120)

−100 0 50 100

02

46

810

−120:120

this.f(−120:120)

−100 0 50 100

02

46

8

−120:120

this.f(−120:120)

−100 0 50 100

01

23

45

6

−120:120

this.f(−120:120)

−100 0 50 100

05

1015

−120:120

this.f(−120:120)

−100 0 50 100

01

23

45

6

−120:120

this.f(−120:120)

−100 0 50 100

0.0

1.0

2.0

3.0

−120:120

this.f(−120:120)

−100 0 50 100

02

46

812

−120:120

this.f(−120:120)

−100 0 50 100

01

23

45

−120:120

this.f(−120:120)

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 26 / 35

Page 27: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Introducing Mixed Models: Algorithm Settings

I Evolution Strategy with and without mutation: mut

I Five repeats of the ES: algSeed> df <- read.table("expBart12a1.csv", header=T)> df$mut <- factor(df$mut)> df$fSeed <- factor(df$fSeed)> df$algSeed <- factor(df$algSeed)> df<-cbind(df, yLog = log(df$y))> str(df)

'data.frame': 90 obs. of 5 variables:

$ y : num 6.22 5.4 4.56 6.12 4.65 ...

$ mut : Factor w/ 2 levels "1","2": 1 1 1 1 1 2 2 2 2 2 ...

$ fSeed : Factor w/ 9 levels "1","2","3","4",..: 1 1 1 1 1 1 1 1 1 1 ...

$ algSeed: Factor w/ 5 levels "1","2","3","4",..: 1 2 3 4 5 1 2 3 4 5 ...

$ yLog : num 1.83 1.69 1.52 1.81 1.54 ...

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 27 / 35

Page 28: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Introducing Mixed Models: Assumptions

I Test the validity of the model assumptions by generating normal quantileplots (QQ plots)

−2 −1 0 1 2

02

46

8

(a)

Theoretical Quantiles

Sam

ple

Qua

ntile

s

−2 −1 0 1 2

−6−4

−20

2

(b)

Theoretical Quantiles

Sam

ple

Qua

ntile

s

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 28 / 35

Page 29: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

The Easiness of Mixed Models: The classical ANOVA

Table: ANOVA table for a one-factor fixed and random e↵ects models

Source Sum Degrees Mean EMS EMSof Variation of Squares of freedom Square Fixed Random

Treatment SStreat q � 1 MStreat �2 + rPq

i=1 ⌧2i

q�1 �2 + r�2⌧

Error SSerr q(r � 1) MSerr �2 �2

Total SStotal qr � 1

> obj1 <- aov(yLog ~fSeed, data=df)> summary(obj1)

Df Sum Sq Mean Sq F value Pr(>F)

fSeed 8 53.6 6.705 1.412 0.204

Residuals 81 384.6 4.748

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 29 / 35

Page 30: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

The Easiness of Mixed Models: Some Statistics

I Technical stu↵: how to access information in R

> (M1 <- anova(obj1))

Analysis of Variance Table

Response: yLog

Df Sum Sq Mean Sq F value Pr(>F)

fSeed 8 53.64 6.7049 1.4122 0.204

Residuals 81 384.59 4.7480

> (MSA <- M1[1,3])

[1] 6.704854

> (MSE <- M1[2,3])

[1] 4.74797

> r <-length(unique(df$algSeed))> q <- nlevels(df$fSeed)> (var.A <- (MSA - MSE)/(r))

[1] 0.3913767

> (var.E <- MSE)

[1] 4.74797

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 30 / 35

Page 31: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Results: What have we Learned?

I Relatively high variance> var.A + var.E

[1] 5.139347

I Large p value indicates that variance is caused by mutation operator, not byproblem instances

> 1-pf(MSA/MSE,q-1,q*(r-1))

[1] 0.2249

I Confidence interval provide information of the algorithm performance on thisset of problem instances (prediction for new problem instances)

> s <- sqrt(MSA/(q*r))> Y.. <- mean(df$yLog)> qsr <- qt(1-0.025,r)> c( exp(Y.. - qsr * s), exp(Y.. + qsr * s))

[1] 0.3529964 2.5681782

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 31 / 35

Page 32: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Examples

Results: What have we Learned?I Mutation improves Algorithm performance (smaller values are better)

> summary(df$y[df$mut==1])

Min. 1st Qu. Median Mean 3rd Qu. Max.

0.5352 1.9560 4.1170 4.6380 7.0540 10.2300

> summary(df$y[df$mut==2])

Min. 1st Qu. Median Mean 3rd Qu. Max.

0.002223 0.055440 0.511000 1.354000 1.550000 8.042000

1 2

02

46

810

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 32 / 35

Page 33: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Outlook

Outlook

I Journal of Negative Results in BioMedicine strongly promotes and invites thepublication of clinical trials that fall short of demonstrating an improvementover current treatments

I The aim of the journal is to provide scientists and physicians with responsibleand balanced information in order to improve experimental designs andclinical decisions

I Do we need a similar infrastructure in computer science?I Register and announce experimentsI LoginI Download test dataI Perform experimentsI Report

I Only a phantasy? Let’s start a discussion...

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 33 / 35

Page 34: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Suggested Readings

Suggested Reading

I Experimental Methods for theAnalysis of Optimization Algorithms

I See also Kleijnen [3], Saltelli et al.

Ihttp://www.spotseven.org

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 34 / 35

Page 35: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Acknowledgments

Acknowledgments

I This work has been supported by the Federal Ministry of Education andResearch (BMBF) under the grants MCIOP (FKZ 17N0311) and CIMO (FKZ17002X11)

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 35 / 35

Page 36: Design of Experiments and Sequential Parameter Optimization · Design of Experiments and Sequential Parameter Optimization Thomas Bartz-Beielstein Cologne University of Applied Sciences

P TS

Acknowledgments

[1] Marco Chiarandini and Yuri Goegebeur.Mixed models for the analysis of optimization algorithms.In Thomas Bartz-Beielstein, Marco Chiarandini, Luıs Paquete, and MikePreuss, editors, Experimental Methods for the Analysis of OptimizationAlgorithms, pages 225–264. Springer, Germany, 2010.Preliminary version available as Tech. Rep. DMF-2009-07-001 at the TheDanish Mathematical Society.

[2] M. Gallagher and B. Yuan.A general-purpose tunable landscape generator.IEEE transactions on evolutionary computation, 10(5):590–603, 2006.

[3] J. P. C. Kleijnen.Design and analysis of simulation experiments.Springer, New York NY, 2008.

Bartz-Beielstein (CUAS) Design of Experiments BRAGFOST ’12 35 / 35