33
Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples This is lecture 4 of `Biologically Inspired Computing’ Contents: Steady state and generational EAs, basic ingredients, example run-throughs on the TSP, example encodings, hillclimbing, landscapes

Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples

Embed Size (px)

DESCRIPTION

Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples. This is lecture 4 of `Biologically Inspired Computing’ Contents: Steady state and generational EAs, basic ingredients, example run-throughs on the TSP, example encodings, hillclimbing, landscapes. - PowerPoint PPT Presentation

Citation preview

Page 1: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Biologically Inspired Computing:

Evolutionary Algorithms: Some Details and Examples

This is lecture 4 of

`Biologically Inspired Computing’

Contents:

Steady state and generational EAs, basic ingredients, example run-throughs on the TSP, example encodings,

hillclimbing, landscapes

Page 2: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

A Standard Evolutionary Algorithm

The algorithm whose pseudocode is on the next slide is a steady state, replace-worst EA with tournament selection, using mutation, but no crossover.

Parameters are popsize, tournament size, mutation-rate.

It can be applied to any problem; the details glossed over are all problem-specific.

Page 3: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

A steady state, mutation-only, replace-worst EA with tournament selection

0. Initialise: generate a population of popsize random solutions, evaluate their fitnesses.

1. Run Select to obtain a parent solution X.2. With probability mute_rate, mutate a copy of X to obtain

a mutant M (otherwise M = X)3. Evaluate the fitness of M.4. Let W be the current worst in the population (BTR). If M

is not less fit than W, then replace W with M. (otherwise do nothing)

5. If a termination condition is met (e.g. we have done 10,000 evaluationss) then stop. Otherwise go to 1.

Select: randomly choose tsize individuals from the population. Let c be the one with best fitness (BTR); return X.

Page 4: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

A generational, elitist, crossover+mutation EA

with Rank-Based selection0. Initialise: generate a population G of popsize random

solutions, evaluate their fitnesses.

1. Run Select 2*(popsize – 1) times to obtain a collection I of 2*(popsize-1) parents.

2. Randomly pair up the parents in I (into popsize – 1 pairs) and apply Vary to produce a child from each pair. Let the set of children be C.

3. Evaluate the fitness of each child.4. Keep the best in the population G (BTR) and delete the

rest. 5. Add all the children to G.6. If a termination condition is met (e.g. we have done 100

or more generations (runs through steps 1—5) then stop. Otherwise go to 1,

Page 5: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

A generational, elitist, crossover+mutation EA with Rank-Based selection, continued …

Select: sort the contents of G from best to worst,

assigning rank popsize to the best, popsize-1 to the next best, etc …, and rank 1 to the worst.

The ranks sum to F = popsize(popsize+1)/2 Associate a probability Rank_i/F with each

individual i. Using these probabilities, choose one individual

X, and return X. Vary: 1. With probability cross_rate, do a crossover: I.e produce a child by applying a crossover operator to the two parents. Otherwise, let the child be a randomly chosen one of the parents. 2. Apply mutation to the child. 3. Return the mutated child.

Page 6: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

The basic ingredients

A running EA can be seen as a sequence of generations. A generation is simply characterised by a population.

An EA goes from generation t to generation t+1 in 3 steps:

Gen t

Select parents

Parents

Produce childrenfrom the parents

Children Gen t

Children

Merge parent andchild populations

And remove some so that popsize is maintained

Page 7: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

The basic ingredients

A running EA can be seen as a sequence of generations. A generation is simply characterised by a population.

An EA goes from generation t to generation t+1 in 3 steps:

Gen t+1

Select parents

ParentsETC … as on last slide, leading to Gen t + 2, and so on.

Page 8: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Steady State vs GenerationalThere are two extremes:

Steady state: population only changes slightly in each generation. (e.g. select 1 parent, produce 1 child, add that child to pop and remove the worst)

Generational: population changes completely in each generation. (select some [could still be 1] parent(s), produce popsize children, they become the next generation.

Although, in practice, if we are using a generational EA, we usually use elitism, which means the new generation always contains the best solution from the previous generation, the remaining popsize-1 individuals being new.

Page 9: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Selection and Variation

A selection method is a way to choose a parent from the population, in such a way that the fitter an individual is, the more likely it is to be selected.

A variation operator or genetic operator is any method that takes in a (set of) parent(s), and produces a new individual (called child). If the input is a single parent, it is called a mutation operator. If two parents, it is called crossover or recombination.

Page 10: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Replacement

A replacement method is a way to decide how to produce the next generation from the merged previous generation and children.

E.g. we might simply sort them in order of fitness, and take the best popsize of them. What else might we do instead?

Page 11: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Encodings

We want to evolve schedules, networks, routes, coffee percolators, drug designs – how do we encode or represent solutions?

How you encode dictates what your operators can be, and certain constraints that the operators must meet.

Page 12: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

E.g. Bin-Packing

The Problem: You have items of different sizes or weights:e.g: 1 (30) 2 (30) 3 (30) 4 (20) 5 (10)And you have several `bins’ with a max capacity, say 40. Find a way to pack them

into the smallest number of bins. This is a hard problem. We will look at the equalize-bins version, also hard: pack these items into N

bins (we will choose N = 3), with arbitrary capacity, so that the weights of the 3 bins are as equal as possible.

Example Encoding; Have nitems numbers, each between 1 and 3, So: 1,2,1,3,3 means: items 1, 3 are in bin 1, item 2 is in bin 2, items 4, 5 are in bin

3.Fitness: minimise difference between heaviest and lightest bin. In this case what is

the fitness? Mutation: choose a gene at random, and change it to a random new value.Crossover: choose a random crossover point c (genes 1, 2, 3, and 4); child is copy

of parent 1 up to and including gene c, it is copy of parent 2 thereafter. E.g. Crossover of 3,1,2,3,3 and 2,2,1,2,1 at gene 2: 3,1,1,2,1 Crossover of 1,1,3,2,2 and 2,3,1,11 at gene 2: 1,1,1,1,1

Page 13: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Back to Basics

With your thirst for seeing example EAs temporarily quenched, the story now skips to simpler algorithms.

This will help to explain what it is about the previous ones which make them work.

Page 14: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

The Travelling Salesperson Problem(also see http://francisshanahan.com/tsa/tsaGAworkers.htm )

An example (hard) problem, for illustration

The Travelling Salesperson ProblemFind the shortest tour through the cities.

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

The one below is length: 33

Page 15: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Simplest possible EA: Hillclimbing

0. Initialise: Generate a random solution c; evaluate its fitness, f(c). Call c the current solution.

1. Mutate a copy of the current solution – call the mutant mEvaluate fitness of m, f(m).

2. If f(m) is no worse than f(c), then replace c with m, otherwise do nothing (effectively discarding m).

3. If a termination condition has been reached, stop. Otherwise, go to 1.

Note. No population (well, population of 1). This is a very simple version of an EA, although it has been around for much longer.

Page 16: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Why “Hillclimbing”?

Suppose that solutions are lined up along the x axis, and that mutationalways gives you a nearby solutions. Fitness is on the y axis; this is a landscape

12

4

3

5, 8

67

9 10

1. Initial solution; 2. rejected mutant; 3. new current solution, 4. New current solution; 5. new current solution; 6. new current soln7. Rejected mutant; 8. rejected mutant; 9. new current solution,10. Rejected mutant, …

Page 17: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Example: HC on the TSP

We can encode a candidate solution to the TSP as a permutation

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

Here is our initial random solution ACEDB withfitness 32

Current solution

Page 18: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Example: HC on the TSP

We can encode a candidate solution to the TSP as a permutation

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

Here is our initial random solution ACEDB withfitness 32. We are going to mutate it – first make Mutant a copy of Current.

A

Current solutionMutant

Page 19: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

We randomly mutate it (swap randomly chosenadjacent nodes) from ACEDB to ACDEBwhich has fitness 33 -- so current stays the sameBecause we reject this mutant.

A

Current solutionMutant

Page 20: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

We now try another mutation of Current (swap randomly chosen adjacent nodes) from ACEDB to CAEDB. Fitness is 38, so reject that too.

A

Current solutionMutant

Page 21: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

Our next mutant of Current is from ACEDB toAECDB. Fitness 33, reject this too.

A

Current solutionMutant

Page 22: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

Our next mutant of Current is from ACEDB toACDEB. Fitness 33, reject this too.

A

Current solutionMutant

Page 23: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

AD E

CB

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

D E

CB

Our next mutant of Current is from ACEDB toACEBD. Fitness is 32. Equal to Current, so this becomes the new Current.

A

Current solutionMutant

Page 24: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

ACEBD is our Current solution, with fitness 32

Current solution

D E

CB

A

Page 25: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

ACEBD is our Current solution, with fitness 32.We mutate it to DCEBA (note that first and last are adjacent nodes); fitness is 28. So this becomesour new current solution.

Current solution

D E

CB

A

Mutant

D E

CB

A

Page 26: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

Our new Current, DCEBA, with fitness 28

Current solution

D E

CB

A

Page 27: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

HC on the TSP

A B C D E

A 5 7 4 15

B 5 3 4 10

C 7 3 2 7

D 4 4 2 9

E 15 10 7 9

Our new Current, DCEBA, with fitness 28 . We mutate it, this time getting DCEAB, with fitness33 – so we reject that and DCEBA is still ourCurrent solution. Current solution

D E

CB

A

Mutant

D E

CB

A

Page 28: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

And so on …

Page 29: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

LandscapesRecall S, the search space, and f(s), the fitness of a candidate in S

f(s)

members of S lined up along here

The structure we get by imposing f(s) on S is called a landscape

What does the landscape look like if f(s) is a random number generator?What kinds of problems would have very smooth landscapes?What is the importance of the mutation operator in all this?

Page 30: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

NeighbourhoodsLet s be an individual in S, f(s) is our fitness function, and M is

our mutation operator, so that M(s1) s2, where s2 is a mutant of s1.

Given M, we can usually work out the neighbourhood of an individual point s – the neighbourhood of s is the set of all possible mutants of s

E.g. Encoding: permutations of k objects (e.g. for k-city TSP) Mutation: swap any adjacent pair of objects. Neighbourhood: Each individual has k neighbours. E.g. neighbours of EABDC are: {AEBDC, EBADC, EADBC, EABCD, CABDE}

Encoding: binary strings of length L (e.g. for L-item bin-packing) Mutation: choose a bit randomly and flip it. Neighbourhood: Each individual has L neighbours. E.g. neighbours of 00110 are: {10110, 01110, 00010, 00100, 00111}

Page 31: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Landscape Topology

Mutation operators lead to slight changes in the solution, which tend to lead to slight changes in fitness.

I.e. the fitnesses in the neighbourhood of s are often similar to the fitness of s.

Landscapes tend to be locally smoothWhat about big mutations ??It turns out that ….

Page 32: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Typical Landscapes

f(s)

members of S lined up along here

Typically, with large (realistic) problems, the huge majority of thelandscape has very poor fitness – there are tiny areas where the decent solutions lurk.So, big random changes are very likely to take us outside the nice areas.

Page 33: Biologically Inspired Computing:   Evolutionary Algorithms: Some Details and Examples

Quiz Questions Here is an altered distance matrix for the TSP problem in these slides. Illustrate five steps of Hillclimbing, in the same way done in these slides (inventing your own random mutations etc…), using this altered matrix.

Consider the example encoding and mutation operator given in these slides for the equalize-bins problem, and imagine nitems = 100 and N (number of bins) is 10. What is the neighbourhood size?

A B C D E

A 15 17 4 15

B 15 3 14 10

C 17 3 2 17

D 4 14 2 19

E 15 10 17 19