Genetic Programming: Hypothesis Evolution

Preview:

DESCRIPTION

Genetic Programming: Hypothesis Evolution. By Nam Nguyen Greg Nelson. Introduction To Evol. An attempt to more directly simulate the process of natural selection. The system is set up as a large board with agents and examples. Each agent executes on its own thread. - PowerPoint PPT Presentation

Citation preview

Genetic Programming:Hypothesis Evolution

By Nam Nguyen Greg Nelson

Introduction To Evol An attempt to more directly simulate the

process of natural selection. The system is set up as a large board with

agents and examples. Each agent executes on its own thread. Agents live on the board and wander

around processing examples and interacting with other agents.

Two kinds of agents: helpers and fighters. Examples act as food for the agents.

Goal Implement a new genetic algorithm that, rather than

being based on probabilities, actually models the process of natural selection.

Build a dynamic, intuitive learning system that very closely emulates natural selection.

Find a balance in parameter settings (how much "life" an agent starts with, what the "age" threshold is in order to breed, how much an agent's "life" is boosted by eating an example, etc...); some settings cause agents to breed like rabbits and grow to doomsday, others lead to almost immediate extinction.

Problem Representation Environment is represented by a board. All

entities live on this board. Agents walk around, eat examples, breed, share

information, and fight with other agents.

E

E

Hierarchy Tree

Entity

ExampleAgent

Helper Fighter

Play Tennis Example Example 1(Outlook = Sunny) && (Temp = Hot) && (Humidity = High)&& (Wind = Weak) Then (PlayTennis = Yes)

Bit string10010010101 (First three bits for Outlook, etc…)

All hypotheses and examples are represented with this notation.

Agents The agents contain a genotype and

phenotype. Both represent hypotheses. The phenotype is used (and changed); the

genotype is inherited from birth, never changed, and passed on to future generations.

Agents also interact by breeding, helping each other, or fighting.

Examples Continued Examples are created from the training

data set and act as food for the agents. When an agent comes upon an example

it does one of two things:1. If it classifies it correctly, it "eats" it, which increases its strength and likelihood for survival.2. If not, it tries to learn something from it (change some of the phenotype bits to match the example’s).

When Helpers Meet Other Helpers

Helper 1

0110100

Helper 2

1110110

You are tooyoung. We should

help each other.

Okay, let’s be friends!

Helper 1

0110100

Helper 2

1110110

This is what I learned

Here’s myknowledge

Helper 1

0110100

Helper 2

1110110

Bit 5 is true

Bit 6 is true

When Helpers Meet Other Helpers: part 2

Helper 1

Life 500 Helper 2

Life 800

How oldare you?

I am above threshold

Censored

Helper 1

Life 250Helper 2

Life 550

Helper 3 (Baby)

Life 500

Several Cycles Later…

When Fighters Meet Other Agents

Fighter 1

Quit stealing my mates and examples

No, you stop!

LightningBolt

POW!

AH HA HA HA!

When Fighters Meet Wise Helpers

Helper 1

Life 900

Fighter 1

Life 800

Must… fight!

That agent istoo cute tobully…

Hello! How are you?

Get over here!

Zip!

AHHHHH!

Whoosh

Smooch!

Helper 1

Life 650

Fighter 1

Life 550

Fighter 2 Baby

Life 500

Several Cycles Later…..

Learning Algorithm Agents move around the board

learning from examples and each other.

The more successful agents will breed more readily, while the less successful agents will be killed by starvation and combat.

Performance Seemed to learn simple tasks such as

playTennis and mushroom.- classified 80% of the playTennis set correctly- classified 92% of mushroom test set correctly

. This is because it “learned” to classify everything as edible.

Performed dismally on other tasks such as voting.

Varying parameters might improve stability and speedup convergence.

Problems Population control (doomsday and

extinction, memory issues). Too much interaction between

agents, not enough eating! Disturbing bug (fixed): agents would

have a child and then immediately mate with it!

Conclusions Hard to control/predict a dynamic

population of agents acting independently.

Nondeterministic: different results every time, although similar trends (above).

With lots of work, we think this could work very well.

Sample log file . . .(Fighter3,9999692) at [12,3] ate (person161,99) at [12,3](Fighter3,10499691) at [12,3] ate (person161,98) at [12,3](Fighter0,14499692) at [20,1] is breeding with (Helper0,8999691) at [20,1]!(Fighter0,14499692) at [20,1] begat (Fighter20,500000) at [20,1]!(Fighter20,499999) at [19,1] beat up (Fighter19,499987) at [19,1]!(Fighter19,499987) at [19,1] is dying!(Fighter0,14499686) at [20,2] is breeding with (Helper0,8999685) at [20,2]!(Fighter0,14499686) at [20,2] begat (Fighter21,500000) at [20,2]!(Fighter21,500000) at [20,2] got beat up by (Fighter0,14499685) at [20,2](Fighter21,500000) at [20,2] is dying!(Helper1,13999675) at [18,0] ate (person77,98) at [18,0](Fighter0,14499676) at [21,4] is breeding with (Helper0,8999675) at [21,4]!(Fighter0,14499676) at [21,4] begat (Fighter22,500000) at [21,4]!(Fighter3,10999672) at [18,0] ate (person77,97) at [18,0](Helper2,9499670) at [5,16] ate (person112,99) at [5,16](Fighter3,11499670) at [18,24] ate (person52,95) at [18,24](Helper2,9999665) at [6,17] ate (person130,100) at [6,17] . . .

Sample outputTime limit reached: terminating.The following 42 rules were learned from the training set:

Helper1, RANK 2670: IF ((handicappedkids=y) OR (handicappedkids=n)) AND ... AND ((export_act_safrica=y)) THEN Democrat...

Questions…

???

Recommended