Upload
daniele-loiacono
View
1.123
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Martin V. Butz, Olivier Sigaud. "XCSF with Local Deletion: Preventing Detrimental Forgetting", IWLCS, 2011
Citation preview
Cognitive Bodyspaces: Learning and Behavior
Department of Psychology (III)Cognitive Psychology
XCSF with Local Deletion:Preventing Detrimental Forgetting
Martin V. ButzDepartment of Psychology III
University of WürzburgRöntgenring 11, 97070 Würzburg,
Olivier SigaudInstitut des Systèmes Intelligents et de Robotique, Université Pierre et Marie Curie Paris 6. CNRS UMR 7222, 4 place
Jussieu, F-75005 Paris, [email protected]
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Motivation
Achieve the following goals:– Maintain a complete solution– Avoid detrimental forgetting– Enable continuous learning with selective focus
… particularly in problems where: – the problem space is non-uniformly or non-independently
sampled (not iid).– the sub-space is not fully sampled (learning in manifolds).– some problem subspaces need to be known (smaller error)
better than others (selective learning).
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Observation
• XCSF reproduces locally but deletes globally.
• This is good, because we generate a generalization pressure (local classifiers are on average more general).
• This is bad, however, because non-uniformly sampled problems can lead to forgetting.
• Thus, how can we– delete locally and still– generate the
generalization pressure?
specificity
0 1accurate, maximally general
0
1
set pressure
mutation pressure
subsumption pressure
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Approach:Choose local candidates for deletion without dependency on their generality.
Algorithm1. Select random classifier cl from [M].2. [D] = 3. for all c 2 [P] do4. if cl does match center of c
then5. add c to candidate list
[D]6. end if7. end for8. DELETE FROM CANDIDATE LIST [D]
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
The Two Evaluation Functions
Crossed-Ridge Function Diagonal Sine Function
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Evaluation with Different Sampling Types
• Normal: Uniform Sampling1. Random walk sampling:
– Next sample is located in radial vicinity of previous one
2. Random walk sampling in ring (area of distance .3 to .4 of center)
3. Centered, Gaussian sampling 4. Ring-based Gaussian sampling
Parameter Settings: N = 4000, ²0 = 0.002
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Crossed RidgeUniform Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Crossed-Ridge ComparisonBefore Condensation
Normal XCSF XCSF with Local Deletion
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Crossed-Ridge ComparisonAfter Condensation
Normal XCSF XCSF with Local Deletion
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Crossed RidgeRandom Walk Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Crossed RidgeRing-based Gaussian Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Sine FunctionUniform Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Diagonal Sine FunctionBefore Condensation
Normal XCSF XCSF with Local Deletion
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Diagonal Sine FunctionAfter Condensation
Normal XCSF XCSF with Local Deletion
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Sine FunctionRandom Walk Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Sine FunctionRandom Walk Sampling in Ring
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Sine FunctionGaussian Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Sine FunctionRing-based Gaussian Sampling
13. 7. 2011 XCSF with Local Deletion Martin V. Butz & Olivier Sigaud
Summary & Conclusions
• Local deletion does not negatively affect performance.
• During condensation, local deletion can assure a better problem solution sustenance.
• Some of the results also indicate better structural development during learning.
• These results have been confirmed in various other settings.
• No apparent drawback to apply local deletion (constant overhead computationally)
• Use this mechanism also in other condition settings!
• Use it also to selectively learn higher accurate and lower accurate approximations in different problem subspaces!