65
Focus on Unsupervised Learning

Focus on Unsupervised Learning. No teacher specifying right answer

Embed Size (px)

Citation preview

Page 1: Focus on Unsupervised Learning.  No teacher specifying right answer

Focus on Unsupervised Learning

Page 2: Focus on Unsupervised Learning.  No teacher specifying right answer

No teacher specifying right answer

Page 3: Focus on Unsupervised Learning.  No teacher specifying right answer

No teacher specifying right answerTechniques for autonomous SW or

robots to learn to characterize their sensations

Page 4: Focus on Unsupervised Learning.  No teacher specifying right answer

“Competitive” learning algorithm

Page 5: Focus on Unsupervised Learning.  No teacher specifying right answer

“Competitive” learning algorithm Winner-take-all

Page 6: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 7: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 8: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 9: Focus on Unsupervised Learning.  No teacher specifying right answer

Learning Rule: Iterate

Page 10: Focus on Unsupervised Learning.  No teacher specifying right answer

Learning Rule: IterateFind “winner”

Page 11: Focus on Unsupervised Learning.  No teacher specifying right answer

Learning Rule: IterateFind “winner”Delta = learning rate * (sample –

prototype)

Page 12: Focus on Unsupervised Learning.  No teacher specifying right answer

Example: Learning rate = .05 Sample = (122, 180) Winner = (84, 203)

DeltaX = learning rate * (sample x – winner x) DeltaX = .05 * (122 – 84) DeltaX = 1.9 New prototype x value = 84 + 1.9 = 85.9

DeltaY = .05 * (180 - 203) DeltaY = -1.15 New prototype y value = 203 -1.15 = 201.85

Page 13: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 14: Focus on Unsupervised Learning.  No teacher specifying right answer

Python Demo

Page 15: Focus on Unsupervised Learning.  No teacher specifying right answer

Sound familiar?

Page 16: Focus on Unsupervised Learning.  No teacher specifying right answer

ClusteringDimensionality ReductionData visualization

Page 17: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 18: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 19: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 20: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 21: Focus on Unsupervised Learning.  No teacher specifying right answer

Yves Amu Klein’s Octofungi uses a kohonen neural network to react to its environment

Page 22: Focus on Unsupervised Learning.  No teacher specifying right answer

Associative learning method

Page 23: Focus on Unsupervised Learning.  No teacher specifying right answer

Associative learning methodBiologically inspired

Page 24: Focus on Unsupervised Learning.  No teacher specifying right answer

Associative learning methodBiologically inspiredBehavioral conditioning and

Psychological models

Page 25: Focus on Unsupervised Learning.  No teacher specifying right answer

activation = sign(input sum)

Page 26: Focus on Unsupervised Learning.  No teacher specifying right answer

activation = sign(input sum)+1 and -1 inputs

Page 27: Focus on Unsupervised Learning.  No teacher specifying right answer

activation = sign(input sum)+1 and -1 inputs2 layers

Page 28: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 29: Focus on Unsupervised Learning.  No teacher specifying right answer

weight change = learning constant *

neuron A activation * neuron B activation

Page 30: Focus on Unsupervised Learning.  No teacher specifying right answer

weight change = learning constant *

desired output * input value

Page 31: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 32: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 33: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 34: Focus on Unsupervised Learning.  No teacher specifying right answer

Long-term memory

Page 35: Focus on Unsupervised Learning.  No teacher specifying right answer

Long-term memory Inspired by Hebbian learning

Page 36: Focus on Unsupervised Learning.  No teacher specifying right answer

Long-term memory Inspired by Hebbian learningContent-addressable memory

Page 37: Focus on Unsupervised Learning.  No teacher specifying right answer

Long-term memory Inspired by Hebbian learningContent-addressable memoryFeedback and convergance

Page 38: Focus on Unsupervised Learning.  No teacher specifying right answer

Attractor – “a state or output vector in a system

towards which the system consistently evolves toward given a specific input vector.”

Page 39: Focus on Unsupervised Learning.  No teacher specifying right answer

Attractor Basin – “the set of input vectors surrounding

a learned vector which will converge to the same output vector.”

Page 40: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 41: Focus on Unsupervised Learning.  No teacher specifying right answer

Bi-directional Associative MemoryAttractor network with 2 layers

Page 42: Focus on Unsupervised Learning.  No teacher specifying right answer

Smell Taste

Page 43: Focus on Unsupervised Learning.  No teacher specifying right answer

Bi-directional Associative MemoryAttractor network with 2 layers Information flows in both directions

Page 44: Focus on Unsupervised Learning.  No teacher specifying right answer

Bi-directional Associative MemoryAttractor network with 2 layers Information flows in both directionsMatrix worked out in advance

Page 45: Focus on Unsupervised Learning.  No teacher specifying right answer

Hamming vector – vector composed of

+1 and -1 only

Ex. [1,-1,-1,1] [1,1,-1,1]

Page 46: Focus on Unsupervised Learning.  No teacher specifying right answer

Hamming distance – number of components by which 2 vectors differ

Ex. [1,-1,-1,1] and [1,1,-1,1]Differ in only one element (index 1)Hamming distance = 1

Page 47: Focus on Unsupervised Learning.  No teacher specifying right answer

Weights are a matrix based on memories we want to store

To associate X = [1,-1,-1,-1] With Y = [-1,1,1]XY

1 -1 -1 -1

-1 -1 1 1 1

1 1 -1 -1 -1

1 1 -1 -1 -1

Page 48: Focus on Unsupervised Learning.  No teacher specifying right answer

[1,-1,-1,-1] -> [1,1,1] and [-1,-1,-1,1] -> [1,-1,1]

+

=

1 -1 -1 -1

1 -1 -1 -1

1 -1 -1 -1

-1 -1 -1 1

1 1 1 -1

-1 -1 -1 1

0 -2 -2 0

2 0 0 -2

0 -2 -2 0

Page 49: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 50: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 51: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 52: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 53: Focus on Unsupervised Learning.  No teacher specifying right answer

AutoassociativeRecurrent

Page 54: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 55: Focus on Unsupervised Learning.  No teacher specifying right answer

To remember the pattern [1,-1,1,-1,1]

1 -1 1 -1 11 1 -1 1 -1 1-1 -1 1 -1 1 -11 1 -1 1 -1 1-1 -1 1 -1 1 -11 1 -1 1 -1 1

Page 56: Focus on Unsupervised Learning.  No teacher specifying right answer

Demo

Page 57: Focus on Unsupervised Learning.  No teacher specifying right answer

Complements of a vector also become attractors

Page 58: Focus on Unsupervised Learning.  No teacher specifying right answer

Complements of a vector also become attractors

Ex. Installing [1,-1, 1] [-1, 1, -1] also

“remembered”

Page 59: Focus on Unsupervised Learning.  No teacher specifying right answer

Complements of a vector also become attractors

Crosstalk

Page 60: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 61: Focus on Unsupervised Learning.  No teacher specifying right answer

George Christos “Memory and Dreams”

Page 62: Focus on Unsupervised Learning.  No teacher specifying right answer

Ralph E. Hoffman models of schizophrenia

Page 63: Focus on Unsupervised Learning.  No teacher specifying right answer

Spurious Memories

Page 64: Focus on Unsupervised Learning.  No teacher specifying right answer
Page 65: Focus on Unsupervised Learning.  No teacher specifying right answer