Upload
roots999
View
217
Download
0
Embed Size (px)
Citation preview
8/3/2019 Pattern Association 1
1/35
8/3/2019 Pattern Association 1
2/35
Klinkhachorn:CpE320
Associative Memory Neural Nets
Map user-selected vectors x1,x2,.xl into user-selectedvectors y
1,y
2,y
l
Autoassociative class( y vectors = corresponding x vectors)
Heteroassociative class( y vector corresponding x vectors)
8/3/2019 Pattern Association 1
3/35
Bidirectional AssociativeMemory (BAM)
8/3/2019 Pattern Association 1
4/35
Klinkhachorn:CpE320
Classification of ANN
Paradigms
Bidirectional
Associative Memory
8/3/2019 Pattern Association 1
5/35
Klinkhachorn:CpE320
Bart Kosko
Associate Professor of Electrical Engineering
Bart Kosko received bachelors degrees in Economics
and Philosophy from the University of Southern
California, the masters degree in Applied
Mathematics from the University of California, San
Diego, and the Ph.D. degree in Electrical Engineering
from the University of California, Irvine.
Research Interests
Adaptive Systems
Fuzzy Theory
Neural Networks
Bio-ComputingNonlinear Signal Processing
Intelligent Agents
Smart Materials
Stochastic Resonance
http://sipi.usc.edu/~kosko/
8/3/2019 Pattern Association 1
6/35
Klinkhachorn:CpE320
Bidirectional Associative Memory
Astro
Retrieve Information from both directions
BAM
Pattern X Pattern Y
8/3/2019 Pattern Association 1
7/35
Klinkhachorn:CpE320
Applications
Associate Memorization
Pattern Recognition
Noise Suppression
8/3/2019 Pattern Association 1
8/35
Klinkhachorn:CpE320
Kosko BAM
8/3/2019 Pattern Association 1
9/35
Klinkhachorn:CpE320
Kosko BAM: Architecture
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
8/3/2019 Pattern Association 1
10/35
Klinkhachorn:CpE320
Kosko BAM: Architecture
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
n units in its X-layer
m units in its Y-layer
Weights are found from the sum of the outer products of
the bipolar from of the training vector pairs (XTY)
Activation function is a step function
The connections between the layers are bidirection;
Weight matrix for signal sent from X to Y-layer is W
Weight matrix for signal sent from Y to X-layer is WT
8/3/2019 Pattern Association 1
11/35
Klinkhachorn:CpE320
Kosko BAM: Algorithm
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Step1: Compute the connection weights,Let p be the total # of associated pattern pairs, and for
each pattern p
xp = (xp1,xp2,,xpi,..xpn)
yp = (yp1,yp2,,xpj,..xpm)
Then, the connection weight from node i to node j,
wij = (2xpi-1)(2ypj-1) . If xp =(0,1),
p
orwij = xpiypj . If xp =(-1,1),
p
or W = XTY . If xp =(-1,1),
8/3/2019 Pattern Association 1
12/35
Klinkhachorn:CpE320
Algorithm (Cont)
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Step2: For each testing input
A) Present input pattern x to the X-layer
B) Present input pattern y to the Y-layer
Step 3: Iterate (update outputs) until convergence
Update activations of units in Y-layer
nyj(t+1) = F[ wijxi(t)], j = 1 to m
i=1
or Y(t+1) = F[W.X]
where F[e] = +1. if e0, or
= -1(0). if e
8/3/2019 Pattern Association 1
13/35
Klinkhachorn:CpE320
Algorithm (Cont)
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Update activations of units in X-layer
mxi(t+1) = F[ wijyj(t)], i = 1 to n
j=1
Or X(t+1) = F[WT.Y],
where F[e] = +1. if e0, or
= -1(0). if e
8/3/2019 Pattern Association 1
14/35
Klinkhachorn:CpE320
Kosko BAM: an example
Suppose a Heteroassociative net is to be trained to
store/recall the following mapping from
input/output row vectors:
x = (x1,x2,x3,x4,x5,x6) to output row vectorsy = (y1,y2,y3,y4):
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Pattern x1,x2,x3,x4,x5,x6 y1,y2,y3,y4
1st (1 -1 1 -1 1 -1) (1 1 -1 -1)
2nd (1 1 1 -1 -1 -1) (1 -1 1 -1)
8/3/2019 Pattern Association 1
15/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
1
2
3
4
5
6
1
2
3
4
8/3/2019 Pattern Association 1
16/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Weights calculation: Outer Product
W =XTY
W=
+1 +1
1 +1
+1 +11 1
+1 1
1 1
. +1 +1 1 1+1 1 +1 1
W=
2 0 0 2
0 2 2 0
2 0 0 2
2 0 0 2
0 2 2 0
2 0 0 2
=
w11
w12
w13
w14
w21 w22 w23 w24
w31
w32
w33
w34
w41
w42
w43
w44
w51
w52
w53
w54
w61
w62
w63
w64
8/3/2019 Pattern Association 1
17/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Testing the netfor x = (1 -1 -1 -1 1 -1)
Y = F xW[ ] = F 1 1 1 1 1 1[ ]
2 0 0 2
0 2 2 0
2 0 0 2
2 0 0 2
0 2 2 0
2 0 0 2
Y = F 4 4 4 4[ ] = 1 1 1 1[ ]
Update activations of units in Y-layer
8/3/2019 Pattern Association 1
18/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
X= F yWT
[ ] = F 1 1 1 1[ ]
2 0 2 2 0 2
0 2 0 0 2 0
0 2 0 0 2 0
2 0 2 2 0 2
X= F 2 2 2 2 2 2[ ] = 1 1 1 1 1 1[ ]
Update activations of units in X-layer
8/3/2019 Pattern Association 1
19/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Y = F xW[ ] = F 1 1 1 1 1 1[ ]
2 0 0 2
0 2 2 0
2 0 0 2
2 0 0 2
0 2 2 0
2 0 0 2
Y = F 8 4 4 8[ ]= 1 1 1 1[ ] = previous Y
Repeat until convergence
Update activations of units in Y-layer
8/3/2019 Pattern Association 1
20/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
X= F yWT[ ] = F 1 1 1 1[ ]
2 0 2 2 0 2
0 2 0 0 2 0
0 2 0 0 2 02 0 2 2 0 2
X= F 2 2 2 2 2 2[ ] = 1 1 1 1 1 1[ ]
X= previous X
Update activations of units in X-layer
Results from recalling X = [1 -1 -1 -1 1 -1]
X = [1 -1 1 -1 1 -1] and Y =[1 1 -1 -1]
8/3/2019 Pattern Association 1
21/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Testing the net (2 bits different)for x = (-1 1 -1 -1 -1 -1)
Y = F xW[ ] = F 1 1 1 1 1 1[ ]
2 0 0 2
0 2 2 0
2 0 0 2
2 0 0 2
0 2 2 0
2 0 0 2
Y = F 0 4 4 0[ ] = 1 1 1 1[ ]
Update activations of units in Y-layer
8/3/2019 Pattern Association 1
22/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
X= F yWT
[ ] = F 1 1 1 1[ ]
2 0 2 2 0 2
0 2 0 0 2 0
0 2 0 0 2 0
2 0 2 2 0 2
X= F 0 4 0 0 4 0[ ] = 1 1 1 1 1 1[ ]
Update activations of units in X-layer
8/3/2019 Pattern Association 1
23/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Y = F xW[ ] = F 1 1 1 1 1 1[ ]
2 0 0 2
0 2 2 0
2 0 0 2
2 0 0 2
0 2 2 0
2 0 0 2
Y = F 0 4 4 0[ ] = 1 1 1 1[ ]
Repeat until convergence
Update activations of units in Y-layer
8/3/2019 Pattern Association 1
24/35
Klinkhachorn:CpE320
Kosko BAM: an example
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
X= F yWT[ ] = F 1 1 1 1[ ]
2 0 2 2 0 2
0 2 0 0 2 0
0 2 0 0 2 02 0 2 2 0 2
X= F 0 4 0 0 4 0[ ] = 1 1 1 1 1 1[ ]
Update activations of units in X-layer
X = [1 1 1 1 -1 1] = Previous X
Stop : Crosstalk!
Expected X = [1 1 1 -1 -1 -1]
8/3/2019 Pattern Association 1
25/35
Klinkhachorn:CpE320
Kosko BAM
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Advantages
Fast Training and Recall Straight Forward operations
Bi-directional data Flow
Limitations Performance ????
8/3/2019 Pattern Association 1
26/35
Klinkhachorn:CpE320
Research in BAM
Multiple Training Algorithm (Wang et al, 1991)
Dummy Augmentation (Wang et al, 1990)
Indirect Generalized Inverse BAM (Li, Nutter, 91)
8/3/2019 Pattern Association 1
27/35
Klinkhachorn:CpE320
Multiple Training Algorithm
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
Training pairs for bidirectional associative memory
8/3/2019 Pattern Association 1
28/35
Klinkhachorn:CpE320
Multiple Training Algorithm
Test results using
Koskos encoding
method
8/3/2019 Pattern Association 1
29/35
Klinkhachorn:CpE320
Multiple Training Algorithm
The sequential multiple training algorithm which uses multiple training
sequentially to those pairs which cannot be recalled correctly
Start
Form W0
I = 1
Recall Pi
I = N EndyesI = I+1
yes
Train Pi& form W
Recall Piyes no
no
no
8/3/2019 Pattern Association 1
30/35
Klinkhachorn:CpE320
Multiple Training Algorithm
Test results using the multiple training method
8/3/2019 Pattern Association 1
31/35
Klinkhachorn:CpE320
Multiple Training Algorithm
Test results using the multiple training method with noise present
8/3/2019 Pattern Association 1
32/35
Klinkhachorn:CpE320
Dummy augmentation method
The training pairs for dummy augmentation method
8/3/2019 Pattern Association 1
33/35
Klinkhachorn:CpE320
Dummy augmentation method
A sequential test for dummy augmentation
8/3/2019 Pattern Association 1
34/35
Klinkhachorn:CpE320
Dummy augmentation method
A test for dummy augmentation
8/3/2019 Pattern Association 1
35/35
Klinkhachorn:CpE320
Dummy augmentation method