Feedback Networksand Associative Memories
虞台文大同大學資工所智慧型多媒體研究室
Content
IntroductionDiscrete Hopfield NNsContinuous Hopfield NNsAssociative Memories
– Hopfield Memory– Bidirection Memory
Feedback Networksand Associative Memories
Introduction
大同大學資工所智慧型多媒體研究室
Feedforward/Feedback NNs
Feedforward NNs– The connections between units do not form cycles.– Usually produce a response to an input quickly.– Most feedforward NNs can be trained using a wide variet
y of efficient algorithms. Feedback or recurrent NNs
– There are cycles in the connections. – In some feedback NNs, each time an input is presented, t
he NN must iterate for a potentially long time before it produces a response.
– Usually more difficult to train than feedforward NNs.
Supervised-Learning NNs
Feedforward NNs– Perceptron – Adaline, Madaline – Backpropagation (BP) – Artmap – Learning Vector Quantization (LVQ) – Probabilistic Neural Network (PNN) – General Regression Neural Network (GRNN)
Feedback or recurrent NNs– Brain-State-in-a-Box (BSB) – Fuzzy Congitive Map (FCM) – Boltzmann Machine (BM) – Backpropagation through time (BPTT)
Unsupervised-Learning NNs
Feedforward NNs– Learning Matrix (LM) – Sparse Distributed Associative Memory (SDM) – Fuzzy Associative Memory (FAM) – Counterprogation (CPN)
Feedback or Recurrent NNs– Binary Adaptive Resonance Theory (ART1) – Analog Adaptive Resonance Theory (ART2, ART2a) – Discrete Hopfield (DH) – Continuous Hopfield (CH) – Discrete Bidirectional Associative Memory (BAM) – Kohonen Self-organizing Map/Topology-preserving map
(SOM/TPM)
The Hopfield NNs
In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research.
A fully connected, symmetrically weighted network where each node functions both as input and output node.
Used for– Associated memories– Combinatorial optimization
Associative Memories
An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns.
Two types of associative memory: autoassociative and heteroassociative.
Auto-association– retrieves a previously stored pattern that most closely re
sembles the current pattern. Hetero-association
– the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.
Associative Memories
Auto-association
AA
Hetero-association
Niagara Waterfall
memorymemory
memorymemory
Optimization Problems
Associate costs with energy functions in Hopfield Networks– Need to be in quadratic form
Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set.
Local optimums, not global.
Feedback Networksand Associative Memories
Discrete Hopfield NNs
大同大學資工所智慧型多媒體研究室
The Discrete Hopfield NNs
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
The Discrete Hopfield NNs
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
wij = wij
wii = 0
wij = wij
wii = 0
The Discrete Hopfield NNs
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
wij = wij
wii = 0
wij = wij
wii = 0
State Update Rule
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
11 22 33 nn. . .11 22 33 nn. . .
v1 v2 v3 vnv1 v2 v3 vn
I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In
w21 w31 wn1w21 w31 wn1
w12 w32 wn2w12 w32 wn2
w13 w23 wn3w13 w23 wn3
w1n w2n w3nw1n w2n w3n
Asynchronous mode
Update rule
1
() )( 1 i
n
i ij jjj i
v t IH t w
1 0( 1) sg
( 1)(
1 0(n 1)
1)i
iii
H tH t
H tv t
Stable?Stable?
Energy Function
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
11 22 33 nn. . .11 22 33 nn. . .
v1 v2 v3 vnv1 v2 v3 vn
I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In
w21 w31 wn1w21 w31 wn1
w12 w32 wn2w12 w32 wn2
w13 w23 wn3w13 w23 wn3
w1n w2n w3nw1n w2n w3n
1
() )( 1 i
n
i ij jjj i
v t IH t w
( 1)
( 1
1 0( 1)
1 0)ii
i
H t
tv
Ht
12
1 1 1
n n n
i j ii j i
i ij IE w v v v
Fact:E is lower bounded (upper bounded).
If E is monotonically decreasing (increasing), the system is stable.
The Proof
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
11 22 33 nn. . .11 22 33 nn. . .
v1 v2 v3 vnv1 v2 v3 vn
I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In
w21 w31 wn1w21 w31 wn1
w12 w32 wn2w12 w32 wn2
w13 w23 wn3w13 w23 wn3
w1n w2n w3nw1n w2n w3n
1
() )( 1 i
n
i ij jjj i
v t IH t w
( 1)
( 1
1 0( 1)
1 0)ii
i
H t
tv
Ht
12
1 1 1
n n n
i j ii j i
i ij IE w v v v
Suppose that at time t + 1, the kth neuron is selected for update.
12
1 1 1 1
( ) ( ) ( ) ( ) ( ) ( ) ( )n n n n
ij i j i i ik k k k
i
ii j i i
k j k i k
E t w v t v t I v t w v t v t I v t
12
1 1 1 1
( 1) ( 1) ( 1) ( 1) ( 1) ( 1) ( 1)n n n n
ij i j i i i ii j i i
k k k k
i k j k i k
E t w v t v t I v t w v t v t I v t
The Proof
11 22 33 nn. . .
v1 v2 v3 vn
I1 I2 I3 In
w21 w31 wn1
w12 w32 wn2
w13 w23 wn3
w1n w2n w3n
11 22 33 nn. . .11 22 33 nn. . .
v1 v2 v3 vnv1 v2 v3 vn
I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In
w21 w31 wn1w21 w31 wn1
w12 w32 wn2w12 w32 wn2
w13 w23 wn3w13 w23 wn3
w1n w2n w3nw1n w2n w3n
1
() )( 1 i
n
i ij jjj i
v t IH t w
( 1)
( 1
1 0( 1)
1 0)ii
i
H t
tv
Ht
12
1 1 1
n n n
i j ii j i
i ij IE w v v v
Suppose that at time t + 1, the kth neuron is selected for update.
12
1 1 1 1
( ) ( ) ( ) ( ) ( ) ( ) ( )n n n n
ij i j i i ik k k k
i
ii j i i
k j k i k
E t w v t v t I v t w v t v t I v t
12
1 1 1 1
( 1) ( 1) ( 1) ( 1) ( 1) ( 1) ( 1)n n n n
ij i j i i i ii j i i
k k k k
i k j k i k
E t w v t v t I v t w v t v t I v t
Their values are not changed at
time t + 1.
Their values are not changed at
time t + 1.
Their values are not changed at
time t + 1.
Their values are not changed at
time t + 1.
The Proof
1
() )( 1 i
n
i ij jjj i
v t IH t w
( 1)
( 1
1 0( 1)
1 0)ii
i
H t
tv
Ht
12
1 1 1 1
( ) ( ) ( ) ( ) ( ) ( ) ( )n n n n
ij i j i i ik k k k
i
ii j i i
k j k i k
E t w v t v t I v t w v t v t I v t
12
1 1 1 1
( 1) ( 1) ( 1) ( 1) ( 1) ( 1) ( 1)n n n n
ij i j i i i ii j i i
k k k k
i k j k i k
E t w v t v t I v t w v t v t I v t
1 1
( 1) ( ) ( 1) ( 1) (( ) ( )) ( )n n
k k k ki
k ii
ki i k i kIE E t E t v t v t vw v t w v tt v tI
1
[ ( 1) (( )]) k
n
k i k ki
iw v t t tI v v
( )[ ( 1) ( )]k k kH t v t v t
The Proof
1
() )( 1 i
n
i ij jjj i
v t IH t w
( 1)
( 1
1 0( 1)
1 0)ii
i
H t
tv
Ht
1 1
( 1) ( ) ( 1) ( 1) (( ) ( )) ( )n n
k k k ki
k ii
ki i k i kIE E t E t v t v t vw v t w v tt v tI
1
[ ( 1) (( )]) k
n
k i k ki
iw v t t tI v v
( )[ ( 1) ( )]k k kH t v t v t
1 0 1 0
1 < 0 1 < 0
1 0 1 < 0
1 < 0 1 0
vk(t) vk(t+1)Hk(t) E
E StableStable
Feedback Networksand Associative Memories
Continuous Hopfield NNs
大同大學資工所智慧型多媒體研究室
The Neuron of Continuous Hopfield NNs
ui
vi =a(ui)
1
1
vi
ui =a1(vi)
11
...
giCi
wi1
wi2
win
Iiv1
v2
vn
ui
vivi
( )ia u
The Dynamics
...
giCi
wi1
wi2
win
Iiv1
v2
vn
ui
vivi
( )ia u
1 1( )i iuv w
2 2( )i iuv w
( )i inn uv w
idi t
udC iig u
1
( )n
i iji
i ijj i
j i i
dC
uu Iug
dtvw
1 1
n ni
i ij ij i i ij jj i j i
j
dC w w g I
u
dv u
t
G i
1
n
i ij i ijj i
iij
dC w G I
dtuv
u
The Continuous Hopfield NNs
g1C1
I1
u1g2C2
I2
u2g3C3
I3
u3gnCn
In
un
v1 v2 v3 vn
v1 v2 v3 vn
w21 wn1
w31
w12
w13
w1n
w23
w2n
w32
w3n
wn2
wn3
. . .. . .
1
n
i ij i ijj i
iij
dC w G I
dtuv
u
The Continuous Hopfield NNs
g1C1
I1
u1g2C2
I2
u2g3C3
I3
u3gnCn
In
un
v1 v2 v3 vn
v1 v2 v3 vn
w21 wn1
w31
w12
w13
w1n
w23
w2n
w32
w3n
wn2
wn3
. . .. . .
1
n
i ij i ijj i
iij
dC w G I
dtuv
u
nn
n
ijj
njnjn
n
n
ijj
jj
n
ijj
jj
n
ijj
jj
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
1
331
333
3
221
222
2
111
111
1
nn
n
ijj
njnjn
n
n
ijj
jj
n
ijj
jj
n
ijj
jj
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
1
331
333
3
221
222
2
111
111
1
Stable?Stable?
Equilibrium Points
Consider the autonomous system:
Equilibrium Points Satisfy
1 1
2 2
( )
( )
( )n n
y f
y f
y f
y
y
y
( ) ( )t y f y
( )*0 f y
The system is asymptotically stable if the following holds:
Lyapunov Theorem
( ) ( )t y f y
There exists a positive-definite function E(y) s.t.
( )0
dE
dt
y
Call E(y) as energy function.
Lyapunov Energy Function
g1C1
I1
u1g2C2
I2
u2g3C3
I3
u3gnCn
In
un
v1 v2 v3 vn
v1 v2 v3 vn
w21 wn1
w31
w12
w13
w1n
w23
w2n
w32
w3n
wn2
wn3
. . .. . .
1
01 1 1 1
1 1( )
2
in n n n v
i j iij i ii j i i
j n
E v v v a vw dGI v
1
01 1 1 1
1 1( )
2
in n n n v
i j iij i ii j i i
j n
E v v v a vw dGI v
Lyapunov Energy Function
g1C1
I1
u1g2C2
I2
u2g3C3
I3
u3gnCn
In
un
v1 v2 v3 vn
v1 v2 v3 vn
w21 wn1
w31
w12
w13
w1n
w23
w2n
w32
w3n
wn2
wn3
. . .. . .
uva )(1 uva )(1
v
u=a1(v)
11
v1 1
dxxav
0
1 )(
1
01 1 1 1
1 1( )
2
in n n n v
i j i i ii j i i
j
i
n
ijE v Gv v a v dvIw
1
01 1 1 1
1 1( )
2
in n n n v
i j i i ii j i i
j
i
n
ijE v Gv v a v dvIw
ui
vi =a(ui)
1
1
1( )0
da v
dv
1( )
0da v
dv
Stability of Continuous Hopfield NNs
1
ni
i j i ijj i
ij iGdu
C v udt
w I
1
01 1 1 1
1 1( )
2
in n n n v
i j i i ii j i i
j
i
n
ijE v Gv v a v dvIw
1( )i i iu a v 1( )i i iu a v 11 ( )i i iu a v
11 ( )i i iu a v
1( )1i i i i
i
du da v dv
dt dv dt
1( )1i i i i
i
du da v dv
dt dv dt
1
ni
i i
dvdE dE
dt dv dt
1 1
n ni
j ii j
i i
i
j
j
i IGwdv
v udt
1
ni i
ii
du dvC
dt dt
21
1
( )1 ni i i
ii i
da v dvC
dv dt
Dynamics
Stability of Continuous Hopfield NNs
1
ni
i j i ijj i
ij iGdu
C v udt
w I
1
01 1 1 1
1 1( )
2
in n n n v
i j i i ii j i i
j
i
n
ijE v Gv v a v dvIw
21
1
( )1 ni i i
ii i
da v dvdEC
dt dv dt
Dynamics
> 0
v
u=a1(v)
11
1( )0
da v
dv
0dE
dt0
dE
dt
Stability of Continuous Hopfield NNs
g1C1
I1
u1g2C2
I2
u2g3C3
I3
u3gnCn
In
un
v1 v2 v3 vn
v1 v2 v3 vn
w21 wn1
w31
w12
w13
w1n
w23
w2n
w32
w3n
wn2
wn3
. . .. . .
nn
n
ijj
njnjn
n
n
ijj
jj
n
ijj
jj
n
ijj
jj
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
1
331
333
3
221
222
2
111
111
1
nn
n
ijj
njnjn
n
n
ijj
jj
n
ijj
jj
n
ijj
jj
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
IuGvwdt
duC
1
331
333
3
221
222
2
111
111
1
StableStable
Feedback Networksand Associative Memories
Associative Memories
大同大學資工所智慧型多媒體研究室
Associative Memories
Also named content-addressable memory.
Autoassociative Memory– Hopfield Memory
Heteroassociative Memory– Bidirection Associative Memory (BAM)
Associative Memories
Stored Patterns
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
ni Rxmi Ry
ii x yii x y
Autoassociative
Heteroassociative
Feedback Networksand Associative Memories
Associative Memories Hopfield Memory Bidirection Memory
大同大學資工所智慧型多媒體研究室
Hopfield Memory
1210Neurons
Fully connected
14,400 weights
Example
StoredPatterns
MemoryAssociation
Example
StoredPatterns
MemoryAssociation
Example
StoredPatterns
MemoryAssociation
How to Store Patterns?How to Store Patterns?
The Storage Algorithm
Suppose the set of stored pattern is of dimension n.
nnnn
n
n
www
www
www
21
22221
11211
W
nnnn
n
n
www
www
www
21
22221
11211
W =?
The Storage Algorithm
1
( )p
k k T
k
p
W x x I
1 2( , , , ) 1, 2, .k k k k Tnx x x k p x
{ 1, 1} 1, 2, .kix i n
1
0
pk ki j
kij
x x i jw
i j
Analysis1
( )p
k k T
k
p
W x x I 1
0
pk ki j
kij
x x i jw
i j
Suppose that x xi.
1
( )p
k k T
k
p
Wx x x I x
( )i i in p n p x x x
Example
1
2
(1, 1, 1,1)
( 1,1, 1,1)
T
T
x
x
1111
1111
1111
1111
)( 11 Txx
1111
1111
1111
1111
)( 22 Txx
2200
2200
0022
0022
)()( 2211 TT xxxx
0 2 0 0
2 0 0 0
0 0 0 2
0 0 2 0
W
1
( )p
k k T
k
p
W x x I 1
0
pk ki j
kij
x x i jw
i j
Example
1
2
(1, 1, 1,1)
( 1,1, 1,1)
T
T
x
x
0 2 0 0
2 0 0 0
0 0 0 2
0 0 2 0
W
11 22 33 442 2
12
1 1 1
( )n n n
i j ii
iij i
jE w x x xI
x
12
1 1
n n
i jiji j
x xw
1 2 3 42( )x x x x
12
T x Wx
Example
1
2
(1, 1, 1,1)
( 1,1, 1,1)
T
T
x
x
11 22 33 442 2
1 2 3 4( ) 2( )E x x x x x
1 1 11
1 11 1
1 1 1 1
Stable
E=4E=4
E=0E=0
E=4E=4
Example
1
2
(1, 1, 1,1)
( 1,1, 1,1)
T
T
x
x
11 22 33 442 2
1 2 3 4( ) 2( )E x x x x x
1 1 11
1 11 1
11 1 1
Stable
E=4E=4
E=0E=0
E=4E=4
Problems of Hopfield Memory
Complement Memories
Spurious stable states
Capacity
Capacity of Hopfield Memory
The number of storable patterns w.r.t. the size of the network.
Study methods:– Experiments– Probability– Information theory– Radius of attraction ()
Capacity of Hopfield Memory
The number of storable patterns w.r.t. the size of the network.
Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n.
The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).
Radius of attraction (0 1/2)
n
False memories
n
nc
ln4
)21( 2
n
nc
ln4
)21( 2
Feedback Networksand Associative Memories
Associative Memories Hopfield Memory Bidirection Memory
大同大學資工所智慧型多媒體研究室
Bidirection Memory
. . .y1 y2 yn
. .
.
x1 x2 xm
X Layer
Y Layer
W=[wij]nm
(( ) )1t a t xWy
Forw
ard
Pass
Forw
ard
Pass
( )tx
( )ty
Back
ward
Pass
Back
ward
Pass
( 1 ( )) Tt ta x yW
Bidirection Memory
. . .y1 y2 yn
. .
.
x1 x2 xm
X Layer
Y Layer
W=[wij]nm
(( ) )1t a t xWy
Forw
ard
Pass
Forw
ard
Pass
( )tx
( )ty
Back
ward
Pass
Back
ward
Pass
( 1 ( )) Tt ta x yW
Stored Patterns
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
?
The Storage Algorithm
StoredPatterns
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
1
2
1
2
( , )
( , )
( , )p p
x
x
x
y
y
y
2
1 2
1( , , , )
( , , , )k
T
Tn
km
y
x x
y
x
y
y
x
{ 1,1}ix
{ 1,1}iy
1
( )p
k
k
k T
y xW
1ij
pkk
ik
jxw y
Analysis
Suppose xk’ is one of the stored vector.
1
( )p
k
k k T kka a
x xy yxW
1'
( )p
kk
kk T
k
k kma
x xy y
0
k ka m y y
1
( )p
k
k
k T
y xW
Energy Function
1 12 2( , )
T T T
T
E
x y x W y y Wx
y Wx
Bidirection MemoryEnergy Function:
Wxy
WxyyWxyxT
TTTE
),( 21
21
Wxy
WxyyWxyxT
TTTE
),( 21
21