66
Lecture 23: Associative memory & Hopfield Networks

Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Embed Size (px)

Citation preview

Page 1: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Lecture 23: Associative memory & Hopfield Networks

Page 2: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Feedforward/Feedback NNs

• Feedforward NNs▫ The connections between units do not form cycles.▫ Usually produce a response to an input quickly.▫ Most feedforward NNs can be trained using a wide variety of g y

efficient algorithms. • Feedback or recurrent NNs▫ There are cycles in the connections▫ There are cycles in the connections. ▫ In some feedback NNs, each time an input is presented, the

NN must iterate for a potentially long time before it produces a responsea response.

▫ Usually more difficult to train than feedforward NNs.

Page 3: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Supervised-Learning NNsp g• Feedforward NNs

Perceptron▫ Perceptron▫ Adaline, Madaline▫ Backpropagation (BP) ▫ Artmapp▫ Learning Vector Quantization (LVQ) ▫ Probabilistic Neural Network (PNN) ▫ General Regression Neural Network (GRNN) F db k t NN• Feedback or recurrent NNs▫ Brain-State-in-a-Box (BSB) ▫ Fuzzy Conigitive Map (FCM) ▫ Boltzmann Machine (BM)▫ Boltzmann Machine (BM) ▫ Backpropagation through time (BPTT)

Page 4: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Unsupervised-Learning NNsp g• Feedforward NNs▫ Learning Matrix (LM)▫ Learning Matrix (LM) ▫ Sparse Distributed Associative Memory (SDM) ▫ Fuzzy Associative Memory (FAM) ▫ Counterprogation (CPN)g ( )

• Feedback or Recurrent NNs▫ Binary Adaptive Resonance Theory (ART1) ▫ Analog Adaptive Resonance Theory (ART2, ART2a)

Di t H fi ld (DH)▫ Discrete Hopfield (DH) ▫ Continuous Hopfield (CH)▫ Discrete Bidirectional Associative Memory (BAM)

Page 5: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Neural Networks with Temporal BehaviorBehavior

• Inclusion of feedback gives temporal characteristics gto neural networks: recurrent networks.

• Two ways to add feedback:L l f db k▫ Local feedback

▫ Global feedback• Recurrent networks can become unstable or stable• Recurrent networks can become unstable or stable.• Main interest is in recurrent network’s stability:

neurodynamics.y• Stability is a property of the whole system:

coordination between parts is necessary.

Page 6: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Hopfield NNs

• In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research.

• A fully connected symmetrically weighted• A fully connected, symmetrically weightednetwork where each node functions both as input and output node.

• Used for▫ Associated memories

Combinatorial optimization▫ Combinatorial optimization

Page 7: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Associative Memories

• An associative memory is a content-addressable t t th t t f i t tt t t fstructure that maps a set of input patterns to a set of

output patterns.• Two types of associative memory: autoassociative

and heteroassociative. • Auto-association▫ retrieves a previously stored pattern that most closely p y p y

resembles the current pattern. • Hetero-association▫ the retrieved pattern is, in general, different from the input

pattern not only in content but possibly also in type and format.

Page 8: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Associative Memories

Auto-association

AA memory

Hetero association

memory

Hetero-association

Tehran CityTehran Citymemory

Page 9: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Linear associative memoryy

• The linear associator is one of the simplest and first studied associative memory modelsassociative memory models

• A feedforward type network where the output is produced in a single feedforward computation

• The inputs connected to the outputs via the connection weight p p gmatrix W = [wij]m x n

• It is W that stores the N different associated pattern pairs {(Xk, Yk) | k = 1, 2, ..., N} where the inputs and outputs are either -1 or +1or +1

Page 10: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Linear associative memoryy

• Building an associative memory is constructing W such that when an input pattern is presented theW such that when an input pattern is presented, the stored pattern associated with the input pattern is retrieved encodingW ’ f ti l i t d tt i (X Y )• Wk’s for a particular associated pattern pair (Xk, Yk) are computed as (wij)k = (xi)k(yj)k

• ThenW N WW = ak=1

N Wk• a is the proportionality or normalizing constant to

prevent the synaptic values from going too large fwhen there are a number of associated pattern pairs

to be memorized, usually a = 1/N.• The connection weight matrix construction above g

simultaneously stores or remembers N different associated pattern pairs in a distributed manner.

Page 11: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Linear associative memoryy

• After encoding or memorization, the network can be used for retrieval decoding (X = W-1Y)retrieval decoding (X = W 1Y)

• Given a stimulus input pattern X, decoding or recollection is accomplished by computing the net input to the output units using the previous formula. g p

• Then, yj (-1 or +1) can be computed by thresholding the neuron j

• The input pattern may contain errors and noise, or may be an incomplete version of some previously encoded patternan incomplete version of some previously encoded pattern.

• Nevertheless, when presented with such a corrupted input pattern, the network will retrieve the stored pattern that is closest to actual input pattern. p p

• So, the model is robust and fault tolerant, i.e., the presence of noise or errors results only in a mere decrease rather than total degradation in the performance of the network.A i ti i b i b t d f lt t l t th• Associative memories being robust and fault tolerant are the byproducts of having a number of processing elements performing highly parallel and distributed computations.

Page 12: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Linear associative memoryy• Traditional measures of associative memory

performance are its memory capacity and content-performance are its memory capacity and contentaddressability.

• Memory capacity refers to the maximum number of i t d tt i th t b t d d tlassociated pattern pairs that can be stored and correctly

retrieved• Content-addressability is the ability of the network to y y

retrieve the correct stored pattern• It can be shown that using Hebb's learning rule in

building the connection weight matrix of an associativebuilding the connection weight matrix of an associative memory yields a significantly low memory capacity.

• Due to the limitation brought about by using Hebb'sg y glearning rule, several modifications and variants have been proposed to maximize the memory capacity.

Page 13: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Intuition

• Manipulation of attractors as a recurrent neural network paradigm.

• We can identify attractors with computational objects.• In order to do so we must exercise control over the• In order to do so, we must exercise control over the

location of the attractors in the state space of the system.

• A learning algorithm will manipulate the equations governing the dynamical behavior so that a desired location of attractors are set.location of attractors are set.

• One good way to do this is to use the energy minimization paradigm (e.g., by Hopfield).

Page 14: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Intuition

• N units with full connection among every node (no self-feedback).

• Given M input patterns, each having the same dimensionality as the network can be memorized indimensionality as the network, can be memorized in attractors of the network.

• Starting with an initial pattern, the dynamic will converge f ftoward the attractor of the basin of attraction where the

initial pattern was placed.

Page 15: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Example of using Hopfield NNs

Page 16: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Additive model of a neuron

• Low input resistanceU it t i• Unity current gain

• High output resistance

Page 17: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Additive model of a neuron• The total current flowing toward the input node of the

nonlinear element is:nonlinear element is:i=1

N wji xi(t) + Ij

Th t t l t fl i f th i t d f• The total current flowing away from the input nodes of the nonlinear element is

vj(t)/Rj + Cj dvj(t)/dtj j j j

• By applying Kirchoff's current law, we getCj dvj(t)/dt + vj(t)/Rj = i=1

N wji xi(t) + IjCj dvj(t)/dt vj(t)/Rj i=1 wji xi(t) Ij

• Given the induced local field vj(t), we may determine the output of neuron j by using the nonlinear relationoutput of neuron j by using the nonlinear relation

xj(t) = (vj(n) )

Page 18: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Additive model of a neuron

• So, ignoring interneuron propagation time delays, we may define the dynamics of the network by the following system of coupled first-order ODEs:

Cj dvj(t)/dt = -vj(t)/Rj + i 1N wji xi(t) + Ij ; j = 1 NCj dvj(t)/dt = vj(t)/Rj + i=1 wji xi(t) + Ij ; j = 1,...,N

xj(t) = (vj(n) )For example: (vj(n) ) = 1/(1+exp(- vj(n))j j

• This can be converted to:dx (t)/dt = x (t)/R + ( N w x (t)) + Kdxj(t)/dt = -xj(t)/Rj + (i=1

N wji xi(t)) + Kj

Page 19: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model

• The Hopfield network (model) consists of a set of neurons and a corresponding set of unit delays, forming a multiple-loop feedback systemTh b f f db k l i l t th• The number of feedback loops is equal to the number of neurons.

• The output of each neuron is fed back via a unit• The output of each neuron is fed back via a unit delay element, to each of the other neurons in the network.▫ i.e., there is no self-feedback in the network.

Page 20: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Architecture

Page 21: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: learningg

• Let t1, t2, …tM denote a known set of N-1 2 Mdimensional fundamental memories

• The weights are computed using Hebb’s rule

Wji = (1/M) q=1M tq,jtq,i ; j ≠ i

Wji = 0 ; j = iWji 0 ; j itq,i: the i-th element of tq

• W becomes an N×N matrix• The elements of vector tq are equal to -1 or +1

Once computed the synaptic weights are kept• Once computed, the synaptic weights are kept fixed

Page 22: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: initialization

• Let tprobe denote an unknown N-dimensional probeinput vector presented to the network

• The algorithm is initialized by setting

xj(0) = tj,probe ; j = 1,...,N

• where xj(0) is the state of neuron j at time n = 0, d t i th j th l t f th t tand tj,probe is the j-th element of the evctor tprobe

Page 23: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: iteration

• The states of the neurons (i.e., randomly and ( , yone at a time) are iterated asynchronously (difference equation for discrete-time and

ff f )differential equations for continuous-time) until convergence. For discrete-time it is

xj(n + 1) = sgn [i=1N Wjixi(n)] ; j = 1,2, ...,N

• The convergance of the above rule is gauranteed (we will see why!)gauranteed (we will see why!)

Page 24: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: Outputtingg

• Let Xfixed denote the fixed point (stable state) fixed p ( )computed at the end of the previous step

• The resulting output vector y of the network is set as

y = XfixedSt 1 i th t h hil th l t th• Step 1 is the storage phase, while the last three steps constitute the retrieval phase

Page 25: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Discrete Hopfield NNs

w ww12 w32 wn2

w13 w23 wn3

w1n w2n w3n

1 2 3 n. . .w21 w31 wn1

I1 I2 I3 In

v1 v2 v3 vn1 2 3 n

Page 26: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Discrete Hopfield NNs

w w wwij = wji

0

w21 w31 w 1

w12 w32 wn2

w13 w23 wn3

w1n w2n w3n wii = 0

1 2 3 n. . .w21 w31 wn1

I1 I2 I3 In

v1 v2 v3 vn

Page 27: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Discrete Hopfield NNs

wij = wji0

w ww12 w32 wn2

w13 w23 wn3

w1n w2n w3n wii = 0

1 2 3 n. . .w21 w31 wn1

I1 I2 I3 In

v1 v2 v3 vn1 2 3 n

Page 28: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

State Update Rule

w12 w32 wn2

w13 w23 wn3

w1n w2n w3n

w12 w32 wn2w12 w32 wn2

w13 w23 wn3w13 w23 wn3

w1n w2n w3nw1n w2n w3n

• Asynchronous mode11 22 33 nn. . .

I1 I2 I3 In

w21 w31 wn1

11 22 33 nn. . .11 22 33 nn. . .I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In

w21 w31 wn1w21 w31 wn1

y

Update rulev1 v2 v3 vnv1 v2 v3 vnv1 v2 v3 vn

• Update rule

() )( 1 i

n

i ij jv t IH t w 1

ii ij jjj i

1 0( 1)

( 1)( 1) iH t

H

( 1) sg( )

(1 0(

n 1)1)

iii

i

H tH t

v t

Page 29: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Energy Functiongy

w12 w32 wn2

w13 w23 wn3

w1n w2n w3n

w12 w32 wn2w12 w32 wn2

w13 w23 wn3w13 w23 wn3

w1n w2n w3nw1n w2n w3n

() )( 1 i

n

i ij jv t IH t w

11 22 33 nn. . .I1 I2 I3 In

w21 w31 wn1

11 22 33 nn. . .11 22 33 nn. . .I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In

w21 w31 wn1w21 w31 wn11jj i

( 1)( 1

1 0( 1)

1 0)ii

i

H tt

vH

t

v1 v2 v3 vnv1 v2 v3 vnv1 v2 v3 vn

( )i

1n n n

IE w v v v 21 1 1

i j ii j i

i ij IE w v v v

If E is monotonically decreasingIf E is monotonically decreasing, the system is stable.

(Due to Lyapunov Theorem)(Due to Lyapunov Theorem)

Page 30: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Proof

w12 w32 wn2

w13 w23 wn3

w1n w2n w3n

w12 w32 wn2w12 w32 wn2

w13 w23 wn3w13 w23 wn3

w1n w2n w3nw1n w2n w3n

1n n n

IE w v v v 11 22 33 nn. . .

I1 I2 I3 In

w21 w31 wn1

11 22 33 nn. . .11 22 33 nn. . .I1 I2 I3 InI1 I2 I3 InI1 I2 I3 In

w21 w31 wn1w21 w31 wn121 1 1

i j ii j i

i ij IE w v v v

Suppose that at time t + 1, the kth

neuron is selected for updateTheir values are not Their values are not

v1 v2 v3 vnv1 v2 v3 vnv1 v2 v3 vn

neuron is selected for update.

12( ) ( ) ( ) ( ) ( ) ( ) ( )

n n n n

ij i j i i ik k k kiE t w v t v t I v t w v t v t I v t

changed at time t + 1.changed at time t + 1.

21 1 1 1

( ) ( ) ( ) ( ) ( ) ( ) ( )ij i j i i ik k k k

i

ii j i i

k j k i k

E t w v t v t I v t w v t v t I v t

12( 1) ( 1) ( 1) ( 1) ( 1) ( 1) ( 1)

n n n n

ij i j i i i ik k k kE t w v t v t I v t w v t v t I v t 21 1 1 1

j ji j i ii k j k i k

Page 31: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Proof

( 1) ( ) ( 1) ( 1) (( ) ( )) ( )n n

IE E t E t v t v t vw v t w v tt v tI 1 1

( 1) ( ) ( 1) ( 1) (( ) ( )) ( )k k k ki

k ii

ki i k i kIE E t E t v t v t vw v t w v tt v tI

1[ ( 1) (( )]) k

n

k i k ki

iw v t t tI v v

1( ) ( ) ( ) ( ) ( ) ( ) ( )n n n n

E I I

1i ( 1)[ ( 1) ( )]k k kH t v t v t

12

1 1 1 1( ) ( ) ( ) ( ) ( ) ( ) ( )ij i j i i ik k k k

i

ii j i i

k j k i k

E t w v t v t I v t w v t v t I v t

1n n n n

12

1 1 1 1( 1) ( 1) ( 1) ( 1) ( 1) ( 1) ( 1)ij i j i i i i

i j i ik k k k

i k j k i k

E t w v t v t I v t w v t v t I v t

Page 32: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Proof

( 1) ( ) ( 1) ( 1) (( ) ( )) ( )n n

k k k kk iki i k i kIE E t E t v t v t vw v t w v tt v tI 1 1i i

1

[ ( 1) (( )]) k

n

k i k ki

iw v t t tI v v

( 1)[ ( 1) ( )]k k kH t v t v t

vk(t) vk(t+1)Hk(t+1) E

1 0 1 01 < 0 1 < 0 E 0 1 01 0 1 < 01 < 0 1 0

E

Page 33: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Neuron of Continuous Hopfield NNs

v =a(u )w 1Ii

u

vi a(ui)1

wi1

wi2

iv1

v2 ui1

...wi

2

vn

ui =a1(vi)

1giCi

win ui

( )ia u

vi

11

vv

( )ia u

vivi

Page 34: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Dynamicsy

w 1Ii ( )

nidC u Iwi1

wi2

iv1

v2

1 1( )i iuv w

2 2( )i iuv w

1( )i ij

ii i

jj i

j i iC u Iugdt

vw

n ndu ...

wi

2

vn ( )i inn uv w 1 1

n ni

i ij ij i i ij jj i j i

jdC w w g Iud

v ut

giCi

win ui

( )ia u

idi t

udC iig u Gi

vv

( )i

1

n

i ij i ijj i

iij

dC w G Idt

uvu

vivi

j i

Page 35: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Continuous Hopfield NNsn

i ij i ii

ijdC w G Idt

uvu

I1 I2 I3 In

w1n w3

1jj i

dt

ww12

w13 w23

w2n

w32

w3n

wn2

wn3

g1C1

u1g2C2

u2g3C3

u3gnCn

un

w21 wn1

w31

v1 v2 v3 vn

. . .. . .

v1 v2 v3 vn

Page 36: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Continuous Hopfield NNsn

i ij i ii

ijdC w G Id

uvu

I1 I2 I3 In

w1nw2

w3n

1i ij i i

jj i

ijdt

n

ijj

jj IuGvwdtduC

111

111

1

w31

w12w13 w23

w2n

w32 wn2

wn3n

ijj

jj IuGvwdt

duC

221

222

2

g1C1

u1g2C2

u2g3C3

u3gnCn

un

w21 wn1n

jjj

ij

IuGvwdtduC

331

333

3

v1 v2 v3 vn

. . .. . .

n

ij

GduC

v1 v2 v3 vnnn

ijj

njnjn

n IuGvwdt

duC 1

Page 37: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Lyapunov Energy Functiony gy

I1 I2 I3 In

01 1 1 12 i j i i

j n

w13

w1n

w23

w2nw3n

wn3

u1 u2 u3 un

w21 wn1

w31

w1213 w23 w32 wn2

g1C1

1g2C2

2g3C3

3gnCn

n

v1 v2 v3 vn

. . .. . .

v1 v2 v3 vn

Page 38: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Lyapunov Energy Functiony gy

u=a1(v) v

I1 I2 I3 In

u=a 1(v)

1

dxxav

0

1 )(vi =a(ui)1

w

w1n

w

w2nw3n

wn3

v1

1

v1 1

ui

1

u u u uw21 wn1

w31

w12w13 w23 w32 wn2

uva )(1

g1C1

u1g2C2

u2g3C3

u3gnCn

un

v1 v2 v3 vn

. . .. . .

uva )(

v1 v2 v3 vn

Page 39: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Stability of Continuous Hopfield NNs

ni

i j i iij iGduC v uw I Dynamics1

i j i ijj i

ij idt

1

0

1 1 ( )2

in n n n v

i j i i iiijE v Gv v a v dvIw

y

01 1 1 12 j

i j i ij n

j

nidvdE dE

n niIG dv

1

i

i idt dv dt

1 1

ij i

i ji i

i

j

j

i IGw v udt

n

i idu dvC 1( )i i iu a v 11 ( )i i iu a v

1( )1du da v dv

1

i ii

iC

dt dt

21( )1 n

i i ida v dvC

( )1i i i i

i

du da v dvdt dv dt

1

( )i i ii

i i

Cdv dt

Page 40: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Stability of Continuous Hopfield NNsn

ii j i iij iGduC v u

dtw I Dynamics

1j

jj i

jdt

1

0

1 1 ( )2

in n n n v

i j i i iiijE v Gv v a v dvIw

01 1 1 12 j

i j i ij n

j

21( )1 n

i i ida v dvdE C

1( ) 0da v

1

ii i

Cdt dv dt

> 0

u=a1(v)

0dv

dEv

110dE

dt

dt

Page 41: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Stability of Continuous Hopfield NNs

I1 I2 I3 In

w1nw w3n

n

ijj

jj

IGduC

IuGvwdt

C

2

111

111

w31

w12w13 w23

w2n

w32 wn2

wn3

n

ijj

jj

du

IuGvwdt

C

221

222

2

g1C1

u1g2C2

u2g3C3

u3gnCn

un

w21 wn1

w31

ijj

jj IuGvwdtduC

331

333

3

v1 v2 v3 vn

. . .. . .

nn

n

jnjnj

nn IuGvw

dtduC

1

v1 v2 v3 vnij

jdt1

Page 42: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Basins of Attraction

Page 43: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Basins of Attraction

Page 44: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Local/Global Minima

Energy Landscapegy

Page 45: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model (Summary)( y)

• The Hopfield network may be operated in a continuous mode or discrete modemode or discrete mode

• The continuous mode of operation is based on an additive model

• The discrete mode of operation is based on the• The discrete mode of operation is based on the McCulloch-Pitts model

• We may establish the relationship between the stable states of the continuous and discrete Hopfield modelsstates of the continuous and discrete Hopfield models with the following assumptions:▫ The output of a neuron has the asymptotic values

xj = 1 when vj ∞ and xj = -1 when vj -∞ j j j j▫ The midpoint of the activation function of a neuron lies at

the origin, that is (0) = 0▫ Correspondingly, we may set the bias Ij equal to zero for all

jj.

Page 46: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model

• If the gain of the activation function is very large g y g(infinity), we can show that the energy function of the formulating the energy function E of the

( ) fdiscrete (and continuous) Hopfield models can be rewritten as

• The models tried to minimize the above energy functionfunction

Page 47: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: Storageg

• The learning is similar to Hebbian learning:

• wji = 0 if i = j. (Learning is one-shot.)• In matrix form the above becomes:In matrix form the above becomes:

• The resulting weight matrix W is symmetric:W WTW = WT

Page 48: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Model: Activation (Retrieval)

• Initialize the network with a probe pattern p pprobe.

• Update output of each neuron (picking them by d )random) as

• until x reaches a fixed point.• The fixed point is the retrieved output• The fixed point is the retrieved output.

Page 49: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Storage Capacity of Hopfield NetworkNetwork

• Given a probe (unknown input) equal to the p ( p ) qstored pattern, the activation of the jth neuron can be decomposed into the signal term and the

(noise term (The unit hypercube is N-dimensional and we have M fundamental memory):

Page 50: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Storage Capacity of Hopfield NetworkNetwork

• The signal to noise ratio is obtained as b = N/Mg• It has been shown that memory recall of the

Hopfield network deteriorates with increasing load parameter b, and breaks down at the critical value bc = 0.14Th M < 0 15N i d th t H fi ld t i• Thus, M < 0.15N in order that Hopfield retrieve correctly

Page 51: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Cohen-Grossberg Theoremg

• Cohen & Grossberg defined a general principle for assessing the stability of a certain class of neural networks described by the following equations (Hopfiledmodel is a special case):model is a special case):

duj(t)/dt = aj(uj)[bj(uj) - i=1N cji i (ui) ] ; j = 1,...,N

• They proved that the energy function E described below can be considered as a Lyapunov function for the above systemsystem

E = (1/2) i=1N j=1

N cji i (ui) j (uj) - j=1

N ∫0uj bj(k)’j(k)dk

’j(k) = (d/dk)j(k)

Page 52: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Associative Memories• Also named content-addressable memory.

• Autoassociative Memory (Hopfield Memory)

• Heteroassociative Memory (Bidirection Associative Memory)

Stored Patterns11( , )x y ii x y Autoassociative22

( , )( , )x

yy

x yii

Autoassociative

( , )p px y

ii x y Heteroassociative

ni Rxmi Ry

Page 53: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Memoryy

Fully connectedFully connected14,400 weights

1210 Neurons

Stored Patterns Memory Association

Page 54: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Hopfield Memoryy

Fully connectedFully connected14,400 weights

1210 Neurons

Stored Patterns Memory Association

Page 55: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Storage Algorithmg gSuppose that the set of stored patterns is of dimension n.

n

n

wwwwww

22221

11211

W

www

21

W

nnnn www 21

1 2( , , , ) 1, 2, .k k k k Tnx x x k p x

k

( )p

k k T

{ 1, 1} 1, 2, .kix i n

pk ki jx x i j

1

( )k k T

kp

W x x I 1

0

i jkij

jw

i j

Page 56: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Analysisy

( )p

k k T p W x x I 1

pk ki j

kij

x x i jw

1

( )k

p

W x x I 1

0kijw

i j

Suppose that x xi.

1( )

pk k T

kp

Wx x x I x

( )i i in p n p x x x

Page 57: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Example

( )p

k k T p W x x I 1

pk ki j

kij

x x i jw

1 (1 1 1 1)T x

11111111

1

( )k

p

W x x I 1

0kijw

i j

2

(1, 1, 1,1)( 1,1, 1,1)T

xx

111111111111

)( 11 Txx

111111111111

)( 22 Txx0 2 0 0

1111

00220022

)()( 2211 TT

2 0 0 00 0 0 2

W

22002200

)()( 2211 TT xxxx0 0 2 0

Page 58: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Example

1 (1 1 1 1)T x 1 2 3 42 2

2

(1, 1, 1,1)( 1,1, 1,1)T

xx

11 22 33 44

n n n

0 2 0 0

12

1 1 1( ) i j i

iii

j ijE w x x xI

x

1n n

x xw 2 0 0 00 0 0 2

W2

1 1i jij

i j

x xw

12

T x Wx0 0 2 0

1 2 3 42( )x x x x

Page 59: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Example

1 (1 1 1 1)T x 1 2 3 42 2

2

(1, 1, 1,1)( 1,1, 1,1)T

xx

11 22 33 44

1 1 11

1 11 1

E=4

1 2 3 4( ) 2( )E x x x x x1 1 1 1

E=0

StableE=4

Page 60: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Example

1 (1 1 1 1)T x 1 2 3 42 2

2

(1, 1, 1,1)( 1,1, 1,1)T

xx

11 22 33 44

1 1 11

1 11 1

E=4

E 01 2 3 4( ) 2( )E x x x x x

11 1 1

E=0

E 4Stable

E=4

Page 61: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Bidirection Memoryyy1 y2 yn

Y Layer. . .Y Layer (( ) )1t a t xWy

Fo

( )ty

ard

s

W=[wij]nm

orwardPass

Back

waPa

ss

. . .

X Layer( )tx ( 1 ( )) Tt ta x yW

x1 x2 xm

Page 62: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Bidirection Memoryy

Stored Patternsy1 y2 yn

Y Layer (( ) )1 W ( )

22

( , )( , )xx

yy. . .y

(( ) )1t a t xWy

ForP

( )ty

ward

ss

( , )p px y

W=[wij]nm

rwardPass

( )t

Back

wPa

s

( , )x y

. . .

x1 x2 xX Layer

( )tx ( 1 ( )) Tt ta x yW

x1 x2 xm

Page 63: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

The Storage Algorithmg g

Stored Patterns 21( , , , )Tkmx xxx { 1,1}ix Stored Patterns

11

2

1 2

1( , , , )

( , , , )k Tn

m

y

x x

y

x

yy

{ 1,1}ix

{ 1,1}iy 1

2

1

2

( , )( , )xx

yy ( )

pkk Ty xW( , )

x y

1k

pkk( , )p px y

1ij

kki

kjxw y

Page 64: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Analysisy

( )p

kk Ty xWSuppose xk’ is one of the stored vector:

( )p

k k T kka a x xy yxW

1( )

ky xW

1

( )k

a a

x xy yxW

( )p

kk Tk k k ka m y y

1'

( )kk

kk T

k

k kma

x xy y a m y y

0Energy Function:

T T T T1 12 2( , ) T T T TE x y x W y y Wx y Wx

Page 65: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Applications of Hopfield Memoryy

• Pattern restoration• Pattern completion• Pattern generalization• Pattern association

Page 66: Lecture 23: Associative memory & Hopfield Networksce.sharif.edu/courses/92-93/1/ce957-1/resources/root/Lectures/... · Supervised-Learning NNs • Feedforward NNs Perceptron Adaline,

Readingg

• S Haykin, Neural Networks: A Comprehensive y , pFoundation, 2007 (Chapter 14).