78
N. Laskaris

Hopfield NNets

  • Upload
    stash

  • View
    58

  • Download
    6

Embed Size (px)

DESCRIPTION

Hopfield NNets. N. Laskaris. Professor John Hopfield The Howard A. Prior Professor of Molecular Biology Dept. of Molecular Biology Computational Neurobiology; Biophysics Princeton University. - PowerPoint PPT Presentation

Citation preview

Page 1: Hopfield NNets

N. Laskaris

Page 2: Hopfield NNets

Professor John Hopfield

The Howard A. Prior Professor of Molecular Biology

Dept. of Molecular Biology Computational Neurobiology; Biophysics

Princeton University

Page 3: Hopfield NNets

The physicist Hopfield showed that models of physical systems could be used to solve computational problems

Such systems could be implemented in hardware by combining standard components such as capacitors and resistors.

Page 4: Hopfield NNets

The importance of the Hopfield nets in practical application is limited due to theoretical limitations of the structure,but, in some cases, they may form interesting models.

Page 5: Hopfield NNets

Usually employed in binary-logic tasks : e.g. pattern completion and association

Page 6: Hopfield NNets

The concept

Page 7: Hopfield NNets

In the beginning of 80s Hopfield published two scientific papers, which attracted much interest.

This was the starting point of the new era of neural networks, which continues today

(1982): ‘’Neural networks and physical systems with emergent collective computational abilities’’. Proceedings of the National Academy of Sciences, pp. 2554-2558.

(1984): ‘’Neurons with graded response have collective computational properties like those of two-state neurons’’. Proceedings of the National Academy of Sciences, pp. 81:3088-3092

Page 8: Hopfield NNets

‘‘The dynamics of brain computation”

How is one to understand the incredible effectiveness of

a brain in tasks such as recognizing

a particular face in a complex scene?

The core question :

Page 9: Hopfield NNets

Simple models of the dynamics of neural circuits are described that have collective dynamical properties.

These can be exploited in recognizing sensory patterns.

Using these collective properties in processing information

is effective in that it exploits the spontaneous properties

of nerve cells and circuits to produce robust computation.

Like all computers,

a brain is a dynamical system that carries out its computations by the change of its 'state' with time.

Page 10: Hopfield NNets

Associative memory, logic and inference,

recognizing an odor or a chess position, parsing the world into objects,

and generating appropriate sequences of locomotor muscle commands are all describable

as computation.

His research focuses on understanding

how the neural circuits of the brain produce

such powerful and complex computations.

J. Hopfield’s quest While the brain is totally unlike modern computers, much of what it does can be described as computation.

Page 11: Hopfield NNets

However, olfaction allows remote sensing, and much more complex computations

involving wind direction and fluctuating mixtures of odors

must be described to account for the ability of homing pigeons or slugs to navigate

through the use of odors.

Hopfield has been studying how such computations might be

performed by the known neural circuitry of the

olfactory bulb and prepiriform cortex of mammals or the analogous circuits

of simpler animals.

Olfaction

The simplest problem in olfaction is simply identifying a known odor.

Page 12: Hopfield NNets

Any computer does its computation by its changes in internal state.

In neurobiology, the change of potentials of neurons

(and changes in the strengths of the synapses) with time is what performs the computations.

Dynamical systems

Systems of differential equations can represent these aspects of neurobiology.

He seeks to understand some aspects of neurobiological computation through studying the behavior of equations modeling the time-evolution of neural activity.

Page 13: Hopfield NNets

Action potential computationFor much of neurobiology, information is represented by the paradigm of ‘‘firing rates’’,

i.e. information is represented by the rate of generation of action potential spikes, and the exact timing of these spikes is unimportant.

Page 14: Hopfield NNets

Action potential computation

Since action potentials last only about a millisecond,

the use of action potential timing seems a powerful potential means of neural

computation.

Page 15: Hopfield NNets

Action potential computationThere are cases,

for example the binaural auditory determination of the location of a sound source,

where information is encoded in the timing of action potentials.

Page 16: Hopfield NNets

Identifying words in natural speech is a difficult computational task which brains can easily do.

They use this task as a test-bed for thinking about the computational abilities of neural networks and neuromorphic ideas

Speech

Page 17: Hopfield NNets
Page 18: Hopfield NNets

Simple (e.g. binary-logic ) neurons are

coupled in a system with

recurrent signal flow

Page 19: Hopfield NNets

A 2-neurons Hopfield network of continuous states characterized by 2 stable states

1st Example

Contour-plot

Page 20: Hopfield NNets

A 3-neurons Hopfield network of 23=8 states characterized by 2 stable states

2nd Example

Page 21: Hopfield NNets

Wij = Wji

The behavior of such a dynamical system is fully determined by the synaptic weights

And can be thought of as an Energy minimization

process

3rd Example

Page 22: Hopfield NNets

Hopfield Nets are fully connected, symmetrically-weighted networks that extended the ideas of linear associative memories by adding cyclic connections .

Note: no self-feedback !

Page 23: Hopfield NNets

Regarding training a Hopfield net as a content-addressable memory

the outer-product rule for storing patterns is used

After the ‘teaching-stage’, in which the weights are defined, the initial state of the network is set (input pattern) and a simple recurrent rule is iterated till convergence to a stable state (output pattern)

Operation of the network

There are two main modes of operation:

Synchronous vs. Asynchronous updating

Page 24: Hopfield NNets

Hebbian Learning

Probe pattern

Dynamical evolution

Page 25: Hopfield NNets

A Simple Example

Step_1. Design a network with memorized patterns (vectors) [ 1, -1, 1 ] & [ -1, 1, -1 ]

Page 26: Hopfield NNets

There are 8 different states that can be reached by the net and therefore can be used as its initial state

#1: y1

#2: y2

#3: y3

Step_2. Initialization

Page 27: Hopfield NNets

Step_3. Iterate till convergence- Synchronous Updating - 3 different examples

of the net’s flow

It converges immediately

Page 28: Hopfield NNets

Schematic diagram of all the dynamical trajectories that correspond to the designed net.

Stored pattern

Step_3. Iterate till convergence

- Synchronous Updating -

Page 29: Hopfield NNets

Or Step_3. Iterate till convergence- Asynchronous Updating -

Each time, select one neuron at random and update its state with the previous ruleand the –usual- convention that if the total input to that neuron is 0 its state remains unchanged

Page 30: Hopfield NNets

Explanation of the convergenceThere is an energy function related with each state of the Hopfield network

E( [y1, y2, …, yn]T ) = -Σ Σ wij yi yj

where [y1, y2, …, yn]T is the vector of neurons’ output,

wij is the weight from neuron j to neuron i,

and the double sum is over i and j.

Page 31: Hopfield NNets

The corresponding dynamical system evolves toward states of lower Energy

Page 32: Hopfield NNets

States of lowest energy correspond to attractors of Hopfield-net dynamicsE( [y1, y2, …, yn]T ) = = -Σ Σ wij yi yj

Attractor-state

Page 33: Hopfield NNets
Page 34: Hopfield NNets

Capacity of the Hopfield memory

When this is found, the corresponding pattern of activation is outputted

In short, while training the net (via the outer-product rule) we’re storing patterns by posing different attractors in the state-space of the system. While operating, the net searches the closest attractor.

Page 35: Hopfield NNets

How many patterns we can store in a Hopfield-net ?

0.15 N, N: # neurons

Page 36: Hopfield NNets
Page 37: Hopfield NNets

A simple Pattern

Recognition Example

Computer Experimentation

Class-project

Page 38: Hopfield NNets

Stored Patterns (binary images)

Page 39: Hopfield NNets

Perfect Recall- Image Restoration

Erroneous Recall

Page 40: Hopfield NNets

Irrelevant results

Note: explain the ‘negatives’ ….

Page 41: Hopfield NNets
Page 42: Hopfield NNets

The continuous Hopfield-Net as optimization machinery

Page 43: Hopfield NNets

‘Simple "Neural" Optimization Networks: An A/D Converter, Signal Decision Circuit, and a Linear Programming Circuit’

[ Tank and Hopfield ; IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:

Page 44: Hopfield NNets

Hopfield modified his network so as to work with continuous activation and

-by adopting a dynamical-systems approach-

showed that the resulting system is characterized

by a Lyaponov-function who termed it ‘Computational-Energy’ & which can be used to tailor the net for specific optimizations

Page 45: Hopfield NNets

i

n

jjjij

i

ii IugTn

u

dt

du

1

)(

g u gain u( ) tanh( ) 1

21

E T g u g u I g ui

n

ijj

n

i i j j i i ii

n

1

2 1 1 1( ) ( ) ( )

E T V V I Vi

n

ijj

n

i j i ii

n

1

2 1 1 1

Tij=Tji και Tij=0

The system of coupled differential equation describing the operation of continuous Hopfield net

The Computational Energy

Weights: Wij ≡ Tij

Biases: Ii

Neuronal outputs: Yi ≡ Vi

Page 46: Hopfield NNets

When Hopfield nets are used for function optimization, the objective function F to be minimized is written as energy function in the form of computational energy E .

The comparison between E and F leads to the design, i.e. definition of links and biases, of the network that can solve the problem.

Page 47: Hopfield NNets

The actual advantage of doing this is that the Hopfield-net has a direct hardware implementation that enables even a VLSI-integration of the algorithm performing the optimization task

Page 48: Hopfield NNets

An example: ‘Dominant-Mode Clustering’ Given a set of N vectors {Xi} define the k among them that form the most compact cluster {Zi}

}{if

}{if: k = u with i

N

i ii

iiii

Z X 0

Z X 1}u{}u{

2

ji X-X ji

N

j=1

N

=1ii uu=)}u({F

The objective function F can be written easily in the form of computational energy E

Page 49: Hopfield NNets

0=I

jiif

= j)D(i, -2=T=T

VI - VVT2

1- =F

objijiij

ii

N

=1ijiij

N

j=1

N

=1i

2

ji2

0

XX

ji

N

j=1

N

=1ii uu=)}u({F

2

ji X-X

With each pattern Xi we associate a neuron in the Hopfield network ( i.e. #neurons = N ).

The synaptic weights are the pairwise-distances (*2)

If its activation is ‘1’ when the net will converge the corresponding pattern will be included in the cluster.

There’s an additional Constraint so as k neurons are

‘on’

Page 50: Hopfield NNets

A classical example: ‘The Travelling Salesman Problem’

Page 51: Hopfield NNets

Coding a possible route as a combination of neurons’ firings

The principle

53 4 1 2 5

|5-3|+|3-4|+|4-1|+|1-2|+|2-5|

Page 52: Hopfield NNets

The problem :

The idea :

An example from clinical Encephalography

Page 53: Hopfield NNets

‘‘ ‘‘Hopfield Neural Nets Hopfield Neural Nets

for monitoring Evoked Potential Signalsfor monitoring Evoked Potential Signals’’’’

[ Electroenc. Clin. Neuroph. 1997;104(2) ]

The solution :

N. Laskaris et al.

Page 54: Hopfield NNets

The Boltzmann Machine

Improving Hopfield nets by simulating annealing and adopting more complex topologies

Page 55: Hopfield NNets

(430 – 355) π.X.

‘Ας κλείσω λοιπόν εδώ . . . .

. . . . . . . . . . . . . .. . . . κάποιος άλλος, ίσως θα συμπληρώσει όσα δεν μπόρεσα να ολοκληρώσω’

- Θεμιστογένης ο Συρακούσιος 1ο έτος της 105ης Ολυμπιάδας

ΕΛΛΗΝΙΚΑ

Page 56: Hopfield NNets

(1979-1982)

Hopfield-netsPNAS

(1982)

Page 57: Hopfield NNets
Page 58: Hopfield NNets

‘‘ Τα παιδιά στην Κερκίδα είναι η μόνη σου Ελπίδα ....’’

Page 59: Hopfield NNets
Page 60: Hopfield NNets

A Very Last Comment on Brain-Mind-

Intelligence-Life-Happiness

Page 61: Hopfield NNets

How I Became Stupid by

Martin Page

Penguin Books, 2004, 160 pp. ISBN: 0-14-200495-2

Page 62: Hopfield NNets

In HOW I BECAME STUPID,

The 25-year-old Antoine concludes

‘‘to think is to suffer’’,

a twist on the familiar assertion of Descartes.

For Antoine, intelligence is the source of unhappiness.

He embarks on a series of hilarious strategies to make himself

stupid and possibly happy

Page 63: Hopfield NNets

Animals that Abandon their Brains Dr. Jun Aruga Laboratory for Comparative Neurogenesis

A “primitive but successful”

animal

Oxycomanthus japonicus

Page 64: Hopfield NNets

There is astonishing diversity in the nervous systems of animals, and the variation between species is remarkable.

From the basic, distributed nervous systems of jellyfish and sea anemones to the centralized neural networks of squid and octopuses to the complex brain structures at the terminal end of the neural tube in vertebrates, the variation across species is humbling

people may claim that “more advanced” species like humans are the result of an increasingly centralized nervous system that was produced through evolution. This claim of advancement through evolution is a common, but misleading, one. It suggests that evolution always moves in one direction: the advancement of species by increasing complexity

Page 65: Hopfield NNets

evolution may selectively enable body structures that are more enhanced and complicated, but it may just as easily enable species

that have abandon complex adaptations in favour of simplification. Brains, too, have evolved in the same way. While the brains of some species, including humans, developed to allow them to thrive, others have abandoned their brains because they are no longer necessary.

Page 66: Hopfield NNets

For example, the ascidian, or sea squirt, lives in shallow coastal waters and which is a staple food in certain regions, has a vertebrate-like neural structure with a neural tube and notochord in its larval stage.

As the larvae becomes an adult, however, these features disappear until only very basic ganglions remain.

In evolutionary terms this animal is a “winner” because it develops a very simplified neural system better adapted to a stationary life in seawater

In the long run, however, evolutionary success will be determined by what species survives longer:

humans with their complex brains (and their weapons) or the brainless Dicyemida

Page 67: Hopfield NNets

1948-1990

Δισέγγονος του Ζορμπά και ανηψιός της Ελλης Αλεξίου.

Γεννήθηκε στην Αθήνα.

Ξεκίνησε την καριέρα του το 1970 από τη Θεσσαλονίκη με το συγκρότημα-ντουέτο "Δάμων και Φιντίας". Το 1976 ιδρύει το συγκρότημα "Σπυριδούλα".

Page 68: Hopfield NNets
Page 69: Hopfield NNets

Η σκέψη μας είναι το

αφεντικό ή ο

υπηρέτης μας ;

Page 70: Hopfield NNets
Page 71: Hopfield NNets

Emotional Intelligence

also called EI or EQ , describes an ability, capacity, or skill to perceive, assess, and manage the emotions of one's self, of others, and of groups

Page 72: Hopfield NNets
Page 73: Hopfield NNets

H ποιητική νοημοσύνη μπορεί να λείπει από

τους παντογνώστες,

κι ωστόσο να κατοικεί μέσα στον πιο απλόν

άνθρωπο

Page 74: Hopfield NNets
Page 75: Hopfield NNets

Class-project Oral-Exams

Page 76: Hopfield NNets

Oral-Exam Appointments

DateDate

TimeTime

AEM 31 May31 May 5 June5 June 7 June7 June

1st hour

794845 893 899

711809874909

627887946960

2nd

hour915920932 949

9239509791024

9629809951202

3rd hour

1023

1227

1223

Page 77: Hopfield NNets

Further Inquiries

Page 78: Hopfield NNets