5
Proc. Nati. Acad. Sci. USA Vol. 83, pp. 9469-9473, December 1986 Biophysics Sequential state generation by model neural networks (associative memory/central pattern generators/feedback/synaptic connections/time delays) DAVID KLEINFELD Department of Molecular Biophysics, AT&T Bell Laboratories, Murray Hill, NJ 07974 Communicated by F. A. Bovey, September 3, 1986 ABSTRACT Sequential patterns of neural output activity form the basis of many biological processes, such as the cyclic pattern of outputs that control locomotion. I show how such sequences can be generated by a class of model neural net- works that make defined sets of transitions between selected memory states. Sequence-generating networks depend upon the interplay between two sets of synaptic connections. One set acts to stabilize the network in its current memory state, while the second set, whose action is delayed in time, causes the net- work to make specified transitions between the memories. The dynamic properties of these networks are described in terms of motion along an energy surface. The performance of the net- works, both with intact connections and with noisy or missing connections, is illustrated by numerical examples. In addition, I present a scheme for the recognition of externally generated sequences by these networks. Cyclic patterns of motor neuron activity are involved in the production of many rhythmic movements, such as locomo- tion. The neural circuits that control the muscles involved in executing the movements are typically referred to as central pattern generators (CPGs) (for reviews see refs. 1-3). A common feature of CPGs is that sequence generation can occur in the absence of both sensory feedback and feedback from other neural centers. A second feature of some CPGs is that they do not contain intrinsic pacemakers. Thus the cy- clic activity, and the timing of this activity, is a collective property of the CPG. In this paper I present a formal model of sequence genera- tion in the context of networks of model neurons, such as those discussed by Hopfield (4, 5) (see also refs. 6-13). Hop- field networks use highly interconnected model neurons to perform complex computational tasks. These tasks, such as recalling information by association (4, 9, 13) or solving opti- mization problems (14), involve networks that seek a single, stationary state as their output. The sequential state genera- tors that I describe do not come to rest in a single state. Rather they make transitions between selected states and thus generate sequential patterns of bit sequences. Se- quence-generating networks provide a formal setting for un- derstanding some CPGs. They also provide a basis for the sequential recall of information in associative memories. THEORY Background. As an antecedent to understanding sequence generation, we consider a network consisting of N model neurons that functions as an associative memory (inner cir- cuit, Fig. 1) (4, 5). Neurons are represented by nonlinear am- plifiers whose output, Vi(t), saturates when the input, ui(t), exceeds a threshold value. The detailed form of this behavior does not affect the function of these networks. We take +1 if ui(t) ¢ 1/X3 Vi(t) = i P. ui(t) if -1/,3 < ui(t) < 1/X3, L-1 if ui(t) S -1/fl [1] where P is the gain of the neuron in the linear operating re- gion. A state of the network at time t, V(t), is defined by a se- quence of N numbers that specify the value of the output of each neuron. Under saturating conditions, neurons are ei- ther fully active (+ 1) or quiescent (-1) and V(t) is defined by a binary sequence. There are 2N possible binary sequences. Memories, M', are represented by specific states chosen from the 2N possibilities. For example, MV = (+1 -1 -1 ... ) implies that neuron 1 is active, neurons 2 and 3 are qui- escent, etc. Each model neuron receives input from all other neurons via a set of pairwise synaptic connections (Fig. 1). The net- work functions as an associative memory when the strength of each synapse is determined from the memories according to the outer-product rule (e.g., ref. 9): 1 n Tij = N E MY mjp (i 7& PI where n is the number of stored memories and Tii = 0. The T matrix is essentially a sum of projection operators that map the present state of the network to the closest memory-i.e., the memory with the least number of differing bits. The out- put of the network will converge from an arbitrary initial state to a stationary memory according to the circuit equa- tions (5) (Fig. 2): Ui(t) + I dti = TijVj(t), dt j=1 [3] where rsi = R5s1C is the charging time of the ith neuron and Rsi is related to the input resistance of the neuron, rsi, and the synaptic resistances tij and dij (next section) by N N 1/Rsi = l/rsi + Z 1/tij + > l/d1j. j=1 j=1 [4] The magnitudes of the synaptic strengths Tij are given by ITijl = Rsi/t11, with negative values of Tij obtained by using the inverted value of the output, -Vj(t) (broken lines in Fig. 1). Sequence Generation. Projection operators that map the state of the network from one memory to the next-i.e., terms of the form My"+' MJV-should generate transitions Abbreviation: CPG, central pattern generator. 9469 The publication costs of this article were defrayed in part by page charge payment. This article must therefore be hereby marked "advertisement" in accordance with 18 U.S.C. §1734 solely to indicate this fact. [2] Downloaded by guest on March 27, 2020

Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

Proc. Nati. Acad. Sci. USAVol. 83, pp. 9469-9473, December 1986Biophysics

Sequential state generation by model neural networks(associative memory/central pattern generators/feedback/synaptic connections/time delays)

DAVID KLEINFELDDepartment of Molecular Biophysics, AT&T Bell Laboratories, Murray Hill, NJ 07974

Communicated by F. A. Bovey, September 3, 1986

ABSTRACT Sequential patterns of neural output activityform the basis of many biological processes, such as the cyclicpattern of outputs that control locomotion. I show how suchsequences can be generated by a class of model neural net-works that make defined sets of transitions between selectedmemory states. Sequence-generating networks depend uponthe interplay between two sets of synaptic connections. One setacts to stabilize the network in its current memory state, whilethe second set, whose action is delayed in time, causes the net-work to make specified transitions between the memories. Thedynamic properties of these networks are described in terms ofmotion along an energy surface. The performance of the net-works, both with intact connections and with noisy or missingconnections, is illustrated by numerical examples. In addition,I present a scheme for the recognition of externally generatedsequences by these networks.

Cyclic patterns of motor neuron activity are involved in theproduction of many rhythmic movements, such as locomo-tion. The neural circuits that control the muscles involved inexecuting the movements are typically referred to as centralpattern generators (CPGs) (for reviews see refs. 1-3). Acommon feature of CPGs is that sequence generation canoccur in the absence of both sensory feedback and feedbackfrom other neural centers. A second feature of some CPGs isthat they do not contain intrinsic pacemakers. Thus the cy-clic activity, and the timing of this activity, is a collectiveproperty of the CPG.

In this paper I present a formal model of sequence genera-tion in the context of networks of model neurons, such asthose discussed by Hopfield (4, 5) (see also refs. 6-13). Hop-field networks use highly interconnected model neurons toperform complex computational tasks. These tasks, such asrecalling information by association (4, 9, 13) or solving opti-mization problems (14), involve networks that seek a single,stationary state as their output. The sequential state genera-tors that I describe do not come to rest in a single state.Rather they make transitions between selected states andthus generate sequential patterns of bit sequences. Se-quence-generating networks provide a formal setting for un-derstanding some CPGs. They also provide a basis for thesequential recall of information in associative memories.

THEORYBackground. As an antecedent to understanding sequence

generation, we consider a network consisting of N modelneurons that functions as an associative memory (inner cir-cuit, Fig. 1) (4, 5). Neurons are represented by nonlinear am-plifiers whose output, Vi(t), saturates when the input, ui(t),exceeds a threshold value. The detailed form of this behavior

does not affect the function of these networks. We take

+1 if ui(t) ¢ 1/X3Vi(t) = i P. ui(t) if -1/,3 < ui(t) < 1/X3,

L-1 if ui(t) S -1/fl

[1]

where P is the gain of the neuron in the linear operating re-gion.A state of the network at time t, V(t), is defined by a se-

quence ofN numbers that specify the value of the output ofeach neuron. Under saturating conditions, neurons are ei-ther fully active (+ 1) or quiescent (-1) and V(t) is defined bya binary sequence. There are 2N possible binary sequences.Memories, M', are represented by specific states chosenfrom the 2N possibilities. For example, MV = (+1 -1 -1...) implies that neuron 1 is active, neurons 2 and 3 are qui-escent, etc.Each model neuron receives input from all other neurons

via a set of pairwise synaptic connections (Fig. 1). The net-work functions as an associative memory when the strengthof each synapse is determined from the memories accordingto the outer-product rule (e.g., ref. 9):

1 n

Tij = N E MY mjp (i 7& PI

where n is the number of stored memories and Tii = 0. The Tmatrix is essentially a sum of projection operators that mapthe present state of the network to the closest memory-i.e.,the memory with the least number of differing bits. The out-put of the network will converge from an arbitrary initialstate to a stationary memory according to the circuit equa-tions (5) (Fig. 2):

Ui(t) + Idti = TijVj(t),dt j=1[3]

where rsi = R5s1C is the charging time of the ith neuron andRsi is related to the input resistance of the neuron, rsi, andthe synaptic resistances tij and dij (next section) by

N N

1/Rsi = l/rsi + Z 1/tij + > l/d1j.j=1 j=1

[4]

The magnitudes of the synaptic strengths Tij are given byITijl = Rsi/t11, with negative values of Tij obtained by usingthe inverted value of the output, -Vj(t) (broken lines in Fig.1).Sequence Generation. Projection operators that map the

state of the network from one memory to the next-i.e.,terms of the form My"+' MJV-should generate transitions

Abbreviation: CPG, central pattern generator.

9469

The publication costs of this article were defrayed in part by page chargepayment. This article must therefore be hereby marked "advertisement"in accordance with 18 U.S.C. §1734 solely to indicate this fact.

[2]

Dow

nloa

ded

by g

uest

on

Mar

ch 2

7, 2

020

Page 2: Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

Proc. Natl. Acadc Sci USA 83 (1986)

F MATRIX D MATRIX T MATRIX NEURONS TIME DELAYSFIG. 1. Diagram of the neural network sequence generator and related circuits. The ith neuron is modeled by a nonlinear amplifier (triangle

labeled P) that buffers a capacitive charging circuit; there are N neurons. Broken lines correspond to inverted output signals. The inner circuit,consisting of the neurons and the synaptic strengths Tij lxtij, constitutes an associative memory. The addition of time delays, 2(t, TDj), linearbuffering amplifiers (open triangles), and the synaptic strengths Dijx ldij forms the sequence generator. Sequence recognition is performedon the external inputs, Ei(t), multiplied by the synaptic strengths Fij a 1/fij.from the vth to the (v + l)th memories (4, 15, 16). However,when terms of this form were added to the T matrix, onlysequences of limited length were achieved (4). The difficultywith this construction is that the network does not becomestationary in one memory before the transition to the nextmemory begins. This causes the state of the network to be-come progressively mixed among all the memories makingup a sequence, which makes the generation of arbitrarilylong linear sequences, or cyclic sequences, unattainable.

The essential requirement for sequence generation is toallow the network to become stationary in one memory be-fore inducing a transition to another memory. The transi-tions between memories can be delayed by having the inputto each neuron depend upon the history of the network via aset of delayed output states, VDj(t). These states are definedas the convolution of Vj(t) with a (normalized) delay func-tion, 26(t, Tbj)-i.e.,

VDj(t) = fVj(t - x)9(x, Tbj)dx. [5]

The delay time, TDj, must be long compared to the chargingtime, Tsr. For the propagation delays inherent in nondisper-sive cables or with active propagation along axons, %(t, 1j)= 8(t - rbj) and VDj(t) = Vj(t - TO). For the delays associ-ated with charging a capacitor, 26(t, mbj) = e-t'1,Di/Tb.The delayed outputs form a second set of inputs to the

neurons via the synaptic connections Dij (Fig. 1). Thestrengths of these connections are chosen to cause transi-tions through a sequence of memories. For each independentsequence of length m, m c n:

m

Di = M!EMr+1 My (i [6]

with Dii = 0. The magnitudes of the strengths Dij are relatedto the resistances dij and Rsi by IDjjI = Rsi/dij. To generatecyclic sequences the sum in Eq. 6 is performed modulo m.The circuit equations describing the dynamics of the se-quence generator are

du=(t) N Nu,(t+ ~j dt= > T,1Vj(t) + ~I D,1V~j (t). [7]u'(t)4dt j=1 ~~~~j=1

Binary Limit with Delta Function Delay. The model dis-cussed above uses neurons with a graded input-output rela-tion (Eq. 1). It is useful to consider the "binary" limit of thismodel-i.e., p -X oo andTrj --+ 0-with the additional con-straint 9(t, TDj) = 8(t - T7). This case simplifies the analysisof the sequence generator without changing the essential fea-tures of its dynamics. The input to the ith neuron is now

N N

ui(k) = > TijVj(k) + Z DijVj(k - KD),1=1 j=1 [8]

where the time step k approximates the charging time, Ts.The delay interval, KD, is taken to be the same for all neuronsand is equivalent to KD T7/Ts. The outputs Vi(k) are updat-ed to either + 1 or -1 depending upon the sign of ui(k) (Eq. 1with P -* o0). The updating is performed asynchronously-i.e., the value of the index i is chosen at random to simulatevariations in Tsi between neurons.Sequence Dynamics. The dynamic properties of the se-

quence generator can be qualitatively understood by exam-ining the input to each neuron as a function of V(k) and V(k -KD). To place this analysis in a physical setting, we illustratethe dynamics in terms of motion on a quadratic, "energy,"surface defined by

N ~~~NE = -(I1 Vi(k) TjVj(k) + I Vi(k)DijVj(k - KD)). [91

\i~j is

Fig. 2 shows contour plot of the energy (E) as a function ofV(k) and V(k - KD) for the case of orthogonal memories-i.e.,

N

N1-2, Mj, Mj,= (V - v').j=l

[10]

Memories appear as minima, or bowls, with E = -2. Theyare separated by relatively broad saddles with E -1.We consider a network soon after it has settled into the

vth memory-i.e., V(k) = MV (point a, Fig. 2). For a timeshort compared to KD, the value of the delayed state isV(k - KD) = M -'. The energy of the network is presently ata minimum and the input to the ith neuron is (Eqs. 8 and 10):

9470 Biophysics: Kleinfeld

Dow

nloa

ded

by g

uest

on

Mar

ch 2

7, 2

020

Page 3: Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

Proc. Natl. Acad. Sci. USA 83 (1986) 9471

N N

ui(k) = E TijMjv + Z DijMjv-1 2 MlY.j=l j=l

[11]

Contributions from both the current state and the delayedstate thus stabilize the network in the vth memory.

After the network has remained stationary in the vth mem-ory for an interval - KD, the value of V(k - KD) shifts towardMV (path a-b, Fig. 2). This change increases the energy ofthe network such that it moves towards the top of a bowl inthe energy surface (point b, Fig. 2). When V(k - KD) finallyequals M', the input to the ith neurons becomes

N N

ui(k) ,E TMJ' + > DijMjv M/' + M' 1. [12]

These new values for ui(k) place the network in a mixed,nonstationary state from which it can make the transitionfrom MV to MV+l. The transition corresponds to movingacross a saddle region on the energy surface (path b-c, Fig.2) and falling into the adjacent bowl (path c-d, Fig. 2). Thenetwork will remain in the new memory for an interval KD,after which it will make a transition to the next memory inthe sequence, etc.

Sequence Recognition. An extension of the sequence gen-erator can be used to recognize external sequences (Fig. 1).Recognition is defined as a completed set of transitions ofthe sequence generator in response to a particular externalinput, E(k) (17). We require that the sequence of states, L,composing the external input can be mapped onto the se-quence of memories used to form the D matrix (Eq. 6). A setof synaptic connections that produce this mapping are

Fij = VI My LJV (i ij), [13]

where Fij connects the ith component of the external inputto the jth neuron, and Fii = 0. The external input is clockedwith an average period of KE. The circuit equations for the

recognition network, shown in Fig. 1, areN N

ui(k) = Z TijVj(k) + X I D1jVj(k - KD)j=1 j=l

N

+ E IFij Ej(k).j=l

[14]

The coefficient X is adjusted so that the contribution to ui(k)from the delayed states alone is insufficient to cause transi-tions between memories. Sequence generation is now possi-ble only when the external input can assist in driving thetransitions. Thus, a completed set of transitions occurs whenthe externally generated sequence of states is equivalent tothe internal sequence of memories (Eq. 12). This set is com-pleted in a period Ak = m KE with cyclic sequences. In thisscheme, E must be large enough to elicit transitions but smallenough so that E(k) does not dominate the inputs to the indi-vidual neurons. The external period must be longer than thedelay time-i.e., KE > KD.The recognition scheme can be interpreted in terms of the

energy diagram of Fig. 2. With X << 1, inputs from the de-layed state can force the network toward the top of an ener-gy bowl (point b, Fig. 2) but not out of that bowl. The addi-tional force required to move the network up to the saddleregion and over into the next bowl, or memory, must be sup-plied by the external input.

NUMERICAL SIMULATIONSSequence Generation. I first demonstrate the functioning

of the sequence generator (Eqs. 1, 2, 6, and 8) with a smallnetwork. Fig. 3a shows the individual neural outputs from anetwork containing 7 neurons and 3 memories with a delayinterval KD = 6. The statistical independence of the memo-ries used to form the T and D matrices was ensured by se-lecting nearly orthogonal memories (Eq. 10). Starting from arandomly selected initial state, the network converged to-ward a memory and subsequently made cyclic transitions be-tween all three memories. Relatively smooth transitions,however, did not occur until after the first few cycles; this isillustrated by the output of neuron 7 (Fig. 3a), which was

MV L.II

Mv- 1r

Mv-2

MV-2 Mv-1 MV

CURRENT STATE, V(k)

*ON(+ ) 7 OFF ( a) Mlm2M3 (a)

2 - ** ,3 _A ~

56 In=7_ L

o >~2

CDZ,zo- 0

MV+I

FIG. 2. Contour plot of the energy function, E, for the sequencegenerator. The plot was constructed by using Eq. 9 with orthogonalmemory states. The arrows indicate a segment of the path the outputof the network follows in the limit 8 -+ oc (Eq. 1) with a delta func-tion delay.

<V(k)*M1> 0 (

<V(k)M2> o.

<V(k).M3>oT

< V(k) V(k - kD)> o kD

0 50 100 150TIME STEP, k

FIG. 3. Simulation of a network with 7 neurons and 3 memories,labeled MI, M2, and M3, that generates cyclic sequences; see textfor details. (a) Output of each neuron, Vi(k), plotted as a function ofthe time step k. (b) Overlap of the current state of the network witheach of the memories (Eq. 14), (V(k)-MV) (top three traces); theoverlap of the current state with the delayed state, (V(k) V(k - KD))(bottom trace).

0

F-

C)Lii

LLJ-j

Biophysics: Kleinfeld

mv+1

Dow

nloa

ded

by g

uest

on

Mar

ch 2

7, 2

020

Page 4: Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

Proc. NatL. Acad. Sci. USA 83 (1986)

specified to have the value V7(k) = -1 for all 3 memories.Details of the sequential behavior can be highlighted by

examining the overlap of V(k) with each of the memories, asopposed to examining the outputs of the individual neurons(23). We define the (normalized) overlap, (V(k) MV), as

N

(V(k)-Mv) =- Vj(k)MMj. [15]

A magnitude of 1 implies that the network is in the vth mem-ory. These overlaps are plotted in Fig. 3b for the 7-neuronsequence generator. The width of each overlap, -KD, corre-sponds to the time spent in each memory. I also show inFig. 3b the overlap of the current state with the delayedstate, (V(k)-V(k - KD)). This overlap peaks when the net-work is undergoing a transition. The maximum value(V(k) V(k - KD)) = 1 occurs just prior to a transition. It cor-responds to the network sitting on the upper edge of a bowlin the energy surface (point b, Fig. 2). The width of the peak,2-3 time steps, corresponds to the transition time betweenmemories.We now focus on sequence generators containing 100 neu-

rons, in which the statistical properties of large neural net-works should emerge. For these studies we used memoriescomposed of bit sequences chosen at random. Fig. 4 illus-trates the cyclic output of a network with KD = 6 and 14memories; 14 memories is just above the value for whichretrieval errors can occur with associative memories (18).Approximately two complete cycles were required beforethe output reached a steady periodicity, after which the cy-cle period remained constant. Details of the initial transientbehavior depended on both the initial state of the network,V(O), and the value of KD; with longer delays fewer cycleswere required to reach steady conditions.The cyclic output of these networks continued essentially

indefinitely for sequences of length m = n < 25 and delayintervals KD 2 5. With shorter values of KD the network didnot become stationary in each memory and often becametrapped in a spurious stable state (19). The period for eachcycle under steady conditions was approximately Ak =

A

-.

%.o

w

0

14

0 500TIME STEP, k

nf(KD + 2); the time step of 2 corresponds to the transitiontime between memories.

Fault Tolerance. Neural networks are expected to functionproperly with errors in their synaptic connections. Qualita-tively, this occurs because n N bits of information are con-tained in the memories while the D and T matrices each con-tain N(N - 1)log2n >» n-N bits. Errors in the values ofsome synaptic strengths are offset by the redundant storageof information in the synapses.

I assessed the ability of the sequence generator to functionwith noisy synaptic connections by randomly incrementingor decrementing the Tij and Dij connections (i 1j) in one-bitincrements. For a network of 100 neurons with 14 memories(Fig. 4), the sequential output remained essentially unaffect-ed with root mean square (rms) noise levels up to approxi-mately twice the rms value of the synaptic strength. At thisthreshold level for failure, the rms signal-to-noise of the in-put to each neuron is reduced from the intrinsic value of-(N/n)112 = 3 to a value of =1.5.

I examined the effect of stochastic removal of synapticconnections on sequence generation by setting randomly se-lected Tij and Dij connections to zero. For a network of 100neurons and 14 memories (Fig. 4) the cyclic behavior wasessentially unperturbed so long as 40% or fewer of the con-nections were removed. However, when additional connec-tions were removed the output degraded rapidly, ceasingwithin one cycle with 50% of the connections missing.A second method for removing 50% of the connections is

to set one randomly chosen member of each Tij and Tjj pairand each Dij and Djj pair to zero. Sequence generation wasessentially unaffected by this alteration. This shows that thesequence generator functions without bidirectional connec-tions between neurons. In contrast to the first method, thesecond method removes connections in a relatively uniformmanner and therefore did not cause the large fluctuations inthe D matrix necessary to halt transitions.

Capacitative Delays. An exponential weighting functionwas used as the time delay-i.e., 95(k, KD) = e 'KD/KD-for

-J

cr-w

0

14

1000

FIG. 4. Simulation of a network with 100 neurons and 14 memo-ries that generates cyclic sequences; see text for details. The overlapof the current state of the network with each of the memories isplotted as a function of the time step, k. The bar in the lower leftindicates an overlap value of 1.

0 500TIME STEP, k

1000

FIG. 5. Recognition of an external cyclic sequence by a networkwith 100 neurons and 14 memories, see text for details. The overlapof the current state with each of the memories is plotted as a func-tion of the time step, k. The bar in the lower left indicates an overlapvalue of 1.

MEMORIES PLUS EXTERNAL INPUTS

(DO = 2600 AZ oo00IV-mkE-A

I _

-'.\ \~- CYCLE PERIOD

:I=

I

9472 Biophysics: Kleinfeld

Dow

nloa

ded

by g

uest

on

Mar

ch 2

7, 2

020

Page 5: Sequential state by neuralProc. Natl. Acad. Sci. USA83 (1986) 9471 N N ui(k) = ETijMjv + ZDijMjv-1 2 MlY. j=l j=l [11] Contributions from both the current state and the delayed state

Proc. NatL Acad ScL USA 83 (1986) 9473

simulations with networks of 100 neurons. Sequential outputwas obtained with delay intervals as short as KD = 3, al-though the maximum length of a sequence was limited tom = n 7. This result implies that sequence generation doesnot depend on the detailed form of the time delay. Further-more, sequential output was maintained only if the peak val-ue of (V(k)'VD(k)) exceeded -0.8 prior to a transition; for adelta function delay the peak value is =1.0. This suggeststhat reliable transitions occur only if the trajectory of thenetwork along its energy surface comes close to the top of abowl (point b, Fig. 2).

Sequence Recognition. I demonstrate the sequence recog-nition scheme (Eqs. 1, 2, 6, 13, and 14) with a network of 100neurons and 14 randomly selected memories. The thresholdvalue of X, below which sequence generation does not occur,was found to be X 0.5. Thus I choose X = 0.4 and e = 0.2as parameters for the simulation to ensure that only the ex-ternal input, E(k), could elicit sequential outputs. The exter-nal sequence was clocked with a period KE = 3 KD= 18. Thesequence consisted of 14 randomly chosen states that weremapped onto the memories via synapses with strengths Fij(Eq. 13). The initial state of the network was set to V(0) =M1 and the initial state of the external input was E(0) = L11.The difference between the states, Av = 10, corresponds toan initial phase difference of A40 = (Av/n).3600 2600 be-tween the internal and external states.The output of the sequence recognizer is shown in Fig. 5.

At first the output fluctuated between memories, concomi-tant with the phase difference between E(k) and V(k) con-verging toward zero. Sequential outputs emerged when thestates of the external sequence and the memories of the se-quence generator varied in synchrony-i.e., AS = 00. Theperiod of the cyclic output (m = n) was Ak = noKE 250, ascompared with Ak 110 for the free-running sequence gen-erator (Eq. 14 with X = 1 and E = 0); cf. Figs. 4 and 5. Thetime required for the external and internal sequences to syn-chronize included contributions from the overall settlingtime of the network (Fig. 4) and corresponded roughly to2'n- KE, independent of the value of AW0.

DISCUSSIONI have shown that the generation of sequential output statescan be understood in the context of a model neural networkThis network contains no intrinsic oscillators and functionsin the absence of external inputs. Sequence generation is anemergent property of the network that depends upon the in-terplay between two sets of synaptic connections. One setstabilizes the network in its current memory state, while asecond set, whose action is delayed in time, causes the net-work to make transitions between memories.The model I demonstrated contains connections between

all pairs of neurons, while biological systems almost univer-sally have a lesser degree of connectivity. Reducing thenumber of connections in the model network will decreasethe number of memories that can be incorporated into se-quences. However, sequence generation will still occur witha reduced number of connections if, on average, a Tij con-nection is paralleled by a Dij connection of the opposite sign.This corresponds in biological systems to balancing short-term inhibition by delayed excitation, and vice versa. Recip-rocal connections of the same sign, which can cause oscilla-tions between pairs of neurons (3), are not necessary forCPGs to function.How does the behavior of the model compare with the

measured properties of CPGs? Details of the networks thatunderlie CPGs have been studied in a number of prepara-tions (1-3). We focus on the analysis by Getting (20, 21) ofthe CPG controlling the escape swimming response in themollusk Tritonia diomedea. This CPG functions with 4 neu-

ral groups and 2 stable states (M and -M). Despite thesesmall numbers, it has many features in common with themodel network. The neurons exhibit well-defined on (burst-ing) and off (quiescent) states. None of the neurons are in-trinsic pacemakers. Many of the synaptic connections showboth a short-term and a long-term response (KD 20). Theconnections exhibit short-term excitation followed by de-layed inhibition and short-term inhibition followed by de-layed excitation (Dij -Tij).The output activity of many CPGs can be switched on and

off via external inputs from command neurons (22). Theseneurons modulate a select fraction of the neurons in a CPG.The corresponding control of cyclic output can be achievedwith schemes similar to that for sequence recognition. Forexample, when the Dij connections are formed without in-cluding the term M!Mj, only a linear sequence can be gener-ated (Eq. 6). Cyclic output will be enabled by an externalinput E(k) = E0 if the connection strengths Fj are chosen tocause the transition from the last memory to the first memo-ry in the sequence-i.e., Fij = M!Ej (Eq. 12). Differentcommand neurons can elicit different output patterns fromthe same CPG (22). This corresponds to storing multiple se-quences in the model network. A particular sequence is acti-vated when the external input causes a transition to one ofthe memories in that sequence.Note Added in Proof. A similar theory for generating sequences hasbeen independently derived by Kanter and Sompolinsky (24).

I thank G. E. Blonder, H. J. Chiel, J. J. Hopfield, and H. Sompo-linsky for many useful discussions. The sequence recognitionscheme follows a suggestion by Hopfield. D. B. Pendergraft assist-ed with the simulations.

1. Roberts, A. & Roberts, B. L., eds. (1983) Neural Origin ofRhythmic Movements (Cambridge Univ. Press, Cambridge).

2. Selverston, A. I. & Moulins, M. (1985) Annu. Rev. Physiol. 47,29-48.

3. Cohen, A. H., Rossignol, S. & Grillner, S., eds. (1986) NeuralControl of Rhythmic Movements (Wiley, New York).

4. Hopfield, J. J. (1982) Proc. Nat!. Acad. Sci. USA 79, 2554-2558.

5. Hopfield, J. J. (1984) Proc. Natl. Acad. Sci. USA 81, 3088-3092.

6. Steinbuch, K. & Piske, U. A. W. (1963) IEEE Trans. Electron.Comput. 12, 846-862.

7. Caianiello, E. R. (1966) Kybernetik 3, 98-100.8. Grossberg, S. (1970) Studies Appl. Math. 49, 135-166.9. Nakano, K. (1972) IEEE Trans. Sys. Man Cybern. 2, 380-387.

10. Cooper, L. N. (1973) in Proceedings of the Nobel Symposiumon Collective Properties of Physical Systems, eds. Lundqvist,B. & Lundqvist, S. (Academic, New York), pp. 252-264.

11. Little, W. A. & Shaw, G. L. (1975) Behav. Biol. 14, 115-133.12. Anderson, J. A., Silverstein, J. W., Ritz, S. A. & Jones, R. S.

(1977) Psych. Rev. 84, 412-451.13. Kohonen, T. (1980) Content Addressable Memories (Springer,

Berlin).14. Hopfield, J. J. & Tank, D. W. (1985) Biol. Cybern. 52, 141-

152.15. Amari, S.-I. (1972) IEEE Trans. Comput. 12, 1197-1206.16. Platt, J. C. (1985) Sequential Threshold Circuits (California In-

stitute of Technology, Pasadena), Tech. Rep. 5197:TR:85.17. Fukushima, K. (1973) Kybernetik 12, 58-63.18. Amit, D. J., Gutfreund, H. & Sompolinsky, H. (1985) Phys.

Rev. Lett. 55, 1530-1533.19. Amit, D. J., Gutfreund, H. & Sompolinsky, H. (1985) Phys.

Rev. A 32, 1007-1018.20. Getting, P. A. (1981) J. Neurophysiol. 46, 65-79.21. Getting, P. A. (1983) J. Neurophysiol. 49, 1036-1050.22. Kupfermann, I. & Weiss, K. R. (1978) Behav. Brain Sci. 1, 3-

39.23. Peretto, P. & Niez, J. J. (1986) Disordered Systems and Bio-

logical Organization, ed. Bienenstock, E. etal. (Springer, Ber-lin), pp. 171-185.

24. Kanter, I. & Sompolinsky, H. (1986) Phys. Rev. Lett., inpress.

Biophysics: Kleinfeld

Dow

nloa

ded

by g

uest

on

Mar

ch 2

7, 2

020