14
616 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009 Global Asymptotic Stability and Robust Stability of a Class of Cohen–Grossberg Neural Networks With Mixed Delays Huaguang Zhang, Senior Member, IEEE, Zhanshan Wang, and Derong Liu, Fellow, IEEE Abstract—This paper is concerned with the global asymptotic stability of a class of Cohen–Grossberg neural networks with both multiple time-varying delays and continuously distributed delays. Two classes of amplification functions are considered, and some sufficient stability criteria are established to ensure the global asymptotic stability of the concerned neural networks, which can be expressed in the form of linear matrix inequality and are easy to check. Furthermore, some sufficient conditions guaranteeing the global robust stability are also established in the case of parameter uncertainties. Index Terms—Cohen–Grossberg neural networks, distributed delays, global asymptotic stability, linear matrix inequality (LMI), multiple time-varying delays, nonnegative equilibrium points, robust stability. I. INTRODUCTION C OHEN–GROSSBERG [11] proposed in 1983 a neural network model described by the following system: (1) where is the state of neuron, is an amplification function, is a well-defined function to guarantee the existence of solution of system (1), is an activation function describing the effects of input on the output of neuron, is the connection weight coefficient of the neural network, is an external constant input, and . System (1) includes a number of models from neurobiology, population Manuscript received June 23, 2007; revised December 02, 2007 and June 12, 2008. First published August 04, 2008; current version published March 11, 2009. This work was supported in part by the National Natural Science Foun- dation of China under Grants 60534010, 60572070, 60728307, 60774048, and 60774093, by the 111 Project of the Ministry of Education of China under Grant B08015, by the Natural Science Foundation of Liaoning Province under Grant 20072025, by the Program for Cheung Kong Scholars and Innovative Research Groups of China under Grant 60521003, by the National High Technology Re- search and Development Program of China under Grant 2006AA04Z183, and by the Postdoctor Foundation of Northeastern University under Grant 20080314. This paper was recommended by Associate Editor M. Di Marco. H. Zhang and Z. Wang are with the School of Information Science and En- gineering, Northeastern University, Shenyang, Liaoning 110004, China, and also with the Key Laboratory of Integrated Automation of Process Industry (Northeastern University), Ministry of Education of China, Shenyang, Liaoning 110004, China (e-mail: [email protected]; [email protected]). D. Liu is with the Department of Electrical and Computer Engineering, Uni- versity of Illinois at Chicago, Chicago, IL 60607 USA (e-mail: [email protected]. edu). Digital Object Identifier 10.1109/TCSI.2008.2002556 biology, and evolution theory, as well as the Hopfield neural net- work model [20] as a special case. In electronic implementation of analog neural networks, delays always exist due to the trans- mission of signal and the finite switching speed of amplifiers. On the other hand, it is desirable to introduce delays into neural networks when dealing with problems associated with motions. Therefore, model (1) and its delayed version have attracted the attention of many researchers (see, e.g., [1], [5]–[7], [10], [19], [26], [27], [29], [33], [35], [36], [40], [42], [45]). Among them, [42] introduced constant delays into (1), which yields the fol- lowing form: (2) where are bounded constant delays, are the con- nection weight coefficients, and other notations are the same as those in system (1), i.e., and . We remark that one can consider several types of delays in (2), [15]–[17], [31], and [32]. References [17] and [18] first in- vestigated the stability problem of asymmetric Hopfield neural networks with continuously distributed delays. The character- istic of this kind of continuously distributed delays is that the delays range over the infinitely long duration. Dynamics of dif- ferent kinds of neural networks with this kind of continuously distributed delays are widely studied [6], [8], [9], [25], [34]. Al- though the results of [6], [8], [9], [25], and [34] are generally easy to verify, all the results in [6], [8], [9], [25], and [34] take the absolute value operation on the connection weight coeffi- cients. Therefore, the sign difference of entries in connection matrix is ignored, which leads to the ignorance of the neuron’s excitatory and inhibitory effects on the neural network. How- ever, some kinds of continuously distributed delays in a prac- tical system often range over a finite duration. For example, one application of this kind of continuously distributed delay systems can be found in the modeling of feeding systems and combustion chambers in a liquid monopropellant rocket motor with pressure feeding [12], [13]. This kind of continuously dis- tributed delay systems have been investigated in [22], [23], [38], and [41], but all the results in [22], [23], [38], and [41] are only suitable for the case of linear systems with constant delay. It is well known that neural networks are large-scale and com- plex nonlinear dynamic systems. Different types of delays may occur, and it is useful to investigate the dynamics of neural net- works with different types of delays. Motivated by the aforementioned discussion, the purpose of this paper is to establish sufficient conditions for the global asymptotic stability of the following system with both multiple 1549-8328/$25.00 © 2009 IEEE Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

616 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: … · 2020. 10. 9. · 616 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009 Global Asymptotic

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

  • 616 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    Global Asymptotic Stability and Robust Stability ofa Class of Cohen–Grossberg Neural Networks

    With Mixed DelaysHuaguang Zhang, Senior Member, IEEE, Zhanshan Wang, and Derong Liu, Fellow, IEEE

    Abstract—This paper is concerned with the global asymptoticstability of a class of Cohen–Grossberg neural networks with bothmultiple time-varying delays and continuously distributed delays.Two classes of amplification functions are considered, and somesufficient stability criteria are established to ensure the globalasymptotic stability of the concerned neural networks, which canbe expressed in the form of linear matrix inequality and are easy tocheck. Furthermore, some sufficient conditions guaranteeing theglobal robust stability are also established in the case of parameteruncertainties.

    Index Terms—Cohen–Grossberg neural networks, distributeddelays, global asymptotic stability, linear matrix inequality (LMI),multiple time-varying delays, nonnegative equilibrium points,robust stability.

    I. INTRODUCTION

    C OHEN–GROSSBERG [11] proposed in 1983 a neuralnetwork model described by the following system:

    (1)where is the state of neuron, is an amplificationfunction, is a well-defined function to guarantee theexistence of solution of system (1), is an activationfunction describing the effects of input on the output of neuron,

    is the connection weight coefficient of the neural network,is an external constant input, and . System

    (1) includes a number of models from neurobiology, population

    Manuscript received June 23, 2007; revised December 02, 2007 and June 12,2008. First published August 04, 2008; current version published March 11,2009. This work was supported in part by the National Natural Science Foun-dation of China under Grants 60534010, 60572070, 60728307, 60774048, and60774093, by the 111 Project of the Ministry of Education of China under GrantB08015, by the Natural Science Foundation of Liaoning Province under Grant20072025, by the Program for Cheung Kong Scholars and Innovative ResearchGroups of China under Grant 60521003, by the National High Technology Re-search and Development Program of China under Grant 2006AA04Z183, and bythe Postdoctor Foundation of Northeastern University under Grant 20080314.This paper was recommended by Associate Editor M. Di Marco.

    H. Zhang and Z. Wang are with the School of Information Science and En-gineering, Northeastern University, Shenyang, Liaoning 110004, China, andalso with the Key Laboratory of Integrated Automation of Process Industry(Northeastern University), Ministry of Education of China, Shenyang, Liaoning110004, China (e-mail: [email protected]; [email protected]).

    D. Liu is with the Department of Electrical and Computer Engineering, Uni-versity of Illinois at Chicago, Chicago, IL 60607 USA (e-mail: [email protected]).

    Digital Object Identifier 10.1109/TCSI.2008.2002556

    biology, and evolution theory, as well as the Hopfield neural net-work model [20] as a special case. In electronic implementationof analog neural networks, delays always exist due to the trans-mission of signal and the finite switching speed of amplifiers.On the other hand, it is desirable to introduce delays into neuralnetworks when dealing with problems associated with motions.Therefore, model (1) and its delayed version have attracted theattention of many researchers (see, e.g., [1], [5]–[7], [10], [19],[26], [27], [29], [33], [35], [36], [40], [42], [45]). Among them,[42] introduced constant delays into (1), which yields the fol-lowing form:

    (2)where are bounded constant delays, are the con-nection weight coefficients, and other notations are the same asthose in system (1), i.e., and .

    We remark that one can consider several types of delays in(2), [15]–[17], [31], and [32]. References [17] and [18] first in-vestigated the stability problem of asymmetric Hopfield neuralnetworks with continuously distributed delays. The character-istic of this kind of continuously distributed delays is that thedelays range over the infinitely long duration. Dynamics of dif-ferent kinds of neural networks with this kind of continuouslydistributed delays are widely studied [6], [8], [9], [25], [34]. Al-though the results of [6], [8], [9], [25], and [34] are generallyeasy to verify, all the results in [6], [8], [9], [25], and [34] takethe absolute value operation on the connection weight coeffi-cients. Therefore, the sign difference of entries in connectionmatrix is ignored, which leads to the ignorance of the neuron’sexcitatory and inhibitory effects on the neural network. How-ever, some kinds of continuously distributed delays in a prac-tical system often range over a finite duration. For example,one application of this kind of continuously distributed delaysystems can be found in the modeling of feeding systems andcombustion chambers in a liquid monopropellant rocket motorwith pressure feeding [12], [13]. This kind of continuously dis-tributed delay systems have been investigated in [22], [23], [38],and [41], but all the results in [22], [23], [38], and [41] are onlysuitable for the case of linear systems with constant delay. Itis well known that neural networks are large-scale and com-plex nonlinear dynamic systems. Different types of delays mayoccur, and it is useful to investigate the dynamics of neural net-works with different types of delays.

    Motivated by the aforementioned discussion, the purpose ofthis paper is to establish sufficient conditions for the globalasymptotic stability of the following system with both multiple

    1549-8328/$25.00 © 2009 IEEE

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 617

    time-varying delays and a kind of continuously distributed de-lays ranging over a finite duration:

    (3)

    where and are connection weight coefficients of theneural network, time delays and areall bounded, is an external constant input, and

    are activation functions, respectively, and other no-tations are the same as those in system (1), i.e., ,

    , and .The neural network model studied here is more general and

    includes a large class of existing neural networks of the fol-lowing two cases.Case 1) for all and .Case 2) for all and ,

    .For example, in Case 1), model (3) contains the models

    studied in [3], [4], [7], [10], [19]–[21], [24], [26], [27], [33],[37], [42], [45]–[47], [49], and [50]. In Case 2), model (3)contains the classical Cohen–Grossberg neural model [11],[28], [30] and the famous Lotka–Volterra neural model [43],[44]. Instead of using the Jensen inequality approach, as usedin [3] and [46], we construct a suitable Lyapunov–Krasovskiifunctional in this paper to deal with the continuously distributeddelays, and some sufficient conditions are derived to ensurethe global asymptotic stability of system (3) for Case 1) or2), respectively, which can be expressed in the form of linearmatrix inequality (LMI) and are independent of the values oftime-varying delays and amplification functions. Corollariesare also given for some special cases of system (3). Moreover,a global robust stability criterion is also established for system(3) with parameter uncertainties.

    Throughout this paper, let , , , , anddenote the transpose, the inverse, the

    smallest eigenvalue, the largest eigenvalue, and the Euclideannorm of a square matrix , respectively. Letdenote a positive (negative) definite symmetric matrix. Letand 0 denote the identity matrix and the zero matrix with com-patible dimensions, respectively. Time delays andare all bounded, i.e., , ,

    , , , , ,, , and .

    II. GLOBAL ASYMPTOTIC STABILITY RESULTS

    In this section, we need the following assumptions andlemma.

    Assumption 2.1: There exist constants such that thefunction satisfies

    (4)

    for , , and .Assumption 2.2: The bounded activation functions ,

    , and satisfy the following conditions:

    (5)

    (6)

    (7)

    for , , and and for , , and.

    Let , ,, and .

    Assumption 2.3: The amplification function is positiveand continuous, i.e., for all .

    Lemma 2.1 (See [37]): Let and be two real vectors withappropriate dimensions, and let and be two matrices withappropriate dimensions, where . Then, for any two posi-tive constants and , the following inequality holds:

    (8)

    According to [35], for every external constant input , neuralnetwork (3) has an equilibrium point if

    , , , , and satisfy the aforementionedconditions . Letting , thenmodel (3) is transformed into the following form:

    (9)

    or in a vector matrix form by the method in [48]

    (10)

    where ,,

    , ,,

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 618 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    ,, ,

    ,,

    , ,,

    , ,, , , and

    . The initial condition of (10) is of the formfor , and its supremum bound is

    .We now state and prove the results of this paper as follows.Theorem 2.1: Suppose that Assumptions 2.1–2.3 hold,

    , and . If there exist positivediagonal matrices , , , , and and positive definitesymmetric matrices , , , , , , , and such thatthe following conditions hold:

    (11)

    (12)

    (13)

    (14)

    (15)

    (16)

    then the equilibrium point of model (10) is globally asymp-totically stable, where and( , , and ).

    Proof: We will prove the theorem in two steps. First, wewill show that (11)–(16) are sufficient conditions guaranteeingthe uniqueness of the equilibrium point of system (10). Consider(10) at equilibrium point , i.e.,

    (17)If , , and , then it is easy tosee that . Now, suppose that . Then, ,

    , and . Multiplying , ,and on both sides of (17) yields

    (18)

    where we have used Lemma 2.1.However, for , , and , from

    (12)–(14), we have

    (19)Obviously, (18) contradicts with (19), which, in turn, impliesthat, at the equilibrium point , , , and

    . This means that the origin of (10) is a unique equi-librium point.

    Second, we will show that conditions (11)–(16) are also suf-ficient conditions guaranteeing the global asymptotic stabilityof the equilibrium point of system (10). Consider the followingLyapunov–Krasovskii functional

    (20)

    where

    where , , , , , ,, and and .

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 619

    The derivative of along the trajectories of (10) is asfollows:

    (21)

    By Lemma 2.1, for and positive constants ,, we have

    Then, from (21) it yields

    (22)

    Similarly, the derivative of along the trajectories of(10) is as follows:

    (23)

    Thus, we have

    (24)

    The derivative of along the trajectories of (10)is given by (25), shown at the bottom of the page.where , ,

    , , and.

    (25)

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 620 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    By Assumptions 2.1 and 2.2 and Lemma 2.1, we have (26),shown at the bottom of the page. Therefore, see (27), shown atthe bottom of the next page. Now, we choose such that

    (28)Then

    (29)

    Meanwhile, see the second equation at the bottom of the nextpage, where is defined in (13). If we choose such that

    (30)

    then we have for .Similarly, for sufficiently large number , the condition

    in (31), shown at the bottom of page 622, hold for ,where is defined in (12).

    (26)

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 621

    Moreover, by condition (15) in Theorem 2.1, we have. Then, there exists a sufficiently large constant

    such that

    (32)

    Therefore, by (14), (28), (30), (31), and (32), and from (27), wehave for , , ,and . if and only if

    , , , and . Ac-cording to Lyapunov stability theory, the origin of (10) is glob-ally asymptotically stable.

    If we take another Lyapunov functional, we can obtain thefollowing result.

    Theorem 2.2: Suppose that Assumptions 2.1–2.3 hold,, and . If there exist positive

    diagonal matrices , , , , , , , , andsuch that the following LMI holds:

    (33)

    then the equilibrium point of model (10) is globally asymptoti-cally stable, where , ,

    (27)

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 622 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    , ,, ,

    , ,, ,

    , ,, ,

    , ,, denotes the corre-

    sponding symmetric part in a matrix, , and( , , and ).

    Proof: Consider the following Lyapunov–Krasovskiifunctional:

    (34)

    where and are defined in (20), except that, and

    where , , , , and (and ).

    The derivatives of along the trajectoriesof (10) is as (35), shown at the bottom of the page,where , ,

    , , and( ).

    By Assumption 2.2, the following inequalities hold for pos-itive diagonal matrices , , and ( and

    )

    (36)

    Combining (24), (35), and (36), we have

    (37)

    where

    .

    Obviously, if , then for . ByLyapunov stability theory, the origin of system (10) is globallyasymptotically stable.

    We have the following results, which can be derived in a sim-ilar manner to the proof of Theorem 2.2.

    Theorem 2.3: Suppose that Assumptions 2.1–2.3 hold,, and . If there exist positive

    (31)

    (35)

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 623

    diagonal matrices , , , , and such that the followingLMI holds:

    (38)

    then the equilibrium point of model (10) is globally asymp-totically stable, where

    ,, , ,

    , ,, ,

    , denotes the corre-sponding symmetric part in a matrix, , and

    ( , , and ).Proof: Consider the following Lyapunov–Krasovskii

    functional

    (39)

    where and are the same as those defined in(20), except that , and

    In a similar manner to the proof of Theorem 2.2, Theorem 2.3can easily be derived.

    Theorem 2.4: Suppose that Assumptions 2.1–2.3 hold,, and . If there exist positive

    diagonal matrices , , , and such that the followingLMI holds:

    (40)

    then the equilibrium point of model (10) is globally asymp-totically stable, where

    , ,, ,

    , ,, denotes the corresponding sym-

    metric part in a matrix, , and( , , and ).

    When and , i.e., ,, system (10) becomes

    (41)

    We have the following results, which can be derived in a similarmanner to the proof of Theorem 2.2.

    Theorem 2.5: Suppose that Assumptions 2.1–2.3 hold and. If there exist positive diagonal matrices ,

    , and such that the following condition holds:

    (42)

    then system (41) is globally asymptotically stable, where( , , and ).

    Corollary 2.1: Suppose that Assumptions 2.1–2.3 hold and. If there exist positive diagonal matrices ,

    , and such that the following condition holds:

    (43)

    then system (41) is globally asymptotically stable, where( , , and ).

    For neural network (41) with ( and), we have the following result, which can be ob-

    tained directly from Theorem 2.5.Corollary 2.2: Suppose that Assumptions 2.1–2.3 hold and

    . If there exist positive diagonal matricesand such that the following condition holds:

    (44)then the equilibrium point of model (41) with is glob-ally asymptotically stable, where (

    , , and ).For the following Cohen–Grossberg neural networks:

    (45)

    where , , , , and are the sameas those in (3) ( ). We have the following result.

    Corollary 2.3: Suppose that Assumptions 2.1–2.3 hold and. If there exist positive diagonal matrices

    and such that the following condition holds:

    (46)then the equilibrium point of model (45) is globally asymptot-ically stable for a given , where , isa square matrix whose th row is composed by the th row ofsquare matrix , and the other rows are all zero( ).

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 624 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    III. GLOBAL ROBUST STABILITY RESULT

    Consider the Cohen–Grossberg neural network (9) with un-certainties in the form of

    (47)

    or in a vector matrix form

    (48)

    where , , and denote the unknown con-nection weight coefficients representing time-varying parameteruncertainties, and other notations are the same as those in (9).For convenience of description, we let ,

    , ,, , and (

    , , and ).Assumption 3.1: Parameter uncertainties satisfy

    , , and ,respectively, where , , , , , and are the knownstructural matrices and is an unknown time-varying func-tion matrix satisfying ( and

    ).Lemma 3.1 (See [39]): If , , and are real matrices

    of appropriate dimensions with satisfying , thenfor all , if and only

    if there exist a positive constant , such that.

    Theorem 3.1: Suppose that Assumptions 2.1–2.3 and 3.1hold, , and . If there existpositive diagonal matrices , , , , , , , , and

    , positive constants , , , and

    , , , , such that thefollowing LMI holds:

    (49)then system (47) is globally robustly stable, where

    , ,,

    , ,, ,

    , ,,,,

    , denotes the correspondingsymmetric part in a matrix, and others are defined in Theorem2.2.

    Proof: Consider the same Lyapunov–Krasovskii func-tional defined in (20). In a similar manner to the proof ofTheorem 2.2, the derivative of (20) along the trajectories of(48) is of the form

    (50)

    with being defined in (37)

    (51)

    where , ,, ,

    , , ,, ,

    , ,, and others are defined in The-

    orem 2.2.According to Assumption 3.1, Lemma 3.1, and the Schur

    complement, in a similar routine to the proof in [46], (51) isequivalent to (49). The details are omitted.

    IV. GLOBAL ASYMPTOTIC STABILITY OF COHEN–GROSSBERGNEURAL NETWORKS WITH NONNEGATIVE

    EQUILIBRIUM POINTS

    In the original paper of Cohen–Grossberg [11], theCohen–Grossberg neural network model was proposed asa kind of competitive-cooperation dynamical system for de-cision rules, pattern formation, and parallel memory storage.Therefore, each state of the neuron might be the popula-tion size, activity, or concentration of the th species in the

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 625

    system, which is always nonnegative. It is clear that this subsetof Cohen–Grossberg neural networks includes the famousLotka–Volterra recurrent neural networks [11], [43], [44].

    When systems (1) and (3) are applied to biology, initial con-ditions will be of the following type:

    (52)

    where each is a continuous function defined on .In applications, the activation functions of model (3) may

    not be bounded, for example, in the Lotka–Volterra model [43],[44]. Therefore, in this section, we will employ the followingassumptions and lemmas.

    Assumption 4.1: satisfies (4) and ,.

    Assumption 4.2: Activation functions , , andsatisfy conditions (5)–(7), and , , and

    , .Assumption 4.3: The amplification function for all

    and , and for any ,holds for all .

    Lemma 4.1: Assume that satisfies Assumption 4.3.Then, the solution of the system (3) is positive with the initialcondition (52).

    If Assumption 4.3 holds, then the nonnegative equilibriumpoint of system (3) is a solution of the following:

    (53)

    where

    Although (53) might possess multiple solutions, in a similar rou-tine to [30Proof of Proposition 1], we can show that if is anasymptotically stable nonnegative equilibrium point of system(3), then it must be a solution of the following problem:

    (54)

    where .Lemma 4.2 (See [30]): Equation (54) has a unique solution

    for every if and only if is norm coercive, i.e.,

    and locally univalent, where ,, ,

    , if and if, if and if , .

    Lemma 4.3 (See [14], [30]): Let be an matrix,, and . If there exists

    a positive definite diagonal matrix suchthat

    holds, then, for any positive definite diagonal matrix andnonnegative definite diagonal matrix satisfying ,we have , i.e., is nonsingular.

    Theorem 4.1: Suppose that Assumptions 4.1–4.3 hold,, and ( and ).

    Then, system (3) with initial conditions (52) has a unique non-negative equilibrium point, which is globally asymptoticallystable if there exist positive definite diagonal matrices , ,

    , and such that the following condition holds:

    (55)

    where denotes the symmetric parts in a matrix, and its vari-ables are described

    Proof: We prove Theorem 4.1 in two steps. First, we willshow that (55) ensures the existence and uniqueness of a non-negative equilibrium point of system (3).

    From (55), we know that , which also means that

    is positive definite, where we have used Lemma 2.1 and the factsthat and

    According to Lemma 4.2, we only need to prove thatis norm coercive and locally univalent. We first prove thatis locally univalent. For any , withoutloss of generality, by some rearrangement of , we can assumethat if ; if ;and if for some integers .Moreover, if is sufficiently close to , without lossof generality, we can also assume that if ;

    if ; if ;

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 626 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    if ; and iffor some integers . It can be seen that

    (56)

    (57)

    where , ,, ,

    otherwise

    otherwise

    otherwise

    otherwise.

    Then, , , , and ().

    If , then we have

    (58)By (56), without loss of generality, we can assume that

    where and for some integer . Therefore,(58) can be written as

    According to Lemma 4.3 and from ,is nonsingular. By matrix de-

    composition, we can easily obtain and , respec-tively. Therefore, and , i.e., , whichimplies that is locally univalent.

    We next prove that is norm coercive. Suppose thatthere exists a sequence suchthat . Then, there exists some indexsuch that , which implies that

    in the case of the bounded activation functions, , and . In the case of ,

    , or , we willprove that .

    By simple algebraic manipulations, it leads to

    (59)

    Let ,, ,

    , and by Lemma 2.1, (59) be-comes

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 627

    (60)

    where . Therefore,, which implies that

    is norm coercive. In combination with Lemma 4.2, (55)ensures the existence and uniqueness of the nonnegative equi-librium point of system (3).

    Second, we will prove that (55) is also a sufficient conditionto guarantee the global asymptotical stability of the nonnegativeequilibrium point of system (3). It suffices to show thatensures the global asymptotic stability of nonnegative equilib-rium point.

    Let be the nonnegative equilibrium pointof system (3) and . Then, model (3) is trans-formed into the following form:

    (61)

    with

    where, the initial condition

    satisfies (52), and others are the same as those defined in (10).Since is the nonnegative equilibrium point of system (3),

    then, from (54), we know that holds for all .This implies that , , and

    hold for all and .Consider the same Lyapunov functional used in the proof of

    Theorem 2.3 with , and following the same routine as theproof of Theorem 2.3, we can show that guarantees theglobal asymptotic stability of the nonnegative equilibrium pointof system (3). Details are omitted.

    Remark 4.1: If , is a constantdelay in system (61), , and under the same assump-tions of Theorem 4.1, we can prove that conditions (42) in The-orem 2.5 and (43) in Corollary 2.1 ensure the global asymptoticstability of the nonnegative equilibrium point of system (61).

    Remark 4.2: In a similar routine to the proof of Theorem 3.1,we can easily derive the robust stability criterion from Theorem4.1 in the case of uncertainties satisfying Assumption 3.1.

    V. ILLUSTRATIVE EXAMPLES

    In this section, we will use two examples to show the effec-tiveness of the obtained results.

    Example 5.1: Consider the following Cohen–Grossbergneural networks with two neurons:

    (62)

    where and areany bounded constant delays and are some kinds ofamplification functions satisfying Assumption 2.3, whose exactvalues of lower and upper bounds may be unknown ( ).Obviously, the results in [10] and [19] cannot be applied to thisexample. Pertaining to this example, the result in [27] and [7,Th. 2] are not satisfied. Applying Corollary 2.3 of this paper, wehave , ,and . Therefore, the neural net-work (62) is globally asymptotically stable. Moreover, if we let

    in (62), , [24, Ths. 1 and 2], [47, Th. 1],and [4, Th. 3] are not satisfied either.

    Example 5.2: Consider the following two Cohen–Grossbergneural networks with two neuron:

    (63)

    (64)

    where , ,

    , and .

    Pertaining to this example, [28, Th. 4] cannot judge the sta-bility of system (63). Applying Corollary 2.2 and Theorem 2.5,we can show that systems (63) and (64) are both globally asymp-totically stable, respectively.

    In the following, we set initial conditions andfor , respectively.

    When , the equilibrium points of system (63)are , , , and .Among them, is the nonnegative equilibriumpoint. is the unique equilibrium point ofsystem (64). Obviously, different dynamics between systems(63) and (64) are caused by different amplification functions.

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • 628 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: REGULAR PAPERS, VOL. 56, NO. 3, MARCH 2009

    VI. CONCLUSION

    Under different assumptions of amplification function, somesufficient criteria are derived for the global asymptotic stabilityof a class of Cohen–Grossberg neural networks with both mul-tiple time-varying delays and continuously distributed delays,which are independent of the size of the time-varying delays andamplification functions. The results can be applied to some spe-cial cases of Cohen–Grossberg neural networks, for example,Hopfield neural networks and Lotka–Volterra recurrent neuralnetworks. Two numerical examples are employed to demon-strate the effectiveness of the obtained results.

    REFERENCES

    [1] S. Arik and Z. Orman, “Global stability analysis of Cohen–Grossbergneural networks with time varying delays,” Phys. Lett. A, vol. 341, no.5/6, pp. 410–421, 2005.

    [2] S. Boyd, L. G. El, E. Feron, and V. Balakrishnan, Linear Matrix In-equalities in System and Control Theory, ser. Studies in Applied Math-ematics. Philadelphia, PA: SIAM, 1994, vol. 15.

    [3] J. Cao, K. Yuan, and H. X. Li, “Global asymptotical stability of recur-rent neural networks with multiple discrete delays and distributed de-lays,” IEEE Trans. Neural Netw., vol. 17, no. 6, pp. 1646–1651, Nov.2006.

    [4] T. Chen, “Global exponential stability of delayed Hopfield neural net-works,” Neural Netw., vol. 14, no. 8, pp. 977–980, Oct. 2001.

    [5] T. Chen and L. Rong, “Delay-independent stability analysis ofCohen–Grossberg neural networks,” Phys. Lett. A, vol. 317, no. 5/6,pp. 436–449, Oct. 2003.

    [6] T. Chen and W. Lu, “Global asymptotical stability of Cohen–Gross-berg neural networks with time-varying and distributed delays,” Proc.Int. Symp. Neural Netw., ser. Lecture Notes in Computer Science, vol.3971, pp. 192–197, 2006.

    [7] T. Chen and L. Rong, “Robust global exponential stability ofCohen–Grossberg neural networks with time delays,” IEEE Trans.Neural Netw., vol. 15, no. 1, pp. 203–206, Jan. 2004.

    [8] W.-H. Chen and W. Zheng, “Global asymptotic stability of a class ofneural networks with distributed delays,” IEEE Trans. Circuits Syst. I,Reg. Papers, vol. 53, no. 3, pp. 644–652, Mar. 2006.

    [9] Y. Chen, “Global stability of neural networks with distributed delays,”Neural Netw., vol. 15, no. 7, pp. 867–871, Sep. 2002.

    [10] Y. Chen, “Global asymptotic stability of delayed Cohen–Grossbergneural networks,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 53,no. 2, pp. 351–357, Feb. 2006.

    [11] M. A. Cohen and S. Grossberg, “Absolute stability and global pat-tern formation and parallel memory storage by competitive neuralnetworks,” IEEE Trans. Syst., Man, Cybern., vol. SMC-13, no. 5, pp.815–826, Sep./Oct. 1983.

    [12] L. Crocco, “Aspect of combustion stability in liquid propellant rocketmotors, Part I: Fundamentals of low frequency instability with mono-propellants,” J. Amer. Rocket Soc., vol. 21, no. 2, pp. 163–178, Feb.1951.

    [13] Y. A. Fiagbedzi and A. E. Pearson, “A multistage reduction techniquefor feedback stabilizing distributed time-lag systems,” Automatica, vol.23, no. 3, pp. 311–326, May 1987.

    [14] M. Forti and A. Tesi, “New conditions for global stability of neuralnetworks with application to linear and quadratic programming prob-lems,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 42, no.7, pp. 354–366, Jul. 1995.

    [15] M. Forti, P. Nistri, and M. Quincampoix, “Generalized neural networkfor nonsmooth nonlinear programming problems,” IEEE Trans. Cir-cuits Syst. I, Reg. Papers, vol. 51, no. 9, pp. 1741–1754, Sep. 2004.

    [16] M. Forti, P. Nistri, and D. Papini, “Global exponential stability andglobal convergence in finite time of delayed neural networks with in-finite gain,” IEEE Trans. Neural Netw., vol. 16, no. 6, pp. 1449–1463,Nov. 2005.

    [17] K. Gopalsamy and X. He, “Stability in asymmetric Hopfield nets withtransmission delays,” Phys. D, Nonlinear Phenomena, vol. 76, no. 4,pp. 344–358, Sep. 1994.

    [18] K. Gopalsamy and X. He, “Delay-independent stability in bidirectionalassociative neural networks,” IEEE Trans. Neural Netw., vol. 5, no. 6,pp. 998–1002, Nov. 1994.

    [19] S. Guo and L. Huang, “Stability analysis of Cohen–Grossberg neuralnetworks,” IEEE Trans. Neural Netw., vol. 17, no. 1, pp. 106–116, Jan.2006.

    [20] J. J. Hopfield, “Neurons with graded response have collective compu-tational properties like those of two-stage neurons,” Proc. Nat. Acad.Sci. USA, vol. 81, no. 10, pp. 3088–3092, May 1984.

    [21] H. Huang, D. W. C. Ho, and J. D. Cao, “Analysis of global exponentialstability and periodic solutions of neural networks with time-varyingdelays,” Neural Netw., vol. 18, no. 2, pp. 161–170, Mar. 2005.

    [22] V. B. Kolmanovskii and J.-P. Richard, “Stability of some linear sys-tems with delays,” IEEE Trans. Autom. Control, vol. 44, no. 5, pp.984–989, May 1999.

    [23] J. Lam, H. Gao, and C. Wang, “� model reduction of linear systemswith distributed delay,” Proc. Inst. Elect. Eng.—Control Theory Appl.,vol. 152, no. 6, pp. 662–674, Nov. 2005.

    [24] X. X. Liao and J. Wang, “Algebraic criteria for global exponentialstability of cellular neural networks with multiple time delays,” IEEETrans. Circuits Syst. I, Fundam. Theory Appl., vol. 50, no. 2, pp.268–275, Feb. 2003.

    [25] P. Liu and Q.-L. Han, “On stability of recurrent neural networks—Anapproach from Volterra integro-differential equations,” IEEE Trans.Neural Netw., vol. 17, no. 1, pp. 264–267, Jan. 2006.

    [26] H. Lu, “Global exponential stability analysis of Cohen–Grossbergneural networks,” IEEE Trans. Circuits Syst. II, Exp. Briefs, vol. 52,no. 9, pp. 476–479, Aug. 2005.

    [27] H. Lu, R. Shen, and F.-L. Chung, “Global exponential convergenceof Cohen–Grossberg neural networks with time delays,” IEEE Trans.Neural Netw., vol. 16, no. 6, pp. 1694–1696, Nov. 2005.

    [28] K. Lu, D. Xu, and Z. Yang, “Global attraction and stability forCohen–Grossberg neural networks with delays,” Neural Netw., vol.19, no. 10, pp. 1538–1549, Dec. 2006.

    [29] W. Lu and T. Chen, “New conditions on global stability ofCohen–Grossberg neural networks,” Neural Comput., vol. 15,no. 5, pp. 1173–1189, May 2003.

    [30] W. Lu and T. Chen, “� global stability of Cohen–Grossberg neuralnetwork system with nonnegative equilibria,” Neural Netw., vol. 20,no. 6, pp. 714–722, Aug., 2007.

    [31] S. I. Niculosu, Delay Effects on Stability—A Robust Control Ap-proach. London, U.K.: Springer-Verlag, 2001.

    [32] J. P. Richard, “Time-delay system: An overview of some recentadvances and open problems,” Automatica, vol. 39, no. 10, pp.1667–1694, Oct. 2003.

    [33] L. Rong, “LMI-based criteria for robust stability of Cohen–Grossbergneural networks with delay,” Phys. Lett. A, vol. 339, no. 1/2, pp. 63–73,May 2005.

    [34] L. Wang, “Stability of Cohen–Grossberg neural networks with dis-tributed delays,” Appl. Math. Comput., vol. 160, no. 1, pp. 93–110, Jan.2005.

    [35] L. Wang and X. Zou, “Harmless delays in Cohen–Grossberg neural net-work,” Phys. D, Nonlinear Phenomena, vol. 170, no. 2, pp. 162–173,Sep. 2002.

    [36] L. Wang and X. Zou, “Exponential stability of Cohen–Grossberg neuralnetworks,” Neural Netw., vol. 15, no. 3, pp. 415–422, Apr. 2002.

    [37] Z. Wang, H. Zhang, and W. Yu, “Robust exponential stability analysisof neural networks with multiple time delays,” Neurocomputing, vol.70, no. 13–15, pp. 2534–2543, Aug. 2007.

    [38] L. Xie, E. Fridman, and U. Shaked, “Robust� control of distributeddelay systems with application to combustion control,” IEEE Trans.Autom. Control, vol. 46, no. 12, pp. 1930–1935, Dec. 2001.

    [39] L. Xie, M. Fu, and C. E. de Souza, “� control and quadratic sta-bilization of system with parameter uncertainty via output feedback,”IEEE Trans. Autom. Control, vol. 37, no. 8, pp. 1253–1256, Aug. 1992.

    [40] W. Xiong and J. Cao, “Global exponential stability of discrete-timeCohen–Grossberg neural networks,” Neurocomputing, vol. 64, pp.433–446, Mar. 2005.

    [41] S. Xu, J. Lam, T. Chen, and Y. Zou, “A delay-dependent approachto robust � filtering for uncertain distributed delay systems,” IEEETrans. Signal Process., vol. 53, no. 10, pp. 3764–3772, Oct. 2005.

    [42] H. Ye, A. N. Michel, and K. Wang, “Qualitative analysis ofCohen–Grossberg neural networks with multiple delays,” Phys.Rev. E, Stat. Phys. Plasmas Fluids Relat. Interdiscip. Top., vol. 51, no.3, pp. 2611–2618, Mar. 1995.

    [43] Z. Yi and K. K. Tan, “Dynamic stability conditions for Lotka–Volterrarecurrent neural networks with delays,” Phys. Rev. E, Stat. Phys.Plasmas Fluids Relat. Interdiscip. Top., vol. 66, no. 1, p. 011 910, Jul.2002.

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.

  • ZHANG et al.: ASYMPTOTIC AND ROBUST STABILITY OF A CLASS OF NEURAL NETWORKS WITH MIXED DELAYS 629

    [44] Z. Yi and K. K. Tan, “Global convergence of Lotka–Volterra recurrentneural networks with delays,” IEEE Trans. Circuits Syst. I, Reg. Papers,vol. 52, no. 11, pp. 2482–2489, Nov. 2005.

    [45] K. Yuan and J. Cao, “An analysis of global asymptotic stability ofdelayed Cohen–Grossberg neural networks via nonsmooth analysis,”IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 52, no. 9, pp. 1854–1861,Sep. 2005.

    [46] K. Yuan, J. Cao, and H. X. Li, “Robust stability of switchedCohen–Grossberg neural networks with mixed time-varying delays,”IEEE Trans. Syst., Man, Cybern. B, Cybern., vol. 36, no. 6, pp.1356–1363, Dec. 2006.

    [47] Z. Zeng, J. Wang, and X. X. Liao, “Global exponential stability of ageneral class of recurrent neural networks with time-varying delays,”IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 50, no. 10, pp.1353–1358, Oct. 2003.

    [48] H. Zhang and Z. Wang, “Global asymptotic stability of delayed cel-lular neural networks,” IEEE Trans. Neural Netw., vol. 18, no. 3, pp.947–950, May 2007.

    [49] H. Zhang, Z. Wang, and D. Liu, “Robust exponential stability of recur-rent neural networks with multiple time-varying delays,” IEEE Trans.Circuits Syst. II, Exp. Briefs, vol. 54, no. 8, pp. 730–734, Aug. 2007.

    [50] H. Zhang, Z. Wang, and D. Liu, “Global asymptotic stability of recur-rent neural networks with multiple time-varying delays,” IEEE Trans.Neural Netw., vol. 19, no. 5, pp. 855–873, May 2008.

    Huaguang Zhang (SM’04) was born in Jilin, China,in 1959. He received the B.S. and M.S. degrees incontrol engineering from Northeastern Dianli Uni-versity of China, Jilin City, China, in 1982 and 1985,respectively, and the Ph.D. degree in thermal powerengineering and automation from Southeast Univer-sity, Nanjing, China, in 1991.

    He was with the Department of Automatic Control,Northeastern University, Shenyang, China, in 1992,as a Postdoctoral Fellow for two years. Since 1994,he has been a Professor and the Head of the Insti-

    tute of Electric Automation, School of Information Science and Engineering,Northeastern University. His main research interests include fuzzy control, sto-chastic system control, neural-network-based control, nonlinear control, andtheir applications.

    Dr. Zhang is an Associate Editor for the IEEE TRANSACTIONS ON SYSTEMS,MAN, AND CYBERNETICS—PART B and Automatica. He was the recipient ofthe Outstanding Youth Science Foundation Award from the National NaturalScience Foundation of China in 2003. He was named the Cheung Kong Scholarby the Education Ministry of China in 2005.

    Zhanshan Wang was born in Liaoning, China, in1971. He received the M.S. degree in control theoryand control engineering from Fushun Petroleum In-stitute, Fushun, China, in 2001 and the Ph.D. degreein control theory and control engineering from North-eastern University, Shenyang, China, in 2006.

    He is currently an Associate Professor with theSchool of Information Science and Engineering,Northeastern University. His research interestsinclude stability analysis of recurrent neuralnetworks, fault diagnosis, fault-tolerant control, and

    nonlinear control.

    Derong Liu (S’91–M’94–SM’96–F’05) received thePh.D. degree in electrical engineering from the Uni-versity of Notre Dame, Notre Dame, IN, in 1994.

    From 1993 to 1995, he was a Staff Fellow withthe General Motors Research and DevelopmentCenter, Warren, MI. From 1995 to 1999, he was anAssistant Professor with the Department of Elec-trical and Computer Engineering, Stevens Instituteof Technology, Hoboken, NJ. He has been with theUniversity of Illinois, Chicago, since 1999, wherehe is currently a Full Professor of electrical and

    computer engineering and computer science and has also been the Director ofGraduate Studies with the Department of Electrical and Computer Engineeringsince 2005. He has published eight books (four research monographs and fouredited volumes).

    Dr. Liu is an Associate Editor of Automatica. He is currently the Editor of theIEEE Computational Intelligence Society’s Electronic Letter and an AssociateEditor for the IEEE TRANSACTIONS ON NEURAL NETWORKS, the IEEE Compu-tational Intelligence Magazine, and the IEEE Circuits and Systems Magazine.He is the General Chair for the 2008 IEEE International Conference on Net-working, Sensing and Control (Sanya, China). He was the Program Chair forthe 2008 International Joint Conference on Neural Networks, the 2007 IEEEInternational Symposium on Approximate Dynamic Programming and Rein-forcement Learning, the 21st IEEE International Symposium on Intelligent Con-trol (2006), and the 2006 International Conference on Networking, Sensing andControl. He is an elected AdCom Member of the IEEE Computational Intel-ligence Society (2006–2008), the Chair of the Chicago Chapter of the IEEEComputational Intelligence Society, the Chair of the Technical Committee onNeural Networks of the IEEE Computational Intelligence Society, and the PastChair of the Technical Committee on Neural Systems and Applications of theIEEE Circuits and Systems Society. He was the recipient of the Michael J. BirckFellowship from the University of Notre Dame (1990), the Harvey N. Davis Dis-tinguished Teaching Award from Stevens Institute of Technology (1997), theFaculty Early Career Development (CAREER) Award from the National Sci-ence Foundation (1999), and the University Scholar Award from the Universityof Illinois (2006–2009). He is a member of Eta Kappa Nu.

    Authorized licensed use limited to: University of Illinois. Downloaded on May 4, 2009 at 15:17 from IEEE Xplore. Restrictions apply.