Skip to content


  • Research Article
  • Open Access

Global Exponential Stability of Delayed Cohen-Grossberg BAM Neural Networks with Impulses on Time Scales

Journal of Inequalities and Applications20092009:491268

  • Received: 18 April 2009
  • Accepted: 14 July 2009
  • Published:


Based on the theory of calculus on time scales, the homeomorphism theory, Lyapunov functional method, and some analysis techniques, sufficient conditions are obtained for the existence, uniqueness, and global exponential stability of the equilibrium point of Cohen-Grossberg bidirectional associative memory (BAM) neural networks with distributed delays and impulses on time scales. This is the first time applying the time-scale calculus theory to unify the discrete-time and continuous-time Cohen-Grossberg BAM neural network with impulses under the same framework.


  • Neural Network
  • Equilibrium Point
  • Impulsive Effect
  • Generalize Exponential Function
  • Hopfield Neural Network

1. Introduction

In the recent years, bidirectional associative memory (BAM) neural networks and Cohen-Grossberg neural networks (CGNNs) with their various generalizations have attracted the attention of many mathematicians, physicists, and computer scientists (see [117]) due to their wide range of applications in, for example, pattern recognition, associative memory, and combinatorial optimization. Particularly, as discussed in [1820], in the hardware implementation of the neural networks, when communication and response of neurons happens time delays may occur. Actually, time delays are known to be a possible source of instability in many real-world systems in engineering, biology, and so forth. (see, e.g., [21] and references therein). However, besides delay effect, impulsive effect likewise exists in a wide variety of evolutionary processes in which states are changed abruptly at certain moments of time, involving fields such as medicine and biology, economics, mechanics, electronics, and telecommunications. As artificial electronic systems, neural networks such as Hopfield neural networks, bidirectional neural networks, and recurrent neural networks often are subject to impulsive perturbations which can affect dynamical behaviors of the systems just as time delays. Therefore, it is necessary to consider both impulsive effect and delay effect on the stability of neural networks.

As is well known, both continuous and discrete systems are very important in implementation and applications. However, it is troublesome to study the stability for continuous and discrete systems, respectively. Therefore, it is worth studying a new method, such as the time-scale theory, which can unify the continuous and discrete situations.

Motivated by the above discussions, the objective of this paper is to study the global exponential stability of the following Cohen-Grossberg bidirectional associative memory networks with impulses and time delays on time scales:


where is a time scale; are continuous, , are the states of the th neuron from the neural field and the th neuron from the neural field at time , respectively; denote the activation functions of the th neuron from and the th neuron from , respectively; and are constants, which denote the external inputs on the th neuron from and the th neuron from , respectively; and correspond to the transmission delays; and represent amplification functions; and are appropriately behaved functions such that the solutions of system (1.1) remain bounded; and denote the connection strengths which correspond to the neuronal gains associated with the neuronal activations; and denote the external inputs. For each interval of , we denote that by are the impulses at moments , and represent the right and left limits of and in the sense of time scales; is a strictly increasing sequence.

The system (1.1) is supplement with initial values given by


where , are continuous real-valued functions defined on their respective domains.

As usual in the theory of impulsive differential equations, at the points of discontinuity of the solution we assume that


for .

The organization of the rest of this paper is as follows. In Section 2, we introduce some notations and definitions, and state some preliminary results which are needed in later sections. In Section 3, by means of homeomorphism theory, we study the existence and uniqueness of the equilibrium point of system (1.1). In Section 4, by constructing a suitable Lyapunov function, we establish the exponential stability of the equilibrium of (1.1). In Section 5, we present an example to illustrate the feasibility and effectiveness of our results obtained in previous sections.

2. Preliminaries

In this section, we will cite some definitions and lemmas which will be used in the proofs of our main results.

Let be a nonempty closed subset (time scale) of . The forward and backward jump operators and the graininess : are defined, respectively, by


A point is called left dense if and , left scattered if , right dense if and , and right scattered if . If has a left-scattered maximum , then ; otherwise . If has a right-scattered minimum , then ; otherwise .

A function is right dense continuous provided that it is continuous at right dense point in and its left-side limits exist at left-dense points in . If is continuous at each right dense point and each left-dense point, then is said to be a continuous function on . The set of continuous functions will be denoted by .

For and , we define the delta derivative of to be the number (if it exists) with the property that for a given , there exists a neighborhood of such that


for all

If is continuous, then is right dense continuous, and is delta differentiable at , then is continuous at .

Let be right dense continuous. If , then we define the delta integral by


Definition 2.1 (see [22]).

For each , let be a neighborhood of , then, for , define to mean that, given , there exists a right neighborhood of such that
for each , , where . If is rd and is continuous at , this reduce to

Definition 2.2 (see [23]).

If , and is right dense continuous on , then we define the improper integral by

provided that this limit exists, and we say that the improper integral converges in this case. If this limit does not exist, then we say that the improper integral diverges.

A function is called regressive if


for all .

If is regressive function, then the generalized exponential function is defined by


with the cylinder transformation


Let be two regressive functions, then we define


Then the generalized exponential function has the following properties.

Lemma 2.3 (see [24]).

Assume that are two regressive functions, then

(i) and







(viii) .

Definition 2.4.

The equilibrium point of system(1.1) is said to beexponentially stable if there exists a positive constant such that for every , there exists such that the solution of (1.1) with initial value satisfies

Lemma 2.5 (see [25]).

If satisfies the following conditions:

(i) is injective on ,

(ii) as ,

then is a homeomorphism of onto itself.

For , we define the norm as


Throughout this paper, we assume that

() , and satisfy ;

()the activation functions and there exist positive constants such that


for all ;

() and there exist positive constants such that


()the kernels and defined on are nonnegative continuous integral functions such that ,

3. Existence and Uniqueness of the Equilibrium

In this section, using homeomorphism theory, we will study the existence and uniqueness of the equilibrium point of system (1.1).

An equilibrium point of (1.1) is a constant vector which satisfies the system


where the impulsive jumps satisfy


From the assumptions and , it follows that


Noting that if exist and activation functions and are bounded, then the existence of an equilibrium point of system (1.1) is easily obtained from Brouwer's fixed point theorem. We can refer to [28].

Theorem 3.1.

Assume that and hold. Suppose further that for each , the following inequalities are satisfied:

Then there exists a unique equilibrium point of system (1.1).


Consider a mapping defined by
where . First, we want to show that is an injective mapping on . By contradiction, suppose that there exists a distinct such that , where and . Then it follows from (3.5) that
In view of - and (3.6), we have
Thus, we can obtain

It follows from (3.4) and (3.8) that and , That is , which leads to a contradiction. Therefore, is an injective on .

Then we will prove is a homeomorphism on . For convenience, we let , where

We assert that as . Otherwise there is a sequence such that and is bounded as , where . Noting that
we have
On the other hand, we have
It follows from (3.11) and (3.12) that
That is

which contradicts our choice of . Hence, satisfies as . By Lemma 2.5, is a homeomorphism on and there exists a unique point such that . From the definition of , we know that is the unique equilibrium point of (1.1).

4. Global Exponential Stability of the Equilibrium

In this section, we will construct some suitable Lyapunov functions to derive the sufficient conditions which ensure that the equilibrium of (1.1) is globally exponentially stable.

Theorem 4.1.

Assume that ( )–( ) hold, suppose further that

()for each , the following inequalities are satisfied:


()the impulsive operators and satisfy


Then the unique equilibrium point of system (1.1) is globally exponentially stable.


According to Theorem 3.1, we know that (1.1) has a unique equilibrium point . In view of , it is easy to see that and . Suppose that is an arbitrary solution of (1.1). Let , then system (1.1) can be rewritten as
where, for ,
Also, forall ,
Hence by and , we have
Also, for
Similarly, we have
Let and be defined by
respectively, where . By , we have
Since are continuous on and , as , there exist such that and , for , for . By choosing , we obtain
where . For it follows from (4.6)–(4.15), we can obtain
Define a Lyapunov function
And we note that for and . Calculating the -derivatives of , we get
It follows that for and hence, for , we can obtain
In view of (4.14)-(4.15) and the previous inequality, we have

The proof is complete.

5. An Example

In this section, we give an example to illustrate our results.

Consider the following Cohen-Grossberg BAM neural networks system with distributed delays and impulses:


where . A simple computation shows that . It is easy to check that all conditions of Theorems 3.1 and 4.1 are satisfied. Hence, (5.1) has a unique equilibrium point, which is globally exponentially stable.

6. Conclusion

Using the time-scale calculus theory, the homeomorphism theory and the Lyapunov functional method, some sufficient conditions are obtained to ensure the existence and the global exponential stability of the unique equilibrium point of Cohen-Grossberg BAM neural networks with distributed delays and impulses on time scales. This is the first time applying the time-scale calculus theory to unify and improve impulsive Cohen-Grossberg BAM neural networks with distributed delays on time scales under the same framework. The sufficient conditions we obtained can easily be checked in practice by simple algebraic methods.



This work was supported by the National Natural Sciences Foundation of People's Republic of China and the Natural Sciences Foundation of Yunnan Province under Grant 04Y239A.

Authors’ Affiliations

Department of Mathematics, Yunnan University, Kunming, Yunnan, 650091, China
School of Statistics and Mathematics, Yunnan University of Finance and Economics, Kunming, Yunnan, 650221, China


  1. Wang L, Zou X: Exponential stability of Cohen-Grossberg neural networks. Neural Networks 2002,15(3):415–422. 10.1016/S0893-6080(02)00025-4MathSciNetView ArticleGoogle Scholar
  2. Wang L, Zou X: Harmless delays in Cohen-Grossberg neural networks. Physica D 2002,170(2):162–173. 10.1016/S0167-2789(02)00544-4MathSciNetView ArticleMATHGoogle Scholar
  3. Chen T, Rong L: Delay-independent stability analysis of Cohen-Grossberg neural networks. Physics Letters A 2003,317(5–6):436–449. 10.1016/j.physleta.2003.08.066MathSciNetView ArticleMATHGoogle Scholar
  4. Chen T, Rong L: Robust global exponential stability of Cohen-Grossberg neural networks with time delays. IEEE Transactions on Neural Networks 2004,15(1):203–205. 10.1109/TNN.2003.822974MathSciNetView ArticleGoogle Scholar
  5. Lu W, Chen T: New conditions on global stability of Cohen-Grossberg neural networks. Neural Computation 2003,15(5):1173–1189. 10.1162/089976603765202703View ArticleMATHGoogle Scholar
  6. Song Q, Cao J: Stability analysis of Cohen-Grossberg neural network with both time-varying and continuously distributed delays. Journal of Computational and Applied Mathematics 2006,197(1):188–203. 10.1016/ ArticleMATHGoogle Scholar
  7. Liao X, Li C, Wong K-W: Criteria for exponential stability of Cohen-Grossberg neural networks. Neural Networks 2004,17(10):1401–1414. 10.1016/j.neunet.2004.08.007MathSciNetView ArticleMATHGoogle Scholar
  8. Rong L: LMI-based criteria for robust stability of Cohen-Grossberg neural networks with delay. Physics Letters A 2005,339(1–2):63–73. 10.1016/j.physleta.2005.03.023MathSciNetView ArticleMATHGoogle Scholar
  9. Ye H, Michel AN, Wang K: Qualitative analysis of Cohen-Grossberg neural networks with multiple delays. Physical Review E 1995,51(3):2611–2618. 10.1103/PhysRevE.51.2611MathSciNetView ArticleGoogle Scholar
  10. Wang L: Stability of Cohen-Grossberg neural networks with distributed delays. Applied Mathematics and Computation 2005,160(1):93–110. 10.1016/j.amc.2003.09.014MathSciNetView ArticleMATHGoogle Scholar
  11. Zhao H: Global stability of bidirectional associative memory neural networks with distributed delays. Physics Letters A 2002,297(3–4):182–190. 10.1016/S0375-9601(02)00434-6MathSciNetView ArticleMATHGoogle Scholar
  12. Bai C: Stability analysis of Cohen-Grossberg BAM neural networks with delays and impulses. Chaos, Solitons & Fractals 2008,35(2):263–267. 10.1016/j.chaos.2006.05.043MathSciNetView ArticleMATHGoogle Scholar
  13. Li Y: Existence and stability of periodic solution for BAM neural networks with distributed delays. Applied Mathematics and Computation 2004,159(3):847–862. 10.1016/j.amc.2003.11.007MathSciNetView ArticleMATHGoogle Scholar
  14. Li Y, Xing W, Lu L: Existence and global exponential stability of periodic solution of a class of neural networks with impulses. Chaos, Solitons & Fractals 2006,27(2):437–445. 10.1016/j.chaos.2005.04.021MathSciNetView ArticleMATHGoogle Scholar
  15. Gui Z, Yang X-S, Ge W: Periodic solution for nonautonomous bidirectional associative memory neural networks with impulses. Neurocomputing 2007,70(13–15):2517–2527.View ArticleGoogle Scholar
  16. Huang T, Chan A, Huang Y, Cao J: Stability of Cohen-Grossberg neural networks with time-varying delays. Neural Networks 2007,20(8):868–873. 10.1016/j.neunet.2007.07.005View ArticleMATHGoogle Scholar
  17. Huang T, Huang Y, Li C: Stability of periodic solution in fuzzy BAM neural networks with finite distributed delays. Neurocomputing 2008,71(16–18):3064–3069.View ArticleGoogle Scholar
  18. Baldi P, Atiya AF: How delays affect neural dynamics and learning. IEEE Transactions on Neural Networks 1994,5(4):612–621. 10.1109/72.298231View ArticleGoogle Scholar
  19. Babcock KL, Westervelt RM: Dynamics of simple electronic neural networks. Physica D 1987,28(3):305–316. 10.1016/0167-2789(87)90021-2MathSciNetView ArticleGoogle Scholar
  20. Wei J, Ruan S: Stability and bifurcation in a neural network model with two delays. Physica D 1999,130(3–4):255–272. 10.1016/S0167-2789(99)00009-3MathSciNetView ArticleMATHGoogle Scholar
  21. Gu K, Kharitonov V, Chen J: Stability of Time-Delay Systems. Birkhäuser, Boston, Mass, USA; 2003.View ArticleMATHGoogle Scholar
  22. Saker SH, Agarwal RP, O'Regan D: Oscillation of second-order damped dynamic equations on time scales. Journal of Mathematical Analysis and Applications 2007,330(2):1317–1337. 10.1016/j.jmaa.2006.06.103MathSciNetView ArticleMATHGoogle Scholar
  23. Bi L, Bohner M, Fan M: Periodic solutions of functional dynamic equations with infinite delay. Nonlinear Analysis: Theory, Methods & Applications 2008,68(5):1226–1245. 10.1016/ ArticleMATHGoogle Scholar
  24. Bohner M, Peterson A: Dynamic Equations on Time Scales: An Introduction with Application. Birkhäuser, Boston, Mass, USA; 2001:x+358.View ArticleMATHGoogle Scholar
  25. Forti M, Tesi A: New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Transactions on Circuits and Systems I 1995,42(7):354–366. 10.1109/81.401145MathSciNetView ArticleMATHGoogle Scholar


© Yongkun Li et al. 2009

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.