Open Access

A New Singular Impulsive Delay Differential Inequality and Its Application

Journal of Inequalities and Applications20092009:461757

https://doi.org/10.1155/2009/461757

Received: 10 January 2009

Accepted: 5 March 2009

Published: 8 March 2009

Abstract

A new singular impulsive delay differential inequality is established. Using this inequality, the invariant and attracting sets for impulsive neutral neural networks with delays are obtained. Our results can extend and improve earlier publications.

1. Introduction

It is well known that inequality technique is an important tool for investigating dynamical behavior of differential equation. The significance of differential and integral inequalities in the qualitative investigation of various classes of functional equations has been fully illustrated during the last 40 years [13]. Various inequalities have been established such as the delay integral inequality in [4], the differential inequalities in [5, 6], the impulsive differential inequalities in [710], Halanay inequalities in [1113], and generalized Halanay inequalities in [1417]. By using the technique of inequality, the invariant and attracting sets for differential systems have been studied by many authors [9, 1821].

However, the inequalities mentioned above are ineffective for studying the invariant and attracting sets of impulsive nonautonomous neutral neural networks with time-varying delays. On the basis of this, this article is devoted to the discussion of this problem.

Motivated by the above discussions, in this paper, a new singular impulsive delay differential inequality is established. Applying this equality and using the methods in [10, 22], some sufficient conditions ensuring the invariant set and the global attracting set for a class of neutral neural networks system with impulsive effects are obtained.

2. Preliminaries

Throughout the paper, means -dimensional unit matrix, the set of real numbers, the set of positive integers, and . means that each pair of corresponding elements of and satisfies the inequality " ( )". Especially, is called a nonnegative matrix if .

denotes the space of continuous mappings from the topological space to the topological space . In particular, let , where is a constant.
denotes the space of piecewise continuous functions with at most countable discontinuous points and at this points are right continuous. Especially, let . Furthermore, put .
, where denotes the derivative of . In particular, let .
is a positive integrable function and satisfies and .
For , or , we define And we introduce the following norm, respectively,
(2.1)
For any , we define the following norm:
(2.2)
For an -matrix defined in [23], we denote
(2.3)

It is a cone without conical surface in . We call it an " -cone".

3. Singular Impulsive Delay Differential Inequality

For convenience, we introduce the following conditions.

Let the -dimensional diagonal matrix satisfy
(3.1)
Let be an -matrix, where and satisfies
(3.2)

Theorem 3.1.

Assume the conditions and hold. Let and be a solution of the following singular delay differential inequality with the initial conditions :
(3.3)
where , and , , . Then
(3.4)
provided that the initial conditions satisfy
(3.5)
where , and the positive number satisfies the following inequality:
(3.6)

Proof.

By the conditions ( ) and the definition of -matrix, there is a constant vector such that , exists and .

By using continuity, we obtain that there must exist a positive constant satisfying the inequality (3.6), that is,
(3.7)
Denote by
(3.8)
It follows from (3.3) and (3.5) that
(3.9)
In the following, we will prove that for any positive constant ,
(3.10)
Let
(3.11)

If inequality (3.10) is not true, then is a nonempty set and there must exist some integer such that .

If , by and the inequality (3.5), we can get
(3.12)
(3.13)
By using , (3.3), (3.7), (3.12), (3.13), and , we obtain that
(3.14)
Since , we have by . Then (3.14) becomes
(3.15)

which contradicts the second inequality in (3.12).

If , then by and . From the inequality (3.5), we can get
(3.16)
By using , (3.3), (3.7), (3.16), and , we obtain that
(3.17)
This is a contradiction. Thus the inequality (3.10) holds. Therefore, letting in (3.10), we have
(3.18)

The proof is complete.

Remark 3.2.

In order to overcome the difficulty that in (3.3) may be discontinuous, we introduce the notation which is different from the notation in [7]. However, when is continuous in , we have
(3.19)

So we can get [7, Lemma 1] when we choose , , in Theorem 3.1.

Remark 3.3.

Suppose that and in Theorem 3.1, then we can get [10, Theorem 3.1].

4. Applications

The singular impulsive delay differential inequality obtained in Section 3 can be widely applied to study the dynamics of impulsive neutral differential equations. To illustrate the theory, we consider the following nonautonomous impulsive neutral neural networks with delays
(4.1)

where is the neural state vector; , , are the interconnection matrices representing the weighting coefficients of the neurons; are activation functions; are transmission delays; denotes the external inputs at time . represents impulsive perturbations; the fixed moments of time satisfy .

The initial condition for (4.1) is given by
(4.2)

We always assume that for any , (4.1) has at least one solution through ( ), denoted by or (simply or if no confusion should occur).

Definition 4.1.

The set is called a positive invariant set of (4.1), if for any initial value , we have the solution for .

Definition 4.2.

The set is called a global attracting set of (4.1), if for any initial value , the solution converges to as . That is,
(4.3)

where , for .

Throughout this section, we suppose the following.

are continuous. Moreover, and .

There exist nonnegative matrices , , , and a constant such that

(4.4)

There exist nonnegative matrices such that

(4.5)

There exist nonnegative matrices , , such that for all the activating functions and satisfy

(4.6)
There exists nonnegative matrix , such that for all , and
(4.7)

.

Denote by
(4.8)

and let be an -matrix, and .

There exists a constant such that
(4.9)
where , and the scalar is determined by the inequality
(4.10)
where , and
(4.11)

Theorem 4.3.

Assume that hold. Then is a global attracting set of (4.1).

Proof.

Denote . Let be the sign function. For , define

Calculating the upper right derivative along system (4.1). From (4.1), and we have
(4.12)
On the other hand, from (4.1) and , we have
(4.13)
Let
(4.14)
then from (4.12)–(4.14) and , we have
(4.15)
By the conditions and the definition of -matrix, we may choose a vector such that
(4.16)
By using continuity, we obtain that there must be a positive constant satisfying the inequality (4.10). Let and , then . Since , denote
(4.17)

then . From the property of -cone, we have, .

For the initial conditions , , where and (no loss of generality, we assume , and , we can get
(4.18)
Then (4.18) yield
(4.19)
Let , and . Thus, all conditions of Theorem 3.1 are satisfied. By Theorem 3.1, we have
(4.20)
Suppose that for all , the inequalities
(4.21)

hold, where .

From (4.21), , and , we can get
(4.22)
Since , we have
(4.23)
On the other hand, it follows from that
(4.24)
Then from (4.21)–(4.24), we have
(4.25)
which together with (4.22) yields that
(4.26)
Then, it follows from (4.21) and (4.26) that
(4.27)
Using Theorem 3.1 again, we have
(4.28)
By mathematical induction, we can conclude that
(4.29)
Noticing that , by , we can use (4.29) to conclude that
(4.30)

This implies that the conclusion of the theorem holds.

By using Theorem 4.3 with , we can obtain a positive invariant set of (4.1), and the proof is similar to that of Theorem 4.3.

Theorem 4.4.

Assume that with hold. Then is a positive invariant set and also a global attracting set of (4.1).

Remark 4.5.

Suppose that in , and , then we can get Theorems 1 and 2 in [9].

Remark 4.6.

If then (4.1) becomes the nonautonomous neutral neural networks without impulses, we can get Theorem 4.1 in [22].

5. Illustrative Example

The following illustrative example will demonstrate the effectiveness of our results.

Example 5.1.

Consider nonlinear impulsive neutral neural networks:
(5.1)
with
(5.2)

where , , , .

The parameters of conditions are as follows:
(5.3)
It is easy to prove that is an -matrix and
(5.4)
Let , then and . Let which satisfies the inequality
(5.5)

Now, we discuss the asymptotical behavior of the system (5.1) as follows.

  1. (i)
    If , then
    (5.6)
     

Thus , , , and . Clearly, all conditions of Theorem 4.3 are satisfied, by Theorem 4.3, is a global attracting set of (5.1).

(ii)If , then . By Theorem 4.4, is a positive invariant set of (5.1).

Declarations

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant no. 10671133 and the Scientific Research Fund of Sichuan Provincial Education Department (08ZA044).

Authors’ Affiliations

(1)
College of Computer Science and Technology, Southwest University for Nationalities
(2)
Yangtze Center of Mathematics, Sichuan University

References

  1. Walter W: Differential and Integral Inequalities. Springer, New York, NY, USA; 1970:x+352.View ArticleGoogle Scholar
  2. Lakshmikantham V, Leela S: Differential and Integral Inequalities: Theory and Applications. Vol. I: Ordinary Differential Equations, Mathematics in Science and Engineering. Volume 55. Academic Press, New York, NY, USA; 1969:ix+390.MATHGoogle Scholar
  3. Lakshmikantham V, Leela S: Differential and Integral Inequalities: Theory and Applications. Vol. II: Functional, Partial, Abstract, and Complex Differential Equations, Mathematics in Science and Engineering. Volume 55. Academic Press, New York, NY, USA; 1969:ix+319.MATHGoogle Scholar
  4. Xu DY: Integro-differential equations and delay integral inequalities. Tohoku Mathematical Journal 1992,44(3):365–378. 10.2748/tmj/1178227303MathSciNetView ArticleMATHGoogle Scholar
  5. Wang L, Xu DY: Global exponential stability of Hopfield reaction-diffusion neural networks with time-varying delays. Science in China. Series F 2003,46(6):466–474. 10.1360/02yf0146MathSciNetView ArticleMATHGoogle Scholar
  6. Huang YM, Xu DY, Yang ZG: Dissipativity and periodic attractor for non-autonomous neural networks with time-varying delays. Neurocomputing 2007,70(16–18):2953–2958.View ArticleGoogle Scholar
  7. Xu DY, Yang Z: Impulsive delay differential inequality and stability of neural networks. Journal of Mathematical Analysis and Applications 2005,305(1):107–120. 10.1016/j.jmaa.2004.10.040MathSciNetView ArticleMATHGoogle Scholar
  8. Xu DY, Zhu W, Long S: Global exponential stability of impulsive integro-differential equation. Nonlinear Analysis: Theory, Methods & Applications 2006,64(12):2805–2816. 10.1016/j.na.2005.09.020MathSciNetView ArticleMATHGoogle Scholar
  9. Xu DY, Yang Z: Attracting and invariant sets for a class of impulsive functional differential equations. Journal of Mathematical Analysis and Applications 2007,329(2):1036–1044. 10.1016/j.jmaa.2006.05.072MathSciNetView ArticleMATHGoogle Scholar
  10. Xu DY, Yang Z, Yang Z: Exponential stability of nonlinear impulsive neutral differential equations with delays. Nonlinear Analysis: Theory, Methods & Applications 2007,67(5):1426–1439. 10.1016/j.na.2006.07.043MathSciNetView ArticleMATHGoogle Scholar
  11. Driver RD: Ordinary and Delay Differential Equations, Applied Mathematical Sciences. Volume 20. Springer, New York, NY, USA; 1977:ix+501.View ArticleGoogle Scholar
  12. Gopalsamy K: Stability and Oscillations in Delay Differential Equations of Population Dynamics, Mathematics and Its Applications. Volume 74. Kluwer Academic Publishers, Dordrecht, The Netherlands; 1992:xii+501.View ArticleMATHGoogle Scholar
  13. Halanay A: Differential Equations: Stability, Oscillations, Time Lags. Academic Press, New York, NY, USA; 1966:xii+528.MATHGoogle Scholar
  14. Amemiya T: Delay-independent stabilization of linear systems. International Journal of Control 1983,37(5):1071–1079. 10.1080/00207178308933029MathSciNetView ArticleMATHGoogle Scholar
  15. Liz E, Trofimchuk S: Existence and stability of almost periodic solutions for quasilinear delay systems and the Halanay inequality. Journal of Mathematical Analysis and Applications 2000,248(2):625–644. 10.1006/jmaa.2000.6947MathSciNetView ArticleMATHGoogle Scholar
  16. Ivanov A, Liz E, Trofimchuk S: Halanay inequality, Yorke 3/2 stability criterion, and differential equations with maxima. Tohoku Mathematical Journal 2002,54(2):277–295. 10.2748/tmj/1113247567MathSciNetView ArticleMATHGoogle Scholar
  17. Tian H: The exponential asymptotic stability of singularly perturbed delay differential equations with a bounded lag. Journal of Mathematical Analysis and Applications 2002,270(1):143–149. 10.1016/S0022-247X(02)00056-2MathSciNetView ArticleMATHGoogle Scholar
  18. Lu KN, Xu DY, Yang ZC: Global attraction and stability for Cohen-Grossberg neural networks with delays. Neural Networks 2006,19(10):1538–1549. 10.1016/j.neunet.2006.07.006View ArticleMATHGoogle Scholar
  19. Xu DY, Li S, Zhou X, Pu Z: Invariant set and stable region of a class of partial differential equations with time delays. Nonlinear Analysis: Real World Applications 2001,2(2):161–169. 10.1016/S0362-546X(00)00111-5MathSciNetView ArticleMATHGoogle Scholar
  20. Xu DY, Zhao H-Y: Invariant and attracting sets of Hopfield neural networks with delay. International Journal of Systems Science 2001,32(7):863–866.MathSciNetView ArticleMATHGoogle Scholar
  21. Zhao H: Invariant set and attractor of nonautonomous functional differential systems. Journal of Mathematical Analysis and Applications 2003,282(2):437–443. 10.1016/S0022-247X(02)00370-0MathSciNetView ArticleMATHGoogle Scholar
  22. Guo Q, Wang X, Ma Z: Dissipativity of non-autonomous neutral neural networks with time-varying delays. Far East Journal of Mathematical Sciences 2008,29(1):89–100.MathSciNetMATHGoogle Scholar
  23. Berman A, Plemmons RJ: Nonnegative Matrices in the Mathematical Sciences, Computer Science and Applied Mathematic. Academic Press, New York, NY, USA; 1979:xviii+316.Google Scholar

Copyright

© Z. Ma and X.Wang. 2009

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.