 Research
 Open Access
 Published:
Sufficient conditions for nonstability of stochastic differential systems
Journal of Inequalities and Applications volume 2015, Article number: 377 (2015)
Abstract
In this paper, we use the local \(C^{2}\)conjugate transformation to transform nonlinear stochastic differential systems to ordinary differential systems and give some sufficient conditions for the exponential stability and instability of stochastic differential systems. Our conditions just depend on the derivatives of drift terms and diffusion terms at equilibrium points.
Introduction
The exponential stability of nonlinear stochastic differential systems is always a point which has attracted a significant amount of concern in the last two decades and produced a lot of results and methods (see [1–4]). Because of the complexities of differential systems, one used to employ Lyapunov function to discuss these problems in most of the literature, the methods have the advantage of discussing the stability of the systems if one only knows the existence of these solutions.
However, there are many inconveniences in the application on account of the difficulties in the construction of the Lyapunov function. So we try to transform nonlinear stochastic differential systems to ordinary differential systems in which the Brownian motion is just a parameter of them, and then discuss the stability in different situations.
We consider a simple example when the dimension of stochastic differential system is one as an illustration of our ideas:
where \(W(t)\) is a standard one dimensional Brownian motion, σ is a constant, and \(b(x)\) is a function which satisfies the Lipschitz condition and the linear growth condition. Let \(Y(t)=X(t)\exp\{\sigma W(t)\}\), by the Ito formula of continuous semimartingales (see [5], p.32), we have
which is the same as
We remark that equation (1.2) is different from equation (1.1). It is an ordinary differential equation and the Brownian motion \(W(t)\) is a parameter in (1.2). We can get the solution of (1.2) for many \(b(x)\), such as \(b(x)=kx\) or \(b(x)=kx+hx^{n}\).
By the law of the iterated logarithm for Brownian motion (see [6], p.112), the exponential stability of \(Y(t)\) is equivalent to that of \(X(t)\). Because (1.2) is an ordinary differential equation with the parameter \(W(t)\), we can discuss the stability of (1.2) with a lot of methods.
The main purpose of this paper is to extend this result to a wider range. This paper will be organized as follows: We shall first give preliminaries: some definitions and fundamental lemmas which will be used in the following and together with the transformations in different situations. Next we shall give the main results of this paper. In this section we not only discuss the exponential stability of nonlinear stochastic differential systems in a one dimensional situation but also consider the high dimensional situation in two cases.
Preliminaries: transformation
Throughout this paper, unless otherwise specified, we let \((\Omega ,\mathscr{F}, \{\mathscr{F}_{t}\}_{t \geq0}, P)\) be a complete probability space with the filter \(\{\mathscr{F}_{t}\}_{t\geq0}\) which satisfies the following usual conditions:

(i)
\(\mathscr{F}_{0}\) includes all the events whose probability is 0;

(ii)
\(\mathscr{F}_{t}\) is right continuous.
Let \(\{W(t)\}_{t\geq 0}\) be the standard d dimensional Brownian motion which is defined on \((\Omega, \mathscr{F}, \{\mathscr{F}_{t}\}_{t\geq0}, P)\) and is adapted to \(\{\mathscr{F}_{t}\}_{t \geq0}\).
In this paper, we mainly consider the stochastic differential equation
Here \(b(x)\) and \(\sigma(x)\) are two continuous functions on \(\mathbb{R}^{m}\) and satisfy the following conditions:

(A1)
\(b(x)\) and \(\sigma(x)\) are continuous differentiable;

(A2)
there exists a positive constant M such that \(b(x)+\sigma (x)< M(1+x)\).
We know that equation (2.1) has a unique solution (see [7]). We suppose that

(A3)
0 is the only 0 point of \(b(x)\) and \(\sigma(x)\), and \(\partial \sigma(0)\) is not 0.
If \(\sigma(x)\) and \(b(x)\) satisfy (A1), (A2), and (A3), then (2.1) has the unique solution \(X(t)\equiv0\) corresponding to the initial value. We call the solution a trivial solution or equilibrium position.
From Lemma 2.1 of [8], we have the following result.
Lemma 1
If we denote the solution of (2.1) as \(X(t;x_{0})\), then
which means that almost all the sample paths of the solution of (2.1) starting from a nonzero state will never reach the origin.
Definition 1
If
then we call the trivial solution of (2.1) stable. If not, the trivial solution of (2.1) is unstable. Furthermore, if there exists a positive constant γ such that
then the trivial solution of (2.1) is exponentially stable.
Definition 2
We suppose that U and V are the vector spaces on \(\mathbb{R}^{m}\), \(U(0)=V(0)=0\). If there exist a positive constant ρ and \(H:O_{\rho}(0)\rightarrow H(O_{\rho}(0))\) satisfying the following two conditions:

(i)
H is \(C^{2}\)differentiable homeomorphism;

(ii)
\(\partial H(x)U(x)=V(H(x))\), \(\forall x\in O_{\rho}(0)\).
Then U and V are \(C^{2}\)equivalent and H is the \(C^{2}\)conjugate mapping between U and V.
One dimensional situation
Let \(m=d=1\). By (A3), we can suppose that \(\sigma(x)>0\) when \(x\in(0, \infty)\). If \(x\in(\infty, 0)\), then we have \(\sigma(x)<0\) and \(\sigma'(0)>0\), where \(\sigma'(x) \) is the derivative of \(\sigma(x)\).
Let
It is obvious that \(F(0)=0\), and \(F(x)\) is a strictly monotonically increasing function. We denote its inverse function by \(G(x)\) and then we have from (2.2)
In the same way we can get
For any \(x_{0}\neq0\), we note \(X(t;x_{0})\) as \(X(t)\) for short as long as it will not cause any ambiguity. Let
where \(\sigma(x)\), \(W(t)\) and \(X(t)\) are used in (2.1).
By the Ito formula we get
and
According to the product formula of continuous semimartingales, (2.3), (2.4), and (2.5), we have
Let
It is easy to see that \(\lim_{x\rightarrow0} R(x)=0\).
By (2.6) and (2.7) we know that \(Y(t)\) satisfies
We notice that equation (2.8) is an ordinary differential equation, where \(W(t)\) is a parameter of it. Thus (2.8) has a unique solution for every trajectory \(W(t)\) of Brownian motion.
High dimensional situation I
In the case \(m>1\) and \(d=1\). Let \(A=(a_{ij})_{i,j\leq m}\) be a \(m\times m\) matrix and suppose that every m dimensional vector is a \(m\times1\) matrix. For any constant x, we make a convention
Let \(A(s)=(a_{ij}(s))_{i,j\leq m}\) and \(B(s)=(b_{ij}(s))_{i,j\leq m}\) be \(m\times m\) matrix functions and \(\langle dA(s),dB(s)\rangle\) be the abbreviated form of \((\sum_{k=1}^{m} \langle da_{ik}(s),db_{kj}(s)\rangle)_{i,j\leq m}\). If \(a_{ij}(s)\) and \(b_{ij}(s)\) are all continuous semimartingales, then we have
Let A be a \(m\times m\) matrix, \(W(t)\) be a one dimensional Brownian motion, and
then we have
and
Furthermore, if A and B are commutative (\(AB=BA\), A and B are \(m \times m\) matrices), then we have
Let \(A=\partial\sigma(0)\), \(\sigma(x)\) and Ax be smooth \(C^{2}\)equivalent, H be the \(C^{2}\)conjugate mapping between \(\sigma (x)\), and Ax. Suppose that \(G(y)\) is the inverse mapping of \(H(x)\), then \(\partial G(0)=(\partial H(0))^{1}\). By Definition 2, we have \(\partial H(x)\sigma(x)=A H(x)\), \(\forall x\in O_{\rho}(0)\). Taking derivative of both sides above, we can get \(\partial H(0)A=A\partial H(0)\), which means that \(\partial H(0)\) and A are commutative.
Now suppose that \(X(t;x_{0})\) is the solution of (2.1) for any \(x_{0}\in O_{\rho}(0)\) and is noted as \(X(t)\). Let
and
where \(A=\partial\sigma(0)\) and H is the \(C^{2}\)conjugate mapping. Then
Let
and
Then we have
So we can get from (2.9), (2.10), (2.11), and (2.12)
for \(t\in(0, S)\), which means that equation (2.13) is an ordinary differential equation.
High dimensional situation II
Next we suppose \(d>1\). Let \(W(t)=(W_{1}(t),\ldots,W_{d}(t))\) and \(\sigma(x)=(\sigma_{1}(x),\ldots,\sigma_{d}(x))\). Here \(W_{1}(t), \ldots, W_{d}(t)\) are d independent Brownian motions, \(\sigma_{i}(x)\) is a function defined on \(\mathbb{R}^{m}\), where \(i=1, 2, \ldots, d\).
Let \(A_{1}=\partial\sigma_{1}(0),\ldots, A_{d}=\partial\sigma_{d}(0)\). Now suppose that \(\sigma_{k}(x)\) and \(A_{k} x\) are \(C^{2}\)equivalent, where \(k=1, \ldots, d\). For \(m=1\), the \(C^{2}\)equivalence will be automatically established.
Suppose that \(H^{(k)}\) is the \(C^{2}\)conjugate mapping between \(\sigma _{k}(x)\) and \(A_{k} x\), \(k=1, \ldots, d\). Let
If \(\partial b(0)\), \(A_{1},\ldots,A_{d}\) are commutative, then we have
Main results: exponential stability and instability
One dimensional situation
Let \(\gamma=b'(0)\frac{\sigma'(0)}{2}\) in equation (2.8), then we have the following results.
Theorem 1
If \(\gamma<0\), then the trivial solution of (2.1) is exponentially stable.
Proof
We only prove Theorem 1 in the case \(x_{0}>0\), because the proof of it is similar in the case \(x_{0} \leq0\).
By the law of iterated logarithm for Brownian motion, almost surely, we have
so there is a corresponding \(M_{W}\) such that
for every trajectory \(W(t)\) of Brownian motion, where \(t\geq0\).
It is obvious that
is a bounded function. If we denote its upper bound by \(N_{W}\), then it is easy to get \(N_{W}>1\).
Since \(\lim_{x\rightarrow0}R(x)=0\), we can find a \(\epsilon>0\) such that \(R(x)<\frac{\gamma}{3} \), where \(x\in(0, \epsilon)\). Notice that \(C=\lim_{x\rightarrow0+} \frac{F(x)}{x}\) is a nonzero constant and \(G(x)\) is the inverse function of \(F(x)\), we have \(\lim_{x\rightarrow0+} \frac{G(x)}{x}=\frac{1}{C}\), which means that there exists a positive constant δ such that
for any \(x\in(0, \delta)\).
Let \(T=\inf\{ t R(X(t;x_{0}))\geq\frac{\gamma}{2}\}\), where \(x_{0}<\frac{\min\{\delta,\epsilon\}}{4(1+C)N_{W}}\). It is obvious that \(T>0\). If T is finite and \(R(X(T))=\frac{\gamma}{2}\) for \(t\in[0, T]\), then we have from (2.8)
So
By the definition of ϵ we can get \(R(X(t))<\frac{\gamma }{3}\) for any \(t\in[0, T]\), which is a contradiction. So we have \(R(X(t))\leq\frac{\gamma }{2}\), which together with (2.8) gives \(Y(t)\leq e^{\frac{\gamma t}{2}}F(x_{0})\). This means that
where \(x_{0}<\frac{\min\{\delta,\epsilon\}}{4(1+C)N_{W}}\). According to Definition 1, we immediately know that the trivial solution of (2.1) is exponentially stable.
The proof is similar in the case \(x_{0}<0\). So we omit it here. Then we complete the proof. □
Remark 1
In the proof of Theorem 1, the constant \(N_{W}\) depends on \(W(t)\), which means that \(N_{W}\) may be different for different trajectories \(W(t)\) of Brownian motion. Theorem 1 can be strengthened as follows. Almost surely, there exists a domain \(O_{\omega}(0)\) for every \(\omega \in\Omega\), we have for random \(x_{0}\in O_{\omega}(0)\)
where \(\lambda< b'(0)\frac{\sigma'(0)}{2}\).
Theorem 2
If \(\gamma>0\), then the trivial solution of (2.1) is unstable.
Proof
Suppose that \(x_{0}>0\). There exists a positive constant ϵ such that
where \(x\in(0, \epsilon]\).
Let \(T=\inf\{tX(t;x_{0})\geq \epsilon\}\), where \(x_{0}\in(0, \epsilon]\). It is easy to see that \(T>0\). For any \(t\in[0, T)\), we know that
which, together with the comparison principle of ordinary differential equation, gives
If \(P\{T=\infty\}>0\), when t is sufficiently large, then we have \(T=\infty\) and
By the law of iterated logarithm of Brownian motion, almost surely, we know that \(\limsup_{t\rightarrow\infty}X(t;x_{0})=\infty\) when \(T=\infty\), which is a contradiction.
So \(P\{T<\infty\}=1\), which is the same as \(\sup_{t>0}X(t;x_{0})\geq \epsilon\), a.s., so the trivial solution of (2.1) is unstable.
We can prove the conclusion in the case \(x_{0}<0\) similarly. So we omit it here.
Finally Theorem 2 is proved. □
High dimensional situation I
Let \(\Gamma_{1}=\partial b(0)\frac{1}{2}A^{2}\) in equation (2.13). Then we have the following results.
Theorem 3
If \(\partial b(0)\) and A are commutative, and the real parts of all the eigenvalues of \(\Gamma_{1}\) are negative, then the trivial solution of (2.1) is exponentially stable.
Proof
Since \(\partial b(0)\) and A are commutative, \(\partial H(0)\) and A are also commutative and (2.13) can be simplified to
We suppose that the maximum of the real part of all the eigenvalues of \(\Gamma_{1}\) is −λ (\(\lambda>0\)). If \(Z(t)=e^{\frac{\lambda t}{3}}Y(t)\), then \(Z(t)\) satisfies
Finally we complete the proof of Theorem 3 from Theorem 1. □
Theorem 4
If \(\partial b(0)\) and A are commutative, and there is a positive one in the real part of all the eigenvalues of \(\Gamma_{1}\), then the trivial solution of (2.1) is unstable.
Proof
We omit the proof of Theorem 4 because we can obtain it in a completely parallel way to the proof of Theorem 2. □
When the real part of all the eigenvalues of \(\Gamma_{1}\) is 0, we cannot judge the stability of the trivial solution of (2.1).
Example 1
Let \(\sigma(x)=Ax\) and \(b(x)=\frac{1}{2}A^{2} x\). Then the solution of (2.1) is
It is easy to see that if the multiplication of all the eigenvalues of A is 1, then the trivial solution of (2.1) is stable. If not, the trivial solution of (2.1) is unstable.
High dimensional situation II
Let
in equation (2.14). Similarly we have the following results in these situations.
Theorem 5
If \(\partial b(0)\), \(A_{1}, \ldots, A_{d}\) are commutative and the real parts of all the eigenvalues of \(\Gamma_{d}\) are negative, then the trivial solution of (2.1) is exponentially stable.
Theorem 6
If \(\partial b(0)\), \(A_{1}, \ldots, A_{d}\) are commutative and there is a positive one in the real parts of all the eigenvalues of \(\Gamma_{d}\), then the trivial solution of (2.1) is unstable.
References
 1.
Kushner, H: Stochastic Stability and Control. Academic press, New York (1967)
 2.
Khasminskii, R: Stochastic Stability of Differential Equations. Springer, Berlin (1980)
 3.
Krystul, J, Blom, H: Generalized stochastic hybrid processes as strong solutions of stochastic differential equations. Hybridge report 2 (2005)
 4.
Boukas, E: Stochastic Switching Systems: Analysis and Design. Birkhäuser, Boston (2005)
 5.
Mao, X: Stochastic Differential Equations and Applications. Horwood, Chichester (1997)
 6.
Karatzas, I, Shreve, S: Brownian Motion and Stochastic Calculus (2000)
 7.
Skorohod, A: Asymptotic Method in the Theory of Stochastic Differential Equations. Am. Math. Soc., Providence (1989)
 8.
Mao, X: Stability of stochastic differential equations with Markovian switching. Stoch. Process. Appl. 79, 4567 (1999)
Acknowledgements
The author is thankful to the referees for their helpful suggestions and necessary corrections in the completion of this paper.
Author information
Additional information
Competing interests
The author declares that there is no conflict of interests regarding the publication of this article.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Received
Accepted
Published
DOI
Keywords
 nonlinear stochastic differential system
 Ito formula
 \(C^{2}\)equivalence