- Research
- Open Access
- Published:
Parameter estimation for Ornstein–Uhlenbeck processes driven by fractional Lévy process
Journal of Inequalities and Applications volume 2018, Article number: 356 (2018)
Abstract
We study the minimum Skorohod distance estimation \(\theta _{\varepsilon}^{\ast }\) and minimum \(L_{1}\)-norm estimation \(\widetilde {\theta _{\varepsilon}}\) of the drift parameter θ of a stochastic differential equation \(dX_{t}=\theta X_{t}\,dt+\varepsilon \,dL^{d}_{t}\), \(X_{0}=x_{0}\), where \(\{L^{d}_{t},0\leq t\leq T\}\) is a fractional Lévy process, \(\varepsilon \in (0,1]\). We obtain their consistency and limit distribution for fixed T, when \(\varepsilon \rightarrow 0\). Moreover, we also study the asymptotic laws of their limit distributions for \(T\rightarrow \infty \).
1 Introduction
Statistical inference for stochastic equations is a main research direction in probability theory and its applications. The asymptotic theory of parametric estimation for diffusion processes with small noise is well developed. Genon-Catalot [8] and Laredo [17] considered the efficient estimation for drift parameters of small diffusions from discrete observations as \(\epsilon \rightarrow 0\) and \(n\rightarrow \infty \). Using martingale estimating function, Sørensen [27] obtained consistency and asymptotic normality of the estimators of drift and diffusion coefficient parameters as \(\epsilon \rightarrow 0\) and n is fixed. Using a contrast function under suitable conditions on ϵ and n, Sørensen and Uchida [28] and Gloter and Sørensen [9] considered the efficient estimation for unknown parameters in both drift and diffusion coefficient functions. Long [20], Ma [21] studied parameter estimation for Ornstein–Uhlenbeck processes driven by small Lévy noises for discrete observations when \(\epsilon \rightarrow 0\) and \(n\rightarrow \infty \) simultaneously. Shen and Yu [26] obtained consistency and the asymptotic distribution of the estimator for Ornstein–Uhlenbeck processes with small fractional Lévy noises.
Recently, Diop and Yode [4] obtained the minimum Skorohod distance estimate for the parameter θ of a stochastic differential equation with a centered Lévy processes \(\{Z_{t}, 0\leq t\leq T\}\), \(\epsilon \in (0,1]\),
When \(\{Z_{t}, 0\leq t\leq T\}\) is a Brownian motion, Millar [24] obtained the asymptotic behavior of the estimator of the parameter θ. The minimum uniform metric estimate of parameters of diffusion-type processes was considered in Kutoyants and Pilibossian [14, 15]. Hénaff [10] considered the asymptotics of a minimum distance estimator of the parameter of the Ornstein–Uhlenbeck process. Prakasa Rao [25] studied the minimum \(L_{1}\)-norm estimates of the drift parameter of Ornstein–Uhlenbeck process driven by fractional Brownian motion and investigated the asymptotic properties following Kutoyants and Pilibossian [14, 15]. Some surveys on the parameter estimates of fractional Ornstein–Uhlenbeck process can be found in Hu and Nualart [11], El Onsy, Es-Sebaiy and Ndiaye [5], Xiao, Zhang and Xu [29], Jiang and Dong [12], Liu and Song [19].
Motivated by the above results, in this paper we consider the minimum Skorohod distance estimation \(\theta _{\varepsilon }^{\ast }\) and minimum \(L_{1}\)-norm estimation \(\widetilde{\theta _{\varepsilon }}\) of the drift parameter θ for Ornstein–Uhlenbeck processes driven by the fractional Lévy process \(\{L^{d}_{t}, 0\leq t\leq T\}\) which satisfies the following stochastic differential equation:
where the shift parameter \(\theta \in \varTheta =(\theta _{1},\theta _{2}) \subseteq {R}\) is unknown, \(\varepsilon \in (0,1]\). Denote by \(\theta _{0}\) the true value of the unknown parameter θ. Note that
where \(x_{t}(\theta )=x_{0}e^{\theta t}\) is a solution of (1) with \(\varepsilon =0\).
Recall that fractional Lévy processes is a natural generalization of the integral representation of fractional Brownian motion. Analogously to Mandelbrot and Van Ness [22] for fractional Brownian motion we introduce the following definition.
Definition 1.1
(Marquardt [23])
Let \(L=(L(t), t\in {R})\) be a zero-mean two-sided Lévy process with \(E[L(1)^{2}]<\infty \) and without a Brownian component. For \(d\in (0,\frac{1}{2})\), a stochastic process
is called a fractional Lévy process (fLp), where
\(\{L_{1}(t), t\geq 0\} \) and \(\{L_{2}(t), t\geq 0\} \) are two independent copies of a one-side Lévy process.
Lemma 1.1
(Marquardt [23])
Let \(g\in H\), H is the completion of \(L^{1}( {R})\cap L^{2}({R})\) with respect to the norm \(\|g\|_{H}^{2}=E[L(1)^{2}]\int _{ {R}}(I_{-}^{d}g)^{2}(u)\,du\), then
where the equality holds in the \(L^{2}\) sense and \(I_{-}^{d}g\) denotes the Riemann–Liouville fractional integral defined by
Lemma 1.2
(Marquardt [23])
Let \(|f|\), \(|g|\in H\). Then
Lemma 1.3
(Bender et al. [2])
Let \(L_{t}^{d}\) be a fLp. Then for every \(p\geq 2\) and \(\delta >0\) such that \(d+\delta <\frac{1}{2}\) there exists a constant \(C_{p,\delta ,d}\) independent of the driving Lévy process L such that for every \(T\geq 1\)
For the study of fLp see Bender et al. [3], Fink and Klüppelberg [7], Lin and Cheng [18], Benassi et al. [1], Lacaux [16], Engelke [6] and the references therein.
The rest of this paper is organized as follows. In Sect. 2, we consider the minimum Skorohod distance estimation \(\theta _{\varepsilon }^{ \ast }\) of the drift parameter θ, its consistency and limit distribution are studied for fixed T, when \(\varepsilon \rightarrow 0\). Moreover, the asymptotic law of its limit distribution are also studied for \(T\rightarrow \infty \). The similar problems for minimum \(L_{1}\)-norm estimation \(\widetilde{\theta _{\varepsilon }}\) of the drift parameter θ were studied in Sect. 3.
2 Minimum Skorohod distance estimation
In this section, we consider the minimum Skorohod distance estimation which defined by
where
on the Skorohod space \({D}([0,T], {R})\) consists of càdlàg functions on \([0,T]\), \(\varLambda ([0,T])\) is the set of functions μ defined on \([0,T]\) with values in \([0,T]\), continuous, strictly increasing such that \(\mu (0)=0\) and \(\mu (T)=T\), and
Let
where \(\dot{x}(\theta _{0})=x_{0}te^{\theta _{0}t}\) is the derivative of \(x_{t}(\theta _{0})\) with respect to \(\theta _{0}\) and
Let
and \({P}^{(\varepsilon )}_{\theta _{0}}\) denotes the probability measure induced by the process \({X_{t}}\) for fixed ε.
Theorem 2.1
(Consistency)
For every \(p\geq 2\) and \(\kappa >0\) such that, for every \(T\geq 1\), we have
where constant \(C_{p,\kappa ,d}\) is only dependent on p, κ, d.
Proof
Fixed \(\kappa >0\) and let
Then we can obtain \(\mathcal{I}_{0}=\{|\theta _{\varepsilon }^{\ast }- \theta _{0}|>\kappa \}\). In fact, for \(\omega \in \mathcal{I}_{0}\), we have
thus, \(|\theta _{\varepsilon }^{\ast }(\omega )-\theta _{0}|>\kappa \). On the other hand, assume that \(|\theta _{\varepsilon }^{\ast }(\omega )- \theta _{0}|>\kappa \),
For any \(\kappa >0\), we have
Besides, since the process \({X_{t}}\) satisfies the stochastic differential Eqs. (1), it follows that
Then
Hence, we have
because of the Gronwall–Bellman lemma. Thus,
According to Lemma 1.3 and Chebyshev’s inequality, for all \(p\geq 2\), we get
This completes the proof. □
Remark 2.1
As a consequence of the above theorem, we obtain the result that \(\theta _{\varepsilon }^{\ast }\) converges in probability to \(\theta _{0}\) under \({P}^{(\varepsilon )}_{\theta _{0}}\)-measure as \(\varepsilon \rightarrow 0\). Furthermore, the rate of convergence is of order \(O(\varepsilon ^{p})\) for every \(p\geq 2\).
Theorem 2.2
(Limit distribution)
For any \(h\in {D}([0,T], {R})\) satisfying \(h(0)=0\), \(\phi ^{\alpha }_{h}=\rho (h,u\cdot a)\), \(a(t)=te ^{\alpha t}\), \(\alpha \in {R}\), \(u\in {R}\) admits a unique minimum at u. Then we have, as \(\varepsilon \rightarrow 0\), \(\varepsilon ^{-1}( \theta ^{\ast }_{\varepsilon }-\theta _{0})\stackrel{d}{\rightarrow } \zeta _{T}\), where the notation “\(\stackrel{d}{\rightarrow }\)” denotes “convergence in distribution”.
Remark 2.2
\(\phi ^{\alpha }_{h}\) is a convex function and \(\phi ^{\alpha }_{h}\rightarrow +\infty \) when \(|u|\rightarrow +\infty \), so \(\phi ^{\alpha }_{h}\) admits a minimum.
The following lemma due to Diop and Yode [4] which is vital for our proof of Theorem 2.2.
Lemma 2.1
Let \(\{K_{\varepsilon }\}_{\varepsilon >0}\) be a sequence of continuous functions on R and \(K_{0}\) be a convex function which admits a unique minimum η on R. Let \(\{L_{\varepsilon }\}_{\varepsilon >0}\) be a sequence of positive numbers such that \(L_{\varepsilon }\rightarrow +\infty \) as \(\varepsilon \rightarrow 0\). We suppose that
Then
where if there are several minima of \(K_{\varepsilon }\), we choose one of them arbitrarily.
Proof of Theorem 2.2
We introduce the following notations:
Since
with \(\widetilde{\theta }=\widetilde{\theta }_{\varepsilon ,u,t} \in (\theta _{0}, \theta _{0}+\varepsilon u)\), where the second equality is because of the Taylor expansion. If we take \(L_{\varepsilon }= \varepsilon ^{\delta -1}\) with \(\delta \in (1/2, 1)\), we get
Therefore, we get the desired results by Lemma 2.1. □
In the following, we will consider the limiting behavior of \(\eta _{T}\) for \(T\rightarrow +\infty \). Let us introduce the following notations:
From Theorem 3.6.6 of Jurek and Mason [13] and Lemma 4 of Diop and Yode [4], we can get the logarithmic moment condition is necessary and sufficient for the existence of the improper integral \(A_{0}\).
Lemma 2.2
Suppose that \({E}(\log (1+|L_{1}|))<+\infty \). Then
where “\(\stackrel{d}{=}\)” denotes “identical distribution”.
Proof
It is not hard to see,
In a similar way,
From Lemma 4 of Diop and Yode [4], we have immediately
□
The next theorem gives the asymptotic behavior of the limit distribution \(\eta _{T}\) for large T.
Theorem 2.3
Suppose that \(\theta _{0}>0\) and \({E}(\log (1+|L _{1}|))<+\infty \). Then \(\xi _{T}=x_{0} T \eta _{T}\) converges in distribution to \(A_{0}\) as \(T\rightarrow +\infty \).
Proof
Recall that
By changing variable, we have
where \(M_{t}(\omega )=\frac{\omega te^{\theta _{0}t}}{T}\) and \(N(\cdot )=\rho (Y(\theta _{0}),M(\cdot ))\).
We want to show that, for every \(\Delta >0\),
Therefore, let us consider the set
where \({P}_{\theta _{0}}\) is the probability measure induced by the process \({X_{t}}\) when \(\theta _{0}\) is the true parameter and \(\varepsilon \rightarrow 0\). We can get
On the other hand, for \(\omega \in V_{\Delta }\), we have
Hence, we have
where we get the maximum value of the function \((T-t)e^{\theta _{0}t}\) by taking the derivative.
We obtain
Using Lemma 2.2 we have
In addition, using (18), \(\xi _{T}\in V_{\Delta }\), we have
3 Minimum \(L_{1}\)-norm estimation
In this section, we will study the minimum \(L_{1}\)-norm estimation \(\widetilde{\theta _{\varepsilon }}\) of the drift parameter θ. Let
It is well known that \(\widetilde{\theta _{\varepsilon }}\) is the minimum \(L_{1}\)-norm estimator if there exists a measurable selection \(\widetilde{\theta _{\varepsilon }}\) such that
Suppose that there exists a measurable selection \(\widetilde{\theta _{\varepsilon }}\) satisfying the above equation. We can also define the estimator \(\widetilde{\theta _{\varepsilon }}\) by the relation
For any \(\kappa >0\), we define
Theorem 3.1
(Consistency)
For any \(p\geq 2\), there exists a constant \(C_{p,\kappa ,d}\) (only depending the p, κ, d), such that, for every \(\kappa > 0\), we have
Proof
Set \(\Vert \cdot \Vert \) denotes the \(L_{1}\)-norm, then we have
Since the process \(X_{t}\) satisfies the stochastic differential equation (1), it follows that
where \(x_{t}(\theta )=x_{0}e^{\theta t}\).
Similar to the proof of Theorem 2.1, we have
Thus,
Applying Lemma 1.3 to the estimate obtained above, we have
This completes the proof. □
Remark 3.1
It follows from Theorem 3.1 that we have \(\widetilde{\theta }_{\varepsilon }\) converges in probability to \(\theta _{0}\) under \({P}^{(\varepsilon )}_{\theta _{0}}\)-measure as \(\varepsilon \rightarrow 0\). Furthermore, the rate of convergence is of order \(O(\varepsilon ^{p})\) for every \(p \geq 2\).
Theorem 3.2
(Limit distribution)
As \(\varepsilon \rightarrow 0\), \(\varepsilon ^{-1}(\widetilde{\theta }_{\varepsilon }-\theta _{0}) \stackrel{d}{ \rightarrow } \xi \), ξ has the same probability distribution as η̃ under \({P}^{(\varepsilon )}_{\theta _{0}}\)
Proof
Let
and
Furthermore, let
It is easy to see that the random variable \(\widetilde{u}_{\varepsilon }=\varepsilon ^{-1}(\widetilde{\theta }_{\varepsilon }-\theta _{0})\) satisfies the equation
Define
Observe that, with probability one,
where \(\widetilde{\theta }=\theta _{0}+\alpha (\theta -\theta _{0})\) for some \(\alpha \in (0,1]\). Note that the last term in the above inequality tends to zero as \(\varepsilon \rightarrow 0\). This follows from the arguments given in Theorem 2 of Kutoyants and Pilibossian [14, 15]. In addition, we can choose the interval \([-L,L]\) such that
and
Note that \(\widetilde{f}(L)\) increases as L increases. The process \(\{Z_{\varepsilon }(u), u\in [-L,L]\}\) and \(\{Z_{0}(u), u\in [-L,L]\}\) satisfy the Lipschitz conditions and \(Z_{\varepsilon }(u)\) converges uniformly to \(Z_{0}(u)\) over \(u\in [-L,L]\). Hence the minimizer of \(Z_{\varepsilon }(\cdot )\) converges to the minimizer of \(Z_{0}(u)\). This completes the proof. □
Although the distribution of η̃ is not clear, we can consider its limiting behaviors as \(T\rightarrow +\infty \).
Theorem 3.3
(Asymptotic law)
Suppose that \(\theta _{0}>0\) and \({E}(\log (1+|L_{1}|))<+\infty \). Then
where \(L_{1}\), \(A_{0}\) and other notations in the following are the same as Theorem 2.3.
Proof
Recall that
Let \(\Vert \cdot \Vert \) denote the \(L_{1}\)-norm. By changing variable, we have the following:
where \(\widetilde{M}_{t}(\omega )=\frac{\omega te^{\theta _{0}t}}{T}\) and \(\widetilde{N}(\cdot )=\Vert Y-\widetilde{M}(\cdot )\Vert \).
We want to show that, for every \(\Delta >0\),
Therefore, we consider the set
where \({P}_{\theta _{0}}\) is the probability measure induced by the process \({X_{t}}\) when \(\theta _{0}\) is the true parameter and \(\varepsilon \rightarrow 0\).
Besides, we have
On the other hand, for \(\omega \in V_{\Delta }\), we can get
Obviously, we have
with
We obtain with probability one
Moreover, using Lemma 2.2 we obtain
By (43) and (44), we obtain as \(T\rightarrow +\infty \)
Using (41), \(\widetilde{\xi }_{T}\in V_{\Delta }\) implies
Therefore, from Eqs. (45) and (46), we have the result (42). □
Remark 3.2
If \({L^{d}_{t}}\) is a Brownian motion, then \(\widetilde{\xi }_{T}\) is asymptotically Gaussian, this is treated by Kutoyants and Pilibossian [14, 15].
References
Benassi, A., Cohen, S., Istas, J.: Identification and properties of real harmonizable fractional Lévy motions. Bernoulli 8, 97–115 (2002)
Bender, C., Knobloch, R., Oberacker, P.: Maximal inequalities for fractional Lévy and related processes. Stoch. Anal. Appl. 33, 701–714 (2015)
Bender, C., Lindner, A., Schicks, M.: Finite variation of fractional Lévy processes. J. Theor. Probab. 25, 594–612 (2012)
Diop, A., Yode, A.F.: Minimum distance parameter estimation for Ornstein–Uhlenbeck processes driven by Lévy process. Stat. Probab. Lett. 80, 122–127 (2010)
El Onsy, B., Es-Sebaiy, K., Ndiaye, D.: Parameter estimation for discretely observed non-ergodic fractional Ornstein–Uhlenbeck processes of the second kind. Braz. J. Probab. Stat. 32, 545–558 (2018)
Engelke, S.: A unifying approach to fractional Lévy processes. Stoch. Dyn. 13, 1250017 (2013) 19 pages
Fink, H., Klüppelberg, C.: Fractional Lévy-driven Ornstein–Uhlenbeck processes and stochastic differential equations. Bernoulli 17, 484–506 (2011)
Genon-Catalot, V.: Maximum contrast estimation for diffusion processes from discrete observations. Statistics 21, 99–116 (1990)
Gloter, A., Sørensen, M.: Estimation for stochastic differential equations with a small diffusion coefficient. Stoch. Process. Appl. 119, 679–699 (2009)
Hénaff, S.: Asymptotics of a minimum diatance estimator of the Ornstein–Uhlenbeck process. C. R. Acad. Sci., Sér. 1 Math. 325, 911–914 (1997)
Hu, Y., Nualart, D.: Parameter estimation for fractional Ornstein–Uhlenbeck processes. Stat. Probab. Lett. 80, 1030–1038 (2010)
Jiang, H., Dong, X.: Parameter estimation for the non-stationary Ornstein–Uhlenbeck process with linear drift. Stat. Pap. 56, 257–268 (2015)
Jurek, Z.J., Mason, J.D.: Operator-Limit Distributions in Probability Theory. Wiley, New York (1993)
Kutoyants, Y., Pilibossian, P.: On minimum \(L_{1}\)-norm estimate of the parameter of the Ornstein–Uhlenbeck process. Stat. Probab. Lett. 20, 117–123 (1994)
Kutoyants, Y., Pilibossian, P.: On minimum uniform metric estimate of parameters of diffusion-type processes. Stoch. Process. Appl. 51, 259–267 (1994)
Lacaux, C.: Real harmonizable multifractional Lévy motions. Ann. Inst. Henri Poincaré 40, 259–277 (2004)
Laredo, C.F.: A sufficient condition for asymptotic sufficiency of incomplete observations of a diffusion process. Ann. Stat. 18, 1158–1171 (1990)
Lin, Z., Cheng, Z.: Existence and joint continuity of local time of multiparameter fractional Lévy processes. Appl. Math. Mech. 30, 381–390 (2009)
Liu, Z., Song, N.: Minimum distance estimation for fractional Ornstein–Uhlenbeck type process. Adv. Differ. Equ. 2014, 137 (2014)
Long, H.: Least squares estimator for discretely observed Ornstein–Uhlenbeck processes with small Lévy noises. Stat. Probab. Lett. 79, 2076–2085 (2009)
Ma, C.: A note on “Least squares estimator for discretely observed Ornstein–Uhlenbeck processes with small Lévy noises”. Stat. Probab. Lett. 80, 1528–1531 (2010)
Mandelbrot, B.B., Van Ness, J.W.: Fractional Brownian motions, fractional noises and applications. SIAM Rev. 10, 422–437 (1968)
Marquardt, T.: Fractional Lévy processes with an application to long memory moving average processes. Bernoulli 12, 1009–1126 (2006)
Millar, P.W.: A general approach to the optimality of the minimum distance estimators. Trans. Am. Math. Soc. 286, 377–418 (1984)
Prakasa Rao, B.L.S.: Minimum \(L_{1}\)-norm estimation for fractional Ornstein–Uhlenbeck type process. Theory Probab. Math. Stat. 71, 181–189 (2005)
Shen, G., Yu, Q.: Least squares estimator for Ornstein–Uhlenbeck processes driven by fractional Lévy processes from discrete observations. Stat. Pap. (2017). https://doi.org/10.1007/s00362-017-0918-4
Sørensen, M.: Small dispersion asymptotics for diffusion martingale estimating functions. Preprint No. 2000-2, Department of Statistics and Operation Research, University of Copenhagen, Copenhagen (2000)
Sørensen, M., Uchida, M.: Small diffusion asymptotics for discretely sampled stochastic differential equations. Bernoulli 9, 1051–1069 (2003)
Xiao, W., Zhang, W., Xu, W.: Parameter estimation for fractional Ornstein–Uhlenbeck processes at discrete observation. Appl. Math. Model. 35, 4196–4207 (2011)
Acknowledgements
The authors are grateful to the referee for carefully reading the manuscript and for providing some comments and suggestions which led to improvements in this paper.
Funding
This research is supported by the Distinguished Young Scholars Foundation of Anhui Province (1608085J06), the National Natural Science Foundation of China (11271020, 11601260), the Top Talent Project of University Discipline (Speciality) (gxbjZD03), the Natural Science Foundation of Chuzhou University (2016QD13), and Natural Science Foundation of Shandong Province (ZR2016AB01).
Author information
Authors and Affiliations
Contributions
All authors contributed equally to this work. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Shen, G., Li, Y. & Gao, Z. Parameter estimation for Ornstein–Uhlenbeck processes driven by fractional Lévy process. J Inequal Appl 2018, 356 (2018). https://doi.org/10.1186/s13660-018-1951-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-018-1951-0
MSC
- 60G18
- 65C30
- 93E24
Keywords
- Fractional Lévy process
- Minimum Skorohod distance estimation
- Minimum \(L_{1}\)-norm estimation
- Consistency
- Limit distribution
- Asymptotic law