Parameter estimation for Ornstein–Uhlenbeck processes driven by fractional Lévy process

We study the minimum Skorohod distance estimation \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\theta _{\varepsilon}^{\ast }$\end{document}θε∗ and minimum \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$L_{1}$\end{document}L1-norm estimation \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\widetilde {\theta _{\varepsilon}}$\end{document}θε˜ of the drift parameter θ of a stochastic differential equation \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$dX_{t}=\theta X_{t}\,dt+\varepsilon \,dL^{d}_{t}$\end{document}dXt=θXtdt+εdLtd, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$X_{0}=x_{0}$\end{document}X0=x0, where \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\{L^{d}_{t},0\leq t\leq T\}$\end{document}{Ltd,0≤t≤T} is a fractional Lévy process, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\varepsilon \in (0,1]$\end{document}ε∈(0,1]. We obtain their consistency and limit distribution for fixed T, when \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\varepsilon \rightarrow 0$\end{document}ε→0. Moreover, we also study the asymptotic laws of their limit distributions for \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$T\rightarrow \infty $\end{document}T→∞.


Introduction
Statistical inference for stochastic equations is a main research direction in probability theory and its applications. The asymptotic theory of parametric estimation for diffusion processes with small noise is well developed. Genon-Catalot [8] and Laredo [17] considered the efficient estimation for drift parameters of small diffusions from discrete observations as → 0 and n → ∞. Using martingale estimating function, Sørensen [27] obtained consistency and asymptotic normality of the estimators of drift and diffusion coefficient parameters as → 0 and n is fixed. Using a contrast function under suitable conditions on and n, Sørensen and Uchida [28] and Gloter and Sørensen [9] considered the efficient estimation for unknown parameters in both drift and diffusion coefficient functions. Long [20], Ma [21] studied parameter estimation for Ornstein-Uhlenbeck processes driven by small Lévy noises for discrete observations when → 0 and n → ∞ simultaneously. Shen and Yu [26] obtained consistency and the asymptotic distribution of the estimator for Ornstein-Uhlenbeck processes with small fractional Lévy noises.
When {Z t , 0 ≤ t ≤ T} is a Brownian motion, Millar [24] obtained the asymptotic behavior of the estimator of the parameter θ . The minimum uniform metric estimate of parameters of diffusion-type processes was considered in Kutoyants and Pilibossian [14,15]. Hénaff [10] considered the asymptotics of a minimum distance estimator of the parameter of the Ornstein-Uhlenbeck process. Prakasa Rao [25] studied the minimum L 1 -norm estimates of the drift parameter of Ornstein-Uhlenbeck process driven by fractional Brownian motion and investigated the asymptotic properties following Kutoyants and Pilibossian [14,15]. Some surveys on the parameter estimates of fractional Ornstein-Uhlenbeck process can be found in Hu and Nualart [11], El Onsy, Es-Sebaiy and Ndiaye [5], Xiao, Zhang and Xu [29], Jiang and Dong [12], Liu and Song [19].
Motivated by the above results, in this paper we consider the minimum Skorohod distance estimation θ * ε and minimum L 1 -norm estimation θ ε of the drift parameter θ for Ornstein-Uhlenbeck processes driven by the fractional Lévy process {L d t , 0 ≤ t ≤ T} which satisfies the following stochastic differential equation: where the shift parameter θ ∈ Θ = (θ 1 , θ 2 ) ⊆ R is unknown, ε ∈ (0, 1]. Denote by θ 0 the true value of the unknown parameter θ . Note that Recall that fractional Lévy processes is a natural generalization of the integral representation of fractional Brownian motion. Analogously to Mandelbrot and Van Ness [22] for fractional Brownian motion we introduce the following definition. is called a fractional Lévy process (fLp), where {L 1 (t), t ≥ 0} and {L 2 (t), t ≥ 0} are two independent copies of a one-side Lévy process.
where the equality holds in the L 2 sense and I d g denotes the Riemann-Liouville fractional integral defined by  [16], Engelke [6] and the references therein.
The rest of this paper is organized as follows. In Sect. 2, we consider the minimum Skorohod distance estimation θ * ε of the drift parameter θ , its consistency and limit distribution are studied for fixed T, when ε → 0. Moreover, the asymptotic law of its limit distribution are also studied for T → ∞. The similar problems for minimum L 1 -norm estimation θ ε of the drift parameter θ were studied in Sect. 3.

Minimum Skorohod distance estimation
In this section, we consider the minimum Skorohod distance estimation which defined by where Let and P (ε) θ 0 denotes the probability measure induced by the process X t for fixed ε.
where constant C p,κ,d is only dependent on p, κ, d.
Proof Fixed κ > 0 and let Then we can obtain For any κ > 0, we have Besides, since the process X t satisfies the stochastic differential Eqs. (1), it follows that Then Hence, we have because of the Gronwall-Bellman lemma. Thus, According to Lemma 1.3 and Chebyshev's inequality, for all p ≥ 2, we get This completes the proof.
Remark 2.1 As a consequence of the above theorem, we obtain the result that θ * ε converges in probability to θ 0 under P (ε) θ 0 -measure as ε → 0. Furthermore, the rate of convergence is of order O(ε p ) for every p ≥ 2.
The following lemma due to Diop and Yode [4] which is vital for our proof of Theorem 2.2.

Lemma 2.1
Let {K ε } ε>0 be a sequence of continuous functions on R and K 0 be a convex function which admits a unique minimum η on R. Let {L ε } ε>0 be a sequence of positive numbers such that L ε → +∞ as ε → 0. We suppose that where if there are several minima of K ε , we choose one of them arbitrarily.
Proof of Theorem 2.2 We introduce the following notations: where the second equality is because of the Taylor expansion. If we take L ε = ε δ-1 with δ ∈ (1/2, 1), we get Therefore, we get the desired results by Lemma 2.1.
In the following, we will consider the limiting behavior of η T for T → +∞. Let us introduce the following notations: From Theorem 3.6.6 of Jurek and Mason [13] and Lemma 4 of Diop and Yode [4], we can get the logarithmic moment condition is necessary and sufficient for the existence of the improper integral A 0 .

Lemma 2.2 Suppose that E(log(1 + |L 1 |)) < +∞. Then
Proof It is not hard to see, In a similar way,

From Lemma 4 of Diop and Yode [4], we have immediately
The next theorem gives the asymptotic behavior of the limit distribution η T for large T.
Proof Recall that By changing variable, we have where M t (ω) = ωte θ 0 t T and N(·) = ρ(Y (θ 0 ), M(·)). We want to show that, for every > 0, Therefore, let us consider the set where P θ 0 is the probability measure induced by the process X t when θ 0 is the true parameter and ε → 0. We can get On the other hand, for ω ∈ V , we have Hence, we have where we get the maximum value of the function (Tt)e θ 0 t by taking the derivative. We obtain Using Lemma 2.2 we have By (20) and (21), we obtain In addition, using (18), ξ T ∈ V , we have We can get the desired result (19) by Eqs. (22) and (23).

Minimum L 1 -norm estimation
In this section, we will study the minimum L 1 -norm estimation θ ε of the drift parameter θ . Let It is well known that θ ε is the minimum L 1 -norm estimator if there exists a measurable selection θ ε such that This completes the proof.
Remark 3.1 It follows from Theorem 3.1 that we have θ ε converges in probability to θ 0 under P (ε) θ 0 -measure as ε → 0. Furthermore, the rate of convergence is of order O(ε p ) for every p ≥ 2. and Furthermore, let It is easy to see that the random variable u ε = ε -1 ( θ εθ 0 ) satisfies the equation Define Observe that, with probability one, where θ = θ 0 + α(θθ 0 ) for some α ∈ (0, 1]. Note that the last term in the above inequality tends to zero as ε → 0. This follows from the arguments given in Theorem 2 of Kutoyants and Pilibossian [14,15]. In addition, we can choose the interval [-L, L] such that and Note that f (L) increases as L increases. The process {Z ε (u), u ∈ [-L, L]} and {Z 0 (u), u ∈ [-L, L]} satisfy the Lipschitz conditions and Z ε (u) converges uniformly to Z 0 (u) over u ∈ [-L, L]. Hence the minimizer of Z ε (·) converges to the minimizer of Z 0 (u). This completes the proof.
Although the distribution of η is not clear, we can consider its limiting behaviors as T → +∞.
where L 1 , A 0 and other notations in the following are the same as Theorem 2.3.

Proof Recall that
Let · denote the L 1 -norm. By changing variable, we have the following: where M t (ω) = ωte θ 0 t T and N(·) = Y -M(·) . We want to show that, for every > 0, Therefore, we consider the set where P θ 0 is the probability measure induced by the process X t when θ 0 is the true parameter and ε → 0. Besides, we have Obviously, we have