- Review
- Open access
- Published:
Berry-Esseen bounds of weighted kernel estimator for a nonparametric regression model based on linear process errors under a LNQD sequence
Journal of Inequalities and Applications volume 2018, Article number: 10 (2018)
Abstract
In this paper, the authors investigate the Berry-Esseen bounds of weighted kernel estimator for a nonparametric regression model based on linear process errors under a LNQD random variable sequence. The rate of the normal approximation is shown as \(O(n^{-1/6})\) under some appropriate conditions. The results obtained in the article generalize or improve the corresponding ones for mixing dependent sequences in some sense.
1 Introduction
We discuss that the estimation of the fixed design nonparametric regression model involves a regression function \(g(\cdot )\) which is defined on a closed interval \([0,1]\):
where \(\{t_{i}\}\) are known fixed design points, we suppose \(\{t_{i}\}\) to be ordered \(0\leq t_{1}\leq \cdots \leq t_{n}\leq 1\), and \(\{\varepsilon_{i}\}\) are random errors.
As we all know, model (1.1) has been considered extensively by many authors, e.g., Schuster and Yakowitz [1] studied the nonparametric model (1.1) with i.i.d. errors. They obtained the strong convergence and asymptotic normality of the estimator of \(g(\cdot )\), and Qin [2] obtained the strong consistency of the estimator of \(g(\cdot )\). Yang [3–5] studied the nonparametric model (1.1) with φ-mixing errors, censored data random errors and negatively associated errors. He obtained the complete convergence, strong consistency and uniformly asymptotic normality of the estimator of \(g(\cdot )\), respectively. Zhou et al. [6] studied the nonparametric model (1.1) with weakly dependent processes. They obtained the moment consistency, strong consistency, strong convergence rate and asymptotic normality of the estimator of \(g(\cdot )\), etc. Inspired by the literature above, we are devoted to investigating the Berry-Esseen bounds of the estimator for linear process errors in the nonparametric regression model (1.1).
In the article, we will discuss the Berry-Esseen bounds of the estimator of \(g(\cdot )\) in the model (1.1) with repeated measurements. Here, we recall the weighted kernel estimator of nonparametric regression functions. A popular nonparametric estimate of \(g(\cdot )\) is then
where \(K(u)\) is a Borel measurable function, \(0< h_{n}\to 0\) as \(n\to \infty \).
The weighted kernel estimator was first proposed by Priestley and Chao [7], who discussed the weak consistency conditions of \(g(\cdot )\), and subsequently it has been studied extensively by many authors. For instance, in the independent assumption, Benedetti [8] gave the sufficient condition for the strong consistency of \(g(\cdot )\) under the condition of \(\mathrm{E}\varepsilon_{1}^{4}<\infty \). Schuster and Yakowitz [1] discussed the uniformly strong consistency of \(g(\cdot )\). Qin [2] extended the moment condition to \(\mathrm{E} \vert \varepsilon_{1} \vert ^{2+\delta }<\infty \) (as \({\delta >0}\)). Under the mixing dependent assumption, Yang [3] and [9] not only comprehensively improved these results under φ-mixing and ρ-mixing, but reduced the condition to \(\sup_{i} {\mathrm{E}} \vert \varepsilon_{i} \vert ^{r}<\infty \) (as \(r>1\)), weakened the addition of the kernel function \(K(\cdot )\). Pan and Sun [10] extended this discussion to censored data and gave some sufficient conditions for strong consistency in the independent and φ-mixing case. Yang [4] discussed the consistency of weighted kernel estimators of a nonparametric regression function with censored data and obtained strong consistency under some more weakly sufficient conditions. But, up to now, there have been few results related to weighted kernel estimator for model (1.1) with linear process errors.
The Berry-Esseen theorem is the rate of convergence in the central limit theorem. There is a lot of literature regarding this kind of the Berry-Esseen bounds theorem. For the details, Cheng [11] established a Berry-Esseen type theorem showing the near-optimal quality of the normal distribution approximation to the distribution of smooth quantile density estimators. Wang and Zhang [12] obtained a Berry-Esseen type estimate for NA random variables with only finite second moment. They also improved the convergence rate result in the central limit theorem and precise asymptotics in the law of the iterated logarithm for NA and linearly negative quadrant dependent sequences. Liang and Li [13] derived the Berry-Esseen type bound based on linear process errors under negatively associated random variables. Li et al. [14] established the Berry-Esseen bounds of the wavelet estimator for a nonparametric regression model with linear process errors generated by φ-mixing sequences. Yang et al. [15] investigated the Berry-Esseen bound of sample quantiles for NA random variables, the rate of normal approximation is shown as \(O(n^{-1/9})\), etc.
In this paper, we shall study the above nonparametric regression problem with linear process errors generated by a linearly negative quadrant dependent sequence.
Definition 1.1
([16])
Two random variables X and Y are said to be negative quadrant dependent (NQD in short) if, for any \(x, y\in \mathbf{R}\),
Definition 1.2
([17])
A sequence \(\{X_{n}, n\geq 1\}\) of random variables is said to be linearly negative quadrant dependent (LNQD in short) if for any disjoint subsets \(\mathbf{A}, \mathbf{B} \in \mathbf{Z}^{+}\) and positive \(r'_{i}s\), \(\sum_{i\in \mathbf{A}}r _{i}X_{i}\) and \(\sum_{j\in \mathbf{B}}r_{j}X_{j}\) are NQD.
The concept of LNQD sequence was introduced by Newman [17], who investigated the central limit theorem for a strictly stationary LNQD process, and it subsequently has been studied by many authors. Wang and Zhang [12] provided the uniform rates of convergence in the central limit theorem for LNQD random variables. Ko et al. [18] established the Hoeffding-type inequality for a LNQD sequence. Ko et al. [19] discussed the strong convergence and central limit theorem for weighted sums of LNQD random variables. Wang et al. [20] presented some exponential inequalities and complete convergence for a LNQD sequence. Wang and Wu [21] gave some strong laws of large numbers and strong convergence properties for arrays of rowwise NA and LNQD random variables. Li et al. [22] established some inequalities and asymptotic normality of the weight function estimate of a regression function for a LNQD sequence. Shen et al. [23] investigated the complete convergence for weighted sums of LNQD random variables based on the exponential bounds and obtained some complete convergence for arrays of rowwise LNQD random variables, etc.
However, there are very few literature works on Berry-Esseen bounds of weighted kernel estimator for nonparametric regression model (1.1) with linear process errors. So, the main purpose of the paper is to investigate the Berry-Esseen bounds of weighted kernel estimator for nonparametric regression model with linear process errors generated by a LNQD sequence.
In what follows, let C be positive constants which may be different in various places. All limits are taken as the sample size n tends to ∞, unless specified otherwise.
The structure of the rest of the paper is as follows. In Section 2, we give some basic assumptions and main results. Some preliminary lemmas are stated in Section 3. Proofs of the main results are provided in Section 4. Authors’ declaration is given at the end of the paper.
2 Assumptions and main results
In order to facilitate the process, we write \(\rho_{n}^{2}:=\rho_{n} ^{2}(t)=\operatorname {Var}(\widehat{g}_{n}(t))\), \(U_{n}:=U_{n}(t)=\sigma_{n} ^{-1} \{\widehat{g}_{n}(t)-\mathrm{E}\widehat{g}_{n}(t)\}\), \(V(n)=O(n ^{-(r-2)(r+\delta )/2\delta })\), \(V(q)=\sup_{j\geq 1}\sum_{j: \vert j-i \vert \geq q} \vert \operatorname {Cov}(\varepsilon_{i}, \varepsilon_{j}) \vert \), \(\delta_{n}= \max_{1\leq i\leq n}(t_{i}-t_{i-1})\).
First, we make the following basic assumptions:
-
(A1)
\(\{\varepsilon_{j}\}_{j\in \mathbf{Z}}\) has a linear representation \(\varepsilon_{j}=\sum_{k=-\infty }^{\infty }a_{k}e_{k-j}\), where \(\{a_{k}\}\) is a sequence of real numbers with \(\sum_{k=-\infty }^{ \infty } \vert a_{k} \vert <\infty \), \(\{e_{j}\}\) is a strictly stationary and LNQD sequence with \(\mathrm{E}e_{j}=0\), \(\mathrm{E} \vert e_{j} \vert ^{r}<\infty \) for some \(r>2\), and \(V(1)=\sup_{j\geq 1}\sum_{j: \vert j-i \vert \geq 1} \vert \operatorname {Cov}(\varepsilon_{i}, \varepsilon_{j}) \vert <\infty \);
-
(A2)
\(g(\cdot )\) satisfies the Lipschitz condition of order α (\(\alpha >0\)) on \([0, 1]\), \(K(\cdot )\) satisfies the Lipschitz condition of order β (\(\beta >0\)) on \(\mathbf{R}^{1}\), \(\int^{+\infty }_{-\infty }K(u)\,du=1\), \(\int^{+\infty }_{-\infty } \vert K(u) \vert \,du< \infty \), where \(K(\cdot )\) is bounded on \(\mathbf{R}^{1}\);
-
(A3)
\(h_{n}\to 0\) and \(\delta_{n}\to 0\) as \(n\to \infty \), \(\frac{1}{h_{n}} ( ( \frac{\delta_{n}}{h_{n}} ) ^{\beta }+ \delta_{n}^{\alpha } ) \to 0\) as \(n\to \infty \), let \(\frac{\delta _{n}}{h_{n}}=O(n^{-\theta })\) for \(\theta >0\);
-
(A4)
\(\max_{1\leq i\leq n}\frac{x_{i}-x_{i-1}}{h_{n}}K ( \frac{x-x_{i}}{h_{n}} ) =O(\rho_{n}^{2}(t))\leq O ( \frac{ \delta_{n}}{h_{n}} ) =O(n^{-\theta })\);
-
(A5)
There exist two positive integers p and q such that \(p+q\leq 3n\), \(qp^{-1}\to 0\), and \(\xi_{in}\to 0\) (\(i=1,2,3,4\)), where \(\xi_{1n}=n^{1-\theta }qp^{-1}\), \(\xi_{2n}=pn^{-\theta }\), \(\xi_{3n}=n ( \sum_{ \vert j \vert >n} \vert a_{j} \vert ) ^{2}\), \(\xi_{4n}=p^{r/2-1}n^{(1-r/2) \theta }\).
Remark 2.1
(A1) is a basic condition of the LNQD sequence, and conditions (A2)-(A3) are general assumption conditions of the weighted kernel estimator which have been used by some authors such as Yang [3, 4, 9], Pan and Sun [10].
Remark 2.2
Let \(K(\cdot )\) be bounded, suppose that conditions (A2)-(A3) hold true. We have
Thus, (A4) can be assumed.
Remark 2.3
For (A5), \(\xi_{in}\to 0\), \(i=1,2,3,4\), are easily satisfied, if p, q are chosen reasonable, which is the same as in Yang [5] and Li et al. [14]. So, (A5) is a standard regularity condition used commonly in the literature.
Therefore, we can see that conditions (A1)-(A5) in this paper are suitable and reasonable.
Next, we give the main results as follows.
Theorem 2.1
Assume that (A1)-(A5) hold true, then for each \(t\in [0,1]\), we can get that
Corollary 2.1
Assume that (A1)-(A5) hold true, then for each \(t\in [0,1]\), we can get that
Corollary 2.2
Assume that (A1)-(A5) hold true, \(\frac{\delta_{n}}{h_{n}}=O(n^{-\theta })\) for some \(\theta >0\), and \(\sup_{n\geq 1} ( n^{\frac{2\theta \delta +6\theta +3\delta +3}{12+8 \delta }} ) \sum_{ \vert j \vert >n} \vert a_{j} \vert <\infty \) for some \(\delta > 0\). We can get that
Observe, taking \(r=3\) and \(\theta \approx 1\) as \(\delta \to 0\), it follows that \(\sup_{y} \vert {\mathrm{P}}(U_{n}(t)\leq y) -\Phi (y) \vert =O(n ^{-1/6})\).
Remark 2.4
We develop the weighted kernel estimator methods in the nonparametric regression model (1.1) which are different from estimation methods of Liang and Li [13], Li et al. [14]. Our theorem and corollaries improve Theorem 3.1 of Li et al. [22] for the case of linear process errors generated by LNQD sequences and also generalize the results of Li et al. [14] from linear process errors generated by LNQD sequences to the ones generated by φ-mixing sequences. So, our results obtained in the paper generalize and improve some corresponding ones for φ-mixing random variables to the case of LNQD setting.
3 Some preliminary lemmas
First, we have by (1.1) and (1.2) that
where
Let \(U_{1n}=U_{1n}^{\prime }+U_{1n}^{\prime \prime }+U_{1n}^{\prime \prime \prime }\), where \(U_{1n}^{\prime }=\sum_{v=1}^{k}z_{nv}\), \(U _{1n}^{\prime \prime }=\sum_{v=1}^{k}z_{nv}^{\prime }\), \(U_{1n}^{ \prime \prime \prime }=z^{\prime }_{n k+1}\),
where
then
Next, we give the following main lemmas.
Lemma 3.1
([3])
Let \(K(\cdot )\) satisfy the Lipschitz condition of order β (\(\beta >0\)) on \(\mathbf{R}^{1}\), and \(\int^{+\infty }_{-\infty }K(u)\,du=1\), \(\int^{+\infty }_{-\infty } \vert K(u) \vert \,du< \infty \), where \(K(\cdot )\) is bounded on \(\mathbf{R}^{1}\). Assume that \(h_{n}\to 0\) and \(\delta_{n}\to 0\) as \(n\to \infty \), and \(\frac{1}{h _{n}} ( ( \frac{\delta_{n}}{h_{n}} ) ^{\beta }+\delta_{n} ^{\alpha_{0}} ) \to 0\) as \(n\to \infty \), then for any \(\alpha_{0}>0\),
Lemma 3.2
([22])
Let \(\{X_{j}, j\geq 1\}\) be a LNQD random variable sequence with zero mean and finite second moment \(\sup_{j \geq 1}{\mathrm{E}}(X^{2}_{j})<\infty \). Assume that \(\{a_{j}, j\geq 1\}\) is a real constant sequence satisfying \(a:=\sup_{j\geq 1} \vert a_{j} \vert <\infty \). Then, for any \(r>1\),
Lemma 3.3
([22])
If \(X_{1},\ldots, X_{m}\) are LNQD random variables with finite second moments, let \(\varphi_{j}(t_{j})\) and \(\varphi (t_{1},\ldots,t_{m})\) be characteristic functions of \(X_{j}\) and \((X_{1},\ldots, X_{m})\), respectively, then for all nonnegative (or nonpositive) real numbers \(t_{1},\ldots, t_{m}\),
Lemma 3.4
([14])
Suppose that \(\{\zeta_{n}: n\geq 1\}\), \(\{\eta_{n}: n\geq 1\}\) and \(\{\gamma_{n}: n\geq 1\}\) are three random variable sequences, \(\{\xi_{n}: n\geq 1\}\) is a positive constant sequence, and \(\xi_{n}\to 0\). If \(\sup_{y} \vert F_{\zeta_{n}}(y)-\Phi (y) \vert \leq C\xi_{n}\), then for any \(\varepsilon_{1}>0\) and \(\varepsilon_{2}>0\),
Lemma 3.5
Assume that (A1)-(A5) hold true, we can get that
-
(1)
\(\mathrm{E}(U_{1n}^{\prime \prime })^{2}\leq C\xi_{1n}\), \(\mathrm{E}(U _{1n}^{\prime \prime \prime })^{2}\leq C\xi_{2n}\), \(\mathrm{E}(U_{2n})^{2} \leq C\xi_{3n}\);
-
(2)
\(\mathrm{P}( \vert U''_{1n} \vert \geq \xi_{1n}^{1/3}) \leq C \xi_{1n}^{1/3}\), \(\mathrm{P}( \vert U'''_{1n} \vert \geq \xi_{2n}^{1/3}) \leq C \xi_{2n}^{1/3}\), \(\mathrm{P}( \vert U_{2n} \vert \geq \xi_{3n}^{1/3}) \leq C \xi_{3n}^{1/3}\).
Proof of Lemma 3.5
By Lemma 3.2 and assumptions (A4)-(A5), we can obtain that
This completes the proof of Lemma 3.5(1). In addition, Lemma 3.5(2) can be derived from the Markov inequality and Lemma 3.5(1) immediately. □
Lemma 3.6
Assume that (A1)-(A5) hold, write \(u_{n}^{2}= \sum_{v=1}^{k}\operatorname {Var}(z_{nv})\), we can get that
Let \(\{\mu_{nv}:v=1,\ldots, k\}\) be independent random variables and \(\mu_{nv}\stackrel{\mathcal{D}}{=}z_{nv}\), \(v=1,\ldots, k\). Write \(H_{n}=\sum_{v=1}^{k}{\mu_{nv}}\). Then we get the following.
Proof of Lemma 3.6
Let \(\Theta_{n}=\sum_{1\leq i< j\leq k} \operatorname {Cov}(z_{ni}, z_{nj})\), then \(u_{n}^{2}=\mathrm{E}(U_{1n}^{\prime })^{2}-2\Theta_{n}\). By \({\mathrm{E}(U_{n})^{2}=1}\), Lemma 3.5(1), the \(C_{r}\)-inequality and the Cauchy-Schwarz inequality, it follows that
Hence, it has been found that
On the other hand, from the basic definition of LNQD sequence, Lemma 3.1, (A1) and (A4), we can prove that
Therefore, combining equations (3.1) and (3.2), we can get that
□
Lemma 3.7
Assume that (A1)-(A5) hold true and, applying these in Lemma 3.6, we can obtain that
Proof of Lemma 3.7
By using the Berry-Esseen inequality [24], we obtain
According to Lemma 3.1, Lemma 3.2, (A1), (A4) and (A5), we can get that
Hence, by Lemma 3.6, combining equations (3.3) and (3.4), we can obtain Lemma 3.7. □
Lemma 3.8
Assume that (A1)-(A5) hold, and applying this in Lemma 3.4, we can obtain that
Proof of Lemma 3.8
Suppose that \(\chi (t)\) and \(\lambda (t)\) are the characteristic functions of \(U_{1n}^{\prime }\) and \(H_{n}\). Therefore, it follows from Lemma 3.3, (A1) and (A4) that
It is easily seen that
which implies
Consequently, from Lemma 3.7, it has been found
Therefore
Thus, combining equations (3.5) and (3.6), taking \(T=V^{-1/3}(q) \), it suffices to prove that
□
4 Proofs of the main results
Proof of Theorem 2.1
According to Lemma 3.8, Lemma 3.7 and Lemma 3.6, it follows that
Hence, by (4.1)-(4.4), we have that
Thus, by Lemma 3.4, Lemma 3.5(2) and (4.5), it suffices to prove that
□
Proof of Corollary 2.1
By (A1) we can easily see that \(V(q)\to 0\), therefore Corollary 2.1 holds. □
Proof of Corollary 2.2
Let \(p=[n^{\kappa }]\), \(q=[n^{2 \kappa -1}]\). Taking \(\kappa =\frac{2\theta \delta +\delta +3}{6+4 \delta }\), \(0<\kappa <\theta \), we have by \(r=3\) that
Therefore, the desired result is completed by Corollary 2.1 immediately. □
References
Schuster, E, Yakowitz, S: Contributions to the theory of nonparametric regression, with application to system identification. Ann. Stat. 7(1), 139-149 (1979)
Qin, YS: A result on the nonparametric estimation of regression functions. J. Eng. Math. 6(3), 120-123 (1989) (in Chinese)
Yang, SC: Consistency of weighted kernel estimator nonparametric regression functions under φ-mixing errors. Appl. Math. J. Chin. Univ. 10(2), 173-180 (1995)
Yang, SC: The weighted kernel estimators of nonparametric regression function with censored data. Acta Math. Sin. 42(2), 255-262 (1999)
Yang, SC: Uniformly asymptotic normality of the regression weighted estimator for negatively associated samples. Stat. Probab. Lett. 62, 101-110 (2003)
Zhou, XC, Lin, JG, Yin, CM: Asymptotic properties of wavelet-based estimator in nonparametric regression model with weakly dependent processes. J. Inequal. Appl. 2013, Article ID 261 (2013)
Priestley, MB, Chao, MT: Nonparametric function fitting. J. R. Stat. Soc. B 34(3), 385-392 (1972)
Benedetti, JK: On the nonparametric estimation of regression functions. J. R. Stat. Soc. B 39(2), 248-253 (1977)
Yang, SC: Moment inequality for mixing sequences and nonparametric estimation. Acta Math. Sin. 40(2), 271-279 (1997)
Pan, JM, Sun, YP: Strong consistency of the weighing kernel estimate of nonparametric regression function with censored data. Math. Stat. Appl. Probab. 12(2), 151-160 (1997)
Cheng, C: A Berry-Esseen-type theorem of quantile density estimators. Stat. Probab. Lett. 39(3), 255-262 (1998)
Wang, JF, Zhang, LX: A Berry-Esséen theorem for weakly negatively dependent random variables and its applications. Acta Math. Hung. 110(4), 293-308 (2006). https://doi.org/10.1007/s10474-006-0024-x
Liang, HY, Li, YY: A Berry-Esseen type bound of regression estimator based on linear process errors. J. Korean Math. Soc. 45(6), 1753-1767 (2008)
Li, YM, Wei, CD, Xin, GD: Berry-Esseen bounds of wavelet estimator in a regression with linear process errors. Stat. Probab. Lett. 81(1), 103-111 (2011)
Yang, WZ, Hu, SH, Wang, XJ, Zhang, QC: Berry-Esséen bound of sample quantiles for negatively associated sequence. J. Inequal. Appl. 2011, Article ID 83 (2011)
Lehmann, EL: Some concepts of dependence. Ann. Math. Stat. 37, 1137-1153 (1966)
Newman, CM: Asymptotic independence and limit theorems for positively and negatively dependent random variables. In: Tong, YL (ed.) Stat. Probab., vol. 5, pp. 127-140. Inst. Math. Statist., Hayward (1984)
Ko, MH, Choi, YK, Choi, YS: Exponential probability inequality for linearly negative quadrant dependent random variables. Commun. Korean Math. Soc. 22(1), 137-143 (2007)
Ko, MH, Ryu, DH, Kim, TS: Limiting behaviors of weighted sums for linearly negative quadrant dependent random variables. Taiwan. J. Math. 11(2), 511-522 (2007)
Wang, XJ, Hu, SH, Yang, WZ, Li, XQ: Exponential inequalities and complete convergence for a LNQD sequence. J. Korean Stat. Soc. 39, 555-564 (2010)
Wang, JF, Wu, QY: Strong laws of large numbers for arrays of rowwise NA and LNQD random variables. J. Probab. Stat. (2011). https://doi.org/10.1155/2011/708087
Li, YM, Guo, JH, Li, NY: Some inequalities for a LNQD sequence with applications. J. Inequal. Appl. 2012, Article ID 216 (2012)
Shen, AT, Zhu, HY, Wu, RC, Zhang, Y: Complete convergence for weighted sums of LNQD random variables. Stochastics 87(1), 160-169 (2015)
Petrov, VV: Limit Theory for Probability Theory. Oxford University Press, New York (1995)
Acknowledgements
This research is supported by the National Natural Science Foundation of China (11271189, 11461057), 2017 Youth Teacher Research and Development Fund project of Guangxi University of Finance and Economics (2017QNA01).
Author information
Authors and Affiliations
Contributions
All authors contributed equally and read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Ding, L., Chen, P. & Li, Y. Berry-Esseen bounds of weighted kernel estimator for a nonparametric regression model based on linear process errors under a LNQD sequence. J Inequal Appl 2018, 10 (2018). https://doi.org/10.1186/s13660-017-1604-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-017-1604-8