- Research
- Open access
- Published:
Further research on complete moment convergence for moving average process of a class of random variables
Journal of Inequalities and Applications volume 2017, Article number: 46 (2017)
Abstract
In this article, the complete moment convergence for the partial sum of moving average processes \(\{X_{n}=\sum_{i=-\infty}^{\infty}a_{i}Y_{i+n},n\geq 1\}\) is established under some mild conditions, where \(\{Y_{i},-\infty < i<\infty\}\) is a doubly infinite sequence of random variables satisfying the Rosenthal type maximal inequality and \(\{a_{i},-\infty< i<\infty\}\) is an absolutely summable sequence of real numbers. These conclusions promote and improve the corresponding results given by Ko (J. Inequal. Appl. 2015:225, 2015).
1 Introduction
We first introduce the definition of the Rosenthal type maximal inequality, which is one of the most interesting inequalities in probability theory and mathematical statistics. Suppose that \(\{ Y_{n},n\geq1\}\) is a sequence of random variables satisfying \(E\vert Y_{i}\vert ^{r}<\infty\) for \(r\geq2\), then there exists a positive constant \(C(r)\) depending only on r such that
Equation (1.1) can be satisfied by many dependent or mixing sequences. Peligrad [2], Zhou [3], Wang and Lu [4], Utev and Peligrad [5] established the above inequality for ρ-mixing sequence, φ-mixing sequence, \(\rho^{-}\)-mixing sequence, and ρ̃-mixing sequence, respectively. We also refer to Shao [6], Stoica [7], Shen [8], Yuan and An [9] for negatively associated sequence (NA), martingale difference sequence, extended negatively dependent sequence (END), and asymptotically almost negatively associated random sequence (AANA), respectively.
The following definitions will be useful in this paper. The first one can be found in Kuczmaszewska [10].
Definition 1.1
A sequence \(\{Y_{i},-\infty< i<\infty\}\) of random variables is said to satisfy a weak dominating condition with a dominating random variable Y if
where C is a positive constant.
Definition 1.2
A real valued function \(l(x)\), positive and measurable on \([0,\infty)\), is said to be slowly varying at infinity if for each \(\lambda>0\), \(\lim_{x\to\infty}\frac{l(\lambda x)}{l(x)}=1\).
Throughout the paper, let \(\{Y_{i},-\infty< i<\infty\}\) be a sequence of random variables with zero means and \(\{a_{i},-\infty< i<\infty\}\) be a sequence of real numbers with \(\sum_{i=-\infty}^{\infty} \vert a_{i}\vert <\infty\), and the moving average process \(\{X_{n}, n\geq1\}\) is defined by \(X_{n}=\sum_{i=-\infty}^{\infty}a_{i}Y_{i+n}\). The complete moment convergence of moving average process \(\{X_{n},n\geq1\}\) has been widely investigated by many authors. We list some results as follows.
Li and Zhang [11] established the following complete moment convergence of moving average processes under NA assumptions.
Theorem A
Suppose that \(\{X_{n}=\sum_{i=-\infty}^{\infty} a_{i}\varepsilon_{i+n}, n\geq1\}\), where \(\{a_{i},-\infty< i<\infty\}\) is a sequence of real numbers with \(\sum_{i=-\infty}^{\infty} \vert a_{i}\vert <\infty\) and \(\{\varepsilon_{i},-\infty< i<\infty\}\) is a sequence of identically distributed NA random variables with \(E\varepsilon_{1}=0\), \(E\varepsilon_{1}^{2}<\infty\). Let h be a function slowly varying at infinity, \(1\leq q<2 \), \(r>1+q/2\). Then \(E\vert \varepsilon_{1}\vert ^{r}h(\vert \varepsilon_{1}\vert ^{q})<\infty\) implies
for all \(\varepsilon>0\).
Later on, the following complete moment convergence of moving average processes generated by ρ-mixing sequence was proved by Zhou and Lin [12].
Theorem B
Let h be a function slowly varying at infinity, \(p\geq1\), \(p\alpha>1\), and \(\alpha>1/2\). Suppose that \(\{X_{n},n\geq1\}\) is a moving average process based on a sequence \(\{ Y_{i},-\infty< i<\infty\}\) of identically distributed ρ-mixing random variables. If \(EY_{1}=0\) and \(E\vert Y_{1}\vert ^{p+\delta }h(\vert Y_{1}\vert ^{1/{\alpha}})<\infty\) for some \(\delta>0\), then for all \(\varepsilon>0\),
Recently, Ko [1] obtained the complete moment convergence of moving average processes generated by a class of random variable.
Theorem C
Let h be a function slowly varying at infinity, \(p\geq1\), \(p\alpha>1\), and \(\alpha>1/2\). Assume that \(\{ a_{i},-\infty< i<\infty\}\) is an absolutely summable sequence of real numbers and that \(\{ Y_{i},-\infty< i<\infty\}\) is a sequence of mean zero random variables satisfying a weak mean dominating condition with a mean dominating random variable Y and \(E\vert Y\vert ^{p}h(\vert Y\vert ^{1/{\alpha}})<\infty\). Suppose that \(\{X_{n},n\geq1\}\) is a moving average process based on the sequence \(\{ Y_{i},-\infty< i<\infty\}\). Assume that the Rosenthal type maximal inequality of \(Y_{xj}=-xI\{Y_{j}<-x\}+Y_{j}I\{\vert Y_{j}\vert \leq x\}+xI\{Y_{j}>x\}\) holds for any \(q\geq2\) and \(x> 0\). Then, for all \(\varepsilon>0\),
The aim of this paper is to study the complete moment convergence of moving average process of random sequence under the assumption that the random variables satisfy the Rosenthal type maximal inequality and the weak mean dominating condition. The paper is organized as follows. In Section 2 we describe the main results, Sections 3 and 4 provide some lemmas and the details of the proofs, respectively. Throughout the sequel, C represents a positive constant although its value may change from one place to the next, \(a_{n}=O(b_{n})\) means \(\vert a_{n}/{b_{n}}\vert \leq C\) and \(I\{A\}\) stands for the indicator function of the set A.
2 Main results
Theorem 2.1
Let l be a function slowly varying at infinity. Suppose that \(\{ a_{i},-\infty< i<\infty\}\) is an absolutely summable sequence of real numbers. Let \(\{g(n), n\geq1\}\) and \(\{f(n), n\geq1\}\) be two sequences of positive constants such that, for some \(r\geq\max\{2,p\}\), \(p\geq1\),
-
(C1)
\(f(n)\uparrow\infty\), \(\frac{n}{f^{p}(n)}\to0\);
-
(C2)
\(\sum_{m=1}^{k}\log(\frac{f(m+1)}{f(m)})\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p-1}(k)l(k))\);
-
(C3)
\(\sum_{m=k}^{\infty}[f^{1-r}(m+1)-f^{1-r}(m)]\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p-r}(k)l(k))\);
-
(C4)
\(\sum_{m=1}^{k}[f(m+1)-f(m)]\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p}(k)l(k))\);
-
(C5)
\(\sum_{m=1}^{\infty}[f^{1-r}(m+1)-f^{1-r}(m)]f^{t}(m+1)\sum_{n=1}^{m}\frac{n^{r/2}g(n)l(n)}{f(n)}<\infty\), where \(t=\max\{0,2-p\}r/2\);
-
(C6)
\(\sum_{m=1}^{\infty}[f(m+1)-f(m)]f^{t'}(m+1)\sum_{n=1}^{m}\frac {n^{r/2}g(n)l(n)}{f(n)}<\infty\), where \(t'=-\min\{2,p\}r/2\).
Assume that \(\{X_{n}=\sum_{i=-\infty}^{\infty} a_{i}Y_{i+n}, n\geq1\}\) is a moving average process generated by a sequence of random variables \(\{ Y_{i},-\infty< i<\infty\}\) with mean zeros and satisfying a weak dominating condition with a dominating random variable Y and \(E\vert Y\vert ^{p}(1\vee l(f^{-1}(\vert Y\vert )))<\infty\), where \(f^{-1}\) is the inverse function of f.
Assume that the Rosenthal type maximal inequality of \(Y_{xj}=-xI\{Y_{j}<-x\}+Y_{j}I\{\vert Y_{j}\vert \leq x\}+xI\{Y_{j}>x\}\) holds for the above r and all \(x> 0\). Then, for all \(\varepsilon>0\),
Corollary 2.2
If we replace conditions (C2)-(C6) by the following:
-
(C7)
\(\sum_{n=1}^{k} \frac{ng(n)l(n)}{f(n)}=O(f^{p-1}(k)l(k))\), \(\sum_{n=1}^{\infty}\frac{n^{r/2}g(n)l(n)}{f^{\min\{2,p\}r}(n)}<\infty\), \(\sum_{n=k}^{\infty}\frac{ng(n)l(n)}{f^{r}(n)}=O(f^{p-r}(k)l(k))\).
The other assumptions of Theorem 2.1 also hold, then, for all \(\varepsilon>0\), we have
Conditions (C1)-(C7) can be satisfied by many sequences, we list some as the following remarks.
Remark 2.3
Let \(g(n)=n^{p\alpha-2}\), \(f(n)=n^{\alpha}\) for \(p\alpha>1\), and \(1/2<\alpha\leq1\), assume that (1.1) holds true for \(\{Y_{xj}\}\) and
then conditions (C1)-(C7) can be verified easily by Lemma 3.1, therefore we know
Obviously, Theorem 3.1 and Corollary 3.2 from Ko [1] are the same as (2.3) and (2.4), respectively, so we extend the known results. If we take \(a_{0}=1\), \(a_{i}=0,i\neq0\), \(l(x)=1\), let \(\{Y,Y_{i},-\infty < i<\infty\}\) be a sequence of i.i.d. random variables, then \(\sum_{n=1}^{\infty}n^{p\alpha-1}P\{\vert Y\vert >n^{\alpha}\}<\infty\) is equivalent to \(E\vert Y\vert ^{p}<\infty\), which implies (2.4), so we can obtain Remark 1.1 from Chen [13].
Remark 2.4
If we take \(g(n)=n^{s-2}\), \(f(n)=n^{s/p}\) for \(s>p>1\), suppose that (1.1) holds true for \(\{Y_{xj}\}\) and
then conditions (C1)-(C7) can be verified easily by Lemma 3.1, so we can obtain
Remark 2.5
If we set \(g(n)=\frac{\log n}{n}\), \(f(n)=(n\log n)^{1/p}\) for \(1< p\leq 2\), assume that (1.1) holds true for \(\{Y_{xj}\}\) and \(r>4\), it is easy to see that conditions (C1)-(C7) can be satisfied by Lemma 3.1, so we can obtain
Remark 2.6
Denote \(g(n)=\frac{1}{n\log n}\), \(f(n)=(n\log\log n)^{1/p}\) for \(1< p\leq2\), assuming that (1.1) holds true for \(\{Y_{xj}\}\) and \(r>2\), it is easy to prove that conditions (C1)-(C7) can be satisfied by Lemma 3.1, hence the following hold:
Theorem 2.7
Let l be a function slowly varying at infinity. Suppose that \(\{X_{n}= \sum_{i=-\infty}^{\infty} a_{i}Y_{i+n}, n\geq1\}\) is a moving average process generated by a sequence of random variables \(\{ Y_{i},-\infty< i<\infty\}\) with mean zeros, where \(\{a_{i},-\infty< i<\infty\}\) is an absolutely summable sequence of real numbers. Let \(\{g(n), n\geq1\}\) and \(\{f(n), n\geq1\}\) be two sequences of positive constants with \(f(n)\uparrow\infty\) and \(\{\Psi_{n}(t),n\geq1\}\) ba a sequence of even and nonnegative functions such that, for each \(n\geq1\), \(\Psi_{n}(t)>0\) as \(t>0\). Assume that
for some \(1\leq p< q\leq2\), and
for any \(j\geq0\). Assume that the Rosenthal type maximal inequality of \(Y_{nj}=Y_{j}I\{\vert Y_{j}\vert \leq f(n)\}\) holds true for \(r=2\). Then, for all \(\varepsilon>0\),
3 Preliminary lemmas
In order to prove the main results, we shall need the following lemmas.
Lemma 3.1
Zhou [3]
If l is slowly varying at infinity, then
-
(1)
\(\sum_{n=1}^{m}n^{s}l(n)\leq C m^{s+1}l(m)\) for \(s>-1\) and positive integer m,
-
(2)
\(\sum_{n=m}^{\infty}n^{s}l(n)\leq C m^{s+1}l(m)\) for \(s<-1\) and positive integer m.
Lemma 3.2
Gut [14]
Let \(\{Y_{n}, n\geq1\}\) be a sequence of random variables satisfy a weak dominating condition with a dominating random variable Y. For any \(b>0\), set
Then for any \(a>0\) and some constant C
-
(1)
if \(E\vert Y\vert ^{a}<\infty\), then \(n^{-1}\sum_{i=1}^{n}E\vert Y_{i}\vert ^{a}\leq CE\vert Y\vert ^{a}\);
-
(2)
\(n^{-1}\sum_{i=1}^{n}E\vert Y_{i}^{\prime} \vert ^{a}\leq C(E\vert Y^{\prime} \vert ^{a}+b^{a}P\{\vert Y\vert >b\})\);
-
(3)
\(n^{-1}\sum_{i=1}^{n}E\vert Y_{i}^{\prime\prime} \vert ^{a}\leq CE\vert Y^{\prime\prime} \vert ^{a}\).
4 Proofs
Proof of Theorem 2.1
Obviously that \(\sum_{k=1}^{n}X_{k}=\sum_{i=-\infty}^{\infty}a_{i}\sum_{j=i+1}^{i+n}Y_{j}\). Noting that \(\sum_{i=-\infty}^{\infty} \vert a_{i}\vert <\infty\), \(EY_{i}=0\), \(E\vert Y\vert ^{p}(1\vee l(f^{-1}(\vert Y\vert )))<\infty\), then by Lemma 3.2 and condition (C1), for any \(x>f(n)\), we conclude
Therefore, one can get
for any \(\varepsilon>0\) and \(x>f(n)\) large enough. Hence it follows that
Now we want to estimate \(I_{1}<\infty\). It is obvious that \(\vert Y_{j}-Y_{xj}\vert \leq \vert Y_{j}\vert I\{\vert Y_{j}\vert > x\}\), then it follows by Markov’s inequality, Lemma 3.2 and conditions (C1) and (C2) that
Hence it remains to show that \(I_{2}<\infty\). By Markov’s inequality, the Hölder inequality and the Rosenthal type maximal inequality, for \(r>\max\{ 2,p\}\), it is easy to see that
For \(I_{21}\), it follows by \(C_{r}\) inequality, Lemma 3.2 and conditions (C1), (C3), and (C4) that
Finally we want to show that \(I_{22}<\infty\), by \(C_{r}\) inequality, Lemma 3.2 and conditions (C1), (C5), and (C6), it follows that
Hence the proof of (2.1) is completed by combining (4.1)-(4.5). □
Proof of Theorem 2.7
Clearly \(\sum_{j=1}^{k}X_{j}=\sum_{i=-\infty}^{\infty}a_{i}\sum_{j=i+1}^{i+k}Y_{j}\). Noting that \(\sum_{i=-\infty}^{\infty} \vert a_{i}\vert <\infty\) and \(EY_{j}=0\), then by (2.5) and (2.6), we know
Hence for n large enough and any \(\varepsilon>0\), we obtain
Then one can get
By Markov’s inequality, (2.5), and (2.6), it is easy to check that
It follows from Markov’s inequality, the Hölder inequality, the Rosenthal type inequality, (2.5), and (2.6) that
Thus we have completed the proof of Theorem 2.7. □
References
Ko, MH: Complete moment convergence of moving average process generated by a class of random variables. J. Inequal. Appl. 2015, 225 (2015)
Peligrad, M: Convergence rates of the strong law for stationary mixing sequences. Z. Wahrscheinlichkeitstheor. Verw. Geb. 70(2), 307-314 (1985)
Zhou, XC: Complete moment convergence of moving average processes under φ-mixing assumptions. Stat. Probab. Lett. 80, 285-292 (2010)
Wang, JF, Lu, FB: Inequalities of maximum of partial sums and weak convergence for a class of weak dependent random variables. Acta Math. Sin. 22, 693-700 (2006)
Utev, S, Peligrad, M: Maximal inequalities and an invariance principle for a class of weakly dependent random variables. J. Theor. Probab. 16(1), 101-115 (2003)
Shao, QM: A comparison theorem on moment inequalities between negatively associated and independent random variables. J. Theor. Probab. 13(2), 343-356 (2000)
Stoica, G: A note on the rate of convergence in the strong law of large numbers for martingales. J. Math. Anal. Appl. 381(2), 910-913 (2011)
Shen, AT: Probability inequalities for END sequence and their applications. J. Inequal. Appl. 2011, 98 (2011)
Yuan, DM, An, J: Rosenthal type inequalities for asymptotically almost negatively associated random variables and applications. Sci. China Ser. A 52(9), 1887-1904 (2009)
Kuczmaszewska, A: On complete convergence in Marcinkiewicz-Zygmund type SLLN for negatively associated random variables. Acta Math. Hung. 128(1-2), 116-130 (2010)
Li, YX, Zhang, LX: Complete moment convergence of moving average processes under dependence assumptions. Stat. Probab. Lett. 70, 191-197 (2004)
Zhou, XC, Lin, JG: Complete moment convergence of moving average processes under ρ-mixing assumption. Math. Slovaca 61(6), 979-992 (2011)
Chen, PY, Yi, JM, Sung, SH: An extension of the Baum-Katz theorem to i.i.d. random variables with general moment conditions. J. Inequal. Appl. 2015, 414 (2015)
Gut, A: Complete convergence for arrays. Period. Math. Hung. 25(1), 51-75 (1992)
Acknowledgements
The paper is supported by NSFC (Grant Nos. 11101180, 11201175) and the Science and Technology Development Program of Jilin Province (Grant Nos. 20130522096JH, 20140520056JH, 20170101152JC).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
All authors contributed to each part of this work equally and read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Zhang, Y., Ding, X. Further research on complete moment convergence for moving average process of a class of random variables. J Inequal Appl 2017, 46 (2017). https://doi.org/10.1186/s13660-017-1322-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-017-1322-2