Theorem 2.1
Let
l
be a function slowly varying at infinity. Suppose that
\(\{ a_{i},-\infty< i<\infty\}\)
is an absolutely summable sequence of real numbers. Let
\(\{g(n), n\geq1\}\)
and
\(\{f(n), n\geq1\}\)
be two sequences of positive constants such that, for some
\(r\geq\max\{2,p\}\), \(p\geq1\),
-
(C1)
\(f(n)\uparrow\infty\), \(\frac{n}{f^{p}(n)}\to0\);
-
(C2)
\(\sum_{m=1}^{k}\log(\frac{f(m+1)}{f(m)})\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p-1}(k)l(k))\);
-
(C3)
\(\sum_{m=k}^{\infty}[f^{1-r}(m+1)-f^{1-r}(m)]\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p-r}(k)l(k))\);
-
(C4)
\(\sum_{m=1}^{k}[f(m+1)-f(m)]\sum_{n=1}^{m}\frac {ng(n)l(n)}{f(n)}=O(f^{p}(k)l(k))\);
-
(C5)
\(\sum_{m=1}^{\infty}[f^{1-r}(m+1)-f^{1-r}(m)]f^{t}(m+1)\sum_{n=1}^{m}\frac{n^{r/2}g(n)l(n)}{f(n)}<\infty\), where
\(t=\max\{0,2-p\}r/2\);
-
(C6)
\(\sum_{m=1}^{\infty}[f(m+1)-f(m)]f^{t'}(m+1)\sum_{n=1}^{m}\frac {n^{r/2}g(n)l(n)}{f(n)}<\infty\), where
\(t'=-\min\{2,p\}r/2\).
Assume that
\(\{X_{n}=\sum_{i=-\infty}^{\infty} a_{i}Y_{i+n}, n\geq1\}\)
is a moving average process generated by a sequence of random variables
\(\{ Y_{i},-\infty< i<\infty\}\)
with mean zeros and satisfying a weak dominating condition with a dominating random variable
Y
and
\(E\vert Y\vert ^{p}(1\vee l(f^{-1}(\vert Y\vert )))<\infty\), where
\(f^{-1}\)
is the inverse function of
f.
Assume that the Rosenthal type maximal inequality of
\(Y_{xj}=-xI\{Y_{j}<-x\}+Y_{j}I\{\vert Y_{j}\vert \leq x\}+xI\{Y_{j}>x\}\)
holds for the above
r
and all
\(x> 0\). Then, for all
\(\varepsilon>0\),
$$\begin{aligned} \sum_{n=1}^{\infty} \frac{g(n)l(n)}{f(n)}E\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert -\varepsilon f(n)\Biggr\} ^{+} < \infty. \end{aligned}$$
(2.1)
Corollary 2.2
If we replace conditions (C2)-(C6) by the following:
-
(C7)
\(\sum_{n=1}^{k} \frac{ng(n)l(n)}{f(n)}=O(f^{p-1}(k)l(k))\), \(\sum_{n=1}^{\infty}\frac{n^{r/2}g(n)l(n)}{f^{\min\{2,p\}r}(n)}<\infty\), \(\sum_{n=k}^{\infty}\frac{ng(n)l(n)}{f^{r}(n)}=O(f^{p-r}(k)l(k))\).
The other assumptions of Theorem
2.1
also hold, then, for all
\(\varepsilon>0\), we have
$$\begin{aligned} \sum_{n=1}^{\infty}{g(n)l(n)}P \Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert >\varepsilon f(n)\Biggr\} < \infty. \end{aligned}$$
(2.2)
Conditions (C1)-(C7) can be satisfied by many sequences, we list some as the following remarks.
Remark 2.3
Let \(g(n)=n^{p\alpha-2}\), \(f(n)=n^{\alpha}\) for \(p\alpha>1\), and \(1/2<\alpha\leq1\), assume that (1.1) holds true for \(\{Y_{xj}\}\) and
$$ \textstyle\begin{cases} r>2, & 1< p\leq2,\\ r>\frac{2(p\alpha-1)}{2\alpha-1}, & p>2, \end{cases} $$
then conditions (C1)-(C7) can be verified easily by Lemma 3.1, therefore we know
$$\begin{aligned} &\sum_{n=1}^{\infty}n^{p\alpha-\alpha-2}l(n)E \Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert -\varepsilon n^{\alpha}\Biggr\} ^{+} < \infty, \end{aligned}$$
(2.3)
$$\begin{aligned} &\sum_{n=1}^{\infty}n^{p\alpha-2}l(n)P \Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert >\varepsilon n^{\alpha}\Biggr\} < \infty. \end{aligned}$$
(2.4)
Obviously, Theorem 3.1 and Corollary 3.2 from Ko [1] are the same as (2.3) and (2.4), respectively, so we extend the known results. If we take \(a_{0}=1\), \(a_{i}=0,i\neq0\), \(l(x)=1\), let \(\{Y,Y_{i},-\infty < i<\infty\}\) be a sequence of i.i.d. random variables, then \(\sum_{n=1}^{\infty}n^{p\alpha-1}P\{\vert Y\vert >n^{\alpha}\}<\infty\) is equivalent to \(E\vert Y\vert ^{p}<\infty\), which implies (2.4), so we can obtain Remark 1.1 from Chen [13].
Remark 2.4
If we take \(g(n)=n^{s-2}\), \(f(n)=n^{s/p}\) for \(s>p>1\), suppose that (1.1) holds true for \(\{Y_{xj}\}\) and
$$ \textstyle\begin{cases} r>2, & 1< p\leq2,\\ r> \frac{2(s-1)p}{2s-p}, & p>2, \end{cases} $$
then conditions (C1)-(C7) can be verified easily by Lemma 3.1, so we can obtain
$$\begin{aligned} &\sum_{n=1}^{\infty}n^{s-s/p-2}l(n)E\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum_{j=1}^{k}X_{j} \Biggr\vert -\varepsilon n^{s/p}\Biggr\} ^{+} < \infty, \\ &\sum_{n=1}^{\infty}n^{s-2}l(n)P\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum_{j=1}^{k}X_{j} \Biggr\vert >\varepsilon n^{s/p}\Biggr\} < \infty. \end{aligned}$$
Remark 2.5
If we set \(g(n)=\frac{\log n}{n}\), \(f(n)=(n\log n)^{1/p}\) for \(1< p\leq 2\), assume that (1.1) holds true for \(\{Y_{xj}\}\) and \(r>4\), it is easy to see that conditions (C1)-(C7) can be satisfied by Lemma 3.1, so we can obtain
$$\begin{aligned} & \sum_{n=1}^{\infty} \frac{(\log n)^{1-1/p}l(n)}{n^{1+1/p}}E\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert - \varepsilon(n\log n)^{1/p}\Biggr\} ^{+} < \infty, \\ & \sum_{n=1}^{\infty} \frac{(\log n)l(n)}{n}P\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert > \varepsilon(n\log n)^{1/p}\Biggr\} < \infty. \end{aligned}$$
Remark 2.6
Denote \(g(n)=\frac{1}{n\log n}\), \(f(n)=(n\log\log n)^{1/p}\) for \(1< p\leq2\), assuming that (1.1) holds true for \(\{Y_{xj}\}\) and \(r>2\), it is easy to prove that conditions (C1)-(C7) can be satisfied by Lemma 3.1, hence the following hold:
$$\begin{aligned} &\sum_{n=1}^{\infty} \frac{l(n)}{n^{1+1/p}(\log n)(\log\log n)^{1/p}}E\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert - \varepsilon(n\log\log n)^{1/p}\Biggr\} ^{+} < \infty, \\ &\sum_{n=1}^{\infty} \frac{l(n)}{n\log n}P\Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert > \varepsilon(n\log\log n)^{1/p}\Biggr\} < \infty. \end{aligned}$$
Theorem 2.7
Let
l
be a function slowly varying at infinity. Suppose that
\(\{X_{n}= \sum_{i=-\infty}^{\infty} a_{i}Y_{i+n}, n\geq1\}\)
is a moving average process generated by a sequence of random variables
\(\{ Y_{i},-\infty< i<\infty\}\)
with mean zeros, where
\(\{a_{i},-\infty< i<\infty\}\)
is an absolutely summable sequence of real numbers. Let
\(\{g(n), n\geq1\}\)
and
\(\{f(n), n\geq1\}\)
be two sequences of positive constants with
\(f(n)\uparrow\infty\)
and
\(\{\Psi_{n}(t),n\geq1\}\)
ba a sequence of even and nonnegative functions such that, for each
\(n\geq1\), \(\Psi_{n}(t)>0\)
as
\(t>0\). Assume that
$$\begin{aligned} \frac{\Psi_{n}(\vert t\vert )}{\vert t\vert ^{p}}\uparrow,\qquad \frac{\Psi _{n}(\vert t\vert )}{\vert t\vert ^{q}}\downarrow,\quad \textit{as } \vert t\vert \uparrow \end{aligned}$$
(2.5)
for some
\(1\leq p< q\leq2\), and
$$\begin{aligned} \sum_{n=1}^{\infty}g(n)l(n)\sum _{i=j+1}^{j+n}\frac{E\Psi _{i}(Y_{i})}{\Psi_{i}(f(n))}< \infty, \qquad\sum _{i=j+1}^{j+n}\frac{E\Psi _{i}(Y_{i})}{\Psi_{i}(f(n))}\to0,\quad \textit{as }n\to \infty \end{aligned}$$
(2.6)
for any
\(j\geq0\). Assume that the Rosenthal type maximal inequality of
\(Y_{nj}=Y_{j}I\{\vert Y_{j}\vert \leq f(n)\}\)
holds true for
\(r=2\). Then, for all
\(\varepsilon>0\),
$$\begin{aligned} \sum_{n=1}^{\infty}g(n)l(n)P \Biggl\{ \max_{1\leq k\leq n} \Biggl\vert \sum _{j=1}^{k}X_{j}\Biggr\vert >\varepsilon f(n)\Biggr\} < \infty. \end{aligned}$$
(2.7)