• Research
• Open Access

Complete convergence and complete moment convergence for a class of random variables

Journal of Inequalities and Applications20122012:229

https://doi.org/10.1186/1029-242X-2012-229

• Accepted: 1 October 2012
• Published:

Abstract

In this paper, we establish the complete convergence and complete moment convergence and obtain the equivalence of the complete convergence and complete moment convergence for the class of random variables satisfying a Rosenthal-type maximal inequality. Baum-Katz-type theorem and Hsu-Robbins-type theorem are extended to the case of this class of random variables.

MSC:60F15.

Keywords

• complete convergence
• complete moment convergence
• Baum-Katz-type theorem
• Hsu-Robbins-type theorem

1 Introduction

The concept of complete convergence was introduced by Hsu and Robbins  as follows. A sequence of random variables $\left\{{U}_{n},n\ge 1\right\}$ is said to converge completely to a constant C if ${\sum }_{n=1}^{\mathrm{\infty }}P\left(|{U}_{n}-C|>\epsilon \right)<\mathrm{\infty }$ for all $\epsilon >0$. In view of the Borel-Cantelli lemma, this implies that ${U}_{n}\to C$ almost surely (a.s.). The converse is true if the $\left\{{U}_{n},n\ge 1\right\}$ are independent. Hsu and Robbins  proved that the sequence of arithmetic means of independent and identically distributed (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. Erdös  proved the converse. The result of Hsu-Robbin-Erdös is a fundamental theorem in probability theory which has been generalized and extended in several directions by many authors. One of the most important generalizations is that by Baum and Katz  for the strong law of large numbers as follows.

Theorem A (Baum and Katz )

Let $\alpha >1/2$ and $\alpha p>1$. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence if i.i.d. random variables. Assume further that $E{X}_{1}=0$ if $\alpha \le 1$. Then the following statements are equivalent:

Many authors studied the Baum-Katz-type theorem for dependent random variables; see, for example, Peligrad  for a strong stationary ρ-mixing sequence, Peligrad and Gut  for a ${\rho }^{\ast }$-mixing sequence, Stoica [6, 7] for a martingale difference sequence, Stoica  for bounded subsequences, Wang and Hu  for φ-mixing random variables, and so forth.

One of the most interesting inequalities to probability theory is the Rosenthal-type maximal inequality. For a sequence $\left\{{X}_{i},1\le i\le n\right\}$ of i.i.d. random variables with $E{|{X}_{1}|}^{q}<\mathrm{\infty }$ for some $q\ge 2$, there exist positive constants ${C}_{q}$ depending only on q such that
$E{\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}\left({X}_{i}-E{X}_{i}\right)|\right)}^{q}\le {C}_{q}\left\{\sum _{i=1}^{n}E{|{X}_{i}|}^{q}+{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{q/2}\right\}.$

The inequality above has been obtained for dependent random variables by many authors. See, for example, Shao  for negatively associated random variables, Utev and Peligrad  for ${\rho }^{\ast }$-mixing random variables, Wang et al.  for φ-mixing random variables with the mixing coefficients satisfying certain conditions, and so forth.

The purpose of this work is to obtain complete convergence and complete moment convergence for a sequence of random variables satisfying a Rosenthal-type maximal inequality.

Throughout the paper, let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of random variables defined on a fixed probability space $\left(\mathrm{\Omega },\mathcal{A},P\right)$. Let $I\left(A\right)$ be the indicator function of the set A. Denote ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$, ${S}_{0}=0$, ${ln}^{+}x=lnmax\left(x,e\right)$, ${x}^{+}=xI$ ($x\ge 0$). The symbol C denotes a positive constant which may be different in various places.

The following definition will be used frequently in the paper.

Definition 1.1 A sequence of random variables $\left\{{X}_{n},n\ge 1\right\}$ is said to be stochastically dominated by a random variable X if there exists a positive constant C such that
$P\left(|{X}_{n}|>x\right)\le CP\left(|X|>x\right)$

for all $x\ge 0$ and $n\ge 1$.

In this paper, the present investigation is to provide the complete convergence and complete moment convergence for a sequence of the class of random variables satisfying a Rosenthal-type maximal inequality and prove the equivalence of the complete convergence and complete moment convergence. Baum-Katz-type theorem and Hsu-Robbins-type theorem are extended to the case of the class of random variables satisfying a Rosenthal-type maximal inequality. As a result, the Marcinkiewicz-Zygmund strong law of large numbers for the class of random variables is obtained.

2 Main results

Theorem 2.1 Let $\alpha >1/2$, $\alpha p\ge 1$ and $p>1$. Suppose that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of zero mean random variables which is stochastically dominated by a random variable X with $E{|X|}^{p}<\mathrm{\infty }$. Assume that for any $q\ge 2$, there exists a positive constant ${C}_{q}$ depending only on q such that
$E\left(\underset{1\le j\le n}{max}{|\sum _{i=1}^{j}\left({Y}_{ti}-E{Y}_{ti}\right)|}^{q}\right)\le {C}_{q}\left\{\sum _{i=1}^{n}E{|{Y}_{ti}|}^{q}+{\left(\sum _{i=1}^{n}E{Y}_{ti}^{2}\right)}^{q/2}\right\},$
(2.1)
where ${Y}_{ti}=-tI\left({X}_{i}<-t\right)+{X}_{i}I\left(|{X}_{i}|\le t\right)+tI\left({X}_{i}>t\right)$ for all $t>0$. Then
$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}P\left(\underset{1\le j\le n}{max}|{S}_{j}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathit{\text{for all}}\phantom{\rule{0.5em}{0ex}}\epsilon >0$
(2.2)
and
$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }E{\left(\underset{1\le j\le n}{max}|{S}_{j}|-\epsilon {n}^{\alpha }\right)}^{+}<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathit{\text{for all}}\phantom{\rule{0.5em}{0ex}}\epsilon >0.$
(2.3)

Furthermore, (2.2) is equivalent to (2.3).

Corollary 2.1 Let $1. Suppose that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of zero mean random variables which is stochastically dominated by a random variable X with $E{|X|}^{p}<\mathrm{\infty }$. Assume further that (2.1) holds, then
$\frac{{S}_{n}}{{n}^{1/p}}\to 0\phantom{\rule{1em}{0ex}}\mathit{\text{a.s.}}$

For $p=1$, we have the following theorem.

Theorem 2.2 Let $\alpha >0$. Suppose that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of zero mean random variables which is stochastically dominated by a random variable X with $E|X|{ln}^{+}|X|<\mathrm{\infty }$. Assume further that (2.1) holds, then for all $\epsilon >0$,
$\sum _{n=1}^{\mathrm{\infty }}{n}^{-2}E{\left(\underset{1\le j\le n}{max}|{S}_{j}|-\epsilon {n}^{\alpha }\right)}^{+}<\mathrm{\infty }.$
(2.4)
Theorem 2.3 Let $\alpha >1/2$, $p>1$ and $\alpha p>1$. Suppose that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of zero mean random variables which is stochastically dominated by a random variable X with $E{|X|}^{p}<\mathrm{\infty }$. Assume further that (2.1) holds, then for all $\epsilon >0$,
$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}P\left(\underset{j\ge n}{sup}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)<\mathrm{\infty }.$
(2.5)

Remark 2.1 In (2.1), $\left\{{Y}_{ti},i\ge 1\right\}$ is a monotone transformation of $\left\{{X}_{i},i\ge 1\right\}$. If $\left\{{X}_{i},i\ge 1\right\}$ is a sequence of independent random variables, then (2.1) is clearly satisfied. There are many sequences of dependent random variables satisfying (2.1) for all $q\ge 2$. Examples include sequences of NA random variables (see Shao ), ${\rho }^{\ast }$-mixing random variables (see Utev and Peligrad ), φ-mixing random variables with the mixing coefficients satisfying certain conditions (see Wang et al. ), ${\rho }^{-}$-mixing random variables with the mixing coefficients satisfying certain conditions (see Wang and Lu ).

Remark 2.2 In Theorem 2.1, we not only generalize the Baum-Katz-type theorem for the class of random variables satisfying (2.1), but also consider the case $\alpha p=1$. Furthermore, if we take $\alpha =1$ and $p=2$, then we can get the Hsu-Robbins-type theorem (see Hsu and Robbins ) for the class of random variables satisfying (2.1).

3 Lemmas

In this section, the following lemmas are very useful to prove the main results of the paper.

Lemma 3.1 (cf. Wu )

Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of random variables, which is stochastically dominated by a random variable X. Then for any $a>0$ and $b>0$, the following two statements hold:
$E{|{X}_{n}|}^{a}I\left(|{X}_{n}|\le b\right)\le {C}_{1}\left\{E{|X|}^{a}I\left(|X|\le b\right)+{b}^{a}P\left(|X|>b\right)\right\}$
and
$E{|{X}_{n}|}^{a}I\left(|{X}_{n}|>b\right)\le {C}_{2}E{|X|}^{a}I\left(|X|>b\right),$

where ${C}_{1}$ and ${C}_{2}$ are positive constants.

Lemma 3.2 Under the conditions of Theorem  2.1,
$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}P\left(\underset{1\le j\le n}{max}|{S}_{j}|>t\right)\phantom{\rule{0.2em}{0ex}}dt<\mathrm{\infty }.$
(3.1)
Proof For fixed $n\ge 1$, denote ${Y}_{ti}^{\prime }={X}_{i}-{Y}_{ti}$, $i\ge 1$. Then it follows that
For J, noting that $|{Y}_{ti}^{\prime }|\le |{X}_{i}|I\left(|{X}_{i}|>t\right)$, we have by Markov’s inequality, Lemma 3.1 and $E{|X|}^{p}<\mathrm{\infty }$ that
$\begin{array}{rcl}J& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-1}\sum _{i=1}^{n}E|{Y}_{ti}^{\prime }|\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-1}\sum _{i=1}^{n}E|{X}_{i}|I\left(|{X}_{i}|>t\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-1}E|X|I\left(|X|>t\right)\phantom{\rule{0.2em}{0ex}}dt\\ =& C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }\sum _{m=n}^{\mathrm{\infty }}{\int }_{{m}^{\alpha }}^{{\left(m+1\right)}^{\alpha }}{t}^{-1}E|X|I\left(|X|>t\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }\sum _{m=n}^{\mathrm{\infty }}{m}^{-1}E|X|I\left(|X|>{m}^{\alpha }\right)\\ =& C\sum _{m=1}^{\mathrm{\infty }}{m}^{-1}E|X|I\left(|X|>{m}^{\alpha }\right)\sum _{n=1}^{m}{n}^{\alpha p-1-\alpha }\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{m}^{-1}E|X|I\left(|X|>{m}^{\alpha }\right){m}^{\alpha p-\alpha }\\ =& C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-\alpha }E|X|I\left(|X|>{m}^{\alpha }\right)\\ =& C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-\alpha }\sum _{n=m}^{\mathrm{\infty }}E|X|I\left(n<{|X|}^{1/\alpha }\le n+1\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}E|X|I\left(n<{|X|}^{1/\alpha }\le n+1\right)\sum _{m=1}^{n}{m}^{\alpha p-1-\alpha }\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha }E|X|I\left(n<{|X|}^{1/\alpha }\le n+1\right)\le CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
(3.2)
For I, by Markov’s inequality and (2.1), we have that for $q\ge 2$,
$\begin{array}{rcl}I& \le & {C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}E\left(\underset{1\le j\le n}{max}{|\sum _{i=1}^{j}\left({Y}_{ti}-E{Y}_{ti}\right)|}^{q}\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & {C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}\sum _{i=1}^{n}E{|{Y}_{ti}|}^{q}\phantom{\rule{0.2em}{0ex}}dt+{C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}{\left(\sum _{i=1}^{n}E{Y}_{ti}^{2}\right)}^{q/2}\phantom{\rule{0.2em}{0ex}}dt\\ :=& {I}_{1}+{I}_{2}.\end{array}$
(3.3)

We will consider the following three cases:

Case 1. $\alpha >1/2$, $\alpha p>1$ and $p\ge 2$.

Taking $q>max\left(p,\frac{\alpha p-1}{\alpha -\frac{1}{2}}\right)$, which implies that $\alpha p-2-\alpha q+q/2<-1$. We have by Lemma 3.1 and the proof of (3.2) that
$\begin{array}{rcl}{I}_{1}& \le & {C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}\sum _{i=1}^{n}\left(E{|{X}_{i}|}^{q}I\left(|{X}_{i}|\le t\right)+{t}^{q}P\left(|{X}_{i}|>t\right)\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}E{|X|}^{q}I\left(|X|\le t\right)\phantom{\rule{0.2em}{0ex}}dt+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-1}E|X|I\left(|X|>t\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }\sum _{m=n}^{\mathrm{\infty }}{\int }_{{m}^{\alpha }}^{{\left(m+1\right)}^{\alpha }}{t}^{-q}E{|X|}^{q}I\left(|X|\le t\right)\phantom{\rule{0.2em}{0ex}}dt+C\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }\sum _{m=n}^{\mathrm{\infty }}{m}^{\alpha -1-\alpha q}E{|X|}^{q}I\left(|X|\le {\left(m+1\right)}^{\alpha }\right)+C\\ =& C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha -1-\alpha q}E{|X|}^{q}I\left(|X|\le {\left(m+1\right)}^{\alpha }\right)\sum _{n=1}^{m}{n}^{\alpha p-1-\alpha }+C\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-\alpha q}E{|X|}^{q}I\left({m}^{\alpha }<|X|\le {\left(m+1\right)}^{\alpha }\right)\\ +C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-\alpha q}E{|X|}^{q}I\left(|X|\le {m}^{\alpha }\right)+C\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{m}^{-1}E{|X|}^{p}I\left({m}^{\alpha }<|X|\le {\left(m+1\right)}^{\alpha }\right)\\ +C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha \left(p-q\right)-1}\sum _{j=1}^{m}{j}^{\alpha q}P\left(j-1<{|X|}^{1/\alpha }\le j\right)+C\\ \le & CE{|X|}^{p}+C\sum _{j=1}^{\mathrm{\infty }}{j}^{\alpha q}P\left(j-1<{|X|}^{1/\alpha }\le j\right)\sum _{m=j}^{\mathrm{\infty }}{m}^{\alpha \left(p-q\right)-1}+C\\ \le & C\sum _{j=1}^{\mathrm{\infty }}{j}^{\alpha p}P\left(j-1<{|X|}^{1/\alpha }\le j\right)+C\le CE{|X|}^{p}+C<\mathrm{\infty }.\end{array}$
(3.4)
Note that $E{X}^{2}<\mathrm{\infty }$ if $E{|X|}^{p}<\mathrm{\infty }$ for $p\ge 2$. We have that
$\begin{array}{rcl}{I}_{2}& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{q/2}\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}{\left(\sum _{i=1}^{n}E{X}^{2}\right)}^{q/2}\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha +q/2}{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-q}\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q+q/2}<\mathrm{\infty }.\end{array}$

Case 2. $\alpha >1/2$, $\alpha p>1$ and $1.

Take $q=2$. Similar to the proofs of (3.3) and (3.4), we have that
$\begin{array}{rcl}I& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }{\int }_{{n}^{\alpha }}^{\mathrm{\infty }}{t}^{-2}\sum _{i=1}^{n}\left(E{X}_{i}^{2}I\left(|{X}_{i}|\le t\right)+{t}^{2}P\left(|{X}_{i}|>t\right)\right)\phantom{\rule{0.2em}{0ex}}dt\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }\sum _{m=n}^{\mathrm{\infty }}{m}^{-\alpha -1}E{X}^{2}I\left(|X|\le {\left(m+1\right)}^{\alpha }\right)+C\\ =& C\sum _{m=1}^{\mathrm{\infty }}{m}^{-\alpha -1}E{X}^{2}I\left(|X|\le {\left(m+1\right)}^{\alpha }\right)\sum _{n=1}^{m}{n}^{\alpha p-1-\alpha }+C\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-2\alpha }E{X}^{2}I\left({m}^{\alpha }<|X|\le {\left(m+1\right)}^{\alpha }\right)\\ +C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha p-1-2\alpha }E{X}^{2}I\left(|X|\le {m}^{\alpha }\right)+C\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{m}^{-1}E{|X|}^{p}I\left({m}^{\alpha }<|X|\le {\left(m+1\right)}^{\alpha }\right)\\ +C\sum _{m=1}^{\mathrm{\infty }}{m}^{\alpha \left(p-2\right)-1}\sum _{j=1}^{m}{j}^{2\alpha }P\left(j-1<{|X|}^{1/\alpha }\le j\right)+C\\ \le & CE{|X|}^{p}+C\sum _{j=1}^{\mathrm{\infty }}{j}^{2\alpha }P\left(j-1<{|X|}^{1/\alpha }\le j\right)\sum _{m=j}^{\mathrm{\infty }}{m}^{\alpha \left(p-2\right)-1}+C\\ \le & C\sum _{j=1}^{\mathrm{\infty }}{j}^{\alpha p}P\left(j-1<{|X|}^{1/\alpha }\le j\right)+C\\ \le & CE{|X|}^{p}+C<\mathrm{\infty }.\end{array}$
(3.5)

Case 3. $\alpha >1/2$, $\alpha p=1$ and $p>1$.

Take $q=2$. Note that $1/2<\alpha <1$ if $\alpha p=1$. Similar to the proofs of (3.5), it follows that $I<\mathrm{\infty }$. From the statements above, (3.1) is proved. The proof of the lemma is completed. □

Lemma 3.3 (cf. Sung )

Let $\left\{{Y}_{n},n\ge 1\right\}$ and $\left\{{Z}_{n},n\ge 1\right\}$ be sequences of random variables. Then for any $q>1$, $\epsilon >0$ and $a>0$,
$E{\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}\left({Y}_{i}+{Z}_{i}\right)|-\epsilon a\right)}^{+}\le \left(\frac{1}{{\epsilon }^{q}}+\frac{1}{q-1}\right)\frac{1}{{a}^{q-1}}E\underset{1\le j\le n}{max}{|\sum _{i=1}^{j}{Y}_{i}|}^{q}+E\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{Z}_{i}|.$

4 The proofs of main results

Proof of Theorem 2.1 First, we prove (2.2). For fixed $n\ge 1$, let ${X}_{ni}=-{n}^{\alpha }I\left({X}_{i}<-{n}^{\alpha }\right)+{X}_{i}I\left(|{X}_{i}|\le {n}^{\alpha }\right)+{n}^{\alpha }I\left({X}_{i}>{n}^{\alpha }\right)$ and ${X}_{ni}^{\prime }={X}_{i}-{X}_{ni}$, $i\ge 1$. Then it is easy to have that
For ${J}^{\ast }$, noting that $|{X}_{ni}^{\prime }|\le |{X}_{i}|I\left(|{X}_{i}|>{n}^{\alpha }\right)$, we have by Markov’s inequality, Lemma 3.1 and the proof of (3.2) that
$\begin{array}{rcl}{J}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }\sum _{i=1}^{n}E|{X}_{ni}^{\prime }|\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha }\sum _{i=1}^{n}E|{X}_{i}|I\left(|{X}_{i}|>{n}^{\alpha }\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }E|X|I\left(|X|>{n}^{\alpha }\right)\\ \le & CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
(4.1)
For ${I}^{\ast }$, by Markov’s inequality and (2.1), we have that for any $q\ge 2$,
$\begin{array}{rcl}{I}^{\ast }& \le & {C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}E\left(\underset{1\le j\le n}{max}{|\sum _{i=1}^{j}\left({X}_{ni}-E{X}_{ni}\right)|}^{q}\right)\\ \le & {C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}\sum _{i=1}^{n}E{|{X}_{ni}|}^{q}+{C}_{q}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}{\left(\sum _{i=1}^{n}E{X}_{ni}^{2}\right)}^{q/2}\\ :=& {I}_{1}^{\ast }+{I}_{2}^{\ast }.\end{array}$
(4.2)

We consider the following three cases:

Case 1. $\alpha >1/2$, $\alpha p>1$ and $p\ge 2$.

Take $q>max\left(p,\frac{\alpha p-1}{\alpha -\frac{1}{2}}\right)$, which implies that $\alpha p-2-\alpha q+q/2<-1$.

For ${I}_{1}^{\ast }$, we have by ${C}_{r}$’s inequality, the proofs of (3.2) and (3.4) that
$\begin{array}{rcl}{I}_{1}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}\sum _{i=1}^{n}\left(E{|{X}_{i}|}^{q}I\left(|{X}_{i}|\le {n}^{\alpha }\right)+{n}^{\alpha q}P\left(|{X}_{i}|>{n}^{\alpha }\right)\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}\sum _{i=1}^{n}\left(E{|X|}^{q}I\left(|X|\le {n}^{\alpha }\right)+{n}^{\alpha q}P\left(|X|>{n}^{\alpha }\right)\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha q}E{|X|}^{q}I\left(|X|\le {n}^{\alpha }\right)+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }E|X|I\left(|X|>{n}^{\alpha }\right)\\ <& \mathrm{\infty }.\end{array}$
(4.3)
For ${I}_{2}^{\ast }$, note that $E{X}^{2}<\mathrm{\infty }$ if $E{|X|}^{p}<\mathrm{\infty }$ for $p\ge 2$. We have that
$\begin{array}{rcl}{I}_{2}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{q/2}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q}{\left(\sum _{i=1}^{n}E{X}^{2}\right)}^{q/2}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\alpha q+q/2}<\mathrm{\infty }.\end{array}$

Case 2. $\alpha >1/2$, $\alpha p>1$ and $1.

Take $q=2$. Similar to the proofs of (4.2), (4.3), (3.5) and (4.1), we have that
$\begin{array}{rcl}{I}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-2\alpha }\sum _{i=1}^{n}\left(E{X}_{i}^{2}I\left(|{X}_{i}|\le {n}^{\alpha }\right)+{n}^{2\alpha }P\left(|{X}_{i}|>{n}^{\alpha }\right)\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-2\alpha }E{X}^{2}I\left(|X|\le {n}^{\alpha }\right)+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1-\alpha }E|X|I\left(|X|>{n}^{\alpha }\right)\\ <& \mathrm{\infty }.\end{array}$
(4.4)

Case 3. $\alpha >1/2$, $\alpha p=1$ and $p>1$.

Take $q=2$. Note that $1/2<\alpha <1$ if $\alpha p=1$. Similar to the proof of (4.4), it follows that ${I}^{\ast }<\mathrm{\infty }$. From all the statements above, we have proved (2.2).

Next, we prove (2.3). Since ${S}_{j}={\sum }_{i=1}^{j}{X}_{i}$ and ${X}_{i}=\left({X}_{ni}-E{X}_{ni}\right)+\left({X}_{ni}^{\prime }-E{X}_{ni}^{\prime }\right)$, $i=1,2,\dots ,n$. By Lemma 3.3, the proofs of (4.1) and ${I}^{\ast }<\mathrm{\infty }$, it follows that

Hence (2.3) holds.

We will prove the equivalence of (2.2) and (2.3). First, we prove that (2.2) implies (2.3). In fact, for all $\epsilon >0$, we have by Lemma 3.2 that

Hence, by (4.5), (2.3) implies (2.2). The proof of the theorem is completed. □

Proof of Corollary 2.1 Taking $\alpha p=1$ in Theorem 2.1, we have that for all $\epsilon >0$,
$\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(\underset{1\le j\le n}{max}|{S}_{j}|\ge \epsilon {n}^{1/p}\right)<\mathrm{\infty }.$

The rest of the proof is similar to that of Theorem 3.1 in Dung and Tien  and is omitted. □

Proof of Theorem 2.2 We use the same notation as that in the proof of Theorem 2.1. Taking $q=2$ and $a={n}^{\alpha }$ in Lemma 3.3, by (2.1) and Lemma 3.1, it follows that

Hence, (2.4) holds. □

Proof of Theorem 2.3 Inspired by the proof of Theorem 12.1 of Gut , we have that
$\begin{array}{rcl}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}P\left(\underset{j\ge n}{sup}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)& =& \sum _{m=1}^{\mathrm{\infty }}\sum _{n={2}^{m-1}}^{{2}^{m}-1}{n}^{\alpha p-2}P\left(\underset{j\ge n}{sup}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)\\ \le & C\sum _{m=1}^{\mathrm{\infty }}P\left(\underset{j\ge {2}^{m-1}}{sup}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)\sum _{n={2}^{m-1}}^{{2}^{m}-1}{2}^{m\left(\alpha p-2\right)}\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{2}^{m\left(\alpha p-1\right)}P\left(\underset{j\ge {2}^{m-1}}{sup}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)\\ =& C\sum _{m=1}^{\mathrm{\infty }}{2}^{m\left(\alpha p-1\right)}P\left(\underset{k\ge m}{sup}\underset{{2}^{k-1}\le j<{2}^{k}}{max}|\frac{{S}_{j}}{{j}^{\alpha }}|\ge \epsilon \right)\\ \le & C\sum _{m=1}^{\mathrm{\infty }}{2}^{m\left(\alpha p-1\right)}\sum _{k=m}^{\mathrm{\infty }}P\left(\underset{1\le j\le {2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)\\ \le & C\sum _{k=1}^{\mathrm{\infty }}P\left(\underset{1\le j\le {2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)\sum _{m=1}^{k}{2}^{m\left(\alpha p-1\right)}\\ \le & C\sum _{k=1}^{\mathrm{\infty }}{2}^{k\left(\alpha p-1\right)}P\left(\underset{1\le j\le {2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)\\ \le & C\sum _{k=1}^{\mathrm{\infty }}\sum _{n={2}^{k}}^{{2}^{k+1}-1}{n}^{\alpha p-2}P\left(\underset{1\le j\le n}{max}|{S}_{j}|\ge \left(\frac{\epsilon }{{4}^{\alpha }}\right){n}^{\alpha }\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}P\left(\underset{1\le j\le n}{max}|{S}_{j}|\ge \left(\frac{\epsilon }{{4}^{\alpha }}\right){n}^{\alpha }\right).\end{array}$
(4.6)

The desired result (2.5) follows from (2.2) and (4.6) immediately. □

Declarations

Acknowledgements

The authors are most grateful to the editor Andrei Volodin and anonymous referees for careful reading of the manuscript and valuable suggestions which helped in significantly improving an earlier version of this paper. The research was supported by the National Natural Science Foundation of China (11171001, 11201001, 11126176), Natural Science Foundation of Anhui Province (1208085QA03), Provincial Natural Science Research Project of Anhui Colleges (KJ2010A005), Academic Innovation Team of Anhui University (KJTD001B), Doctoral Research Start-up Funds Projects of Anhui University, and the Talents Youth Fund of Anhui Province Universities (2011SQRL012ZD).

Authors’ Affiliations

(1)
School of Mathematical Science, Anhui University, Hefei, 230039, P.R. China

References 