Skip to main content

An extension of the Baum-Katz theorem to i.i.d. random variables with general moment conditions

Abstract

For a sequence of i.i.d. random variables \(\{X, X_{n}, n\ge1\}\) and a sequence of positive real numbers \(\{a_{n}, n\ge1\}\) with \(0< a_{n}/n^{1/p}\uparrow\) for some \(0< p<2\), the Baum-Katz complete convergence theorem is extended to the \(\{X, X_{n}, n\ge1\}\) with the general moment condition \(\sum^{\infty}_{n=1}n^{r-1}P\{|X|>a_{n}\}<\infty\), where \(r\ge1\). The relationship between the complete convergence and the strong law of large numbers is established.

1 Introduction and main result

The concept of complete convergence was first introduced by Hsu and Robbins [1] and has played a very important role in probability theory. A sequence of random variables \(\{U_{n},n\geq1\}\) is said to converge completely to a constant C if \(\sum^{\infty}_{n=1}P\{|U_{n}-C|>\varepsilon\}<\infty\) for any \(\varepsilon>0\). Hsu and Robbins [1] proved that the sequence of arithmetic means of independent and identically distributed (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. Their result has been generalized and extended by many authors.

The following result is well known.

Theorem A

Let \(r\geq1\) and \(0< p<2\). Let \(\{X, X_{n},n\geq1\}\) be a sequence of i.i.d. random variables with partial sums \(S_{n}=\sum_{k=1}^{n} X_{k}\), \(n\geq1\). Then the following statements are equivalent:

$$\begin{aligned} &E|X|^{rp}< \infty, \end{aligned}$$
(1.1)
$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-2}P \bigl\{ \vert S_{n}-nb\vert >\varepsilon n^{1/p} \bigr\} < \infty \quad\forall \varepsilon>0, \end{aligned}$$
(1.2)
$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-2}P \Bigl\{ \max_{1\leq m\leq n}\vert S_{m}-mb\vert >\varepsilon n^{1/p} \Bigr\} < \infty\quad \forall \varepsilon>0, \end{aligned}$$
(1.3)

where \(b=0\) if \(0< rp<1\) and \(b=EX\) if \(rp\geq1\).

When \(r=1\), each of (1.1)∼(1.3) is equivalent to

$$ \frac{S_{n} -nb}{n^{1/p}}\to0 \quad \mbox{a.s.} $$
(1.4)

When \(r=1\), the equivalence of (1.1) and (1.4) is known as the Marcinkiewicz and Zygmund strong law of large numbers. Katz [2] proved the equivalence of (1.1) and (1.2) for the case of \(p=1\). Baum and Katz [3] proved the equivalence of (1.1) and (1.2) for the case of \(0< p<2\). The result of Baum and Katz was generalized and extended in several directions. Some versions of the Baum and Katz theorem under higher-order moment conditions were established by Lanzinger [4], Gut and Stadtmüller [5], and Chen and Sung [6]. When \(p=1\), \(1\leq r<3\), and \(\{X_{n}, n\geq1\}\) is a sequence of pairwise independent, but not necessarily identically distributed, random variables, Spătaru [7] gave sufficient conditions for (1.2).

It is interesting to find more general moment conditions such that the complete convergence holds. In fact, Li et al. [8] and Sung [9] have done something. In particular, it is worth pointing out that Sung [9] obtained the following complete convergence for pairwise i.i.d. random variables \(\{X,X_{n},n\geq1\}\):

$$\sum^{\infty}_{n=1}n^{-1}P \Biggl\{ \Biggl\vert \sum^{n}_{k=1}X_{k}-nEXI\bigl(|X| \leq a_{n}\bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} < \infty \quad \forall\varepsilon>0, $$

provided that \(\sum^{\infty}_{n=1}P\{|X|>a_{n}\}<\infty\), where \(0< a_{n}/n\uparrow\).

Motivated by the work of Sung [9], the aim of this paper is to obtain the complete convergence under more general moment conditions. Our main result includes the Baum and Katz [3] complete convergence and the Marcinkiewicz and Zygmund strong law of large numbers.

Now we state the main result. Some lemmas and the proof of the main result will be detailed in next section.

Theorem 1.1

Let \(r\geq1\) and \(0< p<2\). Let \(\{X, X_{n},n\geq1\}\) be a sequence of i.i.d. random variables with partial sums \(S_{n}=\sum_{k=1}^{n} X_{k}\), \(n\geq1\), and \(\{a_{n},n\geq1\}\) a sequence of positive real numbers with \(0< a_{n}/n^{1/p}\uparrow\). Then the following statements are equivalent:

$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-1}P \bigl\{ |X|>a_{n}\bigr\} < \infty, \end{aligned}$$
(1.5)
$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-2}P \bigl\{ \vert S_{n} -nb_{n}\vert >\varepsilon a_{n} \bigr\} < \infty \quad\forall \varepsilon>0, \end{aligned}$$
(1.6)
$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-2}P \Bigl\{ \max_{1\leq m\leq n}\vert S_{m} -mb_{n} \vert >\varepsilon a_{n} \Bigr\} < \infty\quad \forall \varepsilon>0, \end{aligned}$$
(1.7)

where \(b_{n}=0\) if \(0< p<1\) and \(b_{n}=EXI(|X|\leq a_{n})\) if \(1\leq p<2\).

When \(r=1\), each of (1.5)-(1.7) is equivalent to

$$ a_{n}^{-1} (S_{n} -nb_{n} )\rightarrow0 \quad\mbox{a.s. } $$
(1.8)

Remark 1.1

When \(a_{n}=n^{1/p}\) for \(n\geq1\), (1.5) is equivalent to (1.1). In this case, \(a_{n}^{-1}\cdot n EXI(|X|>a_{n})\to0\) if \(1\leq p<2\) and (1.1) holds. Hence, (1.1) ⇒ (1.2), (1.1) ⇒ (1.3), and (1.1) ⇒ (1.4) (in this case, \(r=1\)) follow from Theorem 1.1. Although the converses do not follow directly from Theorem 1.1, the proofs can be done easily. When \(a_{n}=n^{1/p}(\ln n)^{\alpha}\) for \(n\geq1\), where \(\alpha>0\), (1.5) is equivalent to \(E|X|^{rp}/(\ln(|X|+2))^{\alpha rp}<\infty\).

Throughout this paper, the symbol C denotes a positive constant that is not necessarily the same one in each appearance, and \(I(A)\) denotes the indicator function of an event A.

2 Lemmas and proofs

To prove the main result, the following lemmas are needed. Lemma 2.1 is the Rosenthal inequality for the sum of independent random variables; see, for example, Petrov [10].

Lemma 2.1

Let \(\{Y_{n}, n\ge1 \}\) be a sequence of independent random variables with \(EY_{n}=0\) and \(E|Y_{n}|^{s}<\infty\) for some \(s\ge2\) and all \(n\ge1\). Then there exists a positive constant C depending only on s such that for all \(n\ge1\),

$$E\max_{1\leq m\leq n}\Biggl\vert \sum_{k=1}^{m} Y_{k}\Biggr\vert ^{s}\le C \Biggl\{ \sum _{k=1}^{n} E|Y_{k}|^{s} + \Biggl( \sum_{k=1}^{n} EY_{k}^{2} \Biggr)^{s/2} \Biggr\} . $$

Lemma 2.2

Under the assumptions of Theorem  1.1, if \(0< p<1\) and (1.5) holds, then

$$a_{n}^{-1}\cdot nEXI\bigl(|X|\leq a_{n}\bigr)\rightarrow0 $$

as \(n\rightarrow\infty\).

Proof

Since \(0< p<1\), by \(0< a_{n}/n^{1/p}\uparrow\) we have \(0< a_{n}/n\uparrow\infty\). By (1.5) we have

$$\sum^{\infty}_{n=1}P\bigl\{ |X|>a_{n}\bigr\} < \infty. $$

Therefore, by Lemma 2.4 in Sung [9] we have the desired result. □

Lemma 2.3

Under the assumptions of Theorem  1.1, if \(rp\geq2\) and (1.5) holds, then

$$a_{n}^{-2}\cdot nE|X|^{2}I\bigl(|X|\leq a_{n}\bigr)\leq Cn^{1-2/p}. $$

Proof

By \(0< a_{n}/n^{1/p}\uparrow\) we have \(a_{k}/a_{n}\leq (k/n)^{1/p}\) for any \(1\leq k\leq n\). Hence,

$$\begin{aligned} a_{n}^{-2}\cdot nE|X|^{2}I\bigl(|X|\leq a_{n}\bigr) &=a_{n}^{-2}\cdot n\sum ^{n}_{k=1}E|X|^{2}I\bigl(a_{k-1}< |X|\leq a_{k}\bigr) \quad (\mbox{set } a_{0}=0) \\ &\leq a_{n}^{-2}\cdot n\sum^{n}_{k=1}a_{k}^{2}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \\ &\leq n^{1-2/p}\sum^{n}_{k=1}k^{2/p}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \\ &\leq n^{1-2/p}\sum^{n}_{k=1}k^{r}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \quad (\mbox{by }rp\geq2) \\ &\leq n^{1-2/p}\sum^{\infty}_{k=0} \bigl[(k+1)^{r}-k^{r}\bigr]P\bigl\{ |X|>a_{k}\bigr\} \\ &\leq n^{1-2/p}\cdot \Biggl(1+ r2^{r-1}\sum ^{\infty}_{k=1}k^{r-1}P\bigl\{ |X|>a_{k}\bigr\} \Biggr). \end{aligned}$$

Set \(C=1+ r2^{r-1}\sum^{\infty}_{k=1}k^{r-1}P\{|X|>a_{k}\}\). By (1.5), \(C<\infty\). So we complete the proof. □

Lemma 2.4

Under the assumptions of Theorem  1.1, if \(s>rp\) and (1.5) holds, then

$$\sum^{\infty}_{n=1}n^{r-2} \cdot a_{n}^{-s}nE|X|^{s}I\bigl(|X|\leq a_{n}\bigr)< \infty. $$

Proof

By \(0< a_{n}/n^{1/p}\uparrow\) we have \(a_{k}/a_{n}\leq (k/n)^{1/p}\) for any \(n\geq k\). Hence,

$$\begin{aligned} \sum^{\infty}_{n=1}n^{r-2} \cdot a_{n}^{-s}nE|X|^{s}I\bigl(|X|\leq a_{n}\bigr) &= \sum^{\infty}_{n=1}n^{r-1}a_{n}^{-s} \sum^{n}_{k=1}E|X|^{s}I\bigl(a_{k-1}< |X| \leq a_{k}\bigr) \\ &\leq\sum^{\infty}_{n=1}n^{r-1}a_{n}^{-s} \sum^{n}_{k=1}a_{k}^{s}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \\ &=\sum^{\infty}_{k=1}a_{k}^{s}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \sum^{\infty}_{n=k}n^{r-1}a_{n}^{-s} \\ &\leq\sum^{\infty}_{k=1}k^{s/p}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \sum^{\infty}_{n=k}n^{r-1-s/p} \\ &\leq C\sum^{\infty}_{k=1}k^{r}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} < \infty. \end{aligned}$$

Therefore, the proof is completed. □

Lemma 2.5

Let \(\{X,X_{n}\geq1\}\) be a sequence of i.i.d. symmetric random variables, and \(\{a_{n},n\geq1\}\) a sequence of real numbers with \(0< a_{n}\uparrow \infty\). Suppose that

$$ \sum^{\infty}_{n=1}n^{-1}P \Biggl\{ \Biggl\vert \sum^{n}_{k=1}X_{k} \Biggr\vert >\varepsilon a_{n} \Biggr\} < \infty \quad\forall \varepsilon>0. $$
(2.1)

Then

$$ a_{n}^{-1}\sum^{n}_{k=1}X_{k} \rightarrow0 \textit{ in probability}. $$
(2.2)

Proof

Set \(S_{n}=\sum^{n}_{k=1}X_{k}\), \(n\geq1\). Note that for all \(\varepsilon>0\),

$$\begin{aligned}[b] P\bigl\{ |S_{2n+1}|>\varepsilon a_{2n+1}\bigr\} &\leq P \bigl\{ |S_{2n}|>\varepsilon a_{2n+1}/2\bigr\} +P\bigl\{ |X_{2n+1}|> \varepsilon a_{2n+1}/2 \bigr\} \\ &\leq P\bigl\{ |S_{2n}|>\varepsilon a_{2n}/2\bigr\} +P\bigl\{ |X|>\varepsilon a_{2n+1}/2 \bigr\} \end{aligned} $$

and \(P\{|X|>\varepsilon a_{2n+1}/2 \}\rightarrow0\) as \(n\rightarrow \infty\). Hence, to prove (2.2), it suffices to prove that

$$ a_{2n}^{-1}S_{2n}\rightarrow0 \mbox{ in probability}. $$
(2.3)

We will prove (2.3) by contradiction. Suppose that there exist a constant \(\varepsilon>0\) and a sequence of integers \(\{n_{i},i\geq1\}\) with \(n_{i}\uparrow\infty\) such that

$$P\bigl\{ |S_{2n_{i}}|>\varepsilon a_{2n_{i}}\bigr\} \geq\varepsilon \quad \mbox{for all } i\geq1. $$

Without loss of generality, we can assume that \(2n_{i}< n_{i+1}\). By the Lévy inequality (see, for example, formula (2.6) in Ledoux and Talagrand [11]) we have

$$\begin{aligned} \sum^{\infty}_{n=1}n^{-1}P \bigl\{ |S_{n}|>\varepsilon a_{n}/2\bigr\} &\geq\frac{1}{2}\sum ^{\infty}_{n=1}n^{-1}P\Bigl\{ \max _{1\leq k\leq n}|S_{k}|>\varepsilon a_{n}/2\Bigr\} \\ &\geq\frac{1}{2}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1}P\Bigl\{ \max _{1\leq k\leq n}|S_{k}|>\varepsilon a_{n}/2\Bigr\} \\ &\geq\frac{1}{2}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1}P\Bigl\{ \max _{1\leq k\leq n_{i}}|S_{k}|>\varepsilon a_{2n_{i}}/2\Bigr\} \\ &\geq\frac{1}{2}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1}P\bigl\{ |S_{n_{i}}|>\varepsilon a_{2n_{i}}/2\bigr\} \\ &=\frac{1}{4}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1} \bigl(P\bigl\{ |S_{n_{i}}|>\varepsilon a_{2n_{i}}/2\bigr\} +P\bigl\{ |S_{2n_{i}}-S_{n_{i}}|> \varepsilon a_{2n_{i}}/2\bigr\} \bigr) \\ &\geq\frac{1}{4}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1}P\bigl\{ |S_{2n_{i}}|>\varepsilon a_{2n_{i}}\bigr\} \\ &\geq\frac{\varepsilon}{4}\sum^{\infty}_{i=2}\sum ^{2n_{i}}_{n=n_{i}+1}n^{-1}=\infty, \end{aligned}$$

which leads a contradiction to (2.1). Hence, (2.3) holds, and so the proof is completed. □

Proof of Theorem 1.1

We first prove that (1.5) implies (1.7). By Lemma 2.2, to prove (1.7), it suffices to prove that

$$ \sum^{\infty}_{n=1}n^{r-2}P \Biggl\{ \max_{1\leq m\leq n}\Biggl\vert \sum^{m}_{k=1} \bigl(X_{k}-EX_{k}I\bigl(|X_{k}|\leq a_{n}\bigr) \bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} < \infty\quad \forall \varepsilon>0. $$
(2.4)

Note that

$$\begin{aligned} & \Biggl\{ \max_{1\leq m\leq n}\Biggl\vert \sum ^{m}_{k=1}\bigl(X_{k}-EX_{k}I\bigl(|X_{k}| \leq a_{n}\bigr)\bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} \\ &\quad\subset\bigcup^{n}_{k=1} \bigl\{ |X_{k}|>a_{n}\bigr\} \cup \Biggl\{ \max_{1\leq m\leq n} \Biggl\vert \sum^{m}_{k=1} \bigl(X_{k}I\bigl(|X_{k}|\leq a_{n}\bigr)-EX_{k}I\bigl(|X_{k}| \leq a_{n}\bigr)\bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} . \end{aligned}$$

Hence, by (1.5), to prove (2.4) it suffices to prove that for all \(\varepsilon>0\),

$$ \sum^{\infty}_{n=1}n^{r-2}P \Biggl\{ \max_{1\leq m\leq n}\Biggl\vert \sum^{m}_{k=1} \bigl(X_{k}I\bigl(|X_{k}|\leq a_{n}\bigr)-EX_{k}I\bigl(|X_{k}| \leq a_{n}\bigr)\bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} < \infty. $$
(2.5)

For any \(s\geq2\), by the Markov inequality and Lemma 2.1,

$$\begin{aligned} &\sum^{\infty}_{n=1}n^{r-2}P \Biggl\{ \max_{1\leq m\leq n}\Biggl\vert \sum^{m}_{k=1} \bigl(X_{k}I\bigl(|X_{k}|\leq a_{n}\bigr)-EX_{k}I\bigl(|X_{k}| \leq a_{n}\bigr)\bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} \\ &\quad\leq C\sum^{\infty}_{n=1}n^{r-2} a_{n}^{-s}E\max_{1\leq m\leq n}\Biggl\vert \sum ^{m}_{k=1}\bigl(X_{k}I\bigl(|X_{k}| \leq a_{n}\bigr)-EX_{k}I\bigl(|X_{k}|\leq a_{n}\bigr) \bigr)\Biggr\vert ^{s} \\ &\quad\leq C \Biggl(\sum^{\infty}_{n=1}n^{r-2} \bigl\{ a_{n}^{-2}nEX^{2}I\bigl(|X|\leq a_{n}\bigr) \bigr\} ^{s/2}+\sum^{\infty}_{n=1}n^{r-1}a_{n}^{-s}E|X|^{s}I\bigl(|X| \leq a_{n}\bigr) \Biggr) \\ &\quad=C(I_{1}+I_{2}). \end{aligned}$$

If \(rp\geq2\), taking s large enough such that \(r-2-s/p+s/2<-1\), by Lemma 2.3 we have

$$I_{1}\leq C\sum^{\infty}_{n=1}n^{r-2-s/p+s/2}< \infty. $$

Since \(s>rp\), \(I_{2}<\infty\) by Lemma 2.4. If \(0< rp<2\), taking \(s=2\) (in this case \(I_{1}=I_{2}\)), we have \(I_{1}=I_{2}<\infty\) by Lemma 2.4 again. Hence, (2.5) holds for all \(\varepsilon>0\).

It is trivial that (1.7) implies (1.6). Now we prove that (1.6) implies (1.5). Let \(\{X', X_{n}', n\geq1\}\) be an independent copy of \(\{X,X_{n},n\geq1\}\). Then we also have

$$\sum^{\infty}_{n=1}n^{r-2}P \Biggl\{ \Biggl\vert \sum^{n}_{k=1}X_{k}'-nb_{n} \Biggr\vert >\varepsilon a_{n} \Biggr\} < \infty \quad \forall \varepsilon>0. $$

Hence,

$$\sum^{\infty}_{n=1}n^{r-2}P \Biggl\{ \Biggl\vert \sum^{n}_{k=1} \bigl(X_{k}-X_{k}'\bigr)\Biggr\vert > \varepsilon a_{n} \Biggr\} < \infty \quad \forall \varepsilon>0, $$

from which it follows that

$$\sum^{\infty}_{n=1}n^{-1}P \Biggl\{ \Biggl\vert \sum^{n}_{k=1} \bigl(X_{k}-X_{k}'\bigr)\Biggr\vert > \varepsilon a_{n} \Biggr\} < \infty \quad \forall \varepsilon>0. $$

Then, by Lemma 2.5,

$$a_{n}^{-1}\sum^{n}_{k=1} \bigl(X_{k}-X_{k}'\bigr)\rightarrow0 \mbox{ in probability}. $$

By the Lévy inequality (see, for example, formula (2.7) in Ledoux and Talagrand [11]), for any fixed \(\varepsilon>0\),

$$P \Bigl\{ \max_{1\leq k\leq n}\bigl|X_{k}-X_{k}'\bigr|> \varepsilon a_{n} \Bigr\} \leq 2P \Biggl\{ \Biggl\vert \sum ^{n}_{k=1}\bigl(X_{k}-X_{k}' \bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} \rightarrow0 $$

as \(n\rightarrow\infty\). Then, for all n large enough,

$$P\Bigl\{ \max_{1\leq k\leq n}\bigl|X_{k}-X_{k}'\bigr|> \varepsilon a_{n}\Bigr\} \leq1/2. $$

Therefore, by Lemma 2.6 in Ledoux and Talagrand [11] and the Lévy inequality (see formula (2.7) in Ledoux and Talagrand [11]) we have that for all n large enough,

$$\begin{aligned} nP\bigl\{ \bigl|X-X'\bigr|>\varepsilon a_{n}\bigr\} &=\sum ^{n}_{k=1}P\bigl\{ \bigl|X_{k}-X_{k}'\bigr|> \varepsilon a_{n}\bigr\} \\ &\leq2P\Bigl\{ \max_{1\leq k\leq n}\bigl|X_{k}-X_{k}'\bigr|> \varepsilon a_{n}\Bigr\} \leq4P \Biggl\{ \Biggl\vert \sum ^{n}_{k=1}\bigl(X_{k}-X_{k}' \bigr)\Biggr\vert >\varepsilon a_{n} \Biggr\} . \end{aligned}$$

Therefore,

$$ \sum^{\infty}_{n=1}n^{r-1}P\bigl\{ \bigl|X-X'\bigr|>\varepsilon a_{n}\bigr\} < \infty \quad\forall \varepsilon>0. $$
(2.6)

Since \(P\{|X|>a_{n}/2\}\to0\) as \(n\to\infty\), \(|\operatorname{med}(X)/(a_{n}/2)|\leq1\) for all n large enough. By the weak symmetrization inequality we have that for all n large enough,

$$ P\bigl\{ |X|>a_{n}\bigr\} \leq P\bigl\{ \bigl|X-\operatorname{med}(X)\bigr|>a_{n}/2 \bigr\} \leq2P\bigl\{ \bigl|X-X'\bigr|> a_{n}/2\bigr\} , $$
(2.7)

which, together with (2.6), implies that (1.5) holds.

Finally, we prove that (1.5) and (1.8) are equivalent when \(r=1\). Assume that (1.5) holds for \(r=1\). Since \(\sum^{\infty}_{i=1}iP\{a_{i}<|X|\leq a_{i+1}\}=\sum^{\infty}_{n=1} P\{ |X|>a_{n}\}<\infty\), for any fixed \(\varepsilon>0\), there exists a positive integer N such that \(\sum^{\infty}_{i=N+1}iP\{a_{i}<|X|\leq a_{i+1}\}<\varepsilon\). Then, for \(n>N+1\),

$$\begin{aligned} &\Biggl\vert a_{n}^{-1}\cdot nEXI\bigl(|X|\leq a_{n}\bigr)-a_{n}^{-1}\sum^{n}_{k=1}EXI\bigl(|X| \leq a_{k}\bigr)\Biggr\vert \\ &\quad\leq a_{n}^{-1}\sum^{n-1}_{k=1}E|X|I\bigl(a_{k}< |X| \leq a_{n}\bigr) \\ &\quad= a_{n}^{-1}\sum^{n-1}_{k=1} \sum^{n-1}_{i=k}E|X|I\bigl(a_{i}< |X|\leq a_{i+1}\bigr) \\ &\quad= a_{n}^{-1}\sum^{n-1}_{i=1}iE|X|I\bigl(a_{i}< |X| \leq a_{i+1}\bigr) \\ &\quad\leq a_{n}^{-1}\sum^{N}_{i=1}iE|X|I\bigl(a_{i}< |X|< a_{i+1}\bigr)+ \sum^{n-1}_{i=N+1}iP\bigl\{ a_{i}< |X|\leq a_{i+1}\bigr\} \\ &\quad\leq a_{n}^{-1}\sum^{N}_{i=1}iE|X|I\bigl(a_{i}< |X|< a_{i+1}\bigr)+ \varepsilon \\ &\quad\rightarrow\varepsilon \quad\mbox{as } n\to\infty. \end{aligned}$$

It follows that

$$\Biggl\vert a_{n}^{-1}\cdot nEXI\bigl(|X|\leq a_{n}\bigr)-a_{n}^{-1}\sum^{n}_{k=1}EXI\bigl(|X| \leq a_{k}\bigr)\Biggr\vert \to0 $$

as \(n\to\infty\). Hence, to prove (1.8), by Lemma 2.2 it suffices to prove that

$$ a^{-1}_{n}\sum^{n}_{k=1} \bigl(X_{k}I\bigl(|X|\leq a_{k}\bigr)-EX_{k}I\bigl(|X|\leq a_{k}\bigr)\bigr)\rightarrow0\quad \mbox{a.s.} $$
(2.8)

Since \(0< a_{n}/n^{1/p}\uparrow\) and \(0< p<2\),

$$\begin{aligned} &\sum^{\infty}_{n=1}a_{n}^{-2}Var \bigl(X_{n}I\bigl(|X|\leq a_{n}\bigr)\bigr) \\ &\quad\leq\sum^{\infty}_{n=1}a_{n}^{-2}EX^{2}I\bigl(|X| \leq a_{n}\bigr) \\ &\quad=\sum^{\infty}_{n=1}a_{n}^{-2} \sum^{n}_{k=1}EX^{2}I\bigl(a_{k-1}< |X| \leq a_{k}\bigr) \quad (\mbox{set } a_{0}=0) \\ &\quad\leq\sum^{\infty}_{k=1}a_{k}^{2}P \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} \sum^{\infty}_{n=k}a_{n}^{-2} \\ &\quad\leq C\sum^{\infty}_{k=1}kP \bigl\{ a_{k-1}< |X|\leq a_{k}\bigr\} < \infty. \end{aligned}$$

Then by the Kolmogorov convergence criterion and the Kronecker lemma, (2.8) holds, and so (1.8) also holds.

Conversely, assume that (1.8) holds. Let \(\{X', X_{n}', n\geq1\}\) be an independent copy of \(\{X,X_{n},n\geq1\}\). Then we also have

$$a_{n}^{-1} \Biggl(\sum^{n}_{k=1}X_{k}'-nb_{n} \Biggr)\rightarrow0 \quad\mbox{a.s.} $$

Hence, we have

$$a_{n}^{-1}\sum^{n}_{k=1} \bigl(X_{k}-X_{k}'\bigr)\rightarrow0 \quad \mbox{a.s.} $$

So, we have by \(0< a_{n}\uparrow\) that

$$a^{-1}_{n}\bigl(X_{n}-X_{n}' \bigr)=a_{n}^{-1}\sum^{n}_{k=1} \bigl(X_{k}-X_{k}'\bigr)-\bigl(a_{n-1}a_{n}^{-1} \bigr)a_{n-1}^{-1}\sum^{n-1}_{k=1} \bigl(X_{k}-X_{k}'\bigr) \rightarrow0 \quad \mbox{a.s.} $$

By the Borel-Cantelli lemma,

$$\sum^{\infty}_{n=1}P\bigl\{ \bigl|X_{n}-X_{n}'\bigr|> \varepsilon a_{n}\bigr\} < \infty \quad \forall \varepsilon>0, $$

which, together with (2.7), implies that (1.5) holds for \(r=1\). So we complete the proof. □

References

  1. Hsu, PL, Robbins, H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 33, 25-31 (1947)

    Article  MathSciNet  MATH  Google Scholar 

  2. Katz, M: The probability in the tail of a distribution. Ann. Math. Stat. 34, 312-318 (1963)

    Article  MATH  Google Scholar 

  3. Baum, LE, Katz, M: Convergence rates in the law of large numbers. Trans. Am. Math. Soc. 120, 108-123 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  4. Lanzinger, H: A Baum-Katz theorem for random variables under exponential moment conditions. Stat. Probab. Lett. 39, 89-95 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  5. Gut, A, Stadtmüller, U: An intermediate Baum-Katz theorem. Stat. Probab. Lett. 81, 1486-1492 (2011)

    Article  MATH  Google Scholar 

  6. Chen, P, Sung, SH: A Baum-Katz theorem for i.i.d. random variables with higher order moments. Stat. Probab. Lett. 94, 63-68 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Spătaru, A: Generalizing a theorem of Katz. Stat. Probab. Lett. 80, 1136-1140 (2010)

    Article  MATH  Google Scholar 

  8. Li, W, Chen, P, Hu, TC: Complete convergence for moving average processes associated to heavy-tailed distributions and applications. J. Math. Anal. Appl. 420, 66-76 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  9. Sung, SH: On the strong law of large numbers for pairwise i.i.d. random variables with general moment conditions. Stat. Probab. Lett. 83, 1963-1968 (2013)

    Article  MATH  Google Scholar 

  10. Petrov, VV: Limit Theorems of Probability Theory: Sequences of Independent Random Variables. Clarendon Press, Oxford (1995)

    MATH  Google Scholar 

  11. Ledoux, M, Talagrand, M: Probability in Banach Spaces. Springer, Berlin (1991)

    Book  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the referees for the helpful comments. The research of Pingyan Chen and Jiaming Yi is supported by the National Natural Science Foundation of China (No. 11271161). The research of Soo Hak Sung is supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2014R1A1A2058041).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Soo Hak Sung.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors read and approved the manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, P., Yi, J. & Sung, S.H. An extension of the Baum-Katz theorem to i.i.d. random variables with general moment conditions. J Inequal Appl 2015, 414 (2015). https://doi.org/10.1186/s13660-015-0939-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-015-0939-2

MSC

Keywords