# Complete convergence for Sung’s type weighted sums of END random variables

## Abstract

In this paper, the author studies the complete convergence results for Sung’s type weighted sums of sequences of END random variables and obtains some new results. These results extend and improve the corresponding theorems of Sung (Discrete Dyn. Nat. Soc. 2010:630608, 2010, doi:10.1155/2010/630608).

MSC:60F15.

## 1 Introduction and main results

The concept of complete convergence was introduced by Hsu and Robbins  as follows. A sequence of random variables $\left\{{X}_{n},n\ge 1\right\}$ is said to converge completely to a constant c if

$\sum _{n=1}^{\mathrm{\infty }}P\left(|{X}_{n}-c|>\epsilon \right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$

From then on, many authors have devoted their study to complete convergence; see , and so on.

Recently, Sung  obtained a complete convergence result for weighted sums of identically distributed ${\rho }^{\ast }$-mixing random variables (we call these Sung’s type weighted sums).

Theorem A Let $p>1/\alpha$ and $1/2<\alpha \le 1$. Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed ${\rho }^{\ast }$-mixing random variables with $EX=0$ and $E{|X|}^{p}<\mathrm{\infty }$. Assume that $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ is an array of real numbers satisfying

$\sum _{i=1}^{n}{|{a}_{ni}|}^{q}=O\left(n\right)$
(1.1)

for some $q>p$. Then

$\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}{X}_{i}|>\epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(1.2)

Conversely, (1.2) implies $EX=0$ and $E{|X|}^{p}<\mathrm{\infty }$ if (1.2) holds for any array $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ with (1.1) for some $q>p$.

In this paper, we will extend Theorem A under the END setup. We firstly introduce the concept of END random variables.

Definition 1.1 Random variables ${Y}_{1},{Y}_{2},\dots$ are said to be extended negatively dependent (END) if there exists a constant $M>0$ such that, for each $n\ge 2$,

$P\left({Y}_{1}\le {y}_{1},\dots ,{Y}_{n}\le {y}_{n}\right)\le M\prod _{i=1}^{n}P\left({Y}_{i}\le {y}_{i}\right)$

and

$P\left({Y}_{1}>{y}_{1},\dots ,{Y}_{n}>{y}_{n}\right)\le M\prod _{i=1}^{n}P\left({Y}_{i}>{y}_{i}\right)$

hold for every sequence $\left\{{y}_{1},\dots ,{y}_{n}\right\}$ of real numbers.

The concept was introduced by Liu . When $M=1$, the notion of END random variables reduces to the well-known notion of so-called negatively dependent (ND) random variables, which was firstly introduced by Embrahimi and Ghosh ; some properties and limit results can be found in Alam and Saxena , Block et al. , Joag-Dev and Proschan , and Wu and Zhu . As is mentioned in Liu , the END structure is substantially more comprehensive than the ND structure in that it can reflect not only a negative dependence structure but also a positive one, to some extent. Liu  pointed out that the END random variables can be taken as negatively or positively dependent and provided some interesting examples to support this idea. Joag-Dev and Proschan  also pointed out that negatively associated (NA) random variables must be ND and ND is not necessarily NA, thus NA random variables are END. A great number of articles for NA random variables have appeared in the literature. But very few papers are written for END random variables. For example, for END random variables with heavy tails Liu  obtained the precise large deviations and Liu  studied sufficient and necessary conditions for moderate deviations, and Qiu et al.  and Wu and Guan  studied complete convergence for weighted sums and arrays of rowwise END, and so on.

Now we state the main results; some lemmas and the proofs will be detailed in the next section.

Theorem 1.1 Let $p>1/\alpha$ and $1/2<\alpha \le 1$. Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed END random variables with $EX=0$ and $E{|X|}^{p}<\mathrm{\infty }$. Assume that $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ is an array of real numbers satisfying (1.1) for some $q>p$. Then (1.2) holds. Conversely, (1.2) implies $EX=0$ and $E{|X|}^{p}<\mathrm{\infty }$ if (1.2) holds for any array $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ with (1.1) for some $q>p$.

Remark 1.1 The tool is the maximal Rosenthal’s moment inequality in the proof of Theorem A. But we do not know whether the maximal Rosenthal’s moment inequality holds or not for an END sequence. So the proof of Theorem 1.1 is different from that of Theorem A.

Remark 1.2 Theorem 1.1 does not discuss the very interesting case: $p\alpha =1$. In fact, it is still an open problem whether (1.2) holds or not even in the partial sums of an END sequence when $p\alpha =1$. But we have the following partial result.

Theorem 1.2 Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed END random variables with $EX=0$. Assume that $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ is an array of real numbers satisfying (1.1) for some $q>1$. Then

$\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}{X}_{i}|>\epsilon n\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(1.3)

Conversely, (1.3) implies $EX=0$ if (1.3) holds for any array $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ with (1.1) for some $q>1$.

Throughout this paper, C always stands for a positive constant which may differ from one place to another.

## 2 Lemmas and proofs of main results

To prove the main result, we need the following lemmas.

Lemma 2.1 ()

Let ${X}_{1},{X}_{2},\dots ,{X}_{n}$ be END random variables. Assume that ${f}_{1},{f}_{2},\dots ,{f}_{n}$ are Borel functions all of which are monotone increasing (or all are monotone decreasing). Then ${f}_{1}\left({X}_{1}\right),{f}_{2}\left({X}_{2}\right),\dots ,{f}_{n}\left({X}_{n}\right)$ are END random variables.

The following lemma is due to Chen et al.  when $1 and Shen  when $r\ge 2$.

Lemma 2.2 For any $r>1$, there is a positive constant ${C}_{r}$ depending only on r such that if $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of END random variables with $E{X}_{n}=0$ for every $n\ge 1$, then, for all $n\ge 1$,

$E|\sum _{i=1}^{n}{X}_{i}{|}^{r}\le {C}_{r}\sum _{i=1}^{n}E{|{X}_{i}|}^{r}$

holds when $1 and

$E|\sum _{i=1}^{n}{X}_{i}{|}^{r}\le {C}_{r}\left\{\sum _{i=1}^{n}E{|{X}_{i}|}^{r}+{\left(\sum _{i=1}^{n}E{|{X}_{i}|}^{2}\right)}^{r/2}\right\}$

holds when $r\ge 2$.

By Lemma 2.2 and the same argument as Theorem 2.3.1 in Stout , the following lemma holds.

Lemma 2.3 For any $r>1$, there is a positive constant ${C}_{r}$ depending only on r such that if $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of END random variables with $E{X}_{n}=0$ for every $n\ge 1$, then, for all $n\ge 1$,

$E\underset{1\le k\le n}{max}|\sum _{i=1}^{k}{X}_{i}{|}^{r}\le {C}_{r}{\left(logn\right)}^{r}\sum _{i=1}^{n}E{|{X}_{i}|}^{r}$

holds when $1 and

$E\underset{1\le k\le n}{max}|\sum _{i=1}^{k}{X}_{i}{|}^{r}\le {C}_{r}{\left(logn\right)}^{r}\left\{\sum _{i=1}^{n}E{|{X}_{i}|}^{r}+{\left(\sum _{i=1}^{n}E{|{X}_{i}|}^{2}\right)}^{r/2}\right\}$

holds when $r\ge 2$.

Lemma 2.4 Let $p>1/\alpha$ and $1/2<\alpha \le 1$. Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed END random variables with $EX=0$ and $E{|X|}^{p}<\mathrm{\infty }$. Assume that $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ is an array of real numbers satisfying $|{a}_{ni}|\le 1$ for $1\le i\le n$ and $n\ge 1$. Then (1.2) holds.

Proof Without loss of generality, we can assume that

${a}_{ni}\ge 0,\phantom{\rule{1em}{0ex}}\mathrm{\forall }1\le i\le n,n\ge 1,$
(2.1)

from which it follows that

$\sum _{i=1}^{n}{a}_{ni}^{\tau }\le n,\phantom{\rule{1em}{0ex}}\mathrm{\forall }\tau \ge 1.$
(2.2)

Since $p>1/\alpha$ and $1/2<\alpha \le 1$, we have $0\le \left(1-\alpha \right)/\left(p\alpha -\alpha \right)<1$. We take t as given such that $\left(1-\alpha \right)/\left(p\alpha -\alpha \right).

For $1\le i\le n$, $n\ge 1$, set

$\begin{array}{c}{X}_{ni}^{\left(1\right)}=-{n}^{t\alpha }I\left({X}_{i}<-{n}^{t\alpha }\right)+{X}_{i}I\left(|{X}_{i}|\le {n}^{t\alpha }\right)+{n}^{t\alpha }I\left({X}_{i}>{n}^{t\alpha }\right),\hfill \\ {X}_{ni}^{\left(2\right)}=\left({X}_{i}-{n}^{t\alpha }\right)I\left({n}^{t\alpha }<{X}_{i}\le {n}^{\alpha }\right)+{n}^{\alpha }I\left({X}_{i}>{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(3\right)}=\left({X}_{i}-{n}^{t\alpha }-{n}^{\alpha }\right)I\left({X}_{i}>{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(4\right)}=\left({X}_{i}+{n}^{t\alpha }\right)I\left(-{n}^{\alpha }\le {X}_{i}<-{n}^{t\alpha }\right)-{n}^{\alpha }I\left({X}_{i}<-{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(5\right)}=\left({X}_{i}+{n}^{t\alpha }+{n}^{\alpha }\right)I\left({X}_{i}<-{n}^{\alpha }\right).\hfill \end{array}$

Therefore

$\begin{array}{c}\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}{X}_{i}|>\epsilon {n}^{\alpha }\right)\hfill \\ \phantom{\rule{1em}{0ex}}\le \sum _{l=1}^{5}\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}{X}_{ni}^{\left(l\right)}|>\epsilon {n}^{\alpha }/5\right):=\sum _{l=1}^{5}{I}_{l}.\hfill \end{array}$

For ${I}_{3}$,

$\begin{array}{rl}{I}_{3}& \le \sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\bigcup _{i=1}^{n}\left({X}_{ni}^{\left(3\right)}\ne 0\right)\right)\le \sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}\sum _{i=1}^{n}P\left(|{X}_{i}|>{n}^{\alpha }\right)\\ =\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -1}P\left(|X|>{n}^{\alpha }\right)\le CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
(2.3)

By the same argument as (2.3), we also have ${I}_{5}<\mathrm{\infty }$.

For ${I}_{1}$, by $EX=0$, Markov’s inequality, (2.1), (2.2), and $\left(1-\alpha \right)/\left(p\alpha -\alpha \right),

$\begin{array}{rl}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}E{X}_{ni}^{\left(1\right)}|& \le 2{n}^{-\alpha }\sum _{i=1}^{n}{a}_{ni}E|{X}_{i}|I\left(|{X}_{i}|>{n}^{t\alpha }\right)\\ \le 2{n}^{-\alpha }E|X|I\left(|X|>{n}^{t\alpha }\right)\sum _{i=1}^{n}{a}_{ni}\\ \le 2{n}^{1-\alpha -\left(p\alpha -\alpha \right)t}E{|X|}^{p}I\left(|X|>{n}^{t\alpha }\right)\to 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$
(2.4)

Hence, to prove ${I}_{1}<\mathrm{\infty }$, it is enough to show that

${I}_{1}^{\ast }=\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{ni}^{\left(1\right)}-E{X}_{ni}^{\left(1\right)}\right)|>\epsilon {n}^{\alpha }/10\right)<\mathrm{\infty }.$

By the Markov inequality, Lemma 2.1, Lemma 2.3, the ${C}_{r}$-inequality, (2.1), and (2.2), for any $r\ge 2$,

$\begin{array}{rl}{I}_{1}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}E\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{ni}^{\left(1\right)}-E{X}_{ni}^{\left(1\right)}\right){|}^{r}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}\left\{\sum _{i=1}^{n}{a}_{ni}^{r}E{|{X}_{ni}^{\left(1\right)}|}^{r}+{\left(\sum _{i=1}^{n}{a}_{ni}^{2}E{|{X}_{ni}^{\left(1\right)}|}^{2}\right)}^{r/2}\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -1}{\left(logn\right)}^{r}E{|{X}_{n1}^{\left(1\right)}|}^{r}+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}{\left(logn\right)}^{r}{\left(E{|{X}_{n1}^{\left(1\right)}|}^{2}\right)}^{r/2}\\ :=C{I}_{11}^{\ast }+C{I}_{12}^{\ast }.\end{array}$
(2.5)

If $p\ge 2$, then $\left(p\alpha -1\right)/\left(\alpha -1/2\right)\ge p$. Taking r such that $r>\left(p\alpha -1\right)/\left(\alpha -1/2\right)$,

${I}_{12}^{\ast }\le \sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}{\left(logn\right)}^{r}{\left(E{|X|}^{2}\right)}^{r/2}<\mathrm{\infty }.$

Since $r>p$ and $t<1$, by the ${C}_{r}$-inequality, we get

$\begin{array}{rl}{I}_{11}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -1}{\left(logn\right)}^{r}\left\{E{|X|}^{r}I\left(|X|\le {n}^{t\alpha }\right)+{n}^{tr\alpha }P\left(|X|>{n}^{t\alpha }\right)\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\left(1-t\right)\alpha -1}{\left(logn\right)}^{r}E{|X|}^{p}<\mathrm{\infty }.\end{array}$
(2.6)

If $p<2$, then we can take $r=2$, in this case ${I}_{11}^{\ast }={I}_{12}^{\ast }$ in (2.5). Since $r>p$ and $t<1$, (2.6) still holds. Therefore ${I}_{1}<\mathrm{\infty }$.

For ${I}_{2}$, note that ${I}_{2}={\sum }_{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left({\sum }_{i=1}^{n}{a}_{ni}{X}_{ni}^{\left(2\right)}>\epsilon {n}^{\alpha }/5\right)$, by (2.1) and (2.2),

$\begin{array}{rl}0& \le {n}^{-\alpha }\sum _{i=1}^{n}E\left({a}_{ni}{X}_{ni}^{\left(2\right)}\right)\le {n}^{1-\alpha }\left\{EXI\left({n}^{t\alpha }{n}^{\alpha }\right)\right\}\\ \le {n}^{1-\alpha }E|X|I\left(|X|>{n}^{t\alpha }\right)+nP\left(|X|>{n}^{\alpha }\right)\\ \le {n}^{1-\alpha -\left(p\alpha -\alpha \right)t}E{|X|}^{p}I\left(|X|>{n}^{t\alpha }\right)+{n}^{1-p\alpha }E{|X|}^{p}I\left(|X|>{n}^{\alpha }\right)\\ \to 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$
(2.7)

Hence, in order to prove ${I}_{2}<\mathrm{\infty }$, it is enough to show that

${I}_{2}^{\ast }=\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\sum _{i=1}^{n}{a}_{ni}\left({X}_{ni}^{\left(2\right)}-E{X}_{ni}^{\left(2\right)}\right)>\epsilon {n}^{\alpha }/10\right)<\mathrm{\infty }.$

By the Markov inequality, Lemma 2.1, Lemma 2.2, the ${C}_{r}$-inequality, (2.1), and (2.2), we have, for any $r\ge 2$,

$\begin{array}{rl}{I}_{2}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}\left\{\sum _{i=1}^{n}{a}_{ni}^{r}E|{X}_{ni}^{\left(2\right)}{|}^{r}+{\left(\sum _{i=1}^{n}{a}_{ni}^{2}E|{X}_{ni}^{\left(2\right)}{|}^{2}\right)}^{r/2}\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -1}E|{X}_{n1}^{\left(2\right)}{|}^{r}+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}{\left(E|{X}_{n1}^{\left(2\right)}{|}^{2}\right)}^{r/2}\\ :=C{I}_{21}^{\ast }+C{I}_{22}^{\ast }.\end{array}$
(2.8)

If $p\ge 2$, we take r such that $r>\left(p\alpha -1\right)/\left(\alpha -1/2\right)$. It follows that

${I}_{22}^{\ast }\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}{\left(E{|X|}^{2}\right)}^{r/2}<\mathrm{\infty }.$

Since $r>p$, we get by (2.9) of Sung 

${I}_{21}^{\ast }\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -1}E{|X|}^{r}I\left(|X|\le {n}^{\alpha }\right)+C\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -1}P\left(|X|>{n}^{\alpha }\right)\le CE{|X|}^{p}<\mathrm{\infty }.$
(2.9)

If $p<2$, then we take $r=2$, in this case ${I}_{21}^{\ast }{=}_{22}^{\ast }$. Since $r>p$, (2.9) still holds. Therefore, ${I}_{2}<\mathrm{\infty }$. Similar to the proof of ${I}_{2}<\mathrm{\infty }$, we also have ${I}_{4}<\mathrm{\infty }$. Thus, (1.2) holds. □

Lemma 2.5 Let $p>1/\alpha$ and $1/2<\alpha \le 1$. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed END random variables with $EX=0$. Assume that $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ is an array of real numbers satisfying (1.1) for some $q>p$ and ${a}_{ni}=0$ or $|{a}_{ni}|>1$. Then (1.2) holds.

Proof Let t be as in Lemma 2.4. Without loss of generality, we may assume that ${a}_{ni}\ge 0$ and ${\sum }_{i=1}^{n}{a}_{ni}^{q}\le n$ for some $q>p$, thus, we have

$\sum _{i=1}^{n}{a}_{ni}^{\tau }\le n,\phantom{\rule{1em}{0ex}}\mathrm{\forall }0<\tau \le q.$
(2.10)

Similar to the proof of Lemma 2.4 of Sung , we may assume that (2.10) holds for some $p when $p<2$. For $1\le i\le n$, $n\ge 1$, set

$\begin{array}{c}{X}_{ni}^{\left(1\right)}=-{n}^{t\alpha }I\left({a}_{ni}{X}_{i}<-{n}^{t\alpha }\right)+{a}_{ni}{X}_{i}I\left(|{a}_{ni}{X}_{i}|\le {n}^{t\alpha }\right)+{n}^{t\alpha }I\left({a}_{ni}{X}_{i}>{n}^{t\alpha }\right),\hfill \\ {X}_{ni}^{\left(2\right)}=\left({a}_{ni}{X}_{i}-{n}^{t\alpha }\right)I\left({n}^{t\alpha }<{a}_{ni}{X}_{i}\le {n}^{\alpha }\right)+{n}^{\alpha }I\left({a}_{ni}{X}_{i}>{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(3\right)}=\left({a}_{ni}{X}_{i}-{n}^{t\alpha }-{n}^{\alpha }\right)I\left({a}_{ni}{X}_{i}>{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(4\right)}=\left({a}_{ni}{X}_{i}+{n}^{t\alpha }\right)I\left(-{n}^{\alpha }\le {a}_{ni}{X}_{i}<-{n}^{t\alpha }\right)-{n}^{\alpha }I\left({a}_{ni}{X}_{i}<-{n}^{\alpha }\right),\hfill \\ {X}_{ni}^{\left(5\right)}=\left({a}_{ni}{X}_{i}+{n}^{t\alpha }+{n}^{\alpha }\right)I\left({a}_{ni}{X}_{i}<-{n}^{\alpha }\right).\hfill \end{array}$

Therefore

$\begin{array}{c}\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}{X}_{i}|>\epsilon {n}^{\alpha }\right)\hfill \\ \phantom{\rule{1em}{0ex}}\le \sum _{l=1}^{5}\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{X}_{ni}^{\left(l\right)}|>\epsilon {n}^{\alpha }/5\right):=\sum _{l=1}^{5}{I}_{l}.\hfill \end{array}$

By the proof of Lemma 2.4 in Sung , we have

$\begin{array}{rl}{I}_{3}& \le \sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\bigcup _{i=1}^{n}\left({X}_{ni}^{\left(3\right)}\ne 0\right)\right)\\ \le \sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}\sum _{i=1}^{n}P\left(|{a}_{ni}{X}_{i}|>{n}^{\alpha }\right)\le CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
(2.11)

Similarly, we have ${I}_{5}<\mathrm{\infty }$.

For ${I}_{1}$, since $E{X}_{i}=0$, $p>1/\alpha$, $1/2<\alpha \le 1$, $\left(1-\alpha \right)/\left(p\alpha -\alpha \right) and (2.10), we get

$\begin{array}{rl}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{X}_{ni}^{\left(1\right)}|& \le 2{n}^{-\alpha }\sum _{i=1}^{n}E|{a}_{ni}{X}_{i}|I\left(|{a}_{ni}{X}_{i}|>{n}^{t\alpha }\right)\\ \le 2{n}^{-\alpha -\left(p-1\right)t\alpha }\sum _{i=1}^{n}{a}_{ni}^{p}E|{X}_{i}{|}^{p}I\left(|{a}_{ni}{X}_{i}|>{n}^{t\alpha }\right)\\ \le C{n}^{-\alpha -\left(p-1\right)t\alpha }\sum _{i=1}^{n}{a}_{ni}^{p}\le C{n}^{1-\alpha -\left(p-1\right)t\alpha }\to 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$
(2.12)

Hence, in order to prove ${I}_{1}<\mathrm{\infty }$, it is enough to show that

${I}_{1}^{\ast }=\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}\left({X}_{ni}^{\left(1\right)}-E{X}_{ni}^{\left(1\right)}\right)|>\epsilon {n}^{\alpha }/10\right)<\mathrm{\infty }.$

Similar to the proof of (2.5), we have, for any $r\ge 2$,

$\begin{array}{rl}{I}_{1}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}\sum _{i=1}^{n}E{|{X}_{ni}^{\left(1\right)}|}^{r}+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}{\left(\sum _{i=1}^{n}E{|{X}_{ni}^{\left(1\right)}|}^{2}\right)}^{r/2}\\ :=C{I}_{11}^{\ast }+C{I}_{12}^{\ast }.\end{array}$
(2.13)

If $p\ge 2$, we take r such that $r>\left(p\alpha -1\right)/\left(\alpha -1/2\right)$. By (2.10)

${I}_{12}^{\ast }\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}{\left(\sum _{i=1}^{n}{a}_{ni}^{2}E{|{X}_{1}|}^{2}\right)}^{r/2}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}{\left(logn\right)}^{r}<\mathrm{\infty }.$

Since $r>p$ and $0, we get by (2.10)

$\begin{array}{rl}{I}_{11}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}\left\{\sum _{i=1}^{n}\left(E{|{a}_{ni}{X}_{i}|}^{r}I\left(|{a}_{ni}{X}_{i}|\le {n}^{t\alpha }\right)+{n}^{tr\alpha }P\left(|{a}_{ni}{X}_{i}|>{n}^{t\alpha }\right)\right)\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(logn\right)}^{r}\left\{\sum _{i=1}^{n}{n}^{\left(r-p\right)t\alpha }{a}_{ni}^{p}E{|{X}_{i}|}^{p}\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\left(1-t\right)\alpha -1}{\left(logn\right)}^{r}E{|X|}^{p}<\mathrm{\infty }.\end{array}$
(2.14)

If $p<2$, then we take $r=2$, in this case ${I}_{11}^{\ast }={I}_{12}^{\ast }$ in (2.13). Since $r>p$ and $t<1$, (2.14) still holds. Therefore ${I}_{1}<\mathrm{\infty }$.

For ${I}_{2}$, since $\left(1-\alpha \right)/\left(p\alpha -\alpha \right), we have by (2.10)

$\begin{array}{rl}0& \le {n}^{-\alpha }\sum _{i=1}^{n}E\left({X}_{ni}^{\left(2\right)}\right)\le {n}^{-\alpha }\sum _{i=1}^{n}\left\{E{a}_{ni}{X}_{i}I\left({n}^{t\alpha }<{a}_{ni}{X}_{i}\le {n}^{\alpha }\right)+{n}^{\alpha }P\left({a}_{ni}{X}_{i}>{n}^{\alpha }\right)\right\}\\ \le \sum _{i=1}^{n}\left\{{n}^{-\alpha }E{a}_{ni}{X}_{i}I\left({a}_{ni}{X}_{i}>{n}^{t\alpha }\right)+P\left({a}_{ni}{X}_{i}>{n}^{\alpha }\right)\right\}\\ \le \sum _{i=1}^{n}\left\{{n}^{-\left(p-1\right)t\alpha -\alpha }E{|{a}_{ni}{X}_{i}|}^{p}I\left(|{a}_{ni}{X}_{i}|>{n}^{t\alpha }\right)+{n}^{-p\alpha }E{|{a}_{ni}{X}_{i}|}^{p}I\left(|{a}_{ni}{X}_{i}|>{n}^{\alpha }\right)\right\}\\ \le C\sum _{i=1}^{n}{a}_{ni}^{p}\left({n}^{-\left(p-1\right)t\alpha -\alpha }+{n}^{-p\alpha }\right)\\ \le C{n}^{1-\alpha -\left(p-1\right)t\alpha }+C{n}^{1-p\alpha }\to 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$
(2.15)

Hence, in order to prove ${I}_{2}<\mathrm{\infty }$, it is enough to show that

${I}_{2}^{\ast }=\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}P\left(\sum _{i=1}^{n}\left({X}_{ni}^{\left(2\right)}-E{X}_{ni}^{\left(2\right)}\right)>\epsilon {n}^{\alpha }/10\right)<\mathrm{\infty }.$

Similar to the proof of (2.8), we have for any $r\ge 2$

$\begin{array}{rl}{I}_{2}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}\sum _{i=1}^{n}E|{X}_{ni}^{\left(2\right)}{|}^{r}+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(\sum _{i=1}^{n}E|{X}_{ni}^{\left(2\right)}{|}^{2}\right)}^{r/2}\\ :=C{I}_{21}^{\ast }+C{I}_{22}^{\ast }.\end{array}$
(2.16)

If $p\ge 2$, we take r such that $r>\left\{\left(p\alpha -1\right)/\left(\alpha -1/2\right),q\right\}$. By (2.10), we have

$\begin{array}{rl}{I}_{22}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(\sum _{i=1}^{n}E\left\{|{a}_{ni}{X}_{i}{|}^{2}I\left(|{a}_{ni}{X}_{i}|\le {n}^{\alpha }\right)+{n}^{2\alpha }P\left(|{a}_{ni}{X}_{i}|>{n}^{\alpha }\right)\right\}\right)}^{r/2}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}{\left(\sum _{i=1}^{n}{a}_{ni}^{2}E{|{X}_{i}|}^{2}\right)}^{r/2}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2+r/2}<\mathrm{\infty },\end{array}$

and we get by the ${C}_{r}$-inequality and (2.21)-(2.23) of Sung  and (2.16)

$\begin{array}{rl}{I}_{21}^{\ast }& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}\sum _{i=1}^{n}E\left\{{\left({a}_{ni}{X}_{i}\right)}^{r}I\left({n}^{t\alpha }<{a}_{ni}{X}_{i}\le {n}^{\alpha }\right)+{n}^{r\alpha }I\left({a}_{ni}{X}_{i}>{n}^{\alpha }\right)\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\left(p-r\right)\alpha -2}\sum _{i=1}^{n}E|{a}_{ni}{X}_{i}{|}^{r}I\left(|{a}_{ni}{X}_{i}|\le {n}^{\alpha }\right)+C\sum _{n=1}^{\mathrm{\infty }}{n}^{p\alpha -2}\sum _{i=1}^{n}P\left(|{a}_{ni}{X}_{i}|>{n}^{\alpha }\right)\\ \le CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
(2.17)

If $p<2$, then we take $r=2$, in this case ${I}_{21}^{\ast }={I}_{22}^{\ast }$ in (2.16). Similar to the proof of Lemma 2.4 of Sung , (2.17) still holds. Therefore, ${I}_{2}<\mathrm{\infty }$. Similar to the proof of ${I}_{2}<\mathrm{\infty }$, we have ${I}_{4}<\mathrm{\infty }$. Thus, (1.2) holds. □

Proof of Theorem 1.1 By Lemmas 2.4 and 2.5, the proof is similar to that in Sung , so we omit the details. □

Proof of Theorem 1.2 Sufficiency. Without loss of generality, we can assume that ${a}_{ni}\ge 0$ and (1.1) holds for $1 by the Hölder inequality. We firstly prove that

$\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(|\sum _{i=1}^{n}{a}_{ni}{X}_{i}|>\epsilon n\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.18)

For $1\le i\le n$, $n\ge 1$, set

${X}_{ni}^{\left(1\right)}=-nI\left({X}_{i}<-n\right)+{X}_{i}I\left(|{X}_{i}|\le n\right)+nI\left({X}_{i}>n\right),\phantom{\rule{2em}{0ex}}{X}_{ni}^{\left(2\right)}={X}_{i}-{X}_{ni}^{\left(1\right)}.$

Note that $EX=0$, by the Hölder inequality,

${n}^{-1}|E\sum _{i=1}^{n}{a}_{ni}{X}_{ni}^{\left(1\right)}|\le CE|X|I\left(|X|>n\right)\to 0.$

Hence to prove (2.18), it is enough to show that for any $\epsilon >0$

${I}_{1}=\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(|\sum _{i=1}^{n}{a}_{ni}\left({X}_{ni}^{\left(1\right)}-E{X}_{ni}^{\left(1\right)}\right)|>\epsilon n\right)<\mathrm{\infty }$

and

${I}_{2}=\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(|\sum _{i=1}^{n}{a}_{ni}{X}_{ni}^{\left(2\right)}|>\epsilon n\right)<\mathrm{\infty }.$

By the Markov inequality, Lemma 2.3, the ${C}_{r}$-inequality, (1.1), and a standard computation

$\begin{array}{rl}{I}_{1}& \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1-q}E|\sum _{i=1}^{n}{a}_{ni}\left({X}_{ni}^{\left(1\right)}-E{X}_{ni}^{\left(1\right)}\right){|}^{q}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1-q}\left(\sum _{i=1}^{n}{|{a}_{ni}|}^{q}\right)\left\{E{|X|}^{q}I\left(|X|\le n\right)+{n}^{q}P\left(|X|>n\right)\right\}\\ \le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-q}E{|X|}^{q}I\left(|X|\le n\right)+C\sum _{n=1}^{\mathrm{\infty }}P\left(|X|>n\right)\\ \le CE|X|<\mathrm{\infty }.\end{array}$

Obviously,

${I}_{2}\le \sum _{n=1}^{\mathrm{\infty }}P\left(|X|>n\right)\le CE|X|<\mathrm{\infty }.$

To prove (1.3), it is enough to prove that

$\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i}^{+}-E{X}_{i}^{+}\right)|>\epsilon n\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0$
(2.19)

and

$\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i}^{-}-E{X}_{i}^{-}\right)|>\epsilon n\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0,$
(2.20)

where ${x}^{+}=max\left\{0,x\right\}$ and ${x}^{-}={\left(-x\right)}^{+}$.

Let $\epsilon >0$ be given. By $E{X}^{+}\le E|X|<\mathrm{\infty }$, (1.1), and the Hölder inequality, there exists a constant $x=x\left(\epsilon \right)>0$ such that

${n}^{-1}\sum _{i=1}^{n}{a}_{ni}E\left({X}_{i}^{+}-x\right)I\left({X}_{i}^{+}>x\right)={n}^{-1}\left(\sum _{i=1}^{n}{a}_{ni}\right)E\left\{\left({X}^{+}-x\right)I\left({X}^{+}>x\right)\right\}\le \epsilon /6.$
(2.21)

Set

${X}_{i,x}^{\left(1\right)}={X}_{i}^{+}I\left({X}_{i}^{+}\le x\right)+xI\left({X}_{i}^{+}>x\right),\phantom{\rule{2em}{0ex}}{X}_{i,x}^{\left(2\right)}={X}_{i}^{+}-{X}_{i,x}^{\left(1\right)}.$

Note that by (2.21)

$\begin{array}{c}\underset{1\le j\le n}{max}{n}^{-1}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i}^{+}-E{X}_{i}^{+}\right)|\hfill \\ \phantom{\rule{1em}{0ex}}\le \underset{1\le j\le n}{max}{n}^{-1}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i,x}^{\left(1\right)}-E{X}_{i,x}^{\left(1\right)}\right)|+\underset{1\le j\le n}{max}{n}^{-1}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i,x}^{\left(2\right)}-E{X}_{i,x}^{\left(2\right)}\right)|\hfill \\ \phantom{\rule{1em}{0ex}}\le \underset{1\le j\le n}{max}{n}^{-1}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i,x}^{\left(1\right)}-E{X}_{i,x}^{\left(1\right)}\right)|+{n}^{-1}|\sum _{i=1}^{n}{a}_{ni}\left({X}_{i,x}^{\left(2\right)}-E{X}_{i,x}^{\left(2\right)}\right)|+\epsilon /3.\hfill \end{array}$

Therefore, to prove (2.19), it is enough to prove that

${I}_{3}=\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i,x}^{\left(1\right)}-E{X}_{i,x}^{\left(1\right)}\right)|>\epsilon n/3\right)<\mathrm{\infty }$

and

${I}_{4}=\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}P\left(|\sum _{i=1}^{n}{a}_{ni}\left({X}_{i,x}^{\left(2\right)}-E{X}_{i,x}^{\left(2\right)}\right)|>\epsilon n/3\right)<\mathrm{\infty }.$

By the Markov inequality, Lemma 2.4, and (1.1)

${I}_{3}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1-q}E\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{a}_{ni}\left({X}_{i,x}^{\left(1\right)}-E{X}_{i,x}^{\left(1\right)}\right){|}^{q}\le \sum _{n=1}^{\mathrm{\infty }}{n}^{-q}{\left(logn\right)}^{q}<\mathrm{\infty }.$

By Lemma 2.1, $\left\{{X}_{i,x}^{\left(2\right)}-E{X}_{i,x}^{\left(2\right)},i\ge 1\right\}$ is a sequence of identically distributed END with zero mean. Then ${I}_{4}<\mathrm{\infty }$ by taking $\left\{{X}_{i,x}^{\left(2\right)}-E{X}_{i,x}^{\left(2\right)},i\ge 1\right\}$ instead of $\left\{{X}_{i},i\ge 1\right\}$ in (2.18). Hence (2.19) holds.

The proof of (2.20) is the same as that of (2.19).

Necessity. It is similar to the proof of Theorem 2.2 in Sung . Here we omit the details. So we complete the proof. □

## References

1. Hsu P, Robbins H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 1947, 33: 25–31. 10.1073/pnas.33.2.25

2. Chen P, Hu T-C, Volodin A: Limiting behavior of moving average processes under negative association. Teor. Imovir. Mat. Stat. 2007, 7: 154–166.

3. Qiu D, Chen P, Antonini RG, Volodin A: On the complete convergence for arrays of rowwise extended negatively dependent random variables. J. Korean Math. Soc. 2013,50(2):379–392. 10.4134/JKMS.2013.50.2.379

4. Sung SH: Complete convergence for weighted sums of random variables. Stat. Probab. Lett. 2007,77(3):303–311. 10.1016/j.spl.2006.07.010

5. Sung SH:Complete convergence for weighted sums of ${\rho }^{\ast }$-mixing random variables. Discrete Dyn. Nat. Soc. 2010., 2010: Article ID 630608 10.1155/2010/630608

6. Zhang L, Wang J: A note on complete convergence of pairwise NQD random sequences. Appl. Math. J. Chin. Univ. Ser. A 2004,19(2):203–208. 10.1007/s11766-004-0055-4

7. Liu L: Precise large deviations for dependent random variables with heavy tails. Stat. Probab. Lett. 2009,79(9):1290–1298. 10.1016/j.spl.2009.02.001

8. Ebrahimi N, Ghosh M: Multivariate negative dependence. Commun. Stat., Theory Methods 1981, 10: 307–336. 10.1080/03610928108828041

9. Alam K, Saxena KML: Positive dependence in multivariate distributions. Commun. Stat., Theory Methods 1981,10(12):1183–1196. 10.1080/03610928108828102

10. Block HW, Savits TH, Shaked M: Some concepts of negative dependence. Ann. Probab. 1982,10(3):765–772. 10.1214/aop/1176993784

11. Joag-Dev K, Proschan F: Negative association of random variables with applications. Ann. Stat. 1983,11(1):286–295. 10.1214/aos/1176346079

12. Wu YF, Zhu DJ: Convergence properties of partial sums for arrays of rowwise negatively orthant dependent random variables. J. Korean Stat. Soc. 2010, 39: 189–197. 10.1016/j.jkss.2009.05.003

13. Liu L: Necessary and sufficient conditions for moderate deviations of dependent random variables with heavy tails. Sci. China Math. 2010,53(6):1421–1434. 10.1007/s11425-010-4012-9

14. Wu YF, Guan M: Convergence properties of the partial sums for sequences of END random variables. J. Korean Math. Soc. 2012, 49: 1097–1110. 10.4134/JKMS.2012.49.6.1097

15. Chen P, Bai P, Sung SH: The von Bahr-Esseen moment inequality for pairwise independent random variables and applications. J. Math. Anal. Appl. 2014, 419: 1290–1302. 10.1016/j.jmaa.2014.05.067

16. Shen A: Probability inequalities for END sequence and their applications. J. Inequal. Appl. 2011., 2011: Article ID 98 10.1186/1029-242X-2011-98

17. Stout WF: Almost Sure Convergence. Academic Press, New York; 1974.

## Acknowledgements

The author would like to thank the referees and the editors for the helpful comments and suggestions. The research is supported by the General Project of Science and Technology Department of Hunan Province (2013NK3017) and by the Agriculture Science and Ttechnology Supporting Programme of Hengyang Municipal Bureau of Science and Technology (2013KN36).

## Author information

Authors

### Corresponding author

Correspondence to Guohui Zhang. 