• Research
• Open Access

# A note on the complete convergence for weighted sums of negatively dependent random variables

Journal of Inequalities and Applications20122012:158

https://doi.org/10.1186/1029-242X-2012-158

• Received: 7 April 2012
• Accepted: 27 June 2012
• Published:

## Abstract

The complete convergence theorems for weighted sums of arrays of rowwise negatively dependent random variables were obtained by Wu (Wu, Q: Complete convergence for weighted sums of sequences of negatively dependent random variables. J. Probab. Stat. 2011, Article ID 202015, 16 pages) and Wu (Wu, Q: A complete convergence theorem for weighted sums of arrays of rowwise negatively dependent random variables. J. Inequal. Appl. 2012, 50). In this paper, we complement the results of Wu.

MSC:60F15.

## Keywords

• complete convergence
• weighted sums
• negatively dependent random variables

## 1 Introduction

The concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins . A sequence $\left\{{X}_{n},n\ge 1\right\}$ of random variables converges completely to the constant θ if

By the Borel-Cantelli lemma, this implies that ${X}_{n}\to \theta$ almost surely (a.s.). The converse is true if $\left\{{X}_{n},n\ge 1\right\}$ are independent random variables. Therefore, the complete convergence is a very important tool in establishing almost sure convergence. There are many complete convergence theorems for sums and weighted sums of independent random variables.

Volodin et al.  and Chen et al.  ($\beta >-1$ and $\beta =-1$, respectively) proved the following complete convergence for weighted sums of arrays of rowwise independent random elements in a real separable Banach space.

We recall that the array $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ of random variables is said to be stochastically dominated by a random variable X if

where C is a positive constant.

Theorem 1.1 ([2, 3])

Suppose that $\beta \ge -1$. Let $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ be an array of rowwise independent random elements in a real separable Banach space which are stochastically dominated by a random variable X. Let $\left\{{a}_{ni},i\ge 1,n\ge 1\right\}$ be an array of constants satisfying
(1.1)
and
$\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{\theta }=O\left({n}^{\mu }\right)$
(1.2)
for some $0<\theta \le 2$ and μ such that $\theta +\mu /\gamma <2$ and $1+\mu +\beta >0$. If $E{|X|}^{\theta +\left(1+\mu +\beta \right)/\gamma }<\mathrm{\infty }$ and ${\sum }_{i=1}^{\mathrm{\infty }}{a}_{ni}{X}_{ni}\to 0$ in probability, then
(1.3)

If $\beta <-1$, then (1.3) is immediate. Hence Theorem 1.1 is of interest only for $\beta \ge -1$.

Recently, Wu  extended Theorem 1.1 to negatively dependent random variables when $\beta >-1$. Wu  also considered the case of $1+\mu +\beta =0$ ($\beta >-1$). But, the proof of Wu  does not work for the case of $\beta =-1$.

The concept of negatively dependent random variables was given by Lehmann . A finite family of random variables $\left\{{X}_{1},\dots ,{X}_{n}\right\}$ is said to be negatively dependent (or negatively orthant dependent) if for each $n\ge 2$, the following two inequalities hold:
$P\left({X}_{1}\le {x}_{1},\dots ,{X}_{n}\le {x}_{n}\right)\le \prod _{i=1}^{n}P\left({X}_{i}\le {x}_{i}\right)$
and
$P\left({X}_{1}>{x}_{1},\dots ,{X}_{n}>{x}_{n}\right)\le \prod _{i=1}^{n}P\left({X}_{i}>{x}_{i}\right)$

for all real numbers ${x}_{1},\dots ,{x}_{n}$. An infinite family of random variables is negatively dependent if every finite subfamily is negatively dependent.

Theorem 1.2 (Wu )

Suppose that $\beta >-1$. Let $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ be an array of negatively dependent random variables which are stochastically dominated by a random variable X. Let $\left\{{a}_{ni},i\ge 1,n\ge 1\right\}$ be an array of constants satisfying (1.1) for some $\gamma >0$ and (1.2) for some θ and μ such that $\mu <2\gamma$ and $0<\theta . Furthermore, assume that $E{X}_{ni}=0$ for all $i\ge 1$ and $n\ge 1$ if $\theta +\left(1+\mu +\beta \right)/\gamma \ge 1$.
1. (i)
If $1+\mu +\beta >0$ and $E{|X|}^{\theta +\left(1+\mu +\beta \right)/\gamma }<\mathrm{\infty }$, then
(1.4)

2. (ii)

If $1+\mu +\beta =0$ and $E{|X|}^{\theta }log|X|<\mathrm{\infty }$, then (1.4) holds.

Using the moment inequality of negatively dependent random variables, Wu  obtained a complete convergence result for weighted sums of identically distributed negatively dependent random variables.

Theorem 1.3 (Wu )

Suppose that $r>1$. Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed negatively dependent random variables. Let $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of constants satisfying
(1.5)
and
$\sum _{i=1}^{n}{|{a}_{ni}|}^{2\left(r-1\right)}=O\left(1\right).$
(1.6)
Furthermore, assume that $EX=0$ if $2\left(r-1\right)\ge 1$. Then, for $r\ge 2$,
$E{|X|}^{2\left(r-1\right)}log|X|<\mathrm{\infty }$
(1.7)
if and only if
(1.8)

For $1, (1.7) implies (1.8).

In (1.5), $a\approx b$ means that $a=O\left(b\right)$ and $b=O\left(a\right)$. Theorem 1.3 extends the result of Liang and Su  for negatively associated random variables to negatively dependent case. The proof of the sufficiency part of Liang and Su  is mistakenly based on the fact that (1.8) implies that

The proof of the sufficiency is correct when $r\ge 2$. However, condition (1.5) does not hold, since the left-hand side of (1.5) goes to the limit $\mathrm{♯}\left\{1\le i\le n:{a}_{ni}\ne 0\right\}$ as $m\to \mathrm{\infty }$, but the right-hand side diverges. Hence, there are no arrays satisfying (1.5).

In this paper, we obtain complete convergence results for weighted sums of arrays of rowwise negatively dependent random variables. Our results complement the results of Wu [4, 6].

Throughout this paper, the symbol C denotes a positive constant which is not necessarily the same one in each appearance. It proves convenient to define $logx=max\left\{1,lnx\right\}$, where lnx denotes the natural logarithm.

## 2 Preliminary lemmas

In this section, we present some lemmas which will be used to prove our main results.

The following two lemmas are well known and their proofs are standard.

Lemma 2.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of random variables which are stochastically dominated by a random variable X. For any $\alpha >0$ and $b>0$, the following statements hold:
1. (i)

$E{|{X}_{n}|}^{\alpha }I\left(|{X}_{n}|\le b\right)\le C\left\{E{|X|}^{\alpha }I\left(|X|\le b\right)+{b}^{\alpha }P\left(|X|>b\right)\right\}$.

2. (ii)

$E{|{X}_{n}|}^{\alpha }I\left(|{X}_{n}|>b\right)\le CE{|X|}^{\alpha }I\left(|X|>b\right)$.

The following Lemma 2.2(i)-(iii) can be found in Sung .

Lemma 2.2 Let X be a random variable with $E{|X|}^{r}<\mathrm{\infty }$ for some $r>0$. For any $t>0$, the following statements hold:
1. (i)

${\sum }_{n=1}^{\mathrm{\infty }}{n}^{-1-t\delta }E{|X|}^{r+\delta }I\left(|X|\le {n}^{t}\right)\le CE{|X|}^{r}$ for any $\delta >0$.

2. (ii)

${\sum }_{n=1}^{\mathrm{\infty }}{n}^{-1+t\delta }E{|X|}^{r-\delta }I\left(|X|>{n}^{t}\right)\le CE{|X|}^{r}$ for any $\delta >0$ such that $r-\delta >0$.

3. (iii)

${\sum }_{n=1}^{\mathrm{\infty }}{n}^{-1+tr}P\left(|X|>{n}^{t}\right)\le CE{|X|}^{r}$.

4. (iv)

${\sum }_{n=1}^{\mathrm{\infty }}{n}^{-1}E{|X|}^{r}I\left(|X|>{n}^{t}\right)\le CE{|X|}^{r}log|X|$.

The Marcinkiewicz-Zygmund and Rosenthal type inequalities play an important role in establishing complete convergence. Asadian et al.  proved the Marcinkiewicz-Zygmund and Rosenthal inequalities for negatively dependent random variables.

Lemma 2.3 (Asadian et al. )

Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of negatively dependent random variables with $E{X}_{n}=0$ and $E{|{X}_{n}|}^{p}<\mathrm{\infty }$ for some $p\ge 1$ and all $n\ge 1$. Then there exist constants ${C}_{p}>0$ and ${D}_{p}>0$ depending only on p such that

The last lemma is a complete convergence theorem for an array of rowwise negatively dependent mean zero random variables.

Lemma 2.4 ([10, 11])

Let $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ be an array of rowwise negatively dependent random variables with $E{X}_{ni}=0$ and $E{X}_{ni}^{2}<\mathrm{\infty }$ for all $i\ge 1$ and $n\ge 1$. Let $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of nonnegative constants. Suppose that the following conditions hold.
1. (i)

${\sum }_{n=1}^{\mathrm{\infty }}{b}_{n}{\sum }_{i=1}^{\mathrm{\infty }}P\left(|{X}_{ni}|>ϵ\right)<\mathrm{\infty }$ for all $ϵ>0$.

2. (ii)
There exists $J\ge 1$ such that
$\sum _{n=1}^{\mathrm{\infty }}{b}_{n}{\left(\sum _{i=1}^{\mathrm{\infty }}E{X}_{ni}^{2}\right)}^{J}<\mathrm{\infty }.$

Then ${\sum }_{n=1}^{\mathrm{\infty }}{b}_{n}P\left(|{\sum }_{i=1}^{\mathrm{\infty }}{X}_{ni}|>ϵ\right)<\mathrm{\infty }$ for all $ϵ>0$.

## 3 Main results

In this section, we obtain two complete convergence results for weighted sums of arrays of rowwise negatively dependent random variables.

Theorem 3.1 Suppose that $\beta \ge -1$. Let $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ be an array of rowwise negatively dependent mean zero random variables which are stochastically dominated by a random variable X satisfying $E{|X|}^{p}<\mathrm{\infty }$ for some $p\ge 1$. Let $\left\{{a}_{ni},i\ge 1,n\ge 1\right\}$ be an array of constants satisfying (1.1) for some $\gamma >0$ and
(3.1)
Furthermore, assume that
(3.2)
if $p\ge 2$. Then
Proof Since ${a}_{ni}={a}_{ni}^{+}-{a}_{ni}^{-}$, we may assume that ${a}_{ni}\ge 0$. For $i\ge 1$ and $n\ge 1$, define
${X}_{ni}^{\prime }={X}_{ni}I\left(|{X}_{ni}|\le {n}^{\gamma }\right)+{n}^{\gamma }I\left({X}_{ni}>{n}^{\gamma }\right)-{n}^{\gamma }I\left({X}_{ni}<-{n}^{\gamma }\right),\phantom{\rule{2em}{0ex}}{X}_{ni}^{″}={X}_{ni}-{X}_{ni}^{\prime }.$
Then $\left\{{X}_{ni}^{\prime },i\ge 1,n\ge 1\right\}$ and $\left\{{X}_{ni}^{″},i\ge 1,n\ge 1\right\}$ are still arrays of rowwise negatively dependent random variables, $|{X}_{ni}^{\prime }|=|{X}_{ni}|I\left(|{X}_{ni}|\le {n}^{\gamma }\right)+{n}^{\gamma }I\left(|{X}_{ni}|>{n}^{\gamma }\right)$, and $|{X}_{ni}^{″}|=\left({X}_{ni}-{n}^{\gamma }\right)I\left({X}_{ni}>{n}^{\gamma }\right)-\left({X}_{ni}+{n}^{\gamma }\right)I\left({X}_{ni}<-{n}^{\gamma }\right)\le |{X}_{ni}|I\left(|{X}_{ni}|>{n}^{\gamma }\right)$. Since ${a}_{ni}\ge 0$, $\left\{{a}_{ni}{X}_{ni}^{\prime },i\ge 1,n\ge 1\right\}$ and $\left\{{a}_{ni}{X}_{ni}^{″},i\ge 1,n\ge 1\right\}$ are also arrays of rowwise negatively dependent random variables. In view of $E{X}_{ni}=0$ for all $i\ge 1$ and $n\ge 1$, it suffices to show that
${I}_{1}:=\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }P\left(|\sum _{i=1}^{\mathrm{\infty }}{a}_{ni}\left({X}_{ni}^{\prime }-E{X}_{ni}^{\prime }\right)|>ϵ\right)<\mathrm{\infty }$
(3.3)
and
${I}_{2}:=\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }P\left(|\sum _{i=1}^{\mathrm{\infty }}{a}_{ni}\left({X}_{ni}^{″}-E{X}_{ni}^{″}\right)|>ϵ\right)<\mathrm{\infty }.$
(3.4)

We will prove (3.3) and (3.4) with three cases.

Case 1 ($p=1$).

For ${I}_{1}$, we get by Markov’s inequality, Lemmas 2.1-2.3, (1.1), and (3.1) that

The sixth inequality follows from Lemma 2.2.

For ${I}_{2}$, we first prove that
${I}_{3}:=\sum _{i=1}^{\mathrm{\infty }}|{a}_{ni}|E|{X}_{ni}^{″}|\to 0.$
(3.5)
By Lemma 2.1, (1.1), and (3.1), ${I}_{3}$ is dominated by
$\begin{array}{r}\sum _{i=1}^{\mathrm{\infty }}|{a}_{ni}|E|{X}_{ni}|I\left(|{X}_{ni}|>{n}^{\gamma }\right)\\ \phantom{\rule{1em}{0ex}}\le C\sum _{i=1}^{\mathrm{\infty }}|{a}_{ni}|E|X|I\left(|X|>{n}^{\gamma }\right)\\ \phantom{\rule{1em}{0ex}}\le C\underset{i\ge 1}{sup}{|{a}_{ni}|}^{1-q}\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{q}E|X|I\left(|X|>{n}^{\gamma }\right)\\ \phantom{\rule{1em}{0ex}}\le C{n}^{-1-\beta }E|X|I\left(|X|>{n}^{\gamma }\right).\end{array}$

Since $\beta \ge -1$ and $E|X|I\left(|X|>{n}^{\gamma }\right)\to 0$ as $n\to \mathrm{\infty }$, (3.5) holds.

Hence, to prove (3.4), it suffices to show that
${I}_{2}^{\ast }:=\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }P\left(|\sum _{i=1}^{\mathrm{\infty }}{a}_{ni}{X}_{ni}^{″}|>ϵ\right)<\mathrm{\infty }.$
(3.6)
Take $\delta >0$ such that $p-\delta >max\left\{0,q\right\}$. Since $0, we get by Markov’s inequality, Lemmas 2.1-2.2, (1.1), and (3.1) that

Case 2 ($1).

As in Case 1, we have that ${I}_{1}\le CE{|X|}^{p}<\mathrm{\infty }$.

For ${I}_{2}$, we take $\delta >0$ such that $p-\delta \ge max\left\{1,q\right\}$. Then we have by Markov’s inequality, Lemmas 2.1-2.3, (1.1), and (3.1) that

Case 3 ($p\ge 2$).

In this case, we will prove (3.3) and (3.4) by using Lemma 2.4. To prove (3.3), we take $\delta >0$. Then we obtain by Markov’s inequality, Lemmas 2.1-2.2, (1.1), and (3.1) that
$\begin{array}{r}\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}P\left(|{a}_{ni}\left({X}_{ni}^{\prime }-E{X}_{ni}^{\prime }\right)|>ϵ\right)\\ \phantom{\rule{1em}{0ex}}\le {ϵ}^{-p-\delta }\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}E{|{a}_{ni}\left({X}_{ni}^{\prime }-E{X}_{ni}^{\prime }\right)|}^{p+\delta }\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p+\delta }E{|{X}_{ni}^{\prime }|}^{p+\delta }\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p+\delta }\left\{E{|X|}^{p+\delta }I\left(|X|\le {n}^{\gamma }\right)+{n}^{\gamma \left(p+\delta \right)}P\left(|X|>{n}^{\gamma }\right)\right\}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\underset{i\ge 1}{sup}{|{a}_{ni}|}^{p+\delta -q}\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{q}\left\{E{|X|}^{p+\delta }I\left(|X|\le {n}^{\gamma }\right)+{n}^{\gamma \left(p+\delta \right)}P\left(|X|>{n}^{\gamma }\right)\right\}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1-\gamma \delta }\left\{E{|X|}^{p+\delta }I\left(|X|\le {n}^{\gamma }\right)+{n}^{\gamma \left(p+\delta \right)}P\left(|X|>{n}^{\gamma }\right)\right\}\\ \phantom{\rule{1em}{0ex}}\le CE{|X|}^{p}<\mathrm{\infty }.\end{array}$
We also obtain that for $J\ge 1$ such that $\alpha J-\beta >1$,

Hence (3.3) holds by Lemma 2.4.

To prove (3.4), we take $\delta >0$ such that $p-\delta \ge max\left\{1,q\right\}$. The proof of the rest is similar to that of (3.3) and is omitted. □

Remark 3.1 When $0, Theorem 3.1 holds without the condition of negative dependence (see Theorem 2(i) in Sung ). Theorem 3.1 extends the result of Sung  for independent random variables to negatively dependent case.

Remark 3.2 Theorem 1.2(i) follows from Theorem 3.1 by taking $p=\theta +\left(1+\mu +\beta \right)/\gamma$ and $q=\theta$, since
$\sum _{i=1}^{\mathrm{\infty }}{a}_{ni}^{2}\le \underset{i\ge 1}{sup}{|{a}_{ni}|}^{2-\theta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{\theta }=O\left({n}^{-\left(\gamma \left(2-\theta \right)-\mu \right)}\right).$

But, Theorem 1.2(i) does not deal with the case of $\beta =-1$.

Note that conditions (1.1) and (3.1) together imply
$\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p}=O\left({n}^{-1-\beta }\right).$
(3.7)

The following theorem shows that if the moment condition of Theorem 3.1 is replaced by a stronger condition $E{|X|}^{p}log|X|<\mathrm{\infty }$, then condition (3.1) can be replaced by the weaker condition (3.7).

Theorem 3.2 Suppose that $\beta \ge -1$. Let $\left\{{X}_{ni},i\ge 1,n\ge 1\right\}$ be an array of rowwise negatively dependent mean zero random variables which are stochastically dominated by a random variable X satisfying $E{|X|}^{p}log|X|<\mathrm{\infty }$ for some $p\ge 1$. Let $\left\{{a}_{ni},i\ge 1,n\ge 1\right\}$ be an array of constants satisfying (1.1) and (3.7). Furthermore, assume that (3.2) holds for some $\alpha >0$ if $p\ge 2$. Then

Proof As in the proof of Theorem 3.1, it suffices to prove (3.3) and (3.4). The proof of (3.3) is same as that of Theorem 3.1 except that q is replaced by p.

We now prove (3.4). When $1\le p<2$, we have by Markov’s inequality, Lemmas 2.1-2.3, and (3.7) that
$\begin{array}{rcl}{I}_{2}& \le & {ϵ}^{-p}\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }E|\sum _{i=1}^{\mathrm{\infty }}{a}_{ni}\left({X}_{ni}^{″}-E{X}_{ni}^{″}\right){|}^{p}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p}E{|{X}_{ni}^{″}-E{X}_{ni}^{″}|}^{p}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p}E{|X|}^{p}I\left(|X|>{n}^{\gamma }\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}E{|X|}^{p}I\left(|X|>{n}^{\gamma }\right)\\ \le & CE{|X|}^{p}log|X|<\mathrm{\infty }.\end{array}$
When $p\ge 2$, we will prove (3.4) by using Lemma 2.4. We have by Markov’s inequality, Lemmas 2.1-2.2, and (3.7) that
$\begin{array}{r}\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}P\left(|{a}_{ni}\left({X}_{ni}^{″}-E{X}_{ni}^{″}\right)|>ϵ\right)\\ \phantom{\rule{1em}{0ex}}\le {ϵ}^{-p}\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}E{|{a}_{ni}\left({X}_{ni}^{″}-E{X}_{ni}^{″}\right)|}^{p}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\beta }\sum _{i=1}^{\mathrm{\infty }}{|{a}_{ni}|}^{p}E{|X|}^{p}I\left(|X|>{n}^{\gamma }\right)\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-1}E{|X|}^{p}I\left(|X|>{n}^{\gamma }\right)\\ \phantom{\rule{1em}{0ex}}\le CE{|X|}^{p}log|X|<\mathrm{\infty }.\end{array}$
We also have that for $J\ge 1$ such that $\alpha J-\beta >1$,

Hence (3.4) holds by Lemma 2.4. □

Remark 3.3 If $1+\mu +\beta =0$, then $\mu =-1-\beta$. Hence Theorem 1.2(ii) follows from Theorem 3.2 by taking $p=\theta$. But, Theorem 1.2(ii) does not deal with the case of $\beta =-1$.

As mentioned in the Introduction, (1.5) does not hold. Hence it is of interest to find a complete convergence result similar to Theorem 1.3 without condition (1.5). The following corollary does not assume condition (1.5).

Corollary 3.1 Suppose that $r\ge 3/2$. Let $\left\{X,{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed negatively dependent mean zero random variables. Let $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of constants satisfying (1.6) and $|{a}_{ni}|=O\left(1\right)$. If (1.7) holds, then
(3.8)
Proof Let ${c}_{ni}={a}_{ni}/{n}^{1/2}$ for $1\le i\le n$ and ${c}_{ni}=0$ for $i>n$. We will apply Theorem 3.2 with $p=2\left(r-1\right)$, $\beta =r-2$, ${X}_{ni}={X}_{i}$, and ${a}_{ni}$ replaced by ${c}_{ni}$. Then
$\begin{array}{r}\underset{i\ge 1}{sup}|{c}_{ni}|=O\left({n}^{-1/2}\right),\\ \sum _{i=1}^{\mathrm{\infty }}{|{c}_{ni}|}^{p}={n}^{-\left(r-1\right)}\sum _{i=1}^{n}{|{a}_{ni}|}^{2\left(r-1\right)}=O\left({n}^{1-r}\right)=O\left({n}^{-1-\beta }\right).\end{array}$
Furthermore, if $p=2\left(r-1\right)\ge 2$, then
$\sum _{i=1}^{\mathrm{\infty }}{c}_{ni}^{2}={n}^{-1}\sum _{i=1}^{n}{a}_{ni}^{2}\le {n}^{-1}{\left(\sum _{i=1}^{n}{|{a}_{ni}|}^{2\left(r-1\right)}\right)}^{1/\left(r-1\right)}{n}^{1-1/\left(r-1\right)}=O\left({n}^{-1/\left(r-1\right)}\right).$

Hence the result follows from Theorem 3.2. □

Remark 3.4 When $1, Corollary 3.1 holds without the condition of negative dependence. Although (3.8) is weaker than (1.8), (3.8) can be strengthened to (1.8) if the negative dependence is replaced by the stronger condition of negative association.

## Declarations

### Acknowledgement

The author would like to thank the referees for helpful comments and suggestions. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2010-0013131).

## Authors’ Affiliations

(1)
Department of Applied Mathematics, Pai Chai University, Taejon, 302-735, South Korea

## References 