# Complete convergence for negatively orthant dependent random variables

## Abstract

In this paper, necessary and sufficient conditions of the complete convergence are obtained for the maximum partial sums of negatively orthant dependent (NOD) random variables. The results extend and improve those in Kuczmaszewska (Acta Math. Hung. 128(1-2):116-130, 2010) for negatively associated (NA) random variables.

MSC:60F15, 60G50.

## 1 Introduction

The concept of complete convergence for a sequence of random variables was introduced by Hsu and Robbins [1] as follows. A sequence $\left\{{U}_{n},n\ge 1\right\}$ of random variables converges completely to the constant θ if

Moreover, they proved that the sequence of arithmetic means of independent identically distribution (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. This result has been generalized and extended in several directions by many authors. One can refer to [216], and so forth. Kuczmaszewska [8] proved the following result.

Theorem A Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of negatively associated (NA) random variables and X be a random variables possibly defined on a different space satisfying the condition

$\frac{1}{n}\sum _{i=1}^{n}P\left(|{X}_{i}|>x\right)=DP\left(|X|>x\right)$

for all $x>0$, all $n\ge 1$ and some positive constant D. Let $\alpha p>1$ and $\alpha >1/2$. Moreover, additionally assume that $E{X}_{n}=0$ for all $n\ge 1$ if $p\ge 1$. Then the following statements are equivalent:

1. (i)

$E{|X|}^{p}<\mathrm{\infty }$,

2. (ii)

${\sum }_{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}P\left({max}_{1\le j\le n}|{\sum }_{i=1}^{j}{X}_{i}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty }$, $\mathrm{\forall }\epsilon >0$.

The aim of this paper is to extend and improve Theorem A to negatively orthant dependent (NOD) random variables. The tool in the proof of Theorem A is the Rosenthal maximal inequality for NA sequence (cf. [17]), but no one established the kind of maximal inequality for NOD sequence. So the truncated method is different and the proofs of our main results are more complicated and difficult.

The concept of negatively associated (NA) and negatively orthant dependent (NOD) was introduced by Joag-Dev and Proschan [18] in the following way.

Definition 1.1 A finite family of random variables $\left\{{X}_{i},1\le i\le n\right\}$ is said to be negatively associated (NA) if for every pair of disjoint nonempty subset ${A}_{1}$, ${A}_{2}$ of $\left\{1,2,\dots ,n\right\}$,

$|Cov\left({f}_{1}\left({X}_{i},i\in {A}_{1}\right),{f}_{2}\left({X}_{j},j\in {A}_{2}\right)\right)|\le 0,$

where ${f}_{1}$ and ${f}_{2}$ are coordinatewise nondecreasing such that the covariance exists. An infinite sequence of $\left\{{X}_{n},n\ge 1\right\}$ is NA if every finite subfamily is NA.

Definition 1.2 A finite family of random variables $\left\{{X}_{i},1\le i\le n\right\}$ is said to be

1. (a)

negatively upper orthant dependent (NUOD) if

$P\left({X}_{i}>{x}_{i},i=1,2,\dots ,n\right)\le \prod _{i=1}^{n}P\left({X}_{i}>{x}_{i}\right)$

for $\mathrm{\forall }{x}_{1},{x}_{2},\dots ,{x}_{n}\in R$,

1. (b)

negatively lower orthant dependent (NLOD) if

$P\left({X}_{i}\le {x}_{i},i=1,2,\dots ,n\right)\le \prod _{i=1}^{n}P\left({X}_{i}\le {x}_{i}\right)$

for $\mathrm{\forall }{x}_{1},{x}_{2},\dots ,{x}_{n}\in R$,

1. (c)

negatively orthant dependent (NOD) if they are both NUOD and NLOD.

A sequence of random variables $\left\{{X}_{n},n\ge 1\right\}$ is said to be NOD if for each n, ${X}_{1},{X}_{2},\dots ,{X}_{n}$ are NOD.

Obviously, every sequence of independent random variables is NOD. Joag-Dev and Proschan [18] pointed out that NA implies NOD, neither being NUOD nor being NLOD implies being NA. They gave an example that possesses NOD, but does not possess NA, which shows that NOD is strictly wider than NA. For more details of NOD random variables, one can refer to [3, 6, 11, 14, 1921], and so forth.

In order to prove our main results, we need the following lemmas.

Lemma 1.1 (Bozorgnia et al. [19])

Let ${X}_{1},{X}_{2},\dots ,{X}_{n}$ be NOD random variables.

1. (i)

If ${f}_{1},{f}_{2},\dots ,{f}_{n}$ are Borel functions all of which are monotone increasing (or all monotone decreasing), then ${f}_{1}\left({X}_{1}\right),{f}_{2}\left({X}_{2}\right),\dots ,{f}_{n}\left({X}_{n}\right)$ are NOD random variables.

2. (ii)

$E{\prod }_{i=1}^{n}{X}_{i}^{+}\le {\prod }_{i=1}^{n}E{X}_{i}^{+}$, $\mathrm{\forall }n\ge 2$.

Lemma 1.2 (Asadian et al. [22])

For any $q\ge 2$, there is a positive constant $C\left(q\right)$ depending only on q such that if $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of NOD random variables with $E{X}_{n}=0$ for every $n\ge 1$, then for all $n\ge 1$,

$E|\sum _{i=1}^{n}{X}_{i}{|}^{q}\le C\left(q\right)\left\{\sum _{i=1}^{n}E{|{X}_{i}|}^{q}+{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{q/2}\right\}.$

Lemma 1.3 For any $q\ge 2$, there is a positive constant $C\left(q\right)$ depending only on q such that if $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of NOD random variables with $E{X}_{n}=0$ for every $n\ge 1$, then for all $n\ge 1$,

$E\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{X}_{i}{|}^{q}\le C\left(q\right){\left(log\left(4n\right)\right)}^{q}\left\{\sum _{i=1}^{n}E{|{X}_{i}|}^{q}+{\left(\sum _{i=1}^{n}E{X}_{i}^{2}\right)}^{q/2}\right\}.$

Proof By Lemma 1.2, the proof is similar to that of Theorem 2.3.1 in Stout [23], so it is omitted here. □

Lemma 1.4 (Kuczmaszewska [8])

Let β, γ be positive constants. Suppose that $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of random variables and X is a random variable. There exists constant $D>0$ such that

$\sum _{i=1}^{n}P\left(|{X}_{i}|>x\right)\le DnP\left(|X|>x\right),\phantom{\rule{1em}{0ex}}\mathrm{\forall }x>0,\mathrm{\forall }n\ge 1;$
(1.1)
1. (i)

if $E{|X|}^{\beta }<\mathrm{\infty }$, then $\frac{1}{n}{\sum }_{j=1}^{n}E{|{X}_{j}|}^{\beta }\le CE{|X|}^{\beta }$;

2. (ii)

$\frac{1}{n}{\sum }_{j=1}^{n}E{|{X}_{j}|}^{\beta }I\left(|{X}_{j}|\le \gamma \right)\le C\left\{E{|X|}^{\beta }I\left(|X|\le \gamma \right)+{\gamma }^{\beta }P\left(|X|>\gamma \right)\right\}$;

3. (iii)

$\frac{1}{n}{\sum }_{j=1}^{n}E{|{X}_{j}|}^{\beta }I\left(|{X}_{j}|>\gamma \right)\le CE{|X|}^{\beta }I\left(|X|>\gamma \right)$.

Recall that a function $h\left(x\right)$ is said to be slowly varying at infinity if it is real valued, positive, and measurable on $\left[0,\mathrm{\infty }\right)$, and if for each $\lambda >0$

$\underset{x\to \mathrm{\infty }}{lim}\frac{h\left(\lambda x\right)}{h\left(x\right)}=1.$

We refer to Seneta [24] for other equivalent definitions and for a detailed and comprehensive study of properties of slowly varying functions.

We frequently use the following properties of slowly varying functions (cf. Seneta [24]).

Lemma 1.5 If $h\left(x\right)$ is a function slowly varying at infinity, then for any $s>0$

${C}_{1}{n}^{-s}h\left(n\right)\le \sum _{i=n}^{\mathrm{\infty }}{i}^{-1-s}h\left(i\right)\le {C}_{2}{n}^{-s}h\left(n\right)$

and

${C}_{3}{n}^{s}h\left(n\right)\le \sum _{i=1}^{n}{i}^{-1+s}h\left(i\right)\le {C}_{4}{n}^{s}h\left(n\right),$

where ${C}_{1},{C}_{2},{C}_{3},{C}_{4}>0$ depend only on s.

Throughout this paper, C will represent positive constants of which the value may change from one place to another.

## 2 Main results and proofs

Theorem 2.1 Let $\alpha >1/2$, $p>0$, $\alpha p>1$ and $h\left(x\right)$ be a slowly varying function at infinity. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of NOD random variables and X be a random variables possibly defined on a different space satisfying the condition (1.1). Moreover, additionally assume that for $\alpha \le 1$, $E{X}_{n}=0$ for all $n\ge 1$. If

$E{|X|}^{p}h\left({|X|}^{1/\alpha }\right)<\mathrm{\infty },$
(2.1)

then the following statements hold:

$\left(\mathrm{i}\right)\phantom{\rule{1em}{0ex}}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le j\le n}{max}|{S}_{j}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0;$
(2.2)
$\left(\mathrm{i}\mathrm{i}\right)\phantom{\rule{1em}{0ex}}\sum _{n=2}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le k\le n}{max}|{S}_{n}^{\left(k\right)}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0;$
(2.3)
$\left(\mathrm{i}\mathrm{i}\mathrm{i}\right)\phantom{\rule{1em}{0ex}}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le j\le n}{max}|{X}_{j}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0;$
(2.4)
$\left(\mathrm{i}\mathrm{v}\right)\phantom{\rule{1em}{0ex}}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{S}_{j}|\ge \epsilon \right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0;$
(2.5)
$\left(\mathrm{v}\right)\phantom{\rule{1em}{0ex}}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.6)

Here ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$, ${S}_{n}^{\left(k\right)}={S}_{n}-{X}_{k}$, $k=1,2,\dots ,n$.

Proof First, we prove (2.2). Choose q such that $1/\alpha p. Let ${X}_{i}^{\left(n,1\right)}=-{n}^{\alpha q}I\left({X}_{i}<-{n}^{\alpha q}\right)+{X}_{i}I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{\alpha q}I\left({X}_{i}>{n}^{\alpha q}\right)$, ${X}_{i}^{\left(n,2\right)}=\left({X}_{i}-{n}^{\alpha q}\right)I\left({X}_{i}>{n}^{\alpha q}\right)$, ${X}_{i}^{\left(n,3\right)}=-\left({X}_{i}+{n}^{\alpha q}\right)I\left({X}_{i}<-{n}^{\alpha q}\right)$, $\mathrm{\forall }n\ge 1$, $1\le i\le n$. Note that

${X}_{i}={X}_{i}^{\left(n,1\right)}+{X}_{i}^{\left(n,2\right)}-{X}_{i}^{\left(n,3\right)}$

and

$\begin{array}{c}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le j\le n}{max}|{S}_{j}|>\epsilon {n}^{\alpha }\right)\hfill \\ \phantom{\rule{1em}{0ex}}\le \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}{X}_{i}^{\left(n,1\right)}|>\epsilon {n}^{\alpha }/3\right)\hfill \\ \phantom{\rule{2em}{0ex}}+\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\sum _{i=1}^{n}{X}_{i}^{\left(n,2\right)}>\epsilon {n}^{\alpha }/3\right)+\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\sum _{i=1}^{n}{X}_{i}^{\left(n,3\right)}>\epsilon {n}^{\alpha }/3\right)\hfill \\ \phantom{\rule{1em}{0ex}}\stackrel{\mathrm{def}}{=}{I}_{1}+{I}_{2}+{I}_{3}.\hfill \end{array}$
(2.7)

In order to prove (2.2), it suffices to show that ${I}_{l}<\mathrm{\infty }$ for $l=1,2,3$. Obviously, for $0<\eta , the condition (2.1) implies $E{|X|}^{p-\eta }<\mathrm{\infty }$. Therefore, we choose $0<\eta , $\alpha \left(p-\eta \right)>\alpha \left(p-\eta \right)q>1$ and $p-\eta -1>0$ if $p>1$. In order to prove ${I}_{1}<\mathrm{\infty }$, we first prove that

$\underset{n\to \mathrm{\infty }}{lim}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}E{X}_{i}^{\left(n,1\right)}|=0.$
(2.8)

This holds when $\alpha \le 1$. Since $\alpha p>1$, $p>1$. By $E{X}_{i}=0$, $i\ge 1$, and Lemma 1.4, we have

$\begin{array}{rcl}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}E{X}_{i}^{\left(n,1\right)}|& \le & {n}^{-\alpha }\underset{1\le j\le n}{max}\sum _{i=1}^{j}\left\{E|{X}_{i}|I\left(|{X}_{i}|>{n}^{\alpha q}\right)+{n}^{\alpha q}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right\}\\ \le & 2{n}^{-\alpha }\sum _{i=1}^{n}E|{X}_{i}|I\left(|{X}_{i}|>{n}^{\alpha q}\right)\le C{n}^{1-\alpha }E|X|I\left(|X|>{n}^{\alpha q}\right)\\ \le & C{n}^{-\left\{\alpha \left(p-\eta \right)q-1\right\}-\alpha \left(1-q\right)}E{|X|}^{p-\eta }\\ \to & 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$

When $\alpha >1$, $p>1$,

$\begin{array}{rcl}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}E{X}_{i}^{\left(n,1\right)}|& \le & {n}^{-\alpha }\underset{1\le j\le n}{max}\sum _{i=1}^{j}\left\{E|{X}_{i}|I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{\alpha q}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right\}\\ \le & {n}^{-\alpha }\sum _{i=1}^{n}E|{X}_{i}|\le C{n}^{1-\alpha }E|X|\\ \to & 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$

When $\alpha >1$, $p\le 1$,

$\begin{array}{rcl}{n}^{-\alpha }\underset{1\le j\le n}{max}|\sum _{i=1}^{j}E{X}_{i}^{\left(n,1\right)}|& \le & {n}^{-\alpha }\underset{1\le j\le n}{max}\sum _{i=1}^{j}\left\{E|{X}_{i}|I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{\alpha q}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right\}\\ \le & {n}^{-\alpha }\sum _{i=1}^{n}\left\{E|{X}_{i}|I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{\alpha q}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right\}\\ \le & {n}^{-\alpha }\sum _{i=1}^{n}\left({n}^{\alpha \left(1-p+\eta \right)q}E{|{X}_{i}|}^{p-\eta }\right)\\ \le & C{n}^{-\left\{\alpha \left(p-\eta \right)q-1\right\}-\alpha \left(1-q\right)}E{|X|}^{p-\eta }\\ \to & 0,\phantom{\rule{1em}{0ex}}n\to \mathrm{\infty }.\end{array}$

Therefore, (2.8) holds. So, in order to prove ${I}_{1}<\mathrm{\infty }$, it is enough to prove that

${I}_{1}^{\ast }:=\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{1\le j\le n}{max}|\sum _{i=1}^{j}\left({X}_{i}^{\left(n,1\right)}-E{X}_{i}^{\left(n,1\right)}\right)|>\epsilon {n}^{\alpha }/6\right)<\mathrm{\infty }.$
(2.9)

By Lemma 1.1 for $\mathrm{\forall }n\ge 1$, $\left\{{X}_{i}^{\left(n,1\right)}-E{X}_{i}^{\left(n,1\right)},1\le i\le n\right\}$ is a sequence of NOD random variables. When $0, by $\alpha \left(p-\eta \right)>1$ and $0, we have $\alpha -\frac{1}{2}-\alpha \left(1-\frac{p-\eta }{2}\right)q>\alpha -\frac{1}{2}-\alpha \left(1-\frac{p-\eta }{2}\right)>0$. Taking v such that $v>max\left\{2,p,\left(\alpha p-1\right)/\left(\alpha -1/2\right),\left(\alpha p-1\right)/\left(\alpha -\frac{1}{2}-\alpha \left(1-\frac{p-\eta }{2}\right)q\right),\frac{p-\left(p-\eta \right)q}{1-q}\right\}$, we get by the Markov inequality, the ${C}_{r}$ inequality, the Hölder inequality, and Lemma 1.3,

$\begin{array}{rcl}{I}_{1}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right){\left(log\left(4n\right)\right)}^{v}\sum _{i=1}^{n}E|{X}_{i}^{\left(n,1\right)}{|}^{v}\\ +C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right){\left(log\left(4n\right)\right)}^{v}{\left(\sum _{i=1}^{n}E|{X}_{i}^{\left(n,1\right)}{|}^{2}\right)}^{v/2}\stackrel{\mathrm{def}}{=}{I}_{11}^{\ast }+{I}_{12}^{\ast }.\end{array}$

By the ${C}_{r}$ inequality, Lemma 1.4, and Lemma 1.5, we have

$\begin{array}{rcl}{I}_{11}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right){\left(log\left(4n\right)\right)}^{v}\sum _{i=1}^{n}E\left\{{|{X}_{i}|}^{v}I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{\alpha qv}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right\}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-1}h\left(n\right){\left(log\left(4n\right)\right)}^{v}E\left\{{|X|}^{v}I\left(|X|\le {n}^{\alpha q}\right)+{n}^{\alpha qv}P\left(|X|>{n}^{\alpha q}\right)\right\}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha \left\{-\left(1-q\right)v+p-q\left(p-\eta \right)\right\}-1}h\left(n\right){\left(log\left(4n\right)\right)}^{v}E{|X|}^{p-\eta }<\mathrm{\infty }.\end{array}$

By the ${C}_{r}$ inequality and Lemma 1.4,

$\begin{array}{rcl}{I}_{12}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right){\left(log\left(4n\right)\right)}^{v}{\left\{\sum _{i=1}^{n}\left(E{|{X}_{i}|}^{2}I\left(|{X}_{i}|\le {n}^{\alpha q}\right)+{n}^{2\alpha q}P\left(|{X}_{i}|>{n}^{\alpha q}\right)\right)\right\}}^{v/2}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\left(\alpha -1/2\right)v}h\left(n\right){\left(log\left(4n\right)\right)}^{v}{\left\{E{|X|}^{2}I\left(|X|\le {n}^{\alpha q}\right)+{n}^{2\alpha q}P\left(|X|>{n}^{\alpha q}\right)\right\}}^{v/2}.\end{array}$

When $p>2$,

${I}_{12}^{\ast }\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\left(\alpha -1/2\right)v}h\left(n\right){\left(log\left(4n\right)\right)}^{v}{\left(E{X}^{2}\right)}^{v/2}<\mathrm{\infty }.$

When $0,

$\begin{array}{rcl}{I}_{12}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\left(\alpha -1/2\right)v}h\left(n\right){\left(log\left(4n\right)\right)}^{v}{\left(E{|X|}^{p-\eta }\right)}^{v/2}{n}^{\alpha q\left\{2-\left(p-\eta \right)\right\}v/2}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2-\left\{\alpha -\frac{1}{2}-\alpha \left(1-\frac{p-\eta }{2}\right)q\right\}v}h\left(n\right){\left(log\left(4n\right)\right)}^{v}<\mathrm{\infty }.\end{array}$

Therefore, (2.9) holds for ${I}_{2}$. Define ${Y}_{i}^{\left(n,2\right)}=\left({X}_{i}-{n}^{\alpha q}\right)I\left({n}^{\alpha q}<{X}_{i}\le {n}^{\alpha }+{n}^{\alpha q}\right)+{n}^{\alpha }I\left({X}_{i}>{n}^{\alpha }+{n}^{\alpha q}\right)$, $1\le i\le n$, $n\ge 1$, since ${X}_{i}^{\left(n,2\right)}={Y}_{i}^{\left(n,2\right)}+\left({X}_{i}-{n}^{\alpha q}-{n}^{\alpha }\right)I\left({X}_{i}>{n}^{\alpha }+{n}^{\alpha q}\right)$, we have

$\begin{array}{r}{I}_{2}\le \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\sum _{i=1}^{n}{Y}_{i}^{\left(n,2\right)}>\epsilon {n}^{\alpha }/6\right)\\ \phantom{{I}_{2}\le }+\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\sum _{i=1}^{n}\left({X}_{i}-{n}^{\alpha q}-{n}^{\alpha }\right)I\left({X}_{i}>{n}^{\alpha }+{n}^{\alpha q}\right)>\epsilon {n}^{\alpha }/6\right)\\ \phantom{{I}_{2}}\stackrel{\mathrm{def}}{=}{I}_{21}+{I}_{22}.\end{array}$
(2.10)

By Lemma 1.5, (2.1), and a standard computation, we have

$\begin{array}{rcl}{I}_{22}& \le & \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)\sum _{i=1}^{n}P\left({X}_{i}>{n}^{\alpha }+{n}^{\alpha q}\right)\le \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)\sum _{i=1}^{n}P\left(|{X}_{i}|>{n}^{\alpha }\right)\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1}h\left(n\right)P\left(|X|>{n}^{\alpha }\right)\le C+CE{|X|}^{p}h\left({|X|}^{1/\alpha }\right)<\mathrm{\infty }.\end{array}$
(2.11)

Now we prove ${I}_{21}<\mathrm{\infty }$. By (2.1) and Lemma 1.4, we have

Therefore, in order to prove ${I}_{21}<\mathrm{\infty }$, it is enough to prove that

${I}_{21}^{\ast }\le \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\sum _{i=1}^{n}\left({Y}_{i}^{\left(n,2\right)}-E{Y}_{i}^{\left(n,2\right)}\right)>\epsilon {n}^{\alpha }/12\right)<\mathrm{\infty }.$
(2.12)

Taking v such that $v>max\left\{2,\frac{\alpha p-1}{\alpha -1/2},\frac{2\left(\alpha p-1\right)}{\alpha \left(p-\eta \right)-1}\right\}$, we get by Lemma 1.1, the Markov inequality, the ${C}_{r}$ inequality, the Hölder inequality, and Lemma 1.2,

$\begin{array}{rcl}{I}_{21}^{\ast }& \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right)E|\sum _{i=1}^{n}\left({Y}_{i}^{\left(n,2\right)}-E{Y}_{i}^{\left(n,2\right)}\right){|}^{v}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right)\sum _{i=1}^{n}E|{Y}_{i}^{\left(n,2\right)}{|}^{v}+C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right){\left(\sum _{i=1}^{n}E{\left({Y}_{i}^{\left(n,2\right)}\right)}^{2}\right)}^{v/2}\\ \stackrel{\mathrm{def}}{=}& {I}_{211}^{\ast }+{I}_{212}^{\ast }.\end{array}$

By the ${C}_{r}$ inequality, Lemma 1.4, Lemma 1.5, (2.1), and a standard computation, we have

$\begin{array}{rcl}{I}_{211}^{\ast }& =& C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right)\sum _{i=1}^{n}E|{Y}_{i}^{\left(n,2\right)}{|}^{v}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right)\sum _{i=1}^{n}\left\{E{X}_{i}^{v}I\left({n}^{\alpha q}<{X}_{i}\le {n}^{\alpha q}+{n}^{\alpha }\right)+{n}^{\alpha v}P\left({X}_{i}>{n}^{\alpha q}+{n}^{\alpha }\right)\right\}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-2}h\left(n\right)\sum _{i=1}^{n}\left\{E{|{X}_{i}|}^{v}I\left(|{X}_{i}|\le 2{n}^{\alpha }\right)+{n}^{\alpha v}P\left(|{X}_{i}|>{n}^{\alpha }\right)\right\}\\ \le & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-\alpha v-1}h\left(n\right)\left\{E{|X|}^{v}I\left(|X|\le 2{n}^{\alpha }\right)+{n}^{\alpha v}P\left(|X|>{n}^{\alpha }\right)\right\}\\ \le & C+CE{|X|}^{p}h\left({|X|}^{1/\alpha }\right)<\mathrm{\infty }\end{array}$

and

Therefore, (2.12) holds. By (2.10)-(2.12) we get ${I}_{2}<\mathrm{\infty }$. In a similar way of ${I}_{2}<\mathrm{\infty }$ we can obtain ${I}_{3}<\mathrm{\infty }$. Thus, (2.2) holds.

(2.2) (2.3). Note that $|{S}_{n}^{\left(k\right)}|=|{S}_{n}-{X}_{k}|\le |{S}_{n}|+|{X}_{k}|=|{S}_{n}|+|{S}_{k}-{S}_{k-1}|\le |{S}_{n}|+|{S}_{k}|+|{S}_{k-1}|\le 3{max}_{1\le j\le n}|{S}_{j}|$, we have $\left({max}_{1\le k\le n}|{S}_{n}^{\left(k\right)}|\ge \epsilon {n}^{\alpha }\right)\subseteq \left({max}_{1\le j\le n}|{S}_{j}|\ge \epsilon {n}^{\alpha }/3\right)$, hence, from (2.2), (2.3) holds.

(2.3) (2.4). Since $\frac{1}{2}|{S}_{n}|\le \frac{n-1}{n}|{S}_{n}|=|\frac{1}{n}{\sum }_{k=1}^{n}{S}_{n}^{\left(k\right)}|\le {max}_{1\le k\le n}|{S}_{n}^{\left(k\right)}|$, $\mathrm{\forall }n\ge 2$, and $|{X}_{k}|=|{S}_{n}-{S}_{n}^{\left(k\right)}|\le |{S}_{n}|+|{S}_{n}^{\left(k\right)}|$, we have $\left({max}_{1\le k\le n}|{X}_{k}|\ge \epsilon {n}^{\alpha }\right)\subseteq \left(|{S}_{n}|\ge \epsilon {n}^{\alpha }/2\right)\cup \left({max}_{1\le k\le n}|{S}_{n}^{\left(k\right)}|\ge \epsilon {n}^{\alpha }/2\right)\subseteq \left({max}_{1\le k\le n}|{S}_{n}^{\left(k\right)}|\ge \epsilon {n}^{\alpha }/4\right)$, $\mathrm{\forall }n\ge 1$, hence, from (2.3), (2.4) holds.

(2.2) (2.5). By Lemma 1.5 and (2.3), we have

$\begin{array}{c}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{S}_{j}|\ge \epsilon \right)\hfill \\ \phantom{\rule{1em}{0ex}}=\sum _{i=1}^{\mathrm{\infty }}\sum _{{2}^{i-1}\le n<{2}^{i}}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{S}_{j}|\ge \epsilon \right)\hfill \\ \phantom{\rule{1em}{0ex}}\le C\sum _{i=1}^{\mathrm{\infty }}{2}^{i\left(\alpha p-1\right)}h\left({2}^{i}\right)P\left(\underset{j\ge {2}^{i-1}}{sup}{j}^{-\alpha }|{S}_{j}|\ge \epsilon \right)\hfill \\ \phantom{\rule{1em}{0ex}}\le C\sum _{i=1}^{\mathrm{\infty }}{2}^{i\left(\alpha p-1\right)}h\left({2}^{i}\right)\sum _{k=i}^{\mathrm{\infty }}P\left(\underset{{2}^{k-1}\le j<{2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)\hfill \\ \phantom{\rule{1em}{0ex}}\le C\sum _{k=1}^{\mathrm{\infty }}P\left(\underset{{2}^{k-1}\le j<{2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)\sum _{i=1}^{k}{2}^{i\left(\alpha p-1\right)}h\left({2}^{i}\right)\hfill \\ \phantom{\rule{1em}{0ex}}\le C\sum _{k=1}^{\mathrm{\infty }}{2}^{k\left(\alpha p-1\right)}h\left({2}^{k}\right)P\left(\underset{1\le j<{2}^{k}}{max}|{S}_{j}|\ge \epsilon {2}^{\alpha \left(k-1\right)}\right)<\mathrm{\infty }.\hfill \end{array}$

(2.5) (2.6). The proof of (2.5) (2.6) is similar to that of (2.2) (2.4), so it is omitted. □

Theorem 2.2 Let $\alpha >1/2$, $p>0$, $\alpha p>1$ and $h\left(x\right)$ be a slowly varying function at infinity. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of NOD random variables and X be a random variables possibly defined on a different space. Moreover, additionally assume that for $\alpha \le 1$, $E{X}_{n}=0$ for all $n\ge 1$. If there exist constant ${D}_{1}>0$ and ${D}_{2}>0$ such that

$\frac{{D}_{1}}{n}\sum _{i=n}^{2n-1}P\left(|{X}_{i}|>x\right)\le P\left(|X|>x\right)\le \frac{{D}_{2}}{n}\sum _{i=n}^{2n-1}P\left(|{X}_{i}|>x\right),\phantom{\rule{1em}{0ex}}\mathrm{\forall }x>0,n\ge 1,$

then (2.1)-(2.6) are equivalent.

Proof From the proof of Theorem 2.1, in order to prove Theorem 2.2, it is enough to show that (2.4) (2.6) and (2.6) (2.1). The proof of (2.4) (2.6) is similar to that of (2.2) (2.5). Now, we prove (2.6) (2.1). Firstly we prove that

$\underset{n\to \mathrm{\infty }}{lim}P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)=0,\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.13)

Otherwise, there are ${\epsilon }_{0}>0$, $\delta >0$, and a sequence of positive integers $\left\{{n}_{k},k\ge 1\right\}$, ${n}_{k}↑\mathrm{\infty }$ such that $P\left({sup}_{j\ge {n}_{k}}{j}^{-\alpha }|{X}_{j}|\ge {\epsilon }_{0}\right)\ge \delta$, $\mathrm{\forall }k\ge 1$. Without loss of generality, we can assume that ${n}_{k+1}\ge 2{n}_{k}$, $\mathrm{\forall }k\ge 1$. Therefore, we have

$P\left(\underset{j\ge 2{n}_{k}}{sup}{j}^{-\alpha }|{X}_{j}|\ge {\epsilon }_{0}\right)\ge \delta ,\phantom{\rule{1em}{0ex}}\mathrm{\forall }k\ge 1.$

By $\alpha p>1$ we have

$\begin{array}{c}\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge {\epsilon }_{0}\right)\hfill \\ \phantom{\rule{1em}{0ex}}\ge \sum _{k=1}^{\mathrm{\infty }}\sum _{n={n}_{k}+1}^{2{n}_{k}}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge {\epsilon }_{0}\right)\hfill \\ \phantom{\rule{1em}{0ex}}\ge C\sum _{k=1}^{\mathrm{\infty }}{n}_{k}^{\alpha p-1}h\left({n}_{k}\right)P\left(\underset{j\ge 2{n}_{k}}{sup}{j}^{-\alpha }|{X}_{j}|\ge {\epsilon }_{0}\right)=\mathrm{\infty },\hfill \end{array}$

which is in contradiction with (2.6), thus, (2.13) holds. By Lemma 1.1, we get

$\begin{array}{rcl}P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)& \ge & P\left(\underset{n\le j<2n}{max}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)\\ \ge & P\left(\underset{n\le j<2n}{max}|{X}_{j}|\ge {\left(2n\right)}^{\alpha }\epsilon \right)\\ \ge & 1-P\left(\underset{n\le j<2n}{max}{X}_{j}<{\left(2n\right)}^{\alpha }\epsilon \right)=1-E\left(\prod _{j=n}^{2n-1}I\left({X}_{j}<{\left(2n\right)}^{\alpha }\epsilon \right)\right)\\ \ge & 1-\prod _{j=n}^{2n-1}P\left({X}_{j}<{\left(2n\right)}^{\alpha }\epsilon \right)=1-\prod _{j=n}^{2n-1}\left(1-P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right)\right)\\ \ge & 1-exp\left(-\sum _{j=n}^{2n-1}P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right)\right).\end{array}$

By (2.13), we have ${lim}_{n\to \mathrm{\infty }}{\sum }_{j=n}^{2n-1}P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right)=0$, $\mathrm{\forall }\epsilon >0$. Therefore, when n is large enough, we have

$\begin{array}{rcl}P\left(\underset{n\le j<2n}{max}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)& \ge & 1-\left\{1-\sum _{j=n}^{2n-1}P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right)+\frac{1}{2}{\left(\sum _{j=n}^{2n-1}P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right)\right)}^{2}\right\}\\ \ge & C\sum _{j=n}^{2n-1}P\left({X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right),\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.\end{array}$

In a similar way, when n is large enough,

$P\left(\underset{n\le j<2n}{max}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)\ge C\sum _{j=n}^{2n-1}P\left(-{X}_{j}\ge {\left(2n\right)}^{\alpha }\epsilon \right),\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$

Thus, when n is large enough, we have

$P\left(\underset{n\le j<2n}{max}{j}^{-\alpha }|{X}_{j}|\ge \epsilon \right)\ge C\sum _{j=n}^{2n-1}P\left(|{X}_{j}|\ge {\left(2n\right)}^{\alpha }\epsilon \right)\ge CnP\left(|X|\ge {\left(2n\right)}^{\alpha }\epsilon \right),\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.14)

Taking $\epsilon ={2}^{-\alpha }$, by (2.6), (2.14), Lemma 1.5, and a standard computation, we have

$\begin{array}{rcl}\mathrm{\infty }& >& \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{j\ge n}{sup}{j}^{-\alpha }|{X}_{j}|\ge {2}^{-\alpha }\right)\ge \sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\underset{n\le j<2n}{max}{j}^{-\alpha }|{X}_{j}|\ge {2}^{-\alpha }\right)\\ \ge & C\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-1}h\left(n\right)P\left(|X|\ge {n}^{\alpha }\right)\\ \ge & CE{|X|}^{p}h\left({|X|}^{1\alpha }\right).\end{array}$

Thus, (2.1) holds. □

In the following, let $\left\{{\tau }_{n},n\ge 1\right\}$ be a sequence of non-negative, integer valued random variables and τ a positive random variable. All random variables are defined on the same probability space.

Theorem 2.3 Let $\alpha >1/2$, $p>0$, $\alpha p>1$ and $h\left(x\right)>0$ be a slowly varying function as $x\to +\mathrm{\infty }$. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of NOD random variables and X be a random variables possibly defined on a different space satisfying the condition (1.1) and (2.1). Moreover, additionally assume that for $\alpha \le 1$, $E{X}_{n}=0$ for all $n\ge 1$. If there exists $\lambda >0$ such that ${\sum }_{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(\frac{{\tau }_{n}}{n}<\lambda \right)<\mathrm{\infty }$, then

$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(|{S}_{{\tau }_{n}}|\ge \epsilon {\tau }_{n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.15)

Proof Note that

$\left(|{S}_{{\tau }_{n}}|\ge \epsilon {\tau }_{n}^{\alpha }\right)\subseteq \left({\tau }_{n}/n<\lambda \right)\cup \left(|{S}_{{\tau }_{n}}|\ge \epsilon {\tau }_{n}^{\alpha },{\tau }_{n}\ge \lambda n\right)\subseteq \left({\tau }_{n}/n<\lambda \right)\cup \left(\underset{j\ge \lambda n}{sup}{j}^{-\alpha }|{S}_{j}|\ge \epsilon \right).$

Thus, by (2.5) of Theorem 2.1, we have (2.15). □

Theorem 2.4 Let $\alpha >1/2$, $p>0$, $\alpha p>1$ and $h\left(x\right)$ be a slowly varying function at infinity. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of NOD random variables and X be a random variables possibly defined on a different space satisfying the condition (1.1) and (2.1). Moreover, additionally assume that for $\alpha \le 1$, $E{X}_{n}=0$ for all $n\ge 1$. If there exists $\theta >0$ such that ${\sum }_{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(|\frac{{\tau }_{n}}{n}-\tau |>\theta \right)<\mathrm{\infty }$ with $P\left(\tau \le B\right)=1$ for some $B>0$, then

$\sum _{n=1}^{\mathrm{\infty }}{n}^{\alpha p-2}h\left(n\right)P\left(|{S}_{{\tau }_{n}}|\ge \epsilon {n}^{\alpha }\right)<\mathrm{\infty },\phantom{\rule{1em}{0ex}}\mathrm{\forall }\epsilon >0.$
(2.16)

Proof Note that

$\begin{array}{rcl}\left(|{S}_{{\tau }_{n}}|\ge \epsilon {n}^{\alpha }\right)& \subseteq & \left(|\frac{{\tau }_{n}}{n}-\tau |>\theta \right)\cup \left(|{S}_{{\tau }_{n}}|\ge \epsilon {n}^{\alpha },|\frac{{\tau }_{n}}{n}-\tau |\le \theta \right)\\ \subseteq & \left(|\frac{{\tau }_{n}}{n}-\tau |>\theta \right)\cup \left(|{S}_{{\tau }_{n}}|\ge \epsilon {n}^{\alpha },{\tau }_{n}\le \left(\tau +\theta \right)n\right)\\ \subseteq & \left(|\frac{{\tau }_{n}}{n}-\tau |>\theta \right)\cup \left(|{S}_{{\tau }_{n}}|\ge \epsilon {n}^{\alpha },{\tau }_{n}\le \left(B+\theta \right)n\right)\\ \subseteq & \left(|\frac{{\tau }_{n}}{n}-\tau |>\theta \right)\cup \left(\underset{1\le j\le \left(B+\theta \right)n}{max}|{S}_{j}|\ge \epsilon {n}^{\alpha }\right).\end{array}$

Thus, by (2.2) of Theorem 2.1, we have (2.16). □

## References

1. Hsu P, Robbins H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 1947, 33: 25–31. 10.1073/pnas.33.2.25

2. Baum IE, Katz M: Convergence rates in the law of large numbers. Trans. Am. Math. Soc. 1965, 120: 108–123. 10.1090/S0002-9947-1965-0198524-1

3. Baek J, Park ST: Convergence of weighted sums for arrays of negatively dependent random variables and its applications. J. Stat. Plan. Inference 2010, 140: 2461–2469. 10.1016/j.jspi.2010.02.021

4. Bai ZD, Su C: The complete convergence for partial sums of i.i.d. random variables. Sci. China Ser. A 1985, 5: 399–412.

5. Chen P, Hu TC, Liu X, Volodin A: On complete convergence for arrays of rowwise negatively associated random variables. Theory Probab. Appl. 2007, 52: 323–328.

6. Gan S, Chen P: Strong convergence rate of weighted sums for negatively dependent sequences. Acta. Math. Sci. Ser. A 2008, 28: 283–290. (in Chinese)

7. Gut A: Complete convergence for arrays. Period. Math. Hung. 1992, 25: 51–75. 10.1007/BF02454383

8. Kuczmaszewska A: On complete convergence in Marcinkiewica-Zygmund type SLLN for negatively associated random variables. Acta Math. Hung. 2010,128(1–2):116–130. 10.1007/s10474-009-9166-y

9. Liang HY, Wang L: Convergence rates in the law of large numbers for B -valued random elements. Acta Math. Sci. Ser. B 2001, 21: 229–236.

10. Peligrad M, Gut A: Almost-sure results for a class of dependent random variables. J. Theor. Probab. 1999, 12: 87–104. 10.1023/A:1021744626773

11. Qiu DH, Chang KC, Antonini RG, Volodin A: On the strong rates of convergence for arrays of rowwise negatively dependent random variables. Stoch. Anal. Appl. 2011, 29: 375–385. 10.1080/07362994.2011.548683

12. Sung SH: Complete convergence for weighted sums of random variables. Stat. Probab. Lett. 2007, 77: 303–311. 10.1016/j.spl.2006.07.010

13. Sung SH: A note on the complete convergence for arrays of rowwise independent random elements. Stat. Probab. Lett. 2008, 78: 1283–1289. 10.1016/j.spl.2007.11.018

14. Taylor RL, Patterson R, Bozorgnia A: A strong law of large numbers for arrays of rowwise negatively dependent random variables. Stoch. Anal. Appl. 2002, 20: 643–656. 10.1081/SAP-120004118

15. Wang XM: Complete convergence for sums of NA sequence. Acta Math. Appl. Sin. 1999, 22: 407–412.

16. Zhang LX, Wang JF: A note on complete convergence of pairwise NQD random sequences. Appl. Math. J. Chin. Univ. Ser. A 2004, 19: 203–208. 10.1007/s11766-004-0055-4

17. Shao QM: A comparison theorem on moment inequalities between negatively associated and independent random variables. J. Theor. Probab. 2000, 13: 343–356. 10.1023/A:1007849609234

18. Joag-Dev K, Proschan F: Negative association of random variables with applications. Ann. Stat. 1983, 11: 286–295. 10.1214/aos/1176346079

19. Bozorgnia A, Patterson RF, Taylor RL: Limit theorems for dependent random variables. II. In Proc. of the First World Congress of Nonlinear Analysts ’92. de Gruyter, Berlin; 1996:1639–1650.

20. Ko MH, Han KH, Kim TS: Strong laws of large numbers for weighted sums of negatively dependent random variables. J. Korean Math. Soc. 2006, 43: 1325–1338.

21. Ko MH, Kim TS: Almost sure convergence for weighted sums of negatively dependent random variables. J. Korean Math. Soc. 2005, 42: 949–957.

22. Asadian N, Fakoor V, Bozorgnia A: Rosenthal’s type inequalities for negatively orthant dependent random variables. J. Iran. Stat. Soc. 2006,5(1–2):69–75.

23. Stout WF: Almost Sure Convergence. Academic Press, New York; 1974.

24. Seneta E Lecture Notes in Math. 508. In Regularly Varying Function. Springer, Berlin; 1976.

## Acknowledgements

The authors would like to thank the referees and the editors for the helpful comments and suggestions. This work was supported by the National Natural Science Foundation of China (Grant No. 11271161).

## Author information

Authors

### Corresponding author

Correspondence to Dehua Qiu.

### Competing interests

The authors declare that they have no competing interests.

### Authors’ contributions

All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript.

## Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

Qiu, D., Wu, Q. & Chen, P. Complete convergence for negatively orthant dependent random variables. J Inequal Appl 2014, 145 (2014). https://doi.org/10.1186/1029-242X-2014-145