# Some probability inequalities for a class of random variables and their applications

## Abstract

Some probability inequalities for a class of random variables are presented. As applications, we study the complete convergence for it. Our main results generalize the corresponding ones for negatively associated random variables and negatively orthant dependent random variables.

MSC:60E15, 60F15.

## 1 Introduction

Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of random variables defined on a fixed probability space $\left(\mathrm{\Omega },\mathcal{F},P\right)$. The exponential inequality for the partial sums ${\sum }_{i=1}^{n}\left({X}_{i}-E{X}_{i}\right)$ plays an important role in various proofs of limit theorems. In particular, it provides a measure of convergence rate for the strong law of large numbers. The main purpose of the paper is to present some probability inequalities for a class of random variables. As applications, we will give some complete convergence for a class of random variables.

Firstly, we will recall the definitions of negatively orthant dependent random variables and acceptable random variables.

Definition 1.1 A finite collection of random variables ${X}_{1},{X}_{2},\dots ,{X}_{n}$ is said to be negatively orthant dependent (NOD, in short) if the following two inequalities:

$P\left({X}_{1}>{x}_{1},{X}_{2}>{x}_{2},\dots ,{X}_{n}>{x}_{n}\right)\le \prod _{i=1}^{n}P\left({X}_{i}>{x}_{i}\right)$
(1.1)

and

$P\left({X}_{1}\le {x}_{1},{X}_{2}\le {x}_{2},\dots ,{X}_{n}\le {x}_{n}\right)\le \prod _{i=1}^{n}P\left({X}_{i}\le {x}_{i}\right)$
(1.2)

hold for all real numbers ${x}_{1},{x}_{2},\dots ,{x}_{n}$. An infinite sequence $\left\{{X}_{n},n\ge 1\right\}$ is said to be NOD if every finite subcollection is NOD.

The notion of NOD random variables was introduced by Lehmann  and developed in Joag-Dev and Proschan . Obviously, independent random variables are NOD. Joag-Dev and Proschan  pointed out that negatively associated (NA, in short) random variables are NOD, but neither NUOD nor NLOD implies NA. They also presented an example in which $X=\left({X}_{1},{X}_{2},{X}_{3},{X}_{4}\right)$ possesses NOD, but does not possess NA. So, we can see that NOD is weaker than NA.

Recently, Giuliano Antonini et al.  introduced the following notion of acceptability.

Definition 1.2 We say that a finite collection of random variables ${X}_{1},{X}_{2},\dots ,{X}_{n}$ is acceptable if for any real λ,

$Eexp\left(\lambda \sum _{i=1}^{n}{X}_{i}\right)\le \prod _{i=1}^{n}Eexp\left(\lambda {X}_{i}\right).$
(1.3)

An infinite sequence of random variables $\left\{{X}_{n},n\ge 1\right\}$ is acceptable if every finite subcollection is acceptable.

As is mentioned in Giuliano Antonini et al. , a sequence of NOD random variables with a finite Laplace transform or finite moment generating function near zero (and hence a sequence of negatively associated random variables with finite Laplace transform, too) provides us an example of acceptable random variables. For example, Xing et al.  consider a strictly stationary negatively associated sequence of random variables. According to the sentence above, the sequence of strictly stationary and negatively associated random variables is acceptable. Hence, the model of acceptable random variables is more general than models considered in the previous literature. Studying the limiting behavior of acceptable random variables is of interest.

The main purpose of the paper is to present some exponential probability inequalities for a sequence of acceptable random variables and give some applications by using these exponential probability inequalities. For more details about the exponential probability inequality, one can refer to Wang et al. , Sung , Sung et al.  and Xing et al. [4, 10], and so forth.

The paper is organized as follows. The exponential probability inequalities for a sequence of acceptable random variables are presented in Section 2, and the complete convergence for it is obtained in Section 3. Our results are based on some moment conditions, while the main results of Sung et al.  are under the condition of moment and identical distribution.

Throughout the paper, let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables and denote ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$ for each $n\ge 1$.

## 2 Probability inequalities for acceptable random variables

Theorem 2.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables and $\left\{{g}_{n},n\ge 1\right\}$ be a sequence of positive numbers with ${G}_{n}={\sum }_{i=1}^{n}{g}_{i}$ for each $n\ge 1$. For fixed $n\ge 1$, if there exists a positive number T such that

$E{e}^{t{X}_{k}}\le {e}^{\frac{1}{2}{g}_{k}{t}^{2}},\phantom{\rule{1em}{0ex}}0\le t\le T,k=1,2,\dots ,n,$
(2.1)

then

$P\left({S}_{n}\ge x\right)\le \left\{\begin{array}{cc}{e}^{-\frac{{x}^{2}}{2{G}_{n}}},\hfill & 0\le x\le {G}_{n}T,\hfill \\ {e}^{-\frac{Tx}{2}},\hfill & x\ge {G}_{n}T.\hfill \end{array}$
(2.2)

Proof For each x, by Markov’s inequality, Definition 1.2 and (2.1), we can see that

$P\left({S}_{n}\ge x\right)\le {e}^{-tx}E{e}^{t{S}_{n}}\le {e}^{-tx}\prod _{i=1}^{n}E{e}^{t{X}_{i}}\le {e}^{\frac{{G}_{n}{t}^{2}}{2}-tx},\phantom{\rule{1em}{0ex}}0\le t\le T,$
(2.3)

which implies that

$P\left({S}_{n}\ge x\right)\le \underset{0
(2.4)

For fixed $x\ge 0$, if $T\ge \frac{x}{{G}_{n}}\ge 0$, then

${e}^{{inf}_{0
(2.5)

if $T\le \frac{x}{{G}_{n}}$, then

${e}^{{inf}_{0
(2.6)

The desired result (2.2) follows from (2.4)-(2.6) immediately. □

Corollary 2.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables and $\left\{{g}_{n},n\ge 1\right\}$ be a sequence of positive numbers with ${G}_{n}={\sum }_{i=1}^{n}{g}_{i}$ for each $n\ge 1$. For fixed $n\ge 1$, if there exists a positive number T such that

$E{e}^{t{X}_{k}}\le {e}^{\frac{1}{2}{g}_{k}{t}^{2}},\phantom{\rule{2em}{0ex}}|t|\le T,\phantom{\rule{1em}{0ex}}k=1,2,\dots ,n,$
(2.7)

then

$P\left({S}_{n}\le -x\right)\le \left\{\begin{array}{cc}{e}^{-\frac{{x}^{2}}{2{G}_{n}}},\hfill & 0\le x\le {G}_{n}T,\hfill \\ {e}^{-\frac{Tx}{2}},\hfill & x\ge {G}_{n}T,\hfill \end{array}$
(2.8)

and

$P\left(|{S}_{n}|\ge x\right)\le \left\{\begin{array}{cc}2{e}^{-\frac{{x}^{2}}{2{G}_{n}}},\hfill & 0\le x\le {G}_{n}T,\hfill \\ 2{e}^{-\frac{Tx}{2}},\hfill & x\ge {G}_{n}T.\hfill \end{array}$
(2.9)

Proof It is easily seen that $\left\{-{X}_{n},n\ge 1\right\}$ is still a sequence of acceptable random variables. By Theorem 2.1, we can see that

$P\left(-{S}_{n}\ge x\right)\le \left\{\begin{array}{cc}{e}^{-\frac{{x}^{2}}{2{G}_{n}}},\hfill & 0\le x\le {G}_{n}T,\hfill \\ {e}^{-\frac{Tx}{2}},\hfill & x\ge {G}_{n}T,\hfill \end{array}$
(2.10)

which implies that (2.8) is valid. Finally, (2.9) follows (2.2) and (2.8) immediately. □

Corollary 2.2 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables with $E{X}_{k}=0$ and $E{X}_{k}^{2}={\sigma }_{k}^{2}<\mathrm{\infty }$ for each $k\ge 1$. Denote ${B}_{n}^{2}={\sum }_{k=1}^{n}{\sigma }_{k}^{2}$ for each $n\ge 1$. For fixed $n\ge 1$, if there exists a positive number H such that

$|E{X}_{k}^{m}|\le \frac{m!}{2}{\sigma }_{k}^{2}{H}^{m-2},\phantom{\rule{1em}{0ex}}k=1,2,\dots ,n$
(2.11)

for any positive integer $m\ge 2$, then

$P\left(|{S}_{n}|\ge x\right)\le \left\{\begin{array}{cc}2{e}^{-\frac{{x}^{2}}{4{B}_{n}^{2}}},\hfill & 0\le x\le \frac{{B}_{n}^{2}}{H},\hfill \\ 2{e}^{-\frac{x}{4H}},\hfill & x\ge \frac{{B}_{n}^{2}}{H}.\hfill \end{array}$
(2.12)

Proof By (2.11), we can see that

$E{e}^{t{X}_{k}}=1+\frac{{t}^{2}}{2}{\sigma }_{k}^{2}+\frac{{t}^{3}}{6}E{X}_{k}^{3}+\cdots \le 1+\frac{{t}^{2}}{2}{\sigma }_{k}^{2}\left(1+H|t|+{H}^{2}{t}^{2}+\cdots \right)$

for $k=1,2,\dots ,n$. When $|t|\le \frac{1}{2H}$, it follows that

$E{e}^{t{X}_{k}}\le 1+\frac{{t}^{2}{\sigma }_{k}^{2}}{2}\cdot \frac{1}{1-H|t|}\le 1+{t}^{2}{\sigma }_{k}^{2}\le {e}^{{t}^{2}{\sigma }_{k}^{2}}\doteq {e}^{\frac{1}{2}{g}_{k}{t}^{2}},\phantom{\rule{1em}{0ex}}k=1,2,\dots ,n,$
(2.13)

where ${g}_{k}=2{\sigma }_{k}^{2}$. Take ${G}_{n}\doteq {\sum }_{k=1}^{n}{g}_{k}=2{B}_{n}^{2}$ and $T=\frac{1}{2H}$. Hence, the conditions of Corollary 2.1 are satisfied. Therefore, (2.12) follows from Corollary 2.1 immediately. □

## 3 Complete convergence for acceptable random variables

In this section, we will present some complete convergence for a sequence of acceptable random variables by using the probability inequality.

Theorem 3.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables with $E{X}_{k}=0$ and $E{X}_{k}^{2}\doteq {\sigma }_{k}^{2}<\mathrm{\infty }$ for each $k\ge 1$. Denote ${B}_{n}^{2}={\sum }_{i=1}^{n}{\sigma }_{i}^{2}$, $n\ge 1$. For fixed $n\ge 1$, suppose that there exists a positive number H such that (2.11) holds true. If for any $\epsilon >0$,

$\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{{b}_{n}^{2}{\epsilon }^{2}}{4{B}_{n}^{2}}\right\}<\mathrm{\infty },$
(3.1)

and

$\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{{b}_{n}\epsilon }{4H}\right\}<\mathrm{\infty },$
(3.2)

where $\left\{{b}_{n},n\ge 1\right\}$ is a sequence of positive numbers, then $\frac{1}{{b}_{n}}{\sum }_{i=1}^{n}{X}_{i}\to 0$ completely as $n\to \mathrm{\infty }$.

Proof By Corollary 2.2, we have for any $x\ge 0$,

$P\left(|\sum _{i=1}^{n}{X}_{i}|\ge x\right)\le 2exp\left\{-\frac{{x}^{2}}{4{B}_{n}^{2}}\right\}+2exp\left\{-\frac{x}{4H}\right\},$

which implies that

$\sum _{n=1}^{\mathrm{\infty }}P\left(|\frac{1}{{b}_{n}}\sum _{i=1}^{n}{X}_{i}|\ge \epsilon \right)\le 2\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{{b}_{n}^{2}{\epsilon }^{2}}{4{B}_{n}^{2}}\right\}+2\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{{b}_{n}\epsilon }{4H}\right\}<\mathrm{\infty }.$

This completes the proof of the theorem. □

It is easily seen that (3.2) holds if ${b}_{n}=n$. So, we have the following corollary.

Corollary 3.1 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables with $E{X}_{i}=0$ and $E{X}_{i}^{2}\doteq {\sigma }_{i}^{2}<\mathrm{\infty }$ for each $i\ge 1$. Denote ${B}_{n}^{2}={\sum }_{i=1}^{n}{\sigma }_{i}^{2}$, $n\ge 1$. Suppose that conditions (2.11) and (3.1) hold with ${b}_{n}=n$. Then $\frac{1}{n}{\sum }_{i=1}^{n}{X}_{i}\to 0$ completely as $n\to \mathrm{\infty }$.

Theorem 3.2 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables with $E{X}_{i}=0$ and $E{X}_{i}^{2}\doteq {\sigma }_{i}^{2}<\mathrm{\infty }$ for each $i\ge 1$. Denote ${B}_{n}^{2}={\sum }_{i=1}^{n}{\sigma }_{i}^{2}$ and

${c}_{n}=max\left\{esssup\frac{|{X}_{i}|}{\sqrt{{B}_{n}^{2}}},1\le i\le n\right\},\phantom{\rule{1em}{0ex}}n\ge 1.$

If for any $\epsilon >0$,

$\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{n{\epsilon }^{2}}{2{c}_{n}^{2}{B}_{n}^{2}}\right\}<\mathrm{\infty },$
(3.3)

then $\frac{1}{n}{\sum }_{i=1}^{n}{X}_{i}\to 0$ completely as $n\to \mathrm{\infty }$.

Proof By Markov’s inequality and Definition 1.2, for any $\epsilon >0$ and $t>0$,

$\begin{array}{rcl}P\left(|\sum _{i=1}^{n}{X}_{i}|\ge \epsilon \right)& =& P\left(\sum _{i=1}^{n}{X}_{i}\ge \epsilon \right)+P\left(\sum _{i=1}^{n}\left(-{X}_{i}\right)\ge \epsilon \right)\\ \le & {e}^{-\frac{t\epsilon }{\sqrt{{B}_{n}^{2}}}}Eexp\left\{\frac{t}{\sqrt{{B}_{n}^{2}}}\sum _{i=1}^{n}{X}_{i}\right\}+{e}^{-\frac{t\epsilon }{\sqrt{{B}_{n}^{2}}}}Eexp\left\{-\frac{t}{\sqrt{{B}_{n}}}\sum _{i=1}^{n}{X}_{i}\right\}\\ \le & {e}^{-\frac{t\epsilon }{\sqrt{{B}_{n}^{2}}}}\left(\prod _{i=1}^{n}E{e}^{\frac{t{X}_{i}}{\sqrt{{B}_{n}^{2}}}}+\prod _{i=1}^{n}E{e}^{\frac{-t{X}_{i}}{\sqrt{{B}_{n}^{2}}}}\right)\\ \le & 2exp\left\{-\frac{t\epsilon }{\sqrt{{B}_{n}^{2}}}+\frac{n{t}^{2}{c}_{n}^{2}}{2}\right\}.\end{array}$

Taking $t=\frac{\epsilon }{n{c}_{n}^{2}\sqrt{{B}_{n}^{2}}}$ in the inequality above, we can get that

$P\left(|\sum _{i=1}^{n}{X}_{i}|\ge \epsilon \right)\le 2exp\left\{-\frac{{\epsilon }^{2}}{2n{c}_{n}^{2}{B}_{n}}\right\}.$

It follows from the inequality above and (2.5) that

$\sum _{n=1}^{\mathrm{\infty }}P\left(|\frac{1}{n}\sum _{i=1}^{n}{X}_{i}|\ge \epsilon \right)\le 2\sum _{n=1}^{\mathrm{\infty }}exp\left\{-\frac{n{\epsilon }^{2}}{2{c}_{n}^{2}{B}_{n}^{2}}\right\}<\mathrm{\infty },$

which completes the proof of the theorem. □

Hanson and Wright (1971) obtained a bound on tail probabilities for quadratic forms in independent random variables using the following condition: for all $n\ge 1$ and all $x\ge 0$, there exist positive constants M and γ such that

$P\left(|{X}_{n}|\ge x\right)\le M{\int }_{x}^{+\mathrm{\infty }}{e}^{-\gamma {t}^{2}}\phantom{\rule{0.2em}{0ex}}dt.$
(3.4)

Wright  proved that the bound established by Hanson and Wright  for independent symmetric random variables also holds when the random variables are not symmetric but condition (3.4) is valid. We will study the complete convergence for a sequence of acceptable random variables under condition (3.4). The main result is as follows.

Theorem 3.3 Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of acceptable random variables satisfying condition (3.4) for all $n\ge 1$ and all $x\ge 0$, where M and γ are positive constants. Suppose that there exists a positive constant C not depending on n such that

$E{\left(\sum _{i=1}^{n}{X}_{i}\right)}^{2}\le C\sum _{i=1}^{n}E{X}_{i}^{2}.$
(3.5)

Then for all $\beta >1$, $\frac{1}{{n}^{\beta }}{\sum }_{i=1}^{n}{X}_{i}\to 0$ completely as $n\to \mathrm{\infty }$.

Proof By Markov’s inequality and assumption (3.5), we have that for any $\epsilon >0$,

$\sum _{n=1}^{\mathrm{\infty }}P\left(|\frac{1}{{n}^{\beta }}\sum _{i=1}^{n}{X}_{i}|>\epsilon \right)\le \sum _{n=1}^{\mathrm{\infty }}\frac{1}{{n}^{2\beta }{\epsilon }^{2}}E{\left(\sum _{i=1}^{n}{X}_{i}\right)}^{2}\le C\sum _{n=1}^{\mathrm{\infty }}\frac{1}{{n}^{2\beta }{\epsilon }^{2}}\sum _{i=1}^{n}E{X}_{i}^{2}.$

In the following, we will estimate $E{X}_{i}^{2}$. By (3.4), we can see that

$\begin{array}{rcl}E{X}_{i}^{2}& =& {\int }_{0}^{+\mathrm{\infty }}2xP\left(|{X}_{i}|\ge x\right)\phantom{\rule{0.2em}{0ex}}dx\le {\int }_{0}^{+\mathrm{\infty }}2x\left(M{\int }_{x}^{+\mathrm{\infty }}{e}^{-\gamma {t}^{2}}\phantom{\rule{0.2em}{0ex}}dt\right)\phantom{\rule{0.2em}{0ex}}dx\\ =& M{\int }_{0}^{+\mathrm{\infty }}{e}^{-\gamma {t}^{2}}\left({\int }_{0}^{t}2x\phantom{\rule{0.2em}{0ex}}dx\right)\phantom{\rule{0.2em}{0ex}}dt=M{\int }_{0}^{+\mathrm{\infty }}{t}^{2}{e}^{-\gamma {t}^{2}}\phantom{\rule{0.2em}{0ex}}dt=\frac{M\sqrt{\pi }}{4{\gamma }^{3/2}}.\end{array}$

Hence,

$\sum _{n=1}^{\mathrm{\infty }}P\left(|\frac{1}{{n}^{\beta }}\sum _{i=1}^{n}{X}_{i}|>\epsilon \right)\le \frac{CM\sqrt{\pi }}{4{\gamma }^{3/2}{\epsilon }^{2}}\sum _{n=1}^{\mathrm{\infty }}\frac{1}{{n}^{2\beta -1}}<\mathrm{\infty }.$

This completes the proof of the theorem. □

## References

1. Lehmann E: Some concepts of dependence. Ann. Math. Stat. 1996, 37: 1137–1153.

2. Joag-Dev K, Proschan F: Negative association of random variables with applications. Ann. Stat. 1983, 11(1):286–295. 10.1214/aos/1176346079

3. Giuliano AR, Kozachenko Y, Volodin A: Convergence of series of dependent φ -sub-Gaussian random variables. J. Math. Anal. Appl. 2008, 338: 1188–1203. 10.1016/j.jmaa.2007.05.073

4. Xing GD, Yang SC, Liu AL, Wang X: A remark on the exponential inequality for negatively associated random variables. J. Korean Stat. Soc. 2009, 38: 53–57. 10.1016/j.jkss.2008.06.005

5. Wang XJ, Hu SH, Yang WZ, Li XQ: Exponential inequalities and complete convergence for a LNQD sequence. J. Korean Stat. Soc. 2010, 39(4):555–564. 10.1016/j.jkss.2010.01.002

6. Wang XJ, Hu SH, Yang WZ, Ling NX: Exponential inequalities and inverse moment for NOD sequence. Stat. Probab. Lett. 2010, 80(5–6):452–461. 10.1016/j.spl.2009.11.023

7. Wang XJ, Hu SH, Shen AT, Yang WZ: An exponential inequality for a NOD sequence and a strong law of large numbers. Appl. Math. Lett. 2011, 24: 219–223. 10.1016/j.aml.2010.09.007

8. Sung SH, Srisuradetchai P, Volodin A: A note on the exponential inequality for a class of dependent random variables. J. Korean Stat. Soc. 2011, 40: 109–144. 10.1016/j.jkss.2010.08.002

9. Sung SH: On the exponential inequalities for negatively dependent random variables. J. Math. Anal. Appl. 2011, 381: 538–545. 10.1016/j.jmaa.2011.02.058

10. Xing GD, Yang SC, Liu AL: Exponential inequalities for positively associated random variables and applications. J. Inequal. Appl. 2008., 2008: Article ID 385362

11. Wright FT: A bound on tail probabilities for quadratic forms in independent random variables whose distributions are not necessarily symmetric. Ann. Probab. 1973, 1(6):1068–1070. 10.1214/aop/1176996815

12. Hanson DL, Wright FT: A bound on tail probabilities for quadratic forms in independent random variables. Ann. Math. Stat. 1971, 42: 1079–1083. 10.1214/aoms/1177693335

## Acknowledgements

The authors are most grateful to the editor and the anonymous referee for careful reading of the manuscript and valuable suggestions which helped in improving an earlier version of this paper. This work was supported by the National Natural Science Foundation of China (11201001, 11171001, 11126176 and 11226207), the Specialized Research Fund for the Doctoral Program of Higher Education of China (20093401120001), the Natural Science Foundation of Anhui Province (1308085QA03, 11040606M12, 1208085QA03), the 211 project of Anhui University and the Students Science Research Training Program of Anhui University (KYXL2012007).

## Author information

Authors

### Corresponding author

Correspondence to Aiting Shen.

## Rights and permissions

Reprints and Permissions

Shen, A., Wu, R. Some probability inequalities for a class of random variables and their applications. J Inequal Appl 2013, 57 (2013). https://doi.org/10.1186/1029-242X-2013-57 