- Research
- Open access
- Published:

# Some exponential inequalities for acceptable random variables and complete convergence

*Journal of Inequalities and Applications*
**volume 2011**, Article number: 142 (2011)

## Abstract

Some exponential inequalities for a sequence of acceptable random variables are obtained, such as Bernstein-type inequality, Hoeffding-type inequality. The Bernstein-type inequality for acceptable random variables generalizes and improves the corresponding results presented by Yang for NA random variables and Wang et al. for NOD random variables. Using the exponential inequalities, we further study the complete convergence for acceptable random variables.

**MSC(2000):** 60E15, 60F15.

## 1 Introduction

Let {*X*_{
n
}, *n* ≥ 1} be a sequence of random variables defined on a fixed probability space \left(\mathrm{\Omega},\mathcal{F},P\right). The exponential inequality for the partial sums {\sum}_{i=1}^{n}\left({X}_{i}-E{X}_{i}\right) plays an important role in various proofs of limit theorems. In particular, it provides a measure of convergence rate for the strong law of large numbers. There exist several versions available in the literature for independent random variables with assumptions of uniform boundedness or some, quite relaxed, control on their moments. If the independent case is classical in the literature, the treatment of dependent variables is more recent.

First, we will recall the definitions of some dependence structure.

**Definition 1.1**. *A finite collection of random variables X*_{1}, *X*_{2},..., *X*_{
n
} *is said to be negatively associated (NA) if for every pair of disjoint subsets A*_{1}, *A*_{2} *of* {1, 2,..., *n*},

*whenever f and g are coordinatewise nondecreasing (or coordinatewise nonincreasing) such that this covariance exists. An infinite sequence of random variables* {*X*_{
n
}, *n* ≥ 1} *is NA if every finite subcollection is NA*.

**Definition 1.2**. *A finite collection of random variables X*_{1}, *X*_{2},..., *X*_{
n
} *is said to be negatively upper orthant dependent (NUOD) if for all real numbers x*_{1}, *x*_{2},..., *x*_{
n
},

*and negatively lower orthant dependent (NLOD) if for all real numbers x*_{1}, *x*_{2},..., *x*_{
n
},

*A finite collection of random variables X*_{1}, *X*_{2},..., *X*_{
n
} *is said to be negatively orthant dependent (NOD) if they are both NUOD and NLOD. An infinite sequence* {*X*_{
n
}, *n* ≥ 1} *is said to be NOD if every finite subcollection is NOD*.

The concept of NA random variables was introduced by Alam and Saxena [1] and carefully studied by Joag-Dev and Proschan [2]. Joag-Dev and Proschan [2] pointed out that a number of well-known multivariate distributions possesses the negative association property, such as multinomial, convolution of unlike multinomial, multivariate hypergeometric, Dirichlet, permutation distribution, negatively correlated normal distribution, random sampling without replacement, and joint distribution of ranks. The notion of NOD random variables was introduced by Lehmann [3] and developed in Joag-Dev and Proschan [2]. Obviously, independent random variables are NOD. Joag-Dev and Proschan [2] pointed out that NA random variables are NOD, but neither NUOD nor NLOD implies NA. They also presented an example in which *X* = (*X*_{1}, *X*_{2}, *X*_{3}, *X*_{4}) possesses NOD, but does not possess NA. Hence, we can see that NOD is weaker than NA.

Recently, Giuliano et al. [4] introduced the following notion of acceptability.

**Definition 1.3**. *We say that a finite collection of random variables X*_{1}, *X*_{2},..., *X*_{
n
} *is acceptable if for any real λ*,

*An infinite sequence of random variables* {*X*_{
n
}, *n* ≥ 1} *is acceptable if every finite subcollection is acceptable*.

Since it is required that the inequality (1.4) holds for all *λ*, Sung et al. [5] weakened the condition on *λ* and gave the following definition of acceptability.

**Definition 1.4**. *We say that a finite collection of random variables X*_{1}, *X*_{2},..., *X*_{
n
} *is acceptable if there exists δ >* 0 *such that for any real λ∈* (-*δ*, *δ*),

*An infinite sequence of random variables* {*X*_{
n
}, *n* ≥ 1} *is acceptable if every finite subcollection is acceptable*.

First, we point out that Definition 1.3 of acceptability will be used in the current article. As is mentioned in Giuliano et al. [4], a sequence of NOD random variables with a finite Laplace transform or finite moment generating function near zero (and hence a sequence of NA random variables with finite Laplace transform, too) provides us an example of acceptable random variables. For example, Xing et al. [6] consider a strictly stationary NA sequence of random variables. According to the sentence above, a sequence of strictly stationary and NA random variables is acceptable.

Another interesting example of a sequence {*Z*_{
n
}, *n* ≥ 1} of acceptable random variables can be constructed in the following way. Feller [[7], Problem III.1] (cf. also Romano and Siegel [[8], Section 4.30]) provides an example of two random variables *X* and *Y* such that the density of their sum is the convolution of their densities, yet they are not independent. It is easy to see that *X* and *Y* are not negatively dependent either. Since they are bounded, their Laplace transforms *E* exp(*λX*) and *E* exp(*λY*) are finite for any *λ*. Next, since the density of their sum is the convolution of their densities, we have

The announced sequence of acceptable random variables {*Z*_{
n
}, *n* ≥ 1} can be now constructed in the following way. Let (*X*_{
k
}, *Y*_{
k
}) be independent copies of the random vector (*X*, *Y*), *k* ≥ 1. For any *n* ≥ 1, set *Z*_{
n
} = *X*_{
k
} if *n* = 2*k* + 1 and *Z*_{
n
} = *Y*_{
k
} if *n* = 2*k*. Hence, the model of acceptable random variables that we consider in this article (Definition 1.3) is more general than models considered in the previous literature. Studying the limiting behavior of acceptable random variables is of interest.

Recently, Sung et al. [5] established an exponential inequality for a random variable with the finite Laplace transform. Using this inequality, they obtained an exponential inequality for identically distributed acceptable random variables which have the finite Laplace transforms. The main purpose of the article is to establish some exponential inequalities for acceptable random variables under very mild conditions. Furthermore, we will study the complete convergence for acceptable random variables using the exponential inequalities.

Throughout the article, let {*X*_{
n
}, *n* ≥ 1} be a sequence of acceptable random variables and denote {S}_{n}={\sum}_{i=1}^{n}{X}_{i} for each *n* ≥ 1.

**Remark 1.1**. If {*X*_{
n
}, *n* ≥ 1} is a sequence of acceptable random variables, then {-*X*_{
n
}, *n* ≥ 1} is still a sequence of acceptable random variables. Furthermore, we have for each *n* ≥ 1,

Hence, {*X*_{
n
} - *EX*_{
n
}, *n* ≥ 1} is also a sequence of acceptable random variables.

The following lemma is useful.

**Lemma 1.1**. *If X is a random variable such that a* ≤ *X* ≤ *b, where a and b are finite real numbers, then for any real number h*,

*Proof*. Since the exponential function exp(*hX*) is convex, its graph is bounded above on the interval *a* ≤ *X* ≤ *b* by the straight line which connects its ordinates at *X* = *a* and *X* = *b*. Thus

which implies (1.6).

The rest of the article is organized as follows. In Section 2, we will present some exponential inequalities for a sequence of acceptable random variables, such as Bernstein-type inequality, Hoeffding-type inequality. The Bernstein-type inequality for acceptable random variables generalizes and improves the corresponding results of Yang [9] for NA random variables and Wang et al. [10] for NOD random variables. In Section 3, we will study the complete convergence for acceptable random variables using the exponential inequalities established in Section 2.

## 2 Exponential inequalities for acceptable random variables

In this section, we will present some exponential inequalities for acceptable random variables, such as Bernstein-type inequality and Hoeffding-type inequality.

**Theorem 2.1**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables with EX*_{
i
} = 0 *and* E{X}_{i}^{2}={\sigma}_{i}^{2}<\infty*for each i* ≥ 1. *Denote* {B}_{n}^{2}={\sum}_{i=1}^{n}{\sigma}_{i}^{2}*for each n* ≥ 1. *If there exists a positive number c such that* |*X*_{
i
}| ≤ *cB*_{
n
} *for each* 1 ≤ *i* ≤ *n, n* ≥ 1, *then for any ε >* 0,

**Proof**. For fixed *n* ≥ 1, take *t >* 0 such that *tcB*_{
n
} ≤ 1. It is easily seen that

Hence,

By Definition 1.3 and the inequality above, we have

which implies that

We take t=\frac{\epsilon}{{B}_{n}} when *εc ≤* 1, and take t=\frac{1}{c{B}_{n}} when *εc >* 1. Thus, the desired result (2.1) can be obtained immediately from (2.2).

**Theorem 2.2**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables with EX*_{
i
} = 0 *and* |*X*_{
i
}| ≤ *b for each i* ≥ 1, *where b is a positive constant. Denote* {\sigma}_{i}^{2}=E{X}_{i}^{2}*and* {B}_{n}^{2}={\sum}_{i=1}^{n}{\sigma}_{i}^{2}*for each n* ≥ 1. *Then, for any ε >* 0,

*and*

**Proof**. For any *t >* 0, by Taylor's expansion, *EX*_{
i
} = 0 and the inequality 1 + *x* ≤ *e*^{x}, we can get that for *i* = 1, 2,..., *n*,

where

Denote *C* = *b/* 3 and {M}_{n}=\frac{b\epsilon}{3{B}_{n}^{2}}+1. Choosing *t >* 0 such that *tC <* 1 and

It is easy to check that for *i* = 1, 2,..., *n* and *j* ≥ 2,

which implies that for *i* = 1, 2,..., *n*,

By Markov's inequality, Definition 1.3, (2.5) and (2.6), we can get

Taking t=\frac{\epsilon}{{B}_{n}^{2}{M}_{n}}=\frac{\epsilon}{C\epsilon +{B}_{n}^{2}}. It is easily seen that *tC <* 1 and tC=\frac{C\epsilon}{C\epsilon +{B}_{n}^{2}}. Substituting t=\frac{\epsilon}{{B}_{n}^{2}{M}_{n}} into the right-hand side of (2.7), we can obtain (2.3) immediately. By (2.3), we have

since {-*X*_{
n
}, *n* ≥ 1} is still a sequence of acceptable random variables. The desired result (2.4) follows from (2.3) and (2.8) immediately. □

**Remark 2.1**. By Theorem 2.2, we can get that for any *t >* 0,

and

It is well known that the upper bound of *P* (|*S*_{
n
}| ≥ *nt*) is also 2exp\left\{-\frac{{n}^{2}{t}^{2}}{2{B}_{n}^{2}+\frac{2}{3}bnt}\right\}. So Theorem 2.3 extends corresponding results for independent random variables without necessarily adding any extra conditions. In addition, it is easy to check that

which implies that our Theorem 2.2 generalizes and improves the corresponding results of Yang [9, Lemma 3.5] for NA random variables and Wang et al. [10, Theorem 2.3] for NOD random variables.

In the following, we will provide the Hoeffding-type inequality for acceptable random variables.

**Theorem 2.3**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables. If there exist two sequences of real numbers* {*a*_{
n
}, *n* ≥ 1} *and* {*b*_{
n
}, *n* ≥ 1} *such that a*_{
i
} ≤ *X*_{
i
} ≤ *b*_{
i
} *for each i* ≥ 1, *then for any ε >* 0 *and n* ≥ 1,

and

**Proof**. For any *h >* 0, by Markov's inequality, we can see that

It follows from Remark 1.1 that

Denote *EX*_{
i
} = *μ*_{
i
} for each *i* ≥ 1. By *a*_{
i
} ≤ *X*_{
i
} ≤ *b*_{
i
} and Lemma 1.1, we have

where

The first two derivatives of *L*(*h*_{
i
}) with respect to *h*_{
i
} are

The last ratio is of the form *u*(1 - *u*), where 0 *< u <* 1. Hence,

Therefore, by Taylor's expansion and (2.16), we can get

By (2.12), (2.13), and (2.17), we have

It is easily seen that the right-hand side of (2.18) has its minimum at h=\frac{4\epsilon}{{\sum}_{i=1}^{n}{\left({b}_{i}-{a}_{i}\right)}^{2}}. Inserting this value in (2.18), we can obtain (2.9) immediately. Since {-*X*_{
n
}, *n* ≥ 1} is a sequence of acceptable random variables, (2.9) implies (2.10). Therefore, (2.11) follows from (2.9) and (2.10) immediately. This completes the proof of the theorem.

## 3 Complete convergence for acceptable random variables

In this section, we will present some complete convergence for a sequence of acceptable random variables. The concept of complete convergence was introduced by Hsu and Robbins [11] as follows. A sequence of random variables {*U*_{
n
}, *n* ≥ 1} is said to converge completely to a constant *C* if {\sum}_{n=1}^{\infty}P\left(\mid {U}_{n}-C\mid \phantom{\rule{2.77695pt}{0ex}}>\epsilon \right)<\infty for all *ε >* 0. In view of the Borel-Cantelli lemma, this implies that *U*_{
n
} → *C* almost surely (a.s.). The converse is true if the {*U*_{
n
}, *n* ≥ 1} are independent. Hsu and Robbins [11] proved that the sequence of arithmetic means of independent and identically distributed (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. Erdös [12] proved the converse. The result of Hsu-Robbins-Erdös is a fundamental theorem in probability theory and has been generalized and extended in several directions by many authors.

Define the space of sequences

The following results are based on the space of sequences \mathcal{H}.

**Theorem 3.1**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables with EX*_{
i
} = 0 *and* |*X*_{
i
}| *≤ b for each i* ≥ 1, *where b is a positive constant. Assume that* {\sum}_{i=1}^{n}E{X}_{i}^{2}=O\left({b}_{n}\right)*for some* \left\{{b}_{n}\right\}\in \mathcal{H}*. Then*,

**Proof**. For any *ε >* 0, it follows from Theorem 2.2 that

which implies (3.1). Here, *C* is a positive number not depending on *n*. □

**Theorem 3.2**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables with* |*X*_{
i
}| ≤ *c <* ∞ *for each i* ≥ 1, *where c is a positive constant. Then, for every* \left\{{b}_{n}\right\}\in \mathcal{H},

**Proof**. For any *ε >* 0, it follows from Theorem 2.3 that

which implies (3.2). □

**Theorem 3.3**. *Let* {*X*_{
n
}, *n* ≥ 1} *be a sequence of acceptable random variables with EX*_{
i
} = 0 *and* E{X}_{i}^{2}={\sigma}_{i}^{2}<\infty *for each i*≥ 1. *Denote* {B}_{n}^{2}={\sum}_{i=1}^{n}{\sigma}_{i}^{2} *for each n*≥ 1. *For fixed n* ≥ 1, *there exists a positive number H such that*

*for any positive integer m* ≥ 2. *Then*,

*provided that* \left\{{b}_{n}^{2}\u2215{B}_{n}^{2}\right\}\in \mathcal{H} *and* \left\{{b}_{n}\right\}\in \mathcal{H}.

**Proof**. By (3.3), we can see that

for *i* = 1, 2,..., *n*, *n* ≥ 1. When \mid t\mid \le \frac{1}{2H}, it follows that

Therefore, by Markov's inequality, Definition 1.3 and (3.5), we can get that for any *x* ≥ 0 and \mid t\mid \le \frac{1}{2H},

Hence,

If 0\le x\le \frac{{B}_{n}^{2}}{H}, then

if x\ge \frac{{B}_{n}^{2}}{H}, then

From the statements above, we can get that

which implies that for any *x* ≥ 0,

Therefore, the assumptions of {*b*_{
n
}} yield that

This completes the proof of the theorem. □

## References

Alam K, Saxena KML:

**Positive dependence in multivariate distributions.***Commun Stat Theory Methods*1981,**10:**1183–1196. 10.1080/03610928108828102Joag-Dev K, Proschan F:

**Negative association of random variables with applications.***Ann Stat*1983,**11**(1):286–295. 10.1214/aos/1176346079Lehmann E:

**Some concepts of dependence.***Ann Math Stat*1966,**37:**1137–1153. 10.1214/aoms/1177699260Giuliano AR, Kozachenko Y, Volodin A:

**Convergence of series of dependent***φ***-subGaussian random variables.***J Math Anal Appl*2008,**338:**1188–1203. 10.1016/j.jmaa.2007.05.073Sung SH, Srisuradetchai P, Volodin A:

**A note on the exponential inequality for a class of dependent random variables.***J Korean Stat Soc*2011,**40**(1):109–114. 10.1016/j.jkss.2010.08.002Xing G, Yang S, Liu A, Wang X:

**A remark on the exponential inequality for negatively associated random variables.***J Korean Stat Soc*2009,**38:**53–57. 10.1016/j.jkss.2008.06.005Feller W:

*An Introduction to Probability Theory and its Applications.**Volume II*. 2nd edition. Wiley, New York; 1971.Romano JP, Siegel AF:

*The Wadsworth & Brooks/Cole Statistics/Probability Series, Counterexamples in Probability and Statistics.*Wadsworth & Brooks/Cole Advanced Books & Software, Monterey; 1986.Yang SC:

**Uniformly asymptotic normality of the regression weighted estimator for negatively associated samples.***Stat Probab Lett*2003,**62:**101–110.Wang XJ, Hu SH, Yang WZ, Ling NX:

**Exponential inequalities and inverse moment for NOD sequence.***Stat Probab Lett*2010,**80:**452–461. 10.1016/j.spl.2009.11.023Hsu PL, Robbins H:

**Complete convergence and the law of large numbers.***Proc Natl Acad Sci USA*1947,**33**(2):25–31. 10.1073/pnas.33.2.25Erdös P:

**On a theorem of Hsu and Robbins.***Ann Math Stat*1949,**20**(2):286–291. 10.1214/aoms/1177730037

## Acknowledgements

The authors are most grateful to the editor and anonymous referee for the careful reading of the manuscript and valuable suggestions which helped in significantly improving an earlier version of this article.

The study was supported by the National Natural Science Foundation of China (11171001, 71071002, 11126176) and the Academic Innovation Team of Anhui University (KJTD001B).

## Author information

### Authors and Affiliations

### Corresponding author

## Additional information

### Competing interests

The authors declare that they have no competing interests.

### Authors' contributions

Some exponential inequalities for a sequence of acceptable random variables are obtained, such as Bernstein-type inequality, Hoeffding-type inequality. The complete convergence is further studied by using the exponential inequalities. All authors read and approved the final manuscript.

## Rights and permissions

**Open Access**
This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (
https://creativecommons.org/licenses/by/2.0
), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

## About this article

### Cite this article

Shen, A., Hu, S., Volodin, A. *et al.* Some exponential inequalities for acceptable random variables and complete convergence.
*J Inequal Appl* **2011**, 142 (2011). https://doi.org/10.1186/1029-242X-2011-142

Received:

Accepted:

Published:

DOI: https://doi.org/10.1186/1029-242X-2011-142