 Research
 Open access
 Published:
Conditional acceptability of random variables
Journal of Inequalities and Applications volume 2016, Article number: 149 (2016)
Abstract
Acceptable random variables introduced by Giuliano Antonini et al. (J. Math. Anal. Appl. 338:11881203, 2008) form a class of dependent random variables that contains negatively dependent random variables as a particular case. The concept of acceptability has been studied by authors under various versions of the definition, such as extended acceptability or wide acceptability. In this paper, we combine the concept of acceptability with the concept of conditioning, which has been the subject of current research activity. For conditionally acceptable random variables, we provide a number of probability inequalities that can be used to obtain asymptotic results.
1 Introduction
Let \(\{X_{n},n\in\mathbb{N}\}\) be a sequence of random variables defined on a probability space \((\Omega, \mathcal{A},\mathcal{P})\). Giuliano Antonini et al. [1] introduced the concept of acceptable random variables as follows.
Definition 1
A finite collection of random variables \(X_{1},X_{2},\ldots,X_{n}\) is said to be acceptable if for any real λ,
An infinite sequence of random variables \(\{X_{n},n\in\mathbb{N}\}\) is acceptable if every finite subcollection is acceptable.
The class of acceptable random variables includes in a trivial way collections of independent random variables. But in most cases, acceptable random variables exhibit a form of negative dependence. In fact, as Giuliano Antonini et al. [1] point out, negatively associated random variables with a finite Laplace transform satisfy the notion of acceptability. However, acceptable random variables do not have to be negatively dependent. A classical example of acceptable random variables that are not negatively dependent can be constructed based on problem III.1 listed in the classical book of Feller [2]. Details can be found in Giuliano Antonini et al. [1], Shen et al. [3], or Sung et al. [4].
The idea of acceptability has been modified or generalized in certain directions. For example, Giuliano Antonini et al. [1] introduced the concept of macceptable random variables, whereas other authors provided weaker versions such as the notions of extended acceptability or wide acceptability. The following definition given by Sung et al. [4] provides a weaker version of acceptability by imposing a condition on λ.
Definition 2
A finite collection of random variables \(X_{1},X_{2},\ldots ,X_{n}\) is said to be acceptable if there exists \(\delta>0\) such that, for any real \(\lambda<\delta\),
An infinite sequence of random variables \(\{X_{n},n\in\mathbb{N}\}\) is acceptable if every finite subcollection is acceptable.
The concept of acceptable random variables has been studied extensively by a few authors, and a number of results are available in the literature such as exponential inequalities and complete convergence results. For the interested reader, we suggest the papers of Giuliano Antonini et al. [1], Sung et al. [4], Wang et al. [5], Shen et al. [3], among others.
Further, in addition to the definition of acceptability, Choi and Baek [6] introduced the concept of extended acceptability as follows.
Definition 3
A finite collection of random variables \(X_{1},X_{2},\ldots,X_{n}\) is said to be extended acceptable if there exists a constant \(M>0\) such that, for any real λ,
An infinite sequence of random variables \(\{X_{n},n\in\mathbb{N}\}\) is extended acceptable if every finite subcollection is extended acceptable.
It is clear that acceptable random variables are extended acceptable. The following example provides a sequence of random variables that satisfies the notion of extended acceptability.
Example 4
Let \(\{X_{n}, n\in\mathbb{N}\}\) be absolutely continuous random variables with the same distributions \(F(x)\) and densities \(f(x)\) such that the finitedimensional distributions are given by the copula
This copula can be equivalently represented as
The density of C can be obtained by the formula
and it can be proven that, for \(\beta\leq\log2\),
Furthermore, \(\int_{0}^{1}\cdots\int_{0}^{1}c(u_{1},\ldots,u_{k}) \, du_{1}\cdots \, du_{k}=1\), which shows that \(c(u_{1},\ldots,u_{k})\) is a density function. It is known that
which leads to
Hence,
proving that the random variables satisfy the definition of extended acceptability for \(M = \exp{(\beta)}>0\).
Observe that \(\{X_{n},n\in\mathbb{N}\}\) is a strictly stationary sequence that is negatively dependent for \(\beta<0\) and positively dependent for \(\beta>0\).
For the class of extended acceptable random variables, Choi and Baek [6] provide an exponential inequality that enables the derivation of asymptotic results based on complete convergence.
A different version of acceptability, the notion of wide acceptability, is provided by Wang et al. [5].
Definition 5
Random variables \(\{X_{n},n\in\mathbb{N}\}\) are widely acceptable for \(\delta_{0}>0\) if for any real \(0<\lambda<\delta_{0}\), there exist positive numbers \(g(n)\), \(n\geq1\), such that
The following example gives random variables that satisfy the definition of wide acceptability.
Example 6
Consider a sequence \(\{X_{n}, n\in N\}\) of random variables with the same absolutely continuous distribution function \(F(x)\). Assume that the joint distribution of \((X_{i_{1}},\ldots,X_{i_{n}})\) is given by
provided that \(\sum_{1\leq i< j\leq n} a_{ij} \leq1\). Then it can be proven that
which leads to
proving that \(\{X_{n}, n\in\mathbb{N}\}\) is a sequence of widely extended random variables with \(g(n) = 1+ \sum_{1\leq i< j\leq n} a_{ij}  \).
The concept of widely acceptable random variables follows naturally from the concept of wide dependence of random variables introduced by Wang et al. [7]. Wang et al. [8] and Wang et al. [7] stated (without proof) that, for widely orthant dependent random variables, the inequality in Definition 5 is true for any λ. For widely acceptable random variables, Wang et al. [5] pointed out, although did not provide the details, that one can get exponential inequalities similar to those obtained for acceptable random variables.
In this paper, we combine the concept of conditioning on a σalgebra with the concept of acceptability (in fact, wide acceptability) and define conditionally acceptable random variables. In Section 2.1, we give the basic definitions and examples and prove some classical exponential inequalities. In Section 2.2, we provide asymptotic results by utilizing the tools of Section 2.1. Finally, in Section 3, we give some concluding remarks.
2 Results and discussion
Recently, various researchers have studied extensively the concepts of conditional independence and conditional association (see, e.g., Chow and Teicher [9], Majerak et al. [10], Roussas [11], and Prakasa Rao [12]) providing conditional versions of known results such as the generalized BorelCantelli lemma, the generalized Kolmogorov inequality, and the generalized HájekRényi inequalities. Counterexamples are available in the literature, proving that the conditional independence and conditional association are not equivalent to the corresponding unconditional concepts.
Following the notation introduced by Prakasa Rao [12], let \(E^{\mathcal{F}}(X) = E(X\mathcal{F})\) and \(P^{\mathcal{F}}(A) = P(A\mathcal{F})\) denote the conditional expectation and the conditional probability, respectively, where \(\mathcal{F}\) is a subσalgebra of \(\mathcal{A}\). Furthermore, \(\operatorname{Cov}^{\mathcal {F}}(X,Y)\) denotes the conditional covariance of X and Y given \(\mathcal{F}\), where
whereas the conditional variance is defined as \(\operatorname{Var}^{\mathcal{F}} (X) = \operatorname{Cov}^{\mathcal{F}}(X,X)\).
The concept of conditional negative association was introduced by Roussas [11]. Let us recall its definition since it is related to the results presented further.
Definition 7
A finite collection of random variables \(X_{1}, \ldots, X_{n}\) is said to be conditionally negatively associated given \(\mathcal{F}\) (\(\mathcal{F}\)NA) if
for any disjoint subsets A and B of \({1, 2, \ldots, n}\) and for any realvalued componentwise nondecreasing functions f, g on \(\mathbb{R}^{A}\) and \(\mathbb{R}^{B}\), respectively, where \(A = \operatorname{card}(A)\), provided that the covariance is defined. An infinite collection is conditionally negatively associated given \(\mathcal{F}\) if every finite subcollection is \(\mathcal{F}\)NA.
As mentioned earlier, it can be shown that the concepts of negative association and conditional negative association are not equivalent. See, for example, Yuan et al. [13], where various of counterexamples are given.
2.1 Conditional acceptability
In this paper, we define the concept of conditional acceptability by combining the concept of conditioning on a σalgebra and the concept of acceptability. In particular, conditioning is combined with the concept of wide acceptability. We therefore give the following definition.
Definition 8
A finite family of random variables \(X_{1}, X_{2},\ldots,X_{n}\) is said to be \(\mathcal{F}\)acceptable for \(\delta>0\) if \(E ( \exp(\deltaX_{i}) ) <\infty\) for all i and, for any \(\lambda< \delta\), there exist positive numbers \(g(n)\), \(n\geq1\), such that
A sequence of random variables \(\{X_{n}, n\in\mathbb{N}\}\) is \(\mathcal {F}\)acceptable for \(\delta>0\) if every finite subfamily is \(\mathcal {F}\)acceptable for \(\delta>0\).
Remark 9
The definition of \(\mathcal{F}\)acceptability allows for the quantity λ to be a random object. Thus, as a result, if the random variables \(X_{1}, X_{2},\ldots,X_{n}\) are \(\mathcal{F}\)acceptable for \(\delta >0\), then
for an \(\mathcal{F}\)measurable random variable λ such that \(\lambda<\delta\) a.s.
Remark 10
It can be easily verified that if random variables \(X_{1},\ldots,X_{n}\) are \(\mathcal{F}\)acceptable, then the random variables \(X_{1}E^{\mathcal {F}}(X_{1}), X_{2}E^{\mathcal{F}}(X_{2}),\ldots,X_{n}E^{\mathcal{F}}(X_{n})\) are also \(\mathcal{F}\)acceptable, and \( X_{1} ,  X_{2} , \ldots,  X_{n}\) are also \(\mathcal{F}\)acceptable.
The random variables given in the following example satisfy the definition of \(\mathcal{F}\)acceptability.
Example 11
Let \(\Omega= \{1,2,3,4\}\) with \(P(\{i\})=\frac{1}{4}\). Define the events
and the random variables
Let \(B=\{1\}\) and \(\mathcal{F} = \{\Omega, B, B^{c}, \emptyset\}\). Then
and
For the random variables to satisfy the definition of \(\mathcal {F}\)acceptability, the following inequality needs to be valid:
for all \(\lambda\in(\delta,\delta)\) and \(g(2)>0\). For the case where \(\omega\in B\), this inequality is satisfied for all \(\lambda\in\mathbb {R}\) if \(g(2)\) is chosen to be any number such that \(g(2)\geq1\). For the case where \(\omega\in B^{c}\), the inequality is equivalent to
Observe that the last inequality is satisfied for all \(\lambda\in \mathbb{R}\) if \(g(2)\geq\frac{3}{2}\). Thus, the random variables are \(\mathcal{F}\)acceptable for any real \(\lambda\in(\delta,\delta)\) where \(\delta>0\) and \(g(2)\geq\frac{3}{2}\). Furthermore, it is worth mentioning that the random variables \(\{X_{1},X_{2}\}\) are not \(\mathcal{F}\)NA.
In the case where \(\mathcal{F}\) is chosen to be the trivial σalgebra, that is, \(\mathcal{F} = \{\emptyset,\Omega\}\), the definition of \(\mathcal{F}\)acceptability reduces to the definition of unconditional wide acceptability. The converse statement cannot always be true, and this can be proven via the following counterexample, showing that the concepts of \(\mathcal{F}\)acceptability and acceptability are not equivalent.
Example 12
Let \(\Omega= \{1,2,3,4,5,6\}\), and let \(P(\{i\}) = \frac {1}{6}\). Define the events \(A_{1}\) and \(A_{2}\) by
and the random variables \(X_{1}\) and \(X_{2}\) by
Let \(B = \{6\}\), and let \(\mathcal{F} = \{\Omega, B, B^{c},\emptyset\}\) be the subσalgebra generated by B. Yuan et al. [13] proved that \(\{X_{1},X_{2}\}\) are \(\mathcal{F}\)NA. By proposition P1 of the same paper it follows that, for all \(\lambda \in\mathbb{R}\), \(\{e^{\lambda X_{1}},e^{\lambda X_{2}}\}\) are \(\mathcal{F}\)NA, and therefore \(\{X_{1},X_{2}\}\) are \(\mathcal {F}\)acceptable for \(g(2) =1\).
Note that
and
For \(\lambda= \log2\), this difference is positive, proving that \(\{X_{1} , X_{2} \}\) do not satisfy the definition of acceptability.
Let X be a random variable, \(X\geq0\) a.s., and let ϵ be an \(\mathcal{F}\)measurable random variable such that \(\epsilon>0\) a.s. It is known that
This inequality is a conditional version of Markov’s inequality and is an tool for obtaining the results of this paper.
It is well known that exponential inequalities played an important role in obtaining asymptotic results for sums of independent random variables. Classical exponential inequalities were obtained, for example, by Bernstein, Hoeffding, Kolmogorov, Fuk, and Nagaev (see the monograph of Petrov [14]). A crucial step in proving an exponential inequality is the use of an inequality like that in Definition 2. Next, we provide several exponential inequalities for \(\mathcal{F}\)acceptable random variables.
The following Hoeffdingtype inequality is obtained by Yuan and Xie [16].
Lemma 13
Assume that \(P(a\leq X\leq b) = 1\), where a and b are \(\mathcal{F}\)measurable random variables such that \(a< b\) a.s. Then
for any \(\mathcal{F}\)measurable random variable λ.
The result that follows is a conditional version of the wellknown Hoeffding inequality (Hoeffding [15], Theorem 2). Similar results were proven by Shen et al. [3], Theorem 2.3, for acceptable random variables and by Yuan and Xie [16], Theorem 1, for conditionally linearly negatively quadrant dependent random variables. Our result improves Theorem 1 of Yuan and Xie [16].
Theorem 14
Let \(X_{1}, X_{2},\ldots,X_{n}\) be \(\mathcal{F}\)acceptable random variables for \(\delta>0\) such that \(P(a_{i}\leq X_{i}\leq b_{i})=1\), \(i=1,2,\ldots,n\), where \(a_{i}\) and \(b_{i}\) are \(\mathcal {F}\)measurable random variables such that \(a_{i}< b_{i}\) a.s. for all i. Then for an \(\mathcal{F}\)measurable ϵ with \(0<\epsilon<\frac {\delta}{4}\sum_{i=1}^{n}(b_{i}a_{i})^{2}\) a.s., we have
where \(S_{n} = \sum_{i=1}^{n}X_{i}\).
Proof
Let λ be an \(\mathcal{F}\)measurable random variable such that \(0<\lambda<\delta\) a.s. Then by the conditional version of Markov’s inequality
where the first inequality follows by applying (1), and the second because of the \(\mathcal{F}\)acceptability property for \(\lambda \in(0,\delta)\). Since the minimum of the function \(f(x) = ax^{2}+bx\) is attained at \(x=\frac{b}{2a}\), the minimum of the above expression is attained at \(\lambda= \frac{4 \epsilon}{\sum_{i=1}^{n}(b_{i}a_{i})^{2}}\). Then
Given that \(X_{1},\ldots,X_{n}\) are also \(\mathcal{F}\)acceptable, we also have that
The desired result follows by combining the last two expressions. □
The result that follows is the conditional version of Theorem 2.1 of Shen et al. [3].
Theorem 15
Let \(X_{1},\ldots,X_{n}\) be \(\mathcal{F}\)acceptable random variables for \(\delta>0\). Assume that \(E^{\mathcal{F}}(X_{i})=0\) and that \(E^{\mathcal {F}}(X_{i}^{2})\) is finite for any i. Let \(B_{n}^{2} = \sum_{i=1}^{n}E^{\mathcal{F}}(X_{i}^{2})\). Assume that \(X_{i}\leq cB_{n}\) a.s., where c is a positive \(\mathcal{F}\)measurable random variable. Then
for any positive \(\mathcal{F}\)measurable random variable ϵ.
Proof
Let t be a positive \(\mathcal{F}\)measurable random variable such that \(tcB_{n}\leq1\). Note that, for \(k\geq2\),
Then
where the second inequality follows because \(tcB_{n} \leq1\) a.s., and the third because of the fact that \(\frac{1}{3}+\frac{1}{3\cdot4}+\frac {1}{3\cdot4\cdot5}+\cdots\leq\frac{1}{2}\). Hence, for \(t<\delta\),
where the first inequality follows by the \(\mathcal{F}\)acceptability property. Note that since ϵ, \(B_{n}\), and c are positive \(\mathcal{F}\)measurable random variables, the conditional Markov inequality can be applied, and by using the above calculations we have that, for \(0< t<\delta\),
If \(\epsilon c\leq1\) a.s., then put \(t = \frac{\epsilon}{B_{n}}\) a.s. Therefore, \(0< t<\delta\) is equivalent to \(\epsilon<\delta B_{n}\) a.s. Then we obtain from (2) that
If \(\epsilon c\geq1\) a.s., then put \(t=\frac{1}{cB_{n}}\) a.s. Then we need \(\frac{1}{cB_{n}}<\delta\) a.s. In this case, from (2) we obtain
□
Theorem 16
Let \(X_{1},\ldots,X_{n}\) be \(\mathcal{F}\)acceptable random variables for \(\delta>0\). Assume that \(E^{\mathcal{F}}(X_{i}) = 0\) a.s. and that there is an \(\mathcal{F}\)measurable random variable b such that \(X_{i}\leq b\) a.s. for all i. Define \(B_{n}^{2}=\sum_{i=1}^{n}E^{\mathcal{F}} (X_{i}^{2} )\). Then, for any positive \(\mathcal{F}\)measurable random variable ϵ with \(\frac {\epsilon}{B_{n}^{2}+\frac{b}{3}\epsilon}<\delta\), we have
and
Proof
For \(t>0\), by using Taylor’s expansion we have that
where
Note that on the set where \(E^{\mathcal{F}}[X_{i}^{2}] = 0\) a.s., we have the obvious upper bound \(E^{\mathcal{F}}[\exp(tX_{i})]\leq1\) a.s. since \(E^{\mathcal{F}}[X_{i}^{j}]\leq E^{\mathcal{F}}[X_{i}^{2}]b^{j2}\) a.s.
Define \(c = \frac{b}{3}\) and \(M_{n} = 1+\frac{b\epsilon}{3B_{n}^{2}}=1+c\frac {\epsilon}{B_{n}^{2}}\) a.s. By choosing \(t>0\) such that \(t<\frac{1}{c}\) we have that
Observe that, for \(j\geq2\),
Therefore,
By applying the conditional Markov inequality and the \(\mathcal {F}\)acceptability of the sequence \(\{X_{n}, n\in\mathbb{N}\}\) for \(0< t<\delta\) a.s. we have
The minimum is obtained at \(t_{0}=\frac{\epsilon}{B_{n}^{2}M_{n}}\) a.s. It can be verified that \(t_{0} = \frac{\epsilon}{c\epsilon+B_{n}^{2}}\), and therefore the condition \(tc<1\) a.s. is satisfied. Moreover, \(t_{0}c = \frac{c\epsilon}{c\epsilon+B_{n}^{2}} = \frac{M_{n}1}{M_{n}}\). So the above calculations show that the choice of \(t_{0}=\frac{\epsilon}{B_{n}^{2}M_{n}}\) a.s. is valid. Furthermore, we have assumed that \(0< t<\delta\) a.s., and for the particular choice of \(t_{0}\) this restriction leads to \(\frac {\epsilon}{B_{n}^{2}M_{n}}<\delta\), which is equivalent to
which is satisfied because of the assumption of the theorem. By substituting \(t_{0} =\epsilon/ B_{n}^{2}M_{n}\) into (3) we obtain
Since \(\{X_{n},n\in\mathbb{N}\}\) is again a sequence of \(\mathcal {F}\)acceptable random variables, we obtain
□
The probability inequalities presented above were proven under the assumption of bounded random variables. The result that follows provides a probability inequality under a moment condition.
Theorem 17
Let \(\{X_{n},n\in\mathbb{N}\}\) be a sequence of \(\mathcal{F}\)acceptable random variables for \(\delta>0\), and let \(\{ c_{n},n\in\mathbb{N}\}\) be a sequence of positive \(\mathcal {F}\)measurable random variables with \(C_{n} = \sum_{i=1}^{n}c_{i}\) for all \(n=1,2,\ldots\) . Assume that there is a positive \(\mathcal {F}\)measurable random variable T with \(T\leq\delta\) a.s. such that, for \(t\leq T\) and fixed \(n\geq1\),
Then, for any positive \(\mathcal{F}\)measurable random variable ϵ,
Proof
The proof follows similar arguments as those presented in Petrov [14]. Let \(0< t\leq T\) a.s. Then by applying the conditional Markov inequality and the property of \(\mathcal{F}\)acceptability we have that
The aim is to minimize the above bound. Observe that, for \(0< \epsilon \leq C_{n}T\) a.s., the above bound is minimized for \(t= \frac{\epsilon }{C_{n}}\), which satisfies the condition \(0< t\leq T\) a.s. Therefore, by letting \(t= \frac{\epsilon}{C_{n}}\) we have that
Now, suppose that \(\epsilon\geq C_{n}T\). Then the minimum is obtained for \(t=T\), and the above bound becomes of the form
We see that \(X_{1},X_{2},\ldots,X_{n}\) satisfy condition (4) for \(T\leq t\leq0\) for some positive \(\mathcal{F}\)measurable random variables T and \(\{c_{n},n\in\mathbb{N}\}\). So \(X_{1},X_{2},\ldots,X_{n}\) are \(\mathcal{F}\)acceptable random variables that satisfy condition (4) for \(0\leq t\leq T\), and therefore by applying inequalities (5) and (6) for \(S_{n}\) we have that
and
The desired result follows by combining (5) with (7) and (6) with (8). □
Remark 18
Since \(\{X_{n}, n\in\mathbb{N}\}\) is a sequence of \(\mathcal {F}\)acceptable random variables with \(g(n) \equiv1\), where \(\mathcal {F}\) is the trivial σalgebra, the above result is reduced to the result of Corollary 2.1 of Shen and Wu [17].
2.2 Conditional complete convergence
Complete convergence results are well known for independent random variables (see, e.g., Gut [18]). The classical results of Hsu, Robbins, Erdős, Baum, and Katz were extended to certain dependent sequences. Using the results of Section 2.1, we can show the complete convergence for the partial sum of \(\mathcal{F}\)acceptable random variables under various assumptions. We will need the following definition of conditional complete convergence (see Christofides and Hadjikyriakou [19] for details).
Definition 19
A sequence of random variables \(\{X_{n},n\in\mathbb{N}\}\) is said to converge completely given \(\mathcal{F}\) to a random variable X if
for any \(\mathcal{F}\)measurable random variable ϵ such that \(\epsilon>0\) a.s.
The following set of sequences, which serves purposes of brevity, was first defined by Shen et al. [3]:
The theorem that follows is a complete convergence theorem for ‘selfnormalized’ sums.
Theorem 20
Let \(X_{1}, X_{2},\ldots\) be a sequence of \(\mathcal {F}\)acceptable random variables for \(\delta>0\). Assume that \(E^{\mathcal{F}}(X_{i}) = 0\) a.s. and \(X_{i}\leq b\) a.s. for all i, where b is an \(\mathcal{F}\)measurable random variable. Assume that \(g(n)< K\) a.s., where K is an a.s. finite \(\mathcal{F}\)measurable random variable. Let \(B_{n}^{2} = \sum_{i=1}^{n}E^{\mathcal{F}} ( X_{i}^{2} )\) and assume that \(\{B_{n}^{2},n\in\mathbb{N}\}\in\mathcal {H}\) a.s. Then
Proof
By applying the result of Theorem 16 for \(\frac{\epsilon }{1+\frac{b\epsilon}{3}}<\delta\) a.s. we have that
because \(g(n)\leq K\) a.s. and \(\{B_{n}^{2},n\in\mathbb{N}\}\in\mathcal {H}\) a.s. □
Remark 21
In Theorem 20, the condition \(\{B_{n}^{2},n\in\mathbb {N}\}\in\mathcal{H}\) can be substituted by \(B_{n}^{2} = O(b_{n})\), where \({b_{n}}\in\mathcal{H}\) a.s. Then it can be proven that
Here \(b_{n}\) can be a sequence of \(\mathcal{F}\)measurable random variables. In such case, the result is a conditional version of Theorem 3.1 of Shen et al. [3].
Next, we provide a result, which is a conditional version of Theorem 3.2 of Shen et al. [3].
Theorem 22
Let \(X_{1},X_{2},\ldots\) be a sequence of \(\mathcal{F}\)acceptable random variables for \(\delta> 0\). Assume that \(X_{i}\leq c\) a.s., where c is an \(\mathcal{F}\)measurable random variable such that \(c>0\) a.s. Assume that \(g(n)\leq K\) a.s., where K is an a.s. finite \(\mathcal {F}\)measurable random variable. Moreover, let \({b_{n}}\in\mathcal{H}\) a.s., where \(b_{n}\) are \(\mathcal{F}\)measurable. Then
Proof
By applying Theorem 14 for \(0<\epsilon<c^{2}\delta\sqrt{\frac {n}{b_{n}}}\) a.s. we have that
because \(g(n)\leq K\) a.s. and \(b_{n}\in\mathcal{H}\) a.s. □
Theorem 23
Let \(X_{1},X_{2},\ldots\) be a sequence of \(\mathcal{F}\)acceptable random variables for \(\delta> 0\) such that all the assumptions of Theorem 17 are satisfied. If, for any \(\epsilon>0\) a.s.,
where \(\{b_{n},n\in\mathbb{N}\}\) is a sequence of positive \(\mathcal {F}\)measurable random variables, then
Proof
By applying Theorem 17 we have
□
The theorem that follows gives a conditional exponential inequality for the partial sum of \(\mathcal{F}\)acceptable random variables, under a moment condition, which, in the unconditional case is a condition appearing very frequently in large deviation results (see, e.g., Nagaev [20] and Teicher [21]). It also appears as condition (3.3) of Theorem 3.3 of Shen et al. [3]. However, the bound provided here allows us to prove the complete convergence, and in the unconditional case, under assumptions different from those of Theorem 3.3 of Shen et al. [3].
Theorem 24
Let \(X_{1},X_{2},\ldots\) be a sequence of \(\mathcal{F}\)acceptable random variables for \(\delta>0\). Assume that \(E^{\mathcal{F}}(X_{i}) = 0\) and let \(\sigma_{i}^{2} = E^{\mathcal{F}}(X_{i}^{2})\) be a.s. finite. Let \(B_{n}^{2} = \sum_{i=1}^{n}\sigma_{i}^{2}\). Assume that there exists an a.s. positive and a.s. finite \(\mathcal{F}\)measurable random variable H such that

(i)
If \(\frac{1}{H} [ 1  \sqrt{\frac{B_{n}^{2}}{2Hx + B_{n}^{2}}} ]<\delta\), then for an a.s. positive \(\mathcal{F}\)measurable random variable x, we have
$$P^{\mathcal{F}} \Biggl( \Biggl\vert \sum_{i=1}^{n}X_{i} \Biggr\vert \geq x \Biggr) \leq2g(n) \exp \biggl[  \frac{1}{2 H^{2}} { \Bigl( \sqrt{2Hx + B_{n}^{2}}  \sqrt{B_{n}^{2}} \Bigr)}^{2} \biggr] \quad \textit{a.s.} $$ 
(ii)
If \(g(n)\leq K\) a.s. for all n, where K is a.s. finite and \(\{ B_{n}^{2}\}\in\mathcal{H}\) a.s., then
$$\frac{S_{n}}{B_{n}^{2}}\textit{ converges completely to }0\textit{ given } \mathcal{F}. $$
Proof
where the second equality follows by the fact that \(E^{\mathcal {F}}(X_{i}) = 0\) and t is an \(\mathcal{F}\)measurable random variable. If \(t\leq\frac{1}{H}\), then
For \(\mathcal{F}\)measurable \(x\geq0\) and \(0\leq t\leq\frac{1}{H}\), we have that
Let
Then \(h(t)\) is minimized at
and substituting this value into the RHS of (9), after some algebraic manipulations, we obtain that
Since \(X_{1},X_{2},\ldots,X_{n}\) is also a sequence of \(\mathcal {F}\)acceptable random variables, we obtain
Using this inequality, we can prove that
□
Remark 25
In the previous theorem, it is assumed that \(g(n) \leq K\) a.s. for every n, where K is finite a.s. However, we may have the complete convergence without this assumption. For example, the RHS of (10) may be finite even when g is not bounded. Similar statements can be made for Theorems 20 and 22.
3 Conclusions
In this paper, we define the class of conditionally acceptable random variables as a generalization of the class of acceptable random variables studied previously by Giuliano Antonini et al. [1], Shen et al. [3] and Sung et al. [4], among others. The idea of conditioning on a σalgebra is gaining increasing popularity with potential applications in fields such as risk theory and actuarial science. For the class of conditionally acceptable random variables, we provide useful probability inequalities, mainly of the exponential type, which can be used to establish asymptotic results and, in particular, complete convergence results. We anticipate that the results presented in this paper will serve as a basis for research activity, which will yield further theoretical results and applications.
References
Giuliano Antonini, R, Kozachenko, Y, Volodin, A: Convergence of series of dependent ϕsubgaussian random variables. J. Math. Anal. Appl. 338, 11881203 (2008)
Feller, W: An Introduction to Probability Theory and Its Applications, vol. II, 2nd edn. Wiley, New York (1971)
Shen, A, Hu, S, Volodin, A, Wang, X: Some exponential inequalities for acceptable random variables and complete convergence. J. Inequal. Appl. 2011, 142 (2011)
Sung, SH, Srisuradetchai, P, Volodin, A: A note on the exponential inequality for a class of dependent random variables. J. Korean Stat. Soc. 40, 109114 (2011)
Wang, Y, Li, Y, Gao, Q: On the exponential inequality for acceptable random variables. J. Inequal. Appl. 2011, 40 (2011)
Choi, JY, Baek, JI: Exponential inequalities and complete convergence of extended acceptable random variables. J. Appl. Math. Inform. 31, 417424 (2013)
Wang, X, Xu, C, Hu, TC, Volodin, A, Hu, S: On complete convergence for widely orthantdependent random variables and its applications in nonparametric regression models. Test 23(3), 607629 (2014)
Wang, K, Wang, Y, Gao, Q: Uniform asymptotics for the finitetime ruin probability of a new dependent risk model with constant interest rate. Methodol. Comput. Appl. Probab. 15(1), 109124 (2013)
Chow, YS, Teicher, H: Probability Theory: Independence, Interchangeability, Martingales. Springer, New York (1978)
Majerek, D, Nowak, W, Zieba, W: Conditional strong law of large numbers. Int. J. Pure Appl. Math. 20, 143157 (2005)
Roussas, GG: On conditional independence, mixing and association. Stoch. Anal. Appl. 26, 12741309 (2008)
Prakasa Rao, BLS: Conditional independence, conditional mixing and conditional association. Ann. Inst. Stat. Math. 61, 441460 (2009)
Yuan, DM, An, J, Wu, XS: Conditional limit theorems for conditionally negatively associated random variables. Monatshefte Math. 161, 449473 (2010)
Petrov, VV: Limit Theorems of Probability Theory. Oxford University Press, London (1995)
Hoeffding, W: Probability inequalities for sums of bounded random variables. J. Am. Stat. Assoc. 58, 1330 (1963)
Yuan, DM, Xie, Y: Conditional limit theorems for conditionally linearly negative quadrant dependent random variables. Monatshefte Math. 166, 281299 (2012)
Shen, A, Wu, R: Some probability inequalities for a class of random variables and their applications. J. Inequal. Appl. 2013, 57 (2013)
Gut, A: Probability: A Graduate Course. Springer, Berlin (2005)
Christofides, TC, Hadjikyriakou, M: Conditional demimartingales and related results. J. Math. Anal. Appl. 398, 380391 (2013)
Nagaev, SV: Large deviations of sums of independent random variables. Ann. Probab. 7, 745789 (1979)
Teicher, H: Exponential bounds for large deviations of sums of unbounded random variables. Sankhya 46, 4153 (1984)
Acknowledgements
The authors are grateful to the two anonymous referees for their valuable comments, which led to a much improved version of the manuscript.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
All authors contributed equally and read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Christofides, T.C., Fazekas, I. & Hadjikyriakou, M. Conditional acceptability of random variables. J Inequal Appl 2016, 149 (2016). https://doi.org/10.1186/s1366001610931
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s1366001610931