- Research
- Open access
- Published:
Probability inequalities for END sequence and their applications
Journal of Inequalities and Applications volume 2011, Article number: 98 (2011)
Abstract
Some probability inequalities for extended negatively dependent (END) sequence are provided. Using the probability inequalities, we present some moment inequalities, especially the Rosenthal-type inequality for END sequence. At last, we study the asymptotic approximation of inverse moment for nonnegative END sequence with finite first moments, which generalizes and improves the corresponding results of Wu et al. [Stat. Probab. Lett. 79, 1366-1371 (2009)], Wang et al. [Stat. Probab. Lett. 80, 452-461 (2010)], and Sung [J. Inequal. Appl. 2010, Article ID 823767, 13pp. (2010). doi:10.1155/2010/823767].
MSC(2000): 60E15; 62G20.
1 Introduction
It is well known that the probability inequality plays an important role in various proofs of limit theorems. In particular, it provides a measure of convergence rate for the strong law of large numbers. The main purpose of the article is to provide some probability inequalities for extended negatively dependent (END) sequence, which contains independent sequence, NA sequence, and NOD sequence as special cases. These probability inequalities for END random variables are mainly inspired by Fakoor and Azarnoosh [1] and Asadian et al. [2]. Using the probability inequalities, we can further study the moment inequalities and asymptotic approximation of inverse moment for END sequence.
First, we will recall the definitions of NOD and END sequences.
Definition 1.1 (cf. Joag-Dev and Proschan[3]). A finite collection of random variables X1, X2, ..., X n is said to be negatively upper orthant dependent (NUOD) if for all real numbers x1, x2, ..., x n ,
and negatively lower orthant dependent (NLOD) if for all real numbers x1, x2, ..., x n ,
A finite collection of random variables X1, X2, ..., X n is said to be negatively orthant dependent (NOD) if they are both NUOD and NLOD.
An infinite sequence {X n , n ≥ 1} is said to be NOD if every finite subcollection is NOD.
Definition 1.2 (cf. Liu[4]). We call random variables {X n , n ≥ 1} END if there exists a constant M > 0 such that both
and
hold for each n ≥ 1 and all real numbers x1, x2, ..., x n .
The concept of END sequence was introduced by Liu [4]. Some applications for END sequence have been found. See, for example, Liu [4] obtained the precise large deviations for dependent random variables with heavy tails. Liu [5] studied the sufficient and necessary conditions of moderate deviations for dependent random variables with heavy tails. It is easily seen that independent random variables and NOD random variables are END. Joag-Dev and Proschan [3] pointed out that NA random variables are NOD. Thus, NA random variables are END. Since END random variables are much weaker than independent random variables, NA random variables and NOD random variables, studying the limit behavior of END sequence is of interest.
Throughout the article, let {X n , n ≥ 1} be a sequence of END random variables defined on a fixed probability space with respective distribution functions F1, F2, .... Denote X+ = max{0, X}. c n ~ d n means as n → ∞, and c n = o(d n ) means as n → ∞. Let M and C be positive constants which may be different in various places. Set
The following lemma is useful.
Lemma 1.1 (cf. Liu[5]). Let random variables X1, X2, ..., X n be END.
(i) If f1, f2, ..., f n are all nondecreasing (or nonincreasing) functions, then random variables f1(X1), f2(X2), ..., f n (X n ) are END.
(ii) For each n ≥ 1, there exists a constant M > 0 such that
Lemma 1.2. Let {X n , n ≥ 1} be a sequence of END random variables and {t n , n ≥ 1} be a sequence of nonnegative numbers (or nonpositive numbers), then for each n ≥ 1, there exists a constant M > 0 such that
As a byproduct, for any t ∈ ℝ,
Proof. The desired result follows from Lemma 1.1 (i) and (ii) immediately. □
The organization of this article is as follows: The probability inequalities for END sequence are provided in Section 2, the moment inequalities for END sequence are presented in Section 3, and the asymptotic approximation of inverse moment for nonnegative END sequence is studied in Section 4.
2 Probability inequalities for sums of END sequence
In this section, we will give some probability inequalities for END random variables, which can be applied to obtain the moment inequalities and strong law of large numbers. The proofs of the probability inequalities for END random variables are mainly inspired by Fakoor and Azarnoosh [1] and Asadian et al. [2]. Let x, y be arbitrary positive numbers.
Theorem 2.1. Let 0 < t ≤ 1. Then, there exists a positive constant M such that
If xyt-1> M t, n , then
Proof. For y > 0, denote Y i = min(X i , y), i = 1, 2, ..., n and , n ≥ 1. It is easy to check that
which implies that for any positive number h,
Lemma 1.1 (i) implies that Y1, Y2, ..., Y n are still END random variables. It follows from (2.3) and Lemma 1.2 that
where M is a positive constant. For 0 < t ≤ 1, the function (ehu - 1)/ut is increasing on u > 0. Thus,
Combining the inequality above and (2.4), we can get that
Taking in the right-hand side of (2.5), we can get (2.1) immediately. If xyt-1> M t, n , then the right-hand side of (2.5) attains a minimum value when . Substitute this value of h to the right-hand side of (2.5), we can get (2.2) immediately. This completes the proof of the theorem.
By Theorem 2.1, we can get the following Theorem 2.2 immediately.
Theorem 2.2. Let 0 < t ≤ 1. Then, there exists a positive constant M such that
If xyt-1> M t, n , then
Theorem 2.3. Assume that EX i = 0 for each i ≥ 1, then for any h, x, y > 0, there exists a positive constant M such that
If we take , then
Proof. We use the same notations as that in Theorem 2.1. It is easy to see that (ehu - 1 - hu)/u2 is nondecreasing on the real line. Therefore,
which implies that
Replacing X i by -X i , we have
Therefore, (2.8) follows from statements above immediately, which yields the desired result (2.9). The proof is completed.
Theorem 2.4. Assume that EX i = 0 and |X i | ≤ C for each i ≥ 1, where C is a positive constant. Denotefor each n ≥ 1. Then, for any x > 0, there exists a positive constant M such that
and
Proof. It is easily seen that
and
Thus, for all α > 0 and i = 1, 2, ..., n, we can get that
The last inequality above follows from the fact that the function is nondecreasing on the half-line (0, ∞).
Since x = x - 1 + 1 ≤ ex-1for all x ∈ ℝ, we have by Lemma 1.2 that
where C is a positive constant. Therefore, for all α > 0 and x > 0, we have
Taking in the right-hand side of (2.12), we can see that and (2.10) follows.
Since {-X n , n ≥ 1} is still a sequence of END random variables from Lemma 1.1, we have by (2.10) that
Hence, (2.11) follows from (2.10) and (2.13) immediately. This completes the proof of the theorem.
Theorem 2.5. Assume that EX i = 0 and |X i | ≤ C for each i ≥ 1, where C is a positive constant. If, then n-1S n → 0 completely and in consequence n-1S n → 0 a.s.
Proof. For any ε > 0, we have by Theorem 2.4 that
where D is a positive constant. Therefore,
which implies that n-1S n → 0 completely and in consequence n-1S n → 0 a.s. by Borel-Cantelli Lemma. The proof is completed.
Theorem 2.6. Assume that and ES n ≤ 0 for each n ≥ 1 . Denote . If there exists a nondecreasing sequence of positive numbers {c n , n ≥ 1} such that P(S n ≤ c n ) = 1, then for any x > 0,
In order to prove Theorem 2.6, the following lemma is useful.
Lemma 2.1 (cf. Shao[6]). For any x ≥ 0,
Proof of Theorem 2.6. Noting that (ex - 1 - x)/x2 is nondecreasing on the real line, for any h > 0 and n ≥ 1, we have
Hence,
Taking in the right-hand side of (2.15), we can obtain that
By Lemma 2.1, we can get that
The desired result (2.14) follows from the above inequality and (2.16) immediately.
3 Moment inequalities for END sequence
In this section, we will present some moment inequalities, especially the Rosenthal-type inequality for END sequence by means of the probability inequalities that obtained in Section 2. The proofs are also inspired by Asadian et al. [2]. The Rosenthal-type inequality can be applied to prove the asymptotic approximation of inverse moment for nonnegative END random variables in Section 4.
Theorem 3.1. Let 0 < t ≤ 1 and g(x) be a nonnegative even function and nondecreasing on the half-line [0, ∞). Assume that g(0) = 0 and Eg(X i ) < ∞ for each i ≥ 1, then for every r > 0, there exists a positive constant M such that
Proof. Taking in Theorem 2.2, we have
which implies that
Therefore, the desired result (3.1) follows from the inequality above and Lemma 2.4 in Petrov [7] immediately. This completes the proof of the theorem.
Corollary 3.1. Let 0 < t ≤ 1, p ≥ t, and E|X i | p < ∞ for each i ≥ 1. Then, there exists a positive constant C(p, t) depending only on p and t such that
Proof. Taking g(x) = |x| p , p ≥ t in Theorem 3.1, we can get that
It is easy to check that
If we set in the last equality above, then we have for r > p/t that
where
is the Beta function. Substitute I to (3.3) and choose
we can obtain the desired result (3.2) immediately. The proof is completed.
Similar to the proofs of Theorem 3.1 and Corollary 3.1, we can get the following Theorem 3.2 and Corollary 3.2 using Theorem 2.3. The details are omitted.
Theorem 3.2. Let EX i = 0 for each i ≥ 1. Assume that the conditions of Theorem 3.1 are satisfied, then for every r > 0, there exists a positive constant M such that
Corollary 3.2 (Rosenthal-type inequality). Let p ≥ 2, EX i = 0, and E|X i | p < ∞ for each i ≥ 1. Then, there exists a positive constant C p depending only on p such that
4 Asymptotic approximation of inverse moment for nonnegative END random variables
Recently, Wu et al. [8] studied the asymptotic approximation of inverse moment for nonnegative independent random variables by means of the truncated method and Berstein's inequality, and obtained the following result:
Theorem A. Let {Z n , n ≥ 1} be a sequence of independent, nonnegative, and non-degenerated random variables. Suppose that
(i), ∀ n ≥ 1;
(ii) EX n → ∞ as n → ∞, where
(iii) there exists a finite positive constant C1not depending on n such that sup1≤i≤nEZ i /B n ≤ C1;
(iv) for some η > 0,
Then, for all real numbers a > 0 and α > 0,
Wang et al. [9] pointed out that the condition (iii) in Theorem A can be removed and extended the result for independent random variables to the case of NOD random variables. Shi et al. [10] obtained (4.2) for B n = 1 and pointed out that the existence of finite second moments is not required. Sung [11] studied the asymptotic approximation of inverse moments for nonnegative random variables satisfying a Rosenthal-type inequality. For more details about asymptotic approximation of inverse moment, one can refer to Garcia and Palacios [12], Kaluszka and Okolewski [13], and Hu et al. [14], and so on.
The main purpose of this section is to show that (4.2) holds under very mild conditions. Our results will extend and improve the results of Wu et al. [8], Wang et al. [9], and Sung [11].
Now, we state and prove the results of asymptotic approximation of inverse moments for nonnegative END random variables.
Theorem 4.1. Let {Z n , n ≥ 1} be a sequence of nonnegative END random variables and {B n , n ≥ 1} be a sequence of positive constants. Suppose that
(i) EZ n < ∞, ∀n ≥ 1;
(ii)as n → ∞, where;
(iii) there exists some b > 0 such that
Then, for all real numbers a > 0 and α > 0, (4.2) holds.
Proof. It is easily seen that f(x) = (a + x)-αis a convex function of x on [0, ∞), by Jensen's inequality, we have
which implies that
To prove (4.2), it is enough to show that
In order to prove (4.6), we need only to show that for all δ ∈ (0, 1),
By (iii), we can see that for all δ ∈ (0, 1), there exists n(δ) > 0 such that
Let
and
For Q1, since X n ≥ U n , we have
By (4.8), we have for n ≥ n(δ) that
For each n ≥ 1, it is easy to see that {Z k I(Z k ≤ bB n ) + bB n I(Z k > bB n ), 1 ≤ k ≤ n} are END random variables by Lemma 1.1. Therefore, by (4.11), Markov's inequality, Corollary 3.2, and C r 's inequality, for any p > 2 and n ≥ n(δ),
Taking p > max {2, 2α, α +1}, we have by (4.9), (4.10), and (4.12) that
which implies (4.7). This completes the proof of the theorem.
Remark 4.1. Theorem 4.1 in this article generalizes and improves the corresponding results of Wu et al. [8], Wang et al. [9], and Sung [11]. First, Theorem 4.1 in this article is based on the condition EZ n < ∞, ∀ n ≥ 1, which is weaker than the condition , ∀ n ≥ 1 in the above cited references. Second, {B n , n ≥ 1} is an arbitrary sequence of positive constants in Theorem 4.1, while in the above cited references. If we take B n ≡ 1, we can get the asymptotic approximation of inverse moments for the partial sums of nonnegative END random variables. Third, (4.3) is weaker than (4.1). Actually, by the condition (4.1), we can see that
which implies that
since μ n → ∞, i.e. (4.3) holds.
References
Fakoor V, Azarnoosh HA: Probability inequalities for sums of negatively dependent random variables. Park J Stat 2005,21(3):257–264.
Asadian N, Fakoor V, Bozorgnia A: Rosenthal's type inequalities for negatively orthant dependent random variables. JIRSS 2006,5(1–2):69–75.
Joag-Dev K, Proschan F: Negative association of random variables with applications. Ann Stat 1983,11(1):286–295. 10.1214/aos/1176346079
Liu L: Precise large deviations for dependent random variables with heavy tails. Stat Probab Lett 2009, 79: 1290–1298. 10.1016/j.spl.2009.02.001
Liu L: Necessary and sufficient conditions for moderate deviations of dependent random variables with heavy tails. Sci China Ser A Math 2010,53(6):1421–1434. 10.1007/s11425-010-4012-9
Shao QM: A comparison theorem on moment inequalities between negatively associated and independent random variables. J Theoret Probab 2000, 13: 343–356. 10.1023/A:1007849609234
Petrov VV: Limit Theorems of Probability Theory: Sequences of Independent Random Variables. Clarendon Press, Oxford; 1995.
Wu TJ, Shi XP, Miao BQ: Asymptotic approximation of inverse moments of nonnegative random variables. Stat Probab Lett 2009, 79: 1366–1371. 10.1016/j.spl.2009.02.010
Wang XJ, Hu SH, Yang WZ, Ling NX: Exponential inequalities and inverse moment for NOD sequence. Stat Probab Lett 2010, 80: 452–461. 10.1016/j.spl.2009.11.023
Shi XP, Wu YH, Liu Y: A note on asymptotic approximations of inverse moments of nonnegative random variables. Stat Probab Lett 2010, 80: 1260–1264. 10.1016/j.spl.2010.04.004
Sung SH: On inverse moments for a class of nonnegative random variables. J Inequal Appl 2010, 2010: 13. Article ID 823767
Garcia NL, Palacios JL: On inverse moments of nonnegative random variables. Stat Probab Lett 2001, 53: 235–239. 10.1016/S0167-7152(01)00008-6
Kaluszka M, Okolewski A: On Fatou-type lemma for monotone moments of weakly convergent random variables. Stat Probab Lett 2004, 66: 45–50. 10.1016/j.spl.2003.10.009
Hu SH, Chen GJ, Wang XJ, Chen EB: On inverse moments of nonnegative weakly convergent random variables. Acta Math Appl Sin 2007, 30: 361–367. (in Chinese)
Acknowledgements
The author was most grateful to the Editor Andrei Volodin and anonymous referee for the careful reading of the manuscript and valuable suggestions which helped in significantly improving an earlier version of this article.
The research was supported by the National Natural Science Foundation of China (11171001, 71071002), the Academic Innovation Team of Anhui University (KJTD001B), Provincial Natural Science Research Project of Anhui Colleges (KJ2010A005), Talents youth Fund of Anhui Province Universities (2010SQRL016ZD) and Youth Science Research Fund of Anhui University (2009QN011A).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The author declares that they have no competing interests.
Authors' contributions
Some probability inequalities and moment inequalities for extended negatively dependent (END) sequence are provided. The asymptotic approximation of inverse moment for nonnegative END sequence with finite first moments is obtained. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Shen, A. Probability inequalities for END sequence and their applications. J Inequal Appl 2011, 98 (2011). https://doi.org/10.1186/1029-242X-2011-98
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1029-242X-2011-98