Journal of Inequalities and Applications

Impact Factor 0.791

Open Access

Moment Inequalities and Complete Moment Convergence

Journal of Inequalities and Applications20092009:271265

DOI: 10.1155/2009/271265

Received: 22 August 2009

Accepted: 26 September 2009

Published: 11 October 2009

Abstract

Let and be sequences of random variables. For any and , bounds for and are obtained. From these results, we establish general methods for obtaining the complete moment convergence. The results of Chow (1988), Zhu (2007), and Wu and Zhu (2009) are generalized and extended from independent (or dependent) random variables to random variables satisfying some mild conditions. Some applications to dependent random variables are discussed.

1. Introduction

Let be a sequence of random variables defined on a fixed probability space . The most interesting inequalities to probability theory are probably Marcinkiewicz-Zygmund and Rosenthal inequalities. For a sequence of i.i.d. random variables with for some , Marcinkiewicz and Zygmund [1] and Rosenthal [2] ( and , resp.) proved that there exist positive constants and depending only on such that
(1.1)
(1.2)
The following Marcinkiewicz-Zygmund and Rosenthal type maximal inequalities are well known. For a sequence of i.i.d. random variables with for some , there exist positive constants and depending only on such that
(1.3)
(1.4)

Note that (1.3) and (1.4) imply (1.1) and (1.2), respectively. The above inequalities have been obtained for dependent random variables by many authors. Shao [3] proved that (1.3) and (1.4) hold for negatively associated random variables. Asadian et al. [4] proved that (1.1) and (1.2) hold for negatively orthant dependent random variables.

For a sequence of some mixing random variables, (1.4) holds. However, the constant depends on both and the sequence of mixing random variables. Shao [5] obtained (1.4) for -mixing identically distributed random variables satisfying . Shao [6] also obtained (1.4) for -mixing identically distributed random variables satisfying . Utev and Peligrad [7] obtained (1.4) for -mixing random variables.

The concept of complete convergence was introduced by Hsu and Robbins [8]. A sequence of random variables is said to converge completely to the constant if
(1.5)

In view of the Borel-Cantelli lemma, this implies that almost surely. Therefore the complete convergence is a very important tool in establishing almost sure convergence of summation of random variables. Hsu and Robbins [8] proved that the sequence of arithmetic means of i.i.d. random variables converges completely to the expected value if the variance of the summands is finite. Erdös [9] proved the converse.

The result of Hsu-Robbins-Erdös has been generalized and extended in several directions. Baum and Katz [10] proved that if is a sequence of i.i.d. random variables with , is equivalent to
(1.6)
Chow [11] generalized the result of Baum and Katz [10] by showing the following complete moment convergence. If is a sequence of i.i.d. random variables with for some and , then
(1.7)

where . Note that (1.7) implies (1.6) (see Remark 2.6).

Recently, Zhu [12] obtained a complete convergence for -mixing random variables. Wu and Zhu [13] obtained complete moment convergence results for negatively orthant dependent random variables.

In this paper, we give general methods for obtaining the complete moment convergence by using some moment inequalities. From these results, we generalize and extend the results of Chow [11], Zhu [12], and Wu and Zhu [13] from independent (or dependent) random variables to random variables satisfying some conditions similar to (1.1)–(1.4).

2. Complete Moment Convergence for Random Variables

In this section, we give general methods for obtaining the complete moment convergence by using some moment inequalities. The first two lemmas are simple inequalities for real numbers.

Lemma 2.1.

For any real numbers , , , the inequality holds
(2.1)

Proof.

The result follows by an elementary calculation.

The following lemma is a slight generalization of Lemma 2.1.

Lemma 2.2.

Let and be two sequences of real numbers. Then for any real number , the inequality holds
(2.2)

Proof.

By Lemma 2.1, we obtain
(2.3)

The next two lemmas play essential roles in the paper. Lemma 2.3 gives a moment inequality for the sum of random variables.

Lemma 2.3.

Let and be sequences of random variables. Then for any , , ,
(2.4)

Proof.

By Lemma 2.1,
(2.5)
On the other hand, we have by Markov's inequality that
(2.6)

Substituting (2.6) into (2.5), we have the result.

The following lemma gives a moment inequality for the maximum partial sum of random variables.

Lemma 2.4.

Let and be sequences of random variables. Then for any , , ,
(2.7)

Proof.

By Lemma 2.2,
(2.8)

The rest of the proof is similar to that of Lemma 2.3 and is omitted.

Now we state and prove one of our main results. The following theorem gives a general method for obtaining the complete moment convergence for sums of random variables satisfying (2.9). The condition (2.9) is well known Marcinkiewicz-Zygmund inequality.

Theorem 2.5.

Let be an array of random variables with for , . Let and be sequences of positive real numbers. Suppose that the following conditions hold.

(i)For some , there exists a positive constant depending only on such that
(2.9)

where .

(ii) .

(iii) .

Then
(2.10)

Proof.

Observe that
(2.11)
(2.12)
Then we have by Lemma 2.3, (2.9), (2.11), and (2.12) that
(2.13)

The above two series converge by (ii) and (iii). Hence the result is proved.

Remark 2.6.

If (2.10) holds, then for all , since
(2.14)

Hence complete moment convergence is more general than complete convergence.

When , we have the following theorem. Condition (2.15) is well-known Rosenthal inequality.

Theorem 2.7.

Let be an array of random variables with for , . Let and be sequences of positive real numbers. Suppose that the following conditions hold.

(i)For some , there exists a positive constant depending only on such that
(2.15)

where .

(ii) .

(iii) .

(iv) for some .

Then (2.10) holds.

Proof.

The proof is same as that of Theorem 2.5 except that
(2.16)

Corollary 2.8.

Let be a sequence of positive real numbers. Let be an array of random variables satisfying (2.15) for some . Suppose that the following conditions hold.

(i) .

(ii) .

(iii) for some and .

Then
(2.17)
and hence,
(2.18)

Proof.

By Remark 2.6, (2.17) implies (2.18). To prove (2.17), we apply Theorem 2.7 with . Since ,
(2.19)

Hence the result follows by Theorem 2.7.

The following theorem gives a general method for obtaining the complete moment convergence for maximum partial sums of random variables satisfying condition (2.20).

Theorem 2.9.

Let be an array of random variables with for , . Let and be sequences of positive real numbers. Suppose that the following conditions hold.

(i)For some , there exists a positive constant depending only on such that
(2.20)

where .

(ii) .

(iii) .

Then
(2.21)

Proof.

The proof is similar to that of Theorem 2.5. We have by Lemma 2.4, (2.20), (ii), and (iii) that
(2.22)

Hence the result is proved.

Remark 2.10.

If (2.21) holds, then for all , since, as in Remark 2.6,
(2.23)

When , we have the following theorem.

Theorem 2.11.

Let be an array of random variables with for , . Let and be sequences of positive real numbers. Suppose that the following conditions hold.

(i)For some , there exists a positive constant depending only on such that
(2.24)

where .

(ii) .

(iii) .

(iv) for some .

Then (2.21) holds.

Proof.

The proof is similar to that of Theorem 2.9 and is omitted.

Corollary 2.12.

Let be a sequence of positive real numbers. Let be an array of random variables satisfying (2.24) for some . Suppose that the following conditions hold.

(i) .

(ii) .

(iii) for some and .

Then
(2.25)
and hence,
(2.26)

Proof.

By Remark 2.10, (2.25) implies (2.26). As in the proof of Corollary 2.8,
(2.27)

Hence the result follows by Theorem 2.11 with .

3. Corollaries

In this section, we establish some complete moment convergence results by using the results obtained in the previous section.

Throughout this section, let be a sequence of positive even functions satisfying
(3.1)

for some .

To obtain complete moment convergence results, the following lemmas are needed.

Lemma 3.1.

Let be a random variable and a sequence of positive even functions satisfying (3.1) for some . Then for all and , the followings hold.

(i)If , then .

(ii) .

Proof.

First note by that is an increasing function. If , then implies , and so
(3.2)
Hence (i) holds. Since ,
(3.3)

So (ii) holds.

Lemma 3.2.

Let be an array of random variables with for , . Let and be sequences of positive real numbers. Assume that is a sequence of positive even functions satisfying (3.1) for some and
(3.4)

Then the followings hold.

(i)If , then .

(ii) .

Proof.

The result follows from Lemma 3.1.

By using Lemma 3.2, we can obtain Corollaries 3.3, 3.4, 3.5, 3.6 from Theorem 2.5, Corollary 2.8, Theorem 2.9, Corollary 2.12, respectively.

Corollary 3.3.

Let and be sequences of positive real numbers a sequence of positive even functions satisfying (3.1) for some . Assume that is an array of random variables satisfying (2.9) for and (3.4). Then (2.10) holds.

Corollary 3.4.

Let be a sequence of positive real numbers a sequence of positive even functions satisfying (3.1) for some . Assume that is an array of random variables satisfying (2.15) for some ( is the same as in (3.6)),
(3.5)
(3.6)

Then (2.17) holds and hence, (2.18) holds.

Corollary 3.5.

Let and be sequences of positive real numbers a sequence of positive even functions satisfying (3.1) for some . Assume that is an array of random variables satisfying (2.20) for and (3.4). Then (2.21) holds.

Corollary 3.6.

Let be a sequence of positive real numbers a sequence of positive even functions satisfying (3.1) for some . Assume that is an array of random variables satisfying (2.24) for some ( is the same as in (3.6)), (3.5), and (3.6). Then (2.25) holds and hence, (2.26) holds.

Remark 3.7.

Marcinkiewicz-Zygmund and Rosenthal (type) inequalities hold for dependent random variables as well as independent random variables.
1. (1)

For an array of rowwise negatively associated random variables, condition (2.20) holds if , and (2.24) holds if by Shao's [3] results. Note that is still an array of rowwise negatively associated random variables. Hence Corollaries 3.3–3.6 hold for arrays of rowwise negatively associated random variables.

2. (2)

For an array of rowwise negatively orthant dependent random variables, condition (2.9) holds if , and (2.15) holds if by the results of Asadian et al. [4]. Hence Corollaries 3.3 and 3.4 hold for arrays of rowwise negatively orthant dependent random variables. These results also were proved by Wu and Zhu [13]. Hence Corollaries 3.3 and 3.4 extend the results of Wu and Zhu [13] from an array of negatively orthant dependent random variables to an array of random variables satisfying (2.9) and (2.15).

3. (3)

For an array of rowwise -mixing random variables, condition (2.24) does not necessarily hold if . As mentioned in Section 1, Utev and Peligrad [7] proved (1.4) for -mixing random variables. However, the constant depends on both and the sequence of -mixing random variables. Hence condition (2.24) holds for an array of rowwise -mixing random variables under the additional condition that depending on the sequence of random variables in each row are bounded. So Corollary 3.6 holds for arrays of rowwise -mixing random variables satisfying this additional condition. Zhu [12] obtained only (2.26) in Corollary 3.6 when the array is rowwise -mixing random variables satisfying the additional condition. This additional condition should be added in Zhu [12]. Hence Corollary 3.6 generalizes and extends Zhu's [12] result from -mixing random variables to more general random variables.

Finally, we apply the complete moment convergence results obtained in the previous section to a sequence of identically distributed random variables.

Corollary 3.8.

Let be a sequence of identically distributed random variables with for some and . Assume that for any , there exists a positive constant depending only on such that
(3.7)

where . Then (1.7) holds.

Proof.

Let for , . We apply Theorem 2.7 with and . Take and such that , , and . Then it is easy to see that
(3.8)

Hence the result follows from Theorem 2.7.

Corollary 3.9.

Let be a sequence of identically distributed random variables with for some and . Assume that for any , there exists a positive constant depending only on such that
(3.9)
where . Then
(3.10)

Proof.

As in the proof of Corollary 3.8, (3.8) are satisfied. So the result follows from Theorem 2.11.

Remark 3.10.

If is a sequence of i.i.d. random variables, then conditions (3.7) and (3.9) are satisfied when . Hence Corollaries 3.8 and 3.9 generalize and extend the result of Chow [11]. There are many sequences of dependent random variables satisfying (3.7) for all . Examples include sequences of negatively orthant dependent random variables, negatively associated random variables, -mixing random variables, -mixing identically distributed random variables satisfying , and -mixing identically distributed random variables satisfying . The above sequences of dependent random variables except negatively orthant dependent random variables also satisfy (3.9) when . Hence Corollaries 3.8 and 3.9 hold for many dependent random variables as well as independent random variables.

Declarations

Acknowledgments

The author would like to thank the referees for the helpful comments and suggestions. This work was supported by the Korea Science and Engineering Foundation (KOSEF) grant funded by the Korea government (MOST) (no. R01-2007-000-20053-0).

Authors’ Affiliations

(1)
Department of Applied Mathematics, Pai Chai University

References

1. Marcinkiewicz J, Zygmund A: Sur les fonctions independantes. Fundamenta Mathematicae 1937, 29: 60–90.
2. Rosenthal HP: On the subspaces of spanned by sequences of independent random variables. Israel Journal of Mathematics 1970, 8: 273–303. 10.1007/BF02771562
3. Shao Q-M: A comparison theorem on moment inequalities between negatively associated and independent random variables. Journal of Theoretical Probability 2000,13(2):343–356. 10.1023/A:1007849609234
4. Asadian N, Fakoor V, Bozorgnia A: Rosenthal's type inequalities for negatively orthant dependent random variables. Journal of the Iranian Statistical Society 2006, 5: 69–75.Google Scholar
5. Shao Q-M: A moment inequality and its applications. Acta Mathematica Sinica 1988,31(6):736–747.
6. Shao Q-M: Maximal inequalities for partial sums of -mixing sequences. The Annals of Probability 1995,23(2):948–965. 10.1214/aop/1176988297
7. Utev S, Peligrad M: Maximal inequalities and an invariance principle for a class of weakly dependent random variables. Journal of Theoretical Probability 2003,16(1):101–115. 10.1023/A:1022278404634
8. Hsu PL, Robbins H: Complete convergence and the law of large numbers. Proceedings of the National Academy of Sciences of the United States of America 1947, 33: 25–31. 10.1073/pnas.33.2.25
9. Erdös P: On a theorem of Hsu and Robbins. Annals of Mathematical Statistics 1949, 20: 286–291. 10.1214/aoms/1177730037
10. Baum LE, Katz M: Convergence rates in the law of large numbers. Transactions of the American Mathematical Society 1965, 120: 108–123. 10.1090/S0002-9947-1965-0198524-1
11. Chow YS: On the rate of moment convergence of sample sums and extremes. Bulletin of the Institute of Mathematics. Academia Sinica 1988,16(3):177–201.
12. Zhu M-H: Strong laws of large numbers for arrays of rowwise -mixing random variables. Discrete Dynamics in Nature and Society 2007, 2007:-6.Google Scholar
13. Wu Y-F, Zhu D-J: Convergence properties of partial sums for arrays of rowwise negatively orthant dependent random variables. Journal of the Korean Statistical Society. In press Journal of the Korean Statistical Society. In pressGoogle Scholar