Skip to content

Advertisement

  • Research
  • Open Access

Complete q th moment convergence for arrays of random variables

Journal of Inequalities and Applications20132013:24

https://doi.org/10.1186/1029-242X-2013-24

Received: 9 October 2012

Accepted: 3 January 2013

Published: 17 January 2013

Abstract

Let { X n i , 1 i n , n 1 } be an array of random variables with E X n i = 0 and E | X n i | q < for some q 1 . For any sequences { a n , n 1 } and { b n , n 1 } of positive real numbers, sets of sufficient conditions are given for complete q th moment convergence of the form n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q < , ϵ > 0 , where x + = max { x , 0 } . From these results, we can easily obtain some known results on complete q th moment convergence.

Keywords

complete convergencecomplete moment convergence L q -convergencedependent random variables

1 Introduction

The concept of complete convergence was introduced by Hsu and Robbins [1]. A sequence { X n , n 1 } of random variables is said to converge completely to the constant θ if
n = 1 P ( | X n θ | > ϵ ) < for all  ϵ > 0 .

Hsu and Robbins [1] proved that the sequence of arithmetic means of i.i.d. random variables converges completely to the expected value if the variance of the summands is finite. Erdös [2] proved the converse.

The result of Hsu, Robbins, and Erdös has been generalized and extended in several directions. Baum and Katz [3] proved that if { X n , n 1 } is a sequence of i.i.d. random variables with E X 1 = 0 , E | X 1 | p t < ( 1 p < 2 , t 1 ) is equivalent to
n = 1 n t 2 P ( | i = 1 n X i | > ϵ n 1 / p ) < for all  ϵ > 0 .
(1.1)
Chow [4] generalized the result of Baum and Katz [3] by showing the following complete moment convergence. If { X n , n 1 } is a sequence of i.i.d. random variables with E X 1 = 0 and E ( | X 1 | p t + | X 1 | log ( 1 + | X 1 | ) ) < for some 0 < p < 2 , t 1 , and p t 1 , then
n = 1 n t 2 1 / p E ( max 1 k n | i = 1 k X i | ϵ n 1 / p ) + < for all  ϵ > 0 ,
(1.2)
where x + = max { x , 0 } . Note that (1.2) implies (1.1). Li and Spătaru [5] gave a refinement of the result of Baum and Katz [3] as follows. Let { X n , n 1 } be a sequence of i.i.d. random variables with E X 1 = 0 , and let 0 < p < 2 , t 1 , q > 0 , and p t 1 . Then
{ E | X 1 | q < if  q > p t , E | X 1 | p t log ( 1 + | X 1 | ) < if  q = p t , E | X 1 | p t < if  q < p t ,
(1.3)
if and only if
ϵ n = 1 n t 2 P ( | i = 1 n X i | > x 1 / q n 1 / p ) d x < for all  ϵ > 0 .
Recently, Chen and Wang [6] proved that for any q > 0 , any sequences { a n , n 1 } and { b n , n 1 } of positive real numbers and any sequence { Z n , n 1 } of random variables,
ϵ n = 1 b n P ( | Z n | > x 1 / q a n ) d x < for all  ϵ > 0
and
n = 1 b n a n q E ( | Z n | ϵ a n ) + q < for all  ϵ > 0 ,
are equivalent. Therefore, if { X n , n 1 } is a sequence of i.i.d. random variables with E X 1 = 0 and 0 < p < 2 , t 1 , q > 0 , and p t 1 , then the moment condition (1.3) is equivalent to
n = 1 n t 2 q / p E ( | i = 1 n X i | ϵ n 1 / p ) + q < for all  ϵ > 0 .
(1.4)

When q = 1 , the complete q th moment convergence (1.4) is reduced to complete moment convergence.

The complete q th moment convergence for dependent random variables was established by many authors. Chen and Wang [7] showed that (1.3) and (1.4) are equivalent for φ-mixing random variables. Zhou and Lin [8] established complete q th moment convergence theorems for moving average processes of φ-mixing random variables. Wu et al. [9] obtained complete q th moment convergence results for arrays of rowwise ρ -mixing random variables.

The purpose of this paper is to provide sets of sufficient conditions for complete q th moment convergence of the form
n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q < for all  ϵ > 0 ,
(1.5)

where q 1 , { a n , n 1 } and { b n , n 1 } are sequences of positive real numbers, and { X n i , 1 i n , n 1 } is an array of random variables satisfying Marcinkiewicz-Zygmund and Rosenthal type inequalities. When q = 1 , similar results were established by Sung [10]. From our results, we can easily obtain the results of Chen and Wang [7] and Wu et al. [9].

2 Main results

In this section, we give sets of sufficient conditions for complete q th moment convergence (1.5). The following theorem gives sufficient conditions under the assumption that the array { X n i , 1 i n , n 1 } satisfies a Marcinkiewicz-Zygmund type inequality.

Theorem 2.1 Let 1 q < 2 and let { X n i , 1 i n , n 1 } be an array of random variables with E X n i = 0 and E | X n i | q < for 1 i n and n 1 . Let { a n , n 1 } and { b n , n 1 } be sequences of positive real numbers. Suppose that the following conditions hold:
  1. (i)
    for some s ( 1 q < s 2 ), there exists a positive function α s ( x ) such that
    (2.1)
     
where X n i ( x ) = X n i I ( | X n i | x 1 / q ) + x 1 / q I ( X n i > x 1 / q ) x 1 / q I ( X n i < x 1 / q ) ,
  1. (ii)

    n = 1 b n a n s α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) < ,

     
  2. (iii)

    n = 1 b n a n q ( 1 + α s ( n ) ) i = 1 n E | X n i | q I ( | X n i | > a n ) < ,

     
  3. (iv)

    i = 1 n E | X n i | I ( | X n i | > a n ) / a n 0 .

     

Then (1.5) holds.

Proof It is obvious that
n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q = n = 1 b n a n q 0 P ( max 1 k n | i = 1 k X n i | > ϵ a n + x 1 / q ) d x n = 1 b n a n q { 0 a n q P ( max 1 k n | i = 1 k X n i | > ϵ a n ) d x + a n q P ( max 1 k n | i = 1 k X n i | > x 1 / q ) d x } : = I 1 + I 2 .
We first show that I 1 < . For 1 i n and n 1 , define
X n i = X n i I ( | X n i | a n ) + a n I ( X n i > a n ) a n I ( X n i < a n ) , X n i = X n i X n i .
Then we have by E X n i = 0 , Markov’s inequality, and (i) that
P ( max 1 k n | i = 1 k X n i | > ϵ a n ) = P ( max 1 k n | i = 1 k ( X n i E X n i + X n i E X n i ) | > ϵ a n ) P ( max 1 k n | i = 1 k ( X n i E X n i ) | > ϵ a n / 2 ) + P ( max 1 k n | i = 1 k ( X n i E X n i ) | > ϵ a n / 2 ) 2 s ϵ s a n s E max 1 k n | i = 1 k ( X n i E X n i ) | s + 2 ϵ 1 a n 1 E max 1 k n | i = 1 k ( X n i E X n i ) | 2 s ϵ s a n s α s ( n ) i = 1 n E | X n i | s + 4 ϵ 1 a n 1 i = 1 n E | X n i | 2 s ϵ s a n s α s ( n ) i = 1 n ( E | X n i | s I ( | X n i | a n ) + a n s P ( | X n i | > a n ) ) + 4 ϵ 1 a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 2 s ϵ s a n s α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + 2 s ϵ s a n q α s ( n ) i = 1 n E | X n i | q I ( | X n i | > a n ) + 4 ϵ 1 a n q i = 1 n E | X n i | q I ( | X n i | > a n ) .
It follows that
I 1 = n = 1 b n P ( max 1 k n | i = 1 k X n i | > ϵ a n ) 2 s ϵ s n = 1 b n a n s α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + n = 1 b n a n q ( 2 s ϵ s α s ( n ) + 4 ϵ 1 ) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 1 < by (ii) and (iii).

We next show that I 2 < . By the definition of X n i ( x ) , we have that
P ( max 1 k n | i = 1 k X n i | > x 1 / q ) i = 1 n P ( | X n i | > x 1 / q ) + P ( max 1 k n | i = 1 k X n i ( x ) | > x 1 / q ) .
We also have by E X n i = 0 and (iv) that
sup x a n q max 1 k n x 1 / q | i = 1 k E X n i ( x ) | = sup x a n q max 1 k n x 1 / q | i = 1 k E ( X n i X n i ( x ) ) | sup x a n q x 1 / q i = 1 n E | X n i | I ( | X n i | > x 1 / q ) a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 0 .
Hence to prove that I 2 < , it suffices to show that
I 3 : = n = 1 b n a n q i = 1 n a n q P ( | X n i | > x 1 / q ) d x < , I 4 : = n = 1 b n a n q a n q P ( max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | > x 1 / q / 2 ) d x < .
If x > a n q , then P ( | X n i | > x 1 / q ) = P ( | X n i | I ( | X n i | > a n ) > x 1 / q ) and so
a n q P ( | X n i | > x 1 / q ) d x = a n q P ( | X n i | I ( | X n i | > a n ) > x 1 / q ) d x 0 P ( | X n i | I ( | X n i | > a n ) > x 1 / q ) d x = E | X n i | q I ( | X n i | > a n ) ,
which implies that
I 3 n = 1 b n a n q i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 3 < by (iii).

Finally, we show that I 4 < . We get by Markov’s inequality and (i) that
I 4 2 s n = 1 b n a n q a n q x s / q E max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | s d x 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q E | X n i ( x ) | s d x = 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q ( E | X n i | s I ( | X n i | x 1 / q ) + x s / q P ( | X n i | > x 1 / q ) ) d x = 2 s n = 1 b n a n q α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) a n q x s / q d x + 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q E | X n i | s I ( a n < | X n i | x 1 / q ) d x + 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q P ( | X n i | > x 1 / q ) d x : = I 5 + I 6 + I 7 .
Using a simple integral and Fubini’s theorem, we obtain that
Similarly to I 3 ,
I 7 2 s n = 1 b n a n q α s ( n ) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 4 < by (ii) and (iii). □

The next theorem gives sufficient conditions for complete q th moment convergence (1.5) under the assumption that the array { X n i , 1 i n , n 1 } satisfies a Rosenthal type inequality.

Theorem 2.2 Let q 1 and let { X n i , 1 i n , n 1 } be an array of random variables with E X n i = 0 and E | X n i | q < for 1 i n and n 1 . Let { a n , n 1 } and { b n , n 1 } be sequences of positive real numbers. Suppose that the following conditions hold:
  1. (i)
    for some s > max { 2 , 2 q / r } (r is the same as in (v)), there exist positive functions β s ( x ) and γ s ( x ) such that
    (2.2)
     
where X n i ( x ) = X n i I ( | X n i | x 1 / q ) + x 1 / q I ( X n i > x 1 / q ) x 1 / q I ( X n i < x 1 / q ) ,
  1. (ii)

    n = 1 b n a n s β s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) < ,

     
  2. (iii)

    n = 1 b n a n q ( 1 + β s ( n ) ) i = 1 n E | X n i | q I ( | X n i | > a n ) < ,

     
  3. (iv)

    i = 1 n E | X n i | I ( | X n i | > a n ) / a n 0 ,

     
  4. (v)

    n = 1 b n γ s ( n ) ( i = 1 n a n r E | X n i | r ) s / 2 < for some 0 < r 2 .

     

Then (1.5) holds.

Proof The proof is similar to that of Theorem 2.1. As in the proof of Theorem 2.1,
n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q n = 1 b n a n q { 0 a n q P ( max 1 k n | i = 1 k X n i | > ϵ a n ) d x + a n q P ( max 1 k n | i = 1 k X n i | > x 1 / q ) d x } : = J 1 + J 2 .
Similarly to I 1 in the proof of Theorem 2.1, we have by the c r -inequality that
P ( max 1 k n | i = 1 k X n i | > ϵ a n ) 2 s ϵ s a n s β s ( n ) i = 1 n ( E | X n i | s I ( | X n i | a n ) + a n s P ( | X n i | > a n ) ) + 2 s ϵ s a n s γ s ( n ) ( i = 1 n E | X n i | 2 I ( | X n i | a n ) + a n 2 P ( | X n i | > a n ) ) s / 2 + 4 ϵ 1 a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 2 s ϵ s a n s β s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + ( 2 s ϵ s β s ( n ) + 4 ϵ 1 ) a n q i = 1 n E | X n i | q I ( | X n i | > a n ) + 2 3 s / 2 1 ϵ s γ s ( n ) ( i = 1 n a n r E | X n i | r I ( | X n i | a n ) ) s / 2 + 2 3 s / 2 1 ϵ s γ s ( n ) ( i = 1 n a n r E | X n i | r I ( | X n i | > a n ) ) s / 2 .

Hence J 1 < by (ii), (iii), and (v).

As in the proof of Theorem 2.1, to prove that J 2 < , it suffices to show that
J 3 : = n = 1 b n a n q i = 1 n a n q P ( | X n i | > x 1 / q ) d x < , J 4 : = n = 1 b n a n q a n q P ( max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | > x 1 / q / 2 ) d x < .

The proof of J 3 < is same as that of I 3 in the proof of Theorem 2.1.

For J 4 , we have by Markov’s inequality and (i) that
J 4 2 s n = 1 b n a n q a n q x s / q E max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | s d x 2 s n = 1 b n a n q a n q x s / q { β s ( n ) i = 1 n E | X n i ( x ) | s + γ s ( n ) ( i = 1 n E | X n i ( x ) | 2 ) s / 2 } d x : = J 5 + J 6 .
Similarly to I 4 in the proof of Theorem 2.1, we get that
J 5 2 s q s q n = 1 b n a n s β s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + 2 s ( q s q + 1 ) n = 1 b n a n q β s ( n ) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence J 5 < by (ii) and (iii).

Finally, we show that J 6 < . By the c r -inequality,
J 6 = 2 s n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | 2 I ( | X n i | x 1 / q ) + x 2 / q P ( | X n i | > x 1 / q ) ) s / 2 d x 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | 2 I ( | X n i | x 1 / q ) ) s / 2 d x + 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q ( i = 1 n P ( | X n i | > x 1 / q ) ) s / 2 d x 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | r x ( 2 r ) / q ) s / 2 d x + 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q ( i = 1 n x r / q E | X n i | r ) s / 2 d x = 2 3 s / 2 2 q r s 2 q n = 1 b n γ s ( n ) ( i = 1 n a n r E | X n i | r ) s / 2 .

Hence J 6 < by (v). □

Remark 2.1 Marcinkiewicz-Zygmund and Rosenthal type inequalities hold for dependent random variables as well as independent random variables.
  1. (1)

    Let { X n i , 1 i n , n 1 } be an array of rowwise negatively associated random variables. Then, for 1 < s 2 , (2.1) holds for α s ( n ) = 2 s 2 3 s = 8 . For s > 2 , (2.2) holds for β s ( n ) = 2 s 2 ( 15 s / log s ) s and γ s ( n ) = 2 ( 15 s / log s ) s (see Shao [11]). Note that α s ( n ) and β s ( n ) are multiplied by the factor 2 s since E | X n i ( x ) E X n i ( x ) | s 2 s E | X n i ( x ) | s .

     
  2. (2)

    Let { X n i , 1 i n , n 1 } be an array of rowwise negatively orthant dependent random variables. By Corollary 2.2 of Asadian et al. [12] and Theorem 3 of Móricz [13], (2.1) holds for α s ( n ) = C 1 ( log n ) s , and (2.2) holds for β s ( n ) = C 2 ( log n ) s and γ s ( n ) = C 2 ( log n ) s , where C 1 and C 2 are constants depending only on s.

     
  3. (3)

    Let { X n , n 1 } be a sequence of identically distributed φ-mixing random variables. Set X n i = X i for 1 i n and n 1 . By Shao’s [14] result, (2.2) holds for a constant function β s ( x ) and a slowly varying function γ s ( x ) . In particular, if n = 1 φ 1 / 2 ( 2 n ) < , then (2.2) holds for some constant functions β s ( x ) and γ s ( x ) .

     
  4. (4)

    Let { X n , n 1 } be a sequence of identically distributed ρ-mixing random variables. Set X n i = X i for 1 i n and n 1 . By Shao’s [15] result, (2.2) holds for some slowly varying functions β s ( x ) and γ s ( x ) . In particular, if n = 1 ρ 2 / s ( 2 n ) < , then (2.2) holds for some constant functions β s ( x ) and γ s ( x ) .

     
  5. (5)

    Let { X n , n 1 } be a sequence of ρ -mixing random variables. Set X n i = X i for 1 i n and n 1 . By the result of Utev and Peligrad [16], (2.2) holds for some constant functions β s ( x ) and γ s ( x ) .

     

3 Corollaries

In this section, we establish some complete q th moment convergence results by using the results obtained in the previous section.

Corollary 3.1 (Chen and Wang [7])

Let { X n , n 1 } be a sequence of identically distributed φ-mixing random variables with E X 1 = 0 , and let t 1 , 0 < p < 2 , q 1 , and p t 1 . Assume that (1.3) holds. Furthermore, suppose that
n = 1 φ 1 / 2 ( 2 n ) <
if t = 1 and max { q , p t } < 2 . Then
n = 1 n t 2 q / p E ( max 1 k n | i = 1 k X i | ϵ n 1 / p ) + q < for all  ϵ > 0 .

Proof Let a n = n 1 / p and b n = n t 2 for n 1 , and let X n i = X i for 1 i n and n 1 . Then, for s 2 , (2.2) holds for a constant function β s ( x ) and a slowly varying function γ s ( x ) (see Remark 2.1(3)). Under the additional condition that n = 1 φ 1 / 2 ( 2 n ) < , (2.2) holds for some constant functions β s ( x ) and γ s ( x ) . In particular, for s = 2 , (2.1) holds for a constant function α s ( x ) under this additional condition.

By a standard method, we have that
n = 1 n t 1 s / p E | X 1 | s I ( | X 1 | n 1 / p ) C E | X 1 | p t if  p t < s , n = 1 n t 1 q / p E | X 1 | q I ( | X 1 | > n 1 / p ) { C E | X 1 | q if  q > p t , C E | X 1 | p t log ( 1 + | X 1 | ) if  q = p t , C E | X 1 | p t if  q < p t , n 1 1 / p E | X 1 | I ( | X 1 | > n 1 / p ) n 1 t E | X 1 | p t I ( | X 1 | > n 1 / p ) if  p t 1 ,
where C is a positive constant which is not necessarily the same one in each appearance. Hence, the conditions (i)-(iv) of Theorem 2.2 hold if we take s > max { p t , 2 , 2 q / r } . Under the additional conditions that max { q , p t } < 2 and n = 1 φ 1 / 2 ( 2 n ) < , all conditions of Theorem 2.1 hold if we take s = 2 . Therefore, the result follows from Theorems 2.1 and 2.2 if we only show that the condition (v) of Theorem 2.2 holds when t > 1 or max { q , p t } 2 . To do this, we take r = 2 if max { q , p t } 2 and r = max { q , p t } if max { q , p t } < 2 . If t > 1 or max { q , p t } 2 , then r > p and so we can choose s > 2 large enough such that t 1 + ( 1 r / p ) s / 2 < 0 . Then
n = 1 b n γ s ( n ) ( i = 1 n a n r E | X n i | r ) s / 2 = ( E | X 1 | r ) s / 2 n = 1 γ s ( n ) n t 2 + ( 1 r / p ) s / 2 < .

Hence the condition (v) of Theorem 2.2 holds. □

Let { Ψ n ( x ) , n 1 } be a sequence of positive even functions satisfying
Ψ n ( | x | ) | x | q and Ψ n ( | x | ) | x | s as  | x |
(3.1)

for some 1 q < s .

Corollary 3.2 Let { Ψ n ( x ) , n 1 } be a sequence of positive even functions satisfying (3.1) for some 1 q < s 2 . Let { X n i , 1 i n , n 1 } be an array of random variables satisfying E X n i = 0 for 1 i n and n 1 , and (2.1) for some constant function α s ( x ) . Let { a n , n 1 } and { b n , n 1 } be sequences of positive real numbers. Suppose that the following conditions hold:
  1. (i)

    n = 1 b n i = 1 n E Ψ i ( | X n i | ) / Ψ i ( a n ) < ,

     
  2. (ii)

    i = 1 n E Ψ i ( | X n i | ) / Ψ i ( a n ) 0 .

     

Then (1.5) holds.

Proof First note by Ψ i ( | x | ) / | x | q that Ψ i ( | x | ) is an increasing function. Since Ψ i ( | x | ) / | x | s ,
| X n i | s I ( | X n i | a n ) a n s Ψ i ( | X n i | I ( | X n i | a n ) ) Ψ i ( a n ) Ψ i ( | X n i | ) Ψ i ( a n ) .
Since q 1 and Ψ i ( | x | ) / | x | q ,
| X n i | I ( | X n i | > a n ) a n | X n i | q I ( | X n i | > a n ) a n q Ψ i ( | X n i | I ( | X n i | > a n ) ) Ψ i ( a n ) Ψ i ( | X n i | ) Ψ i ( a n ) .

It follows that all conditions of Theorem 2.1 are satisfied and so the result follows from Theorem 2.1. □

Corollary 3.3 Let { Ψ n ( x ) , n 1 } be a sequence of positive even functions satisfying (3.1) for some q 1 and s > max { 2 , q } . Let { X n i , 1 i n , n 1 } be an array of random variables satisfying E X n i = 0 for 1 i n and n 1 , and (2.2) for some constant functions β s ( x ) and γ s ( x ) . Let { a n , n 1 } and { b n , n 1 } be sequences of positive real numbers. Suppose that the following conditions hold:
  1. (i)

    n = 1 b n i = 1 n E Ψ i ( | X n i | ) / Ψ i ( a n ) < ,

     
  2. (ii)

    i = 1 n E Ψ i ( | X n i | ) / Ψ i ( a n ) 0 ,

     
  3. (iii)

    n = 1 b n ( i = 1 n a n 2 E | X n i | 2 ) s / 2 < .

     

Then (1.5) holds.

Proof The proof is similar to that of Corollary 3.2. By the proof of Corollary 3.2 and the condition (iii), all conditions of Theorem 2.2 are satisfied and so the result follows from Theorem 2.2. □

Remark 3.1 When b n = 1 for n 1 , the condition (i) of Corollaries 3.2 and 3.3 is reduced to the condition n = 1 i = 1 n E Ψ i ( | X n i | ) / Ψ i ( a n ) < , and so the condition (ii) of Corollaries 3.2 and 3.3 follows from this reduced condition. For a sequence of ρ -mixing random variables, (2.1) holds for some constant function α s ( x ) if s = 2 , and (2.2) holds for some constant functions β s ( x ) and γ s ( x ) if s > 2 (see Remark 2.1(5)). Wu et al. [9] proved Corollaries 3.2 and 3.3 when b n = 1 for n 1 , and { X n i } is an array of rowwise ρ -mixing random variables.

Declarations

Acknowledgements

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2010-0013131).

Authors’ Affiliations

(1)
Department of Applied Mathematics, Pai Chai University, Taejon, South Korea

References

  1. Hsu PL, Robbins H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 1947, 33: 25–31. 10.1073/pnas.33.2.25MATHMathSciNetView ArticleGoogle Scholar
  2. Erdös P: On a theorem of Hsu and Robbins. Ann. Math. Stat. 1949, 20: 286–291. 10.1214/aoms/1177730037MATHView ArticleGoogle Scholar
  3. Baum LE, Katz M: Convergence rates in the law of large numbers. Trans. Am. Math. Soc. 1965, 120: 108–123. 10.1090/S0002-9947-1965-0198524-1MATHMathSciNetView ArticleGoogle Scholar
  4. Chow YS: On the rate of moment convergence of sample sums and extremes. Bull. Inst. Math. Acad. Sin. 1988, 16: 177–201.MATHGoogle Scholar
  5. Li D, Spătaru A: Refinement of convergence rates for tail probabilities. J. Theor. Probab. 2005, 18: 933–947. 10.1007/s10959-005-7534-2MATHView ArticleGoogle Scholar
  6. Chen PY, Wang DC: Convergence rates for probabilities of moderate deviations for moving average processes. Acta Math. Sin. 2008, 24: 611–622. 10.1007/s10114-007-6062-7MATHView ArticleGoogle Scholar
  7. Chen PY, Wang DC: Complete moment convergence for sequences of identically distributed φ -mixing random variables. Acta Math. Sin. 2010, 26: 679–690. 10.1007/s10114-010-7625-6MATHView ArticleGoogle Scholar
  8. Zhou XC, Lin JG: Complete q -moment convergence of moving average processes under φ -mixing assumption. J. Math. Res. Expo. 2011, 31: 687–697.MATHMathSciNetGoogle Scholar
  9. Wu Y, Wang C, Volodin A:Limiting behavior for arrays of rowwise ρ -mixing random variables. Lith. Math. J. 2012, 52: 214–221. 10.1007/s10986-012-9168-2MATHMathSciNetView ArticleGoogle Scholar
  10. Sung SH: Moment inequalities and complete moment convergence. J. Inequal. Appl. 2009., 2009: Article ID 271265Google Scholar
  11. Shao QM: A comparison theorem on moment inequalities between negatively associated and independent random variables. J. Theor. Probab. 2000, 13: 343–356. 10.1023/A:1007849609234MATHView ArticleGoogle Scholar
  12. Asadian N, Fakoor V, Bozorgnia A: Rosenthal’s type inequalities for negatively orthant dependent random variables. J. Iran. Stat. Soc. 2006, 5: 69–75.Google Scholar
  13. Móricz F: Moment inequalities and the strong laws of large numbers. Z. Wahrscheinlichkeitstheor. Verw. Geb. 1976, 35: 299–314. 10.1007/BF00532956MATHView ArticleGoogle Scholar
  14. Shao QM: A moment inequality and its applications. Acta Math. Sin. Chin. Ser. 1988, 31: 736–747.MATHGoogle Scholar
  15. Shao QM: Maximal inequalities for partial sums of ρ -mixing sequences. Ann. Probab. 1995, 23: 948–965. 10.1214/aop/1176988297MATHMathSciNetView ArticleGoogle Scholar
  16. Utev S, Peligrad M: Maximal inequalities and an invariance principle for a class of weakly dependent random variables. J. Theor. Probab. 2003, 16: 101–115. 10.1023/A:1022278404634MATHMathSciNetView ArticleGoogle Scholar

Copyright

© Sung; licensee Springer 2013

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Advertisement