Convergence for sums of i.i.d. random variables under sublinear expectations

In this paper, we obtain equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables under sublinear expectations space. The results obtained in the paper are extensions of the equivalent conditions of complete moment convergence of the maximum under classical linear expectation space.


Introduction
Peng [7,8] initiated an important concept of the sublinear expectation space to study the uncertainty of probability and distribution. The seminal works of Peng [7,8] attracted people to study inequalities and limit theorems under sublinear expectation space. Zhang [12][13][14] obtained important inequalities including exponential and Rosenthal's inequalities and studied Donsker's invariance principle under sublinear expectations. Inspired by the works of Zhang [12][13][14][15], Huang and Wu [5] and Zhong and Wu [16] studied some limits theorems under sublinear expectation space. Recently, under sublinear expectations, Wu [10] proved precise asymptotics for complete integral convergence, and Xu and Cheng [11] established precise asymptotics in the law of iterated logarithm. Under sublinear expectations for more limit theorems, the interested reader can refer to Chen [1], Xu [2], Hu et al. [3], Hu and Yang [4], and references therein.
Recently, Meng et al. [6] studied the convergence for sums of asymptotically almost negatively associated random variables. For references on complete convergence in linear expectation space, the interested reader can refer to Meng et al. [6], Shen and Wu [9], and references therein. The work of Meng et al. [6] motivates us to wonder whether or not the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables under sublinear expectations hold. Here we get that the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables hold under sublinear expectations, which complement the results of Meng et al. [6] to those under sublinear expectations.
We organize the rest of this paper as follows. In the next section, we recall necessary notions, concepts, and relevant properties and present necessary lemmas under sublinear expectations. In Sect. 3, we present our main results, Theorems 3.1 and 3.2, whose proofs are given in Sect. 4.

Preliminaries
We use the notations as in the work by Peng [8]. Let ( , F) be a measurable space, and let H be a subset of all random variables on ( , F) such that I A ∈ H, where A ∈ F , and X 1 , . . . , X n ∈ H implies that ϕ(X 1 , . . . , X n ) ∈ H for each (locally Lipschitz) function ϕ ∈ C l,Lip (R n ) satisfying for some C > 0 and m ∈ N, which depend on ϕ.
In this paper, given a sublinear expectation space ( , H, E), we set the capacity V(A) := E[I A ] for A ∈ F . Clearly, V is subadditive. We set the Choquet expectations C V by Given two random vectors X = (X 1 , . . . , X m ), n=1 is said to be a sequence of independent random variables if X n+1 is independent of (X 1 , . . . , X n ) for each n ≥ 1.
Let X 1 and X 2 be two n-dimensional random vectors defined, respectively, in sublinear expectation spaces ( 1 , H 1 , E 1 ) and ( 2 , H 2 , E 2 ). They are said to be identically distributed if for every Borel-measurable function ψ such that ψ(X 1 ), ψ(X 2 ) ∈ H, whenever the sublinear expectations are finite. {X n } ∞ n=1 is said to be identically distributed if for each i ≥ 1, X i and X 1 are identically distributed.
Throughout this paper, we assume that E is countably subadditive, that is, E(X) ≤ ∞ n=1 E(X n ) whenever X ≤ ∞ n=1 X n , X, X n ∈ H, and X ≥ 0, X n ≥ 0, n = 1, 2, . . . . By C we denote positive constants, which may differ in different places; I(A) or I A stands for the indicator function of A; a n b n means that there exists a constant C > 0 such that a n ≤ Cb n for n large enough, and a n ≈ b n means that a n b n and b n a n . We denote log x := ln max{e, x}.
To establish our results, we need the following lemmas.

Lemma 2.4 Let Y be a random variable under sublinear expectation space ( , H, E).
Then for any α > 0, γ > 0, and β < -1, (ii) By Lemma 2.3 and the proof of Lemma 2.2 in Zhong and Wu [16] we have Proof We borrow the idea from Shen and Wu [9]. Write A k = (|X k | > x) and Without loss of generality, we may assume that β n > 0. Since which immediately results in (2.4). The proof is finished.

Main results
Throughout the rest of this paper, we assume that {X n , n ≥ 0} is a sequence of independent random variables, identically distributed as X under sublinear expectation space ( , H, E) with E(X i ) = -E(-X i ) = 0, i = 0, 1, 2, . . . . Our main results are as follows.
In Theorem 3.2, taking α = 1, we get the following corollary.
Remark 3.1 Under the same assumptions of Theorem 3.1, we obtain for all ε > 0,

Proofs of Theorems 3.1 and 3.2
Proof of Theorem 3.1 Here we borrow the idea of the proof of Theorem 16 in Meng et al. [6]. We first prove that (3.1) implies (3.2). For all 1 ≤ i ≤ n, n ≥ 1, write E|b ni X i |I |b ni X i | > 1 =: I + II + III.
Thus, to prove (3.2), we need to establish I < ∞, II < ∞, and III < ∞. We first prove I < ∞. For fixed n ≥ 1, since {Y ni -EY ni , 1 ≤ i ≤ n} is a sequence of independent identically distributed random variables under sublinear expectation space ( , H, E), for M ≥ 2, combining Lemma 2.2, C r 's inequality, Markov's inequality, and Jensen's inequality under sublinear expectations results in
Proof of Corollary 3.1 The proofs here are similar to that of Theorem 16 in Meng et al. [6]. For reader's convenience, we give a brief explanation here. For 0 ≤ i ≤ n -1, n ≥ 1, write Y ni = -1 b ni I(b ni X i < -1) + X i I |b ni X i | ≤ 1 + 1 b ni I(b ni X i > 1).
The rest of proof is similar to that of Theorem 3.1, so they are omitted.