Convergence for sums of i. i. d. random variables under sublinear expectations

In this paper, we prove the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent, identically distributed random variables under sublinear expectations space. As applications, the Baum-Katz type results for the maximum for partial weighted sums of independent, identically distributed random variables are established under sublinear expectations space. The results obtained in the article are the extensions of the equivalent conditions of complete moment convergence of the maximum under classical linear expectation space.


Introduction
Peng [7,8] first initiated the important concept of the sublinear expectation space to study the uncertainty of probability and distribution. The seminal works of Peng [7,8] attracted people to study inequalities and limit theorems under sublinear expectation space. Zhang [12,13,14] obtained important inequalities including exponential inequalities, Rosenthal's inequalities, and studied Donsker's invariance principle under sublinear expectations. Inspired by the works of Zhang [12,13,14,15], Huang and Wu [5], Zhong and Wu [16] studied some limits theorems under sublinear expectation space. Recently under sublinear expectations, Wu [10] proved precise asymptotics for complete integral convergence, Xu and Cheng [11] establish precise asymptotics in the law of iterated logarithm. Under sublinear expectations for more limit theorems, the interested reader could refer to Chen [1], Xu [2], Hu et al. [3], Hu and Yang [4] and references therein.
Recently Meng et al. [6] studied convergence for sums of asymptotically almost negatively associated random variables. For references on complete convergence in linear expectation space, the interested reader could refer to Meng et al. [6], Shen and Wu [9] and references therein. The work of Meng et al. [6] motivate us to wonder whether or not the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent, identically distributed random variables under sublinear expectations hold. Here we get that the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent, identically distributed random variables hold under sublinear expectations, which complement the results of Meng et al. [6] to those under sublinear expectations.
We organize the rest of this paper as follows. In the next section, we recall necessary notions, concepts and relevant properties, and present necessary lemmas under sublinear expectations. In Section 3, we present our main results, Theorems 3.1, 3.2, whose proofs are given in Section 4.

Preliminaries
We use notations as in the work by Peng [8]. Suppose that (Ω, F) is a given measurable space. Let H be a subset of all random variables on (Ω, F) such that I A ∈ H, where A ∈ F, and X 1 , · · · , X n ∈ H implies ϕ(X 1 , · · · , X n ) ∈ H for each (local lipschitz) function ϕ ∈ C l,Lip (R n ) satisfying satisfying the following properties: for all X, Y ∈ H, we have Here, we refer to (a)-(d) as monotonicity, constant preserving, positive homogeneity, subadditivity of E[·] respectively.
In this paper, given a sublinear expectation space (Ω, H, E), we set a capacity: V(A) := , ∀A ∈ F. Clearly V is a sub-additive capacity. We set the Choquet expectations C V by Given two random vectors X = (X 1 , · · · , X m ), for each x and E[|ψ(X)|] < ∞ (cf. Definition 2.5 in Chen [1] ). {X n } ∞ n=1 is said to be a sequence of independent random variables, if X n+1 is independent of (X 1 , · · · , X n ) for each n ≥ 1.
Assume that X 1 and X 2 are two n-dimensional random vectors defined, respectively, in sublinear expectation spaces (Ω 1 , H 1 , E 1 ) and (Ω 2 , H 2 , E 2 ). They are said to be identically distributed if for every Borel-measurable function ψ such that ψ(X 1 ), ψ(X 2 ) ∈ H, whenever the sublinear expectations are finite. {X n } ∞ n=1 is said to be identically distributed if for each i ≥ 1, X i and X 1 are identically distributed.
Throughout this paper we assume that E is countably sub-additive, i.e., E(X) ≤ ∞ n=1 E(X n ), whenever X ≤ ∞ n=1 X n , X, X n ∈ H, and X ≥ 0, X n ≥ 0, n = 1, 2, . . .. C represents a positive constant, which may differ in different lines. I(A) or I A stands for the indicator function of A, a n ≪ b n means that there exists a constant C > 0 such that a n ≤ Cb n for n large enough and let a n ≈ b n denote that a n ≪ b n and b n ≪ a n . Denote ln max{e, x} by log x.

Lemma 2.2 (cf. Corollary 2.2 and Theorem 2.3 in Zhang [14]) Suppose that
Then for any α > 0, γ > 0, and β < −1, Proof (i) By Lemma 4.5 in Zhang [13], (ii) By Lemma 4.5 in Zhang [13] and the proof of Lemma 2.2 in Zhong and Wu [16], Lemma 2.5 Let {X n ; n ≥ 1} be a sequence of independent random variables under sublinear expectation space (Ω, H, E). Then the condition that for all x > 0, implies that there exists constants C such that for all x > 0, for n large enough, Proof We borrow the idea from Shen and Wu [9]. Write A k = (|a nk X k | > x) and Without restriction of generality, we can assume that β n > 0. Since (2.7) By (2.7), independence of I(A k ), k = 1, . . . , n, subadditivity of sublinear expectations, and Hölder's inequality under sublinear expectations, we conclude that which combined with (2.5) results in (2.6) immediately. The proof is finished.

Main results
Thoughout the rest of this paper, we assume that {X n , n ≥ 1} is a sequence of independent random variables, identically distributed as X under sublinear expectation space (Ω, H, E) with E(X i ) = −E(−X i ) = 0, i = 1, 2, . . .. Our main results are as follows.
Consequently, this finishes the proof of Theorem 3.1.
Proof [Proof of Corollary 3.1] As in the proof of Theorem 16 in Meng et al. [6], the proof is similar to that of Theorem 3.1, so it is omitted.
Proof [Proof of Theorem 3.2] Taking b ni = A α−1 n−i /A α n , 0 ≤ i ≤ n, n ≥ 1, by the similar proof of Theorem 18 in Meng et al. [6], we deduce that the assumptions of Corollary 3.1 hold. Hence (3.8) follows from (3.4). The proof of Theorem 3.2 is finished.