Skip to main content

Convergence for sums of i.i.d. random variables under sublinear expectations

Abstract

In this paper, we obtain equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables under sublinear expectations space. The results obtained in the paper are extensions of the equivalent conditions of complete moment convergence of the maximum under classical linear expectation space.

1 Introduction

Peng [7, 8] initiated an important concept of the sublinear expectation space to study the uncertainty of probability and distribution. The seminal works of Peng [7, 8] attracted people to study inequalities and limit theorems under sublinear expectation space. Zhang [12–14] obtained important inequalities including exponential and Rosenthal’s inequalities and studied Donsker’s invariance principle under sublinear expectations. Inspired by the works of Zhang [12–15], Huang and Wu [5] and Zhong and Wu [16] studied some limits theorems under sublinear expectation space. Recently, under sublinear expectations, Wu [10] proved precise asymptotics for complete integral convergence, and Xu and Cheng [11] established precise asymptotics in the law of iterated logarithm. Under sublinear expectations for more limit theorems, the interested reader can refer to Chen [1], Xu [2], Hu et al. [3], Hu and Yang [4], and references therein.

Recently, Meng et al. [6] studied the convergence for sums of asymptotically almost negatively associated random variables. For references on complete convergence in linear expectation space, the interested reader can refer to Meng et al. [6], Shen and Wu [9], and references therein. The work of Meng et al. [6] motivates us to wonder whether or not the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables under sublinear expectations hold. Here we get that the equivalent conditions of complete moment convergence of the maximum for partial weighted sums of independent identically distributed random variables hold under sublinear expectations, which complement the results of Meng et al. [6] to those under sublinear expectations.

We organize the rest of this paper as follows. In the next section, we recall necessary notions, concepts, and relevant properties and present necessary lemmas under sublinear expectations. In Sect. 3, we present our main results, Theorems 3.1 and 3.2, whose proofs are given in Sect. 4.

2 Preliminaries

We use the notations as in the work by Peng [8]. Let \((\Omega ,\mathcal{F})\) be a measurable space, and let \(\mathcal{H}\) be a subset of all random variables on \((\Omega ,\mathcal{F})\) such that \(I_{A}\in \mathcal{H}\), where \(A\in \mathcal{F}\), and \(X_{1},\ldots ,X_{n}\in \mathcal{H}\) implies that \(\varphi (X_{1},\ldots ,X_{n})\in \mathcal{H}\) for each (locally Lipschitz) function \(\varphi \in \mathcal{C}_{l,\mathrm{Lip}}(\mathbb{R}^{n})\) satisfying

$$ \bigl\vert \varphi (\mathbf{x})-\varphi (\mathbf{y}) \bigr\vert \le C \bigl(1+ \vert \mathbf{x} \vert ^{m}+ \vert \mathbf{y} \vert ^{m} \bigr) \bigl( \vert \mathbf{x}-\mathbf{y} \vert \bigr),\quad \forall \mathbf{x}, \mathbf{y}\in \mathbb{R}^{n}, $$

for some \(C>0\) and \(m\in \mathbb{N}\), which depend on φ.

Definition 2.1

A sublinear expectation \(\mathbb{E}\) on \(\mathcal{H}\) is a functional \(\mathbb{E}:\mathbb{H}\mapsto \bar{\mathbb{R}}:=[-\infty ,\infty ]\) satisfying the following properties: for all \(X,Y\in \mathcal{H}\), we have

  1. (a)

    If \(X\ge Y\), then \(\mathbb{E}[X]\ge \mathbb{E}[Y]\);

  2. (b)

    \(\mathbb{E}[c]=c\), \(\forall c\in \mathbb{R}\);

  3. (c)

    \(\mathbb{E}[\lambda X]=\lambda \mathbb{E}[X]\), \(\forall \lambda \ge 0\);

  4. (d)

    \(\mathbb{E}[X+Y]\le \mathbb{E}[X]+\mathbb{E}[Y]\) whenever \(\mathbb{E}[X]+\mathbb{E}[Y]\) is not of the form \(\infty -\infty \) or \(-\infty +\infty \).

We refer to (a)–(d) as monotonicity, constant preserving, positive homogeneity, and subadditivity of \(\mathbb{E}[\cdot ]\) respectively.

A set function \(V:\mathcal{F}\mapsto [0,1]\) is called a capacity if

  1. (a)

    \(V(\emptyset )=0\), \(V(\Omega )=1\);

  2. (b)

    \(V(A)\le V(B)\), \(A\subset B\), \(A,B\in \mathcal{F}\).

A capacity V is said to be subadditive if \(V(A+B)\le V(A)+V(B)\), \(A,B\in \mathcal{F}\).

In this paper, given a sublinear expectation space \((\Omega , \mathcal{H}, \mathbb{E})\), we set the capacity \(\mathbb{V}(A):=\mathbb{E}[I_{A}]\) for \(A\in \mathcal{F}\). Clearly, \(\mathbb{V}\) is subadditive. We set the Choquet expectations \(\mathcal{C}_{\mathbb{V}}\) by

$$ \mathcal{C}_{\mathbb{V}}(X):= \int _{0}^{\infty }\mathbb{V}(X>x) \,\mathrm{d}x + \int _{-\infty }^{0} \bigl(\mathbb{V}(X>x)-1 \bigr) \,\mathrm{d}x. $$

Given two random vectors \(\mathbf{X}=(X_{1},\ldots , X_{m})\), \(X_{i}\in \mathcal{H}\), and \(\mathbf{Y}=(Y_{1},\ldots ,Y_{n})\), \(Y_{i}\in \mathcal{H}\), on \((\Omega , \mathcal{H}, \mathbb{E})\), Y is said to be independent of X if for each Borel-measurable function ψ on \(\mathbb{R}^{m}\times \mathbb{R}^{n}\) with \(\psi (\mathbf{X},\mathbf{Y})\) such that \(\psi (\mathbf{x},\mathbf{Y})\in \mathcal{H}\) for each \(\mathbf{x}\in \mathbb{R}^{m}\), we have \(\mathbb{E}[\psi (\mathbf{X},\mathbf{Y})]=\mathbb{E}[\mathbb{E} \psi (\mathbf{x},\mathbf{Y})|_{\mathbf{x}=\mathbf{X}}]\) whenever \(\mathbb{E}[|\psi (\mathbf{x},\mathbf{Y})|]<\infty \) for each x and \(\mathbb{E}[|\mathbb{E}\psi (\mathbf{x},\mathbf{Y})|_{\mathbf{x}=\mathbf{X}}|]<\infty \) (cf. Definition 2.5 in Chen [1]). \(\{X_{n}\}_{n=1}^{\infty }\) is said to be a sequence of independent random variables if \(X_{n+1}\) is independent of \((X_{1},\ldots ,X_{n})\) for each \(n\ge 1\).

Let \(\mathbf{X}_{1}\) and \(\mathbf{X}_{2}\) be two n-dimensional random vectors defined, respectively, in sublinear expectation spaces \((\Omega _{1},\mathcal{H}_{1},\mathbb{E}_{1})\) and \((\Omega _{2},\mathcal{H}_{2},\mathbb{E}_{2})\). They are said to be identically distributed if for every Borel-measurable function ψ such that \(\psi (X_{1}), \psi (X_{2})\in \mathcal{H}\),

$$ \mathbb{E}_{1} \bigl[\psi (\mathbf{X}_{1}) \bigr]= \mathbb{E}_{2} \bigl[\psi ( \mathbf{X}_{2}) \bigr] \text{ } $$

whenever the sublinear expectations are finite. \(\{X_{n}\}_{n=1}^{\infty }\) is said to be identically distributed if for each \(i\ge 1\), \(X_{i}\) and \(X_{1}\) are identically distributed.

Throughout this paper, we assume that \(\mathbb{E}\) is countably subadditive, that is, \(\mathbb{E}(X)\le \sum_{n=1}^{\infty }\mathbb{E}(X_{n})\) whenever \(X\le \sum_{n=1}^{\infty }X_{n}\), \(X,X_{n}\in \mathcal{H}\), and \(X\ge 0\), \(X_{n}\ge 0\), \(n=1,2,\ldots \) . By C we denote positive constants, which may differ in different places; \(I(A)\) or \(I_{A}\) stands for the indicator function of A; \(a_{n}\ll b_{n}\) means that there exists a constant \(C>0\) such that \(a_{n}\le C b_{n}\) for n large enough, and \(a_{n}\approx b_{n}\) means that \(a_{n}\ll b_{n}\) and \(b_{n}\ll a_{n}\). We denote \(\log x:=\ln \max \{\mathrm{e}, x\}\).

To establish our results, we need the following lemmas.

Lemma 2.1

Let Y be a random variable under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\). Then for any \(\alpha >0\), \(\gamma >0\), and \(\beta >-1\),

$$\begin{aligned}& (\mathrm{i})\quad \int _{1}^{\infty }u^{\beta } \mathcal{C}_{ \mathbb{V}} \bigl( \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert >u^{\gamma } \bigr) \bigr)\,\mathrm{d}u\le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \bigr), \\& (\mathrm{ii})\quad \int _{1}^{\infty }u^{\beta }\ln (u) \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert >u^{\gamma } \bigr) \bigr)\,\mathrm{d}u \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr). \end{aligned}$$

Proof

(i)

$$ \begin{aligned} & \int _{1}^{\infty }u^{\beta } \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert >u^{ \gamma } \bigr) \bigr)\,\mathrm{d}u \\ &\quad = \int _{1}^{\infty }u^{\beta } \biggl( \int _{0}^{u^{\gamma }}\mathbb{V} \bigl( \vert Y \vert >u^{ \gamma } \bigr)\alpha t^{\alpha -1}\,\mathrm{d}t+ \int _{u^{\gamma }}^{\infty } \mathbb{V} \bigl( \vert Y \vert ^{\alpha }>t^{\alpha } \bigr)\alpha t^{\alpha -1} \,\mathrm{d}t \biggr)\,\mathrm{d}u \\ &\quad = \int _{1}^{\infty }u^{\beta +\alpha \gamma }\mathbb{V} \bigl( \vert Y \vert >u^{\gamma } \bigr) \,\mathrm{d}u+ \int _{1}^{\infty }\alpha t^{\alpha -1}\mathbb{V} \bigl( \vert Y \vert >t \bigr) \biggl( \int _{1}^{t^{1/\gamma }}u^{\beta }\,\mathrm{d}u \biggr) \,\mathrm{d}t \\ &\quad \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \bigr)+C \int _{1}^{\infty }t^{\alpha -1+(\beta +1)/\gamma }\mathbb{V} \bigl( \vert Y \vert >t \bigr) \,\mathrm{d}t \\ &\quad \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \bigr). \end{aligned} $$

(ii) As in the proof of Lemma 2.2 in Zhong and Wu [16], let \(Z(x)=x^{(\beta +1)/\gamma +\alpha }\ln (1+x)\), and let \(Z^{-1}(x)\) be the inverse function of \(Z(x)\). Then

$$\begin{aligned} & \int _{1}^{\infty }u^{\beta }\ln (u) \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{ \alpha }I \bigl( \vert Y \vert >u^{\gamma } \bigr) \bigr)\,\mathrm{d}u \\ &\quad = \int _{1}^{\infty }u^{\beta }\ln (u) \biggl( \int _{0}^{u^{\gamma }} \mathbb{V} \bigl( \vert Y \vert >u^{\gamma } \bigr)\alpha t^{\alpha -1}\,\mathrm{d}t+ \int _{u^{ \gamma }}^{\infty }\mathbb{V} \bigl( \vert Y \vert ^{\alpha }>t^{\alpha } \bigr)\alpha t^{ \alpha -1}\,\mathrm{d}t \biggr)\,\mathrm{d}u \\ &\quad = \int _{1}^{\infty }u^{\beta +\alpha \gamma }\ln (u)\mathbb{V} \bigl( \vert Y \vert >u^{ \gamma } \bigr)\,\mathrm{d}u+ \int _{1}^{\infty }\alpha t^{\alpha -1} \mathbb{V} \bigl( \vert Y \vert >t \bigr) \biggl( \int _{1}^{t^{1/\gamma }}u^{\beta }\ln (u) \,\mathrm{d}u \biggr)\,\mathrm{d}t \\ &\quad \approx \int _{1}^{\infty } \bigl(x^{(\beta +1)+\alpha \gamma -\gamma } \ln \bigl(1+x^{\gamma } \bigr)+x^{(\beta +1)+\alpha \gamma -\gamma } \bigr)x^{ \gamma -1} \mathbb{V} \bigl( \vert Y \vert >x^{\gamma } \bigr)\,\mathrm{d}x \\ &\qquad {}+C \int _{1}^{\infty }t^{\alpha -1+(\beta +1)/\gamma }\ln (t) \mathbb{V} \bigl( \vert Y \vert >t \bigr)\,\mathrm{d}t \\ &\quad \approx \int _{1}^{\infty }\mathbb{V} \bigl( \vert Y \vert >x \bigr)\,\mathrm{d}Z(x)+C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr) \\ &\quad \le C \int _{1}^{\infty }\mathbb{V} \bigl( \vert Y \vert >Z^{-1}(x) \bigr) \,\mathrm{d}x+C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma + \alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr) \\ &\quad \le C \int _{1}^{\infty }\mathbb{V} \bigl( \vert Y \vert ^{(\beta +1)/\gamma+\alpha } \ln \bigl(1+ \vert Y \vert \bigr)>x \bigr)\,\mathrm{d}x+C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{( \beta +1)/\gamma +\alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr) \\ &\quad \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma+\alpha } \ln \bigl(1+ \vert Y \vert \bigr) \bigr)+C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr) \\ &\quad \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \ln \bigl(1+ \vert Y \vert \bigr) \bigr). \end{aligned}$$

 □

Denote \(S_{k}=X_{1}+\cdots +X_{k}\), \(S_{0}=0\).

Lemma 2.2

(cf. Corollary 2.2 and Theorem 2.3 in Zhang [14])

Suppose that \(X_{k+1}\) is independent of \((X_{1},\ldots ,X_{k})\) under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\) with \(\mathbb{E}(X_{i})\le 0\), \(k=1,\ldots ,n-1\). Then

$$\begin{aligned}& \mathbb{E} \Bigl[ \Bigl\vert \max_{k\le n}(S_{n}-S_{k}) \Bigr\vert ^{M} \Bigr]\le 2^{2-M}\sum _{k=1}^{n}\mathbb{E} \bigl[ \vert X_{k} \vert ^{M} \bigr]\quad \textit{for }1\le M\le 2, \end{aligned}$$
(2.1)
$$\begin{aligned}& \mathbb{E} \Bigl[ \Bigl\vert \max_{k\le n}(S_{n}-S_{k}) \Bigr\vert ^{M} \Bigr]\le C_{M} \Biggl\{ \sum _{k=1}^{n}\mathbb{E} \bigl[ \vert X_{k} \vert ^{M} \bigr]+ \Biggl(\sum _{k=1}^{n}\mathbb{E}[ \vert X_{k} \vert ^{2} \Biggr)^{M/2} \Biggr\} \quad \textit{for }M\ge 2. \end{aligned}$$
(2.2)

We also cite Lemma 4.5(iii) in Zhang [13] as follows.

Lemma 2.3

(Lemma 4.5 (iii) in Zhang [13])

If \(\mathbb{E}\) is countably subadditive and \(\mathcal{C}_{\mathbb{V}}(|X|)<\infty \), then

$$ \mathbb{E} \bigl( \vert X \vert \bigr)\le \mathcal{C}_{\mathbb{V}} \bigl( \vert X \vert \bigr)< \infty . $$

Lemma 2.4

Let Y be a random variable under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\). Then for any \(\alpha >0\), \(\gamma >0\), and \(\beta <-1\),

$$\begin{aligned}& \begin{gathered} (\mathrm{i})\quad \int _{1}^{\infty }u^{\beta } \mathbb{E} \bigl[ \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert \le u^{\gamma } \bigr) \bigr]\,\mathrm{d}u \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \bigr), \\ (\mathrm{ii})\quad \int _{1}^{\infty }u^{\beta }\ln (u) \mathbb{E} \bigl[ \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert \le u^{\gamma } \bigr) \bigr]\,\mathrm{d}u \le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/\gamma +\alpha } \ln \bigl(1+ \vert Y \vert \bigr) \bigr). \end{gathered} \end{aligned}$$
(2.3)

Proof

(i) By Lemma 2.3 we have

$$\begin{aligned}& \int _{1}^{\infty }u^{\beta }\mathbb{E} \bigl[ \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert \le u^{ \gamma } \bigr) \bigr]\,\mathrm{d}u \\& \quad \le \int _{1}^{\infty }u^{\beta } \int _{0}^{u^{\gamma }}\mathbb{V} \bigl( \vert Y \vert I \bigl( \vert Y \vert \le u^{\gamma } \bigr)>t \bigr)\alpha t^{\alpha -1} \,\mathrm{d}t\,\mathrm{d}u \\& \quad \le C \int _{0}^{\infty }\mathbb{V} \bigl( \vert Y \vert >t \bigr) t^{\alpha -1} \int _{1 \bigvee t^{1/\gamma }}^{\infty }u^{\beta }\,\mathrm{d}u \,\mathrm{d}t \\& \quad \le C \int _{0}^{\infty }\mathbb{V} \bigl( \vert Y \vert >t \bigr)t^{\alpha -1+(\beta +1)/ \gamma }\,\mathrm{d}t\le \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{(\beta +1)/ \gamma +\alpha } \bigr). \end{aligned}$$

(ii) By Lemma 2.3 and the proof of Lemma 2.2 in Zhong and Wu [16] we have

$$\begin{aligned}& \int _{1}^{\infty }u^{\beta }\ln (u)\mathbb{E} \bigl[ \vert Y \vert ^{\alpha }I \bigl( \vert Y \vert \le u^{\gamma } \bigr) \bigr]\,\mathrm{d}u \\& \quad \le \int _{1}^{\infty }u^{\beta }\ln (u) \int _{0}^{u^{\gamma }} \mathbb{V} \bigl( \vert Y \vert I \bigl( \vert Y \vert \le u^{\gamma } \bigr)>t \bigr)\alpha t^{\alpha -1} \,\mathrm{d}t\,\mathrm{d}u \\& \quad \le C \int _{0}^{\infty }\mathbb{V} \bigl( \vert Y \vert >t \bigr) t^{\alpha -1} \int _{1 \bigvee t^{1/\gamma }}^{\infty }u^{\beta }\ln (u)\,\mathrm{d}u \,\mathrm{d}t \\& \quad \le C \int _{0}^{\infty }\mathbb{V} \bigl( \vert Y \vert >t \bigr)t^{\alpha -1+(\beta +1)/ \gamma }\ln (t+1)\,\mathrm{d}t\le C \mathcal{C}_{\mathbb{V}} \bigl( \vert Y \vert ^{( \beta +1)/\gamma +\alpha }\ln \bigl(1+ \vert Y \vert \bigr) \bigr). \end{aligned}$$

 □

Lemma 2.5

Let \(\{X_{n};n\ge 1\}\) be a sequence of independent random variables under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\). Then for all \(n\ge 1\) and \(x>0\),

$$\begin{aligned} & \Bigl[1-\mathbb{V} \Bigl(\max _{1\le j\le n} \vert X_{j} \vert >x \Bigr) \Bigr]^{2}\sum_{j=1}^{n} \mathbb{V} \bigl( \vert X_{j} \vert >x \bigr)\le 4\mathbb{V} \Bigl( \max_{1\le j\le n} \vert X_{j} \vert >x \Bigr). \end{aligned}$$
(2.4)

Proof

We borrow the idea from Shen and Wu [9]. Write \(A_{k}=(|X_{k}|>x)\) and

$$ \beta _{n}=1-\mathbb{V} \bigl(\cup _{k=1}^{n}A_{k} \bigr)=1-\mathbb{V} \Bigl( \max_{1\le j\le n} \vert X_{j} \vert >x \Bigr). $$

Without loss of generality, we may assume that \(\beta _{n}>0\). Since \(\{I(|X_{k}|>x)-\mathbb{E}I(|X_{k}|>x), k\le 1\}\) is a sequence of independent random variables under sublinear expectations, combining \(C_{r}\)’s inequality and Lemma 2.2 results in

$$\begin{aligned} \mathbb{E} \Biggl[\sum _{k=1}^{n} \bigl(I(A_{k})- \mathbb{E}I(A_{k}) \bigr) \Biggr]^{2} \le& \sum _{k=1}^{n}\mathbb{E} \bigl[ \bigl(I(A_{k})- \mathbb{E}I(A_{k}) \bigr)^{2} \bigr] \\ \le &2\sum_{k=1}^{n}\mathbb{E} \bigl[ \bigl(I(A_{k})+ \bigl(\mathbb{V}(A_{k}) \bigr) \bigr)^{2} \bigr]\le 4\sum_{k=1}^{n} \mathbb{V}(A_{k}). \end{aligned}$$
(2.5)

By (2.5), the independence of \(I(A_{k})\), \(k=1,\ldots , n\), the subadditivity of sublinear expectations, and Hölder’s inequality under sublinear expectations, and the equality \(\mathbb{E}(X+c)=\mathbb{E}(X)+c\) for constant c and \(X\in \mathcal{H}\), we conclude that

$$\begin{aligned} \sum_{k=1}^{n} \mathbb{V}(A_{k})&=\sum_{k=1}^{n} \mathbb{E} \bigl[I(A_{k}) \bigr] =\sum_{k=1}^{n-2}\mathbb{E}[I(A_{k})]+\mathbb{E} [I(A_{n-1})+\mathbb{E} [I(A_{n}) ] ] \\ &=\sum_{k=1}^{n-2}\mathbb{E} \bigl[I(A_{k}) \bigr]+\mathbb{E} \bigl[I(A_{n-1})+I(A_{n}) \bigr]=\cdots =\mathbb{E} \Biggl[I(A_{1})+\mathbb{E} \Biggl[\sum _{k=2}^{n}I(A_{k}) \Biggr] \Biggr] \\ &=\mathbb{E} \Biggl[\sum_{k=1}^{n}I(A_{k}) \Biggr]=\mathbb{E} \Biggl[ \sum_{k=1}^{n}I(A_{k})I \Biggl(\bigcup_{j=1}^{n}A_{j} \Biggr) \Biggr] \\ &\le \mathbb{E} \Biggl[\sum_{k=1}^{n} \bigl(I(A_{k})-\mathbb{E}I(A_{k}) \bigr)I \Biggl(\bigcup _{j=1}^{n}A_{j} \Biggr) \Biggr]+\sum_{k=1}^{n} \mathbb{V}(A_{k})\mathbb{V} \Biggl(\bigcup _{j=1}^{n}A_{j} \Biggr) \\ &\le \Biggl[\mathbb{E} \Biggl(\sum_{k=1}^{n} \bigl(I(A_{k})-\mathbb{E}I(A_{k}) \bigr) \Biggr)^{2} \mathbb{E} \Biggl(I \Biggl(\bigcup _{j=1}^{n}A_{j} \Biggr) \Biggr) \Biggr]^{1/2}+(1-\beta _{n})\sum _{k=1}^{n}\mathbb{V}(A_{k}) \\ &\le \Biggl[\frac{4(1-\beta _{n})}{\beta _{n}}\beta _{n}\sum _{k=1}^{n} \mathbb{V}(A_{k}) \Biggr]^{1/2}+(1-\beta _{n})\sum _{k=1}^{n} \mathbb{V}(A_{k}) \\ &\le \frac{1}{2} \Biggl[\frac{4(1-\beta _{n})}{\beta _{n}}+\beta _{n} \sum_{k=1}^{n} \mathbb{V}(A_{k}) \Biggr]+(1-\beta _{n})\sum_{k=1}^{n} \mathbb{V}(A_{k}), \end{aligned}$$

which immediately results in (2.4). The proof is finished. □

3 Main results

Throughout the rest of this paper, we assume that \(\{X_{n},n\ge 0\}\) is a sequence of independent random variables, identically distributed as X under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\) with \(\mathbb{E}(X_{i})=-\mathbb{E}(-X_{i})= 0\), \(i=0,1,2,\ldots \) . Our main results are as follows.

Theorem 3.1

Let \(\beta >-1\) and \(r>1\). Let \(\{b_{ni}\approx (i/n)^{\beta }(1/n), 1\le i\le n, n\ge 1\}\) satisfy \(\sum_{i=1}^{n}b_{ni}=1\) for all \(n\ge 1\). Let for \(r>1\),

$$\begin{aligned} \textstyle\begin{cases} \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r-1)/(1+\beta )} )< \infty & \textit{for }-1< \beta < -1/r; \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \textit{for }\beta =-1/r; \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \textit{for }\beta >-1/r. \end{cases}\displaystyle \end{aligned}$$
(3.1)

Then for all \(\varepsilon >0\),

$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl\{ \Biggl(\max_{1\le j \le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert -\varepsilon \Biggr)^{+} \Biggr\} < \infty , \end{aligned}$$
(3.2)
$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl\{ \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert >\varepsilon \Biggr\} < \infty . \end{aligned}$$
(3.3)

Property (3.2) also implies (3.1).

Corollary 3.1

Let \(\beta >-1\) and \(r>1\). Assume that \(\{b_{ni}\approx [(n-i)/n]^{\beta }(1/n), 0\le i\le n-1, n\ge 1\}\) satisfies \(\sum_{i=0}^{n-1}b_{ni}=1\) for all \(n\ge 1\). Let (3.1) hold. Then for all \(\varepsilon >0\),

$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl\{ \Biggl(\max_{0\le j \le n-1} \Biggl\vert \sum _{i=0}^{j}b_{ni}X_{i} \Biggr\vert -\varepsilon \Biggr)^{+} \Biggr\} < \infty , \end{aligned}$$
(3.4)
$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl\{ \max_{0\le j\le n-1} \Biggl\vert \sum _{i=0}^{j}b_{ni}X_{i} \Biggr\vert >\varepsilon \Biggr\} < \infty . \end{aligned}$$
(3.5)

Property (3.4) also implies (3.1).

For \(\alpha >0\), we write the Cesàro summation

$$ A_{n}^{\alpha }:=\frac{(\alpha +1)(\alpha +2)\cdots (\alpha +n)}{n!},\quad n=1,2, \ldots , A_{0}^{\alpha }=1. $$
(3.6)

Theorem 3.2

Let \(0<\alpha \le 1\), \(r\ge 1\). Let

$$\begin{aligned} \textstyle\begin{cases} \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r-1)/\alpha } )< \infty & \textit{ for $0< \alpha < 1-1/r$;} \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \textit{ for $\alpha =1-1/r$;} \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \textit{ for $\alpha >1-1/r$.} \end{cases}\displaystyle \end{aligned}$$
(3.7)

Then for all \(\varepsilon >0\),

$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl\{ \Biggl(\max_{0\le j \le n} \Biggl\vert \sum _{i=0}^{j}A_{n-i}^{\alpha -1}X_{i}/A_{n}^{\alpha } \Biggr\vert -\varepsilon \Biggr)^{+} \Biggr\} < \infty , \end{aligned}$$
(3.8)
$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl\{ \max_{0\le j\le n} \Biggl\vert \sum _{i=0}^{j}A_{n-i}^{\alpha -1}X_{i}/A_{n}^{\alpha } \Biggr\vert > \varepsilon \Biggr\} < \infty . \end{aligned}$$
(3.9)

Property (3.8) also implies (3.7).

In Theorem 3.2, taking \(\alpha =1\), we get the following corollary.

Corollary 3.2

Let \(r>1\). Suppose \(\mathcal{C}_{\mathbb{V}}(|X|^{r})<\infty \). Then for all \(\varepsilon >0\),

$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl\{ \Biggl(\max_{1\le j \le n} \Biggl\vert \frac{1}{n}\sum_{i=1}^{j}X_{i} \Biggr\vert -\varepsilon \Biggr)^{+} \Biggr\} < \infty , \end{aligned}$$
(3.10)
$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl\{ \max_{1\le j\le n} \Biggl\vert \frac{1}{n}\sum_{i=1}^{j}X_{i} \Biggr\vert >\varepsilon \Biggr\} < \infty . \end{aligned}$$
(3.11)

Property (3.10) also implies \(\mathcal{C}_{\mathbb{V}}(|X|^{r})<\infty \).

Remark 3.1

Under the same assumptions of Theorem 3.1, we obtain for all \(\varepsilon >0\),

$$\begin{aligned} \infty &>\sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl\{ \Biggl(\max_{1 \le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert -\varepsilon /2 \Biggr)^{+} \Biggr\} \\ &\ge \sum_{n=1}^{\infty }\varepsilon n^{r-2}\mathbb{V} \Biggl(\max_{1 \le j\le n} \Biggl\vert \sum_{i=1}^{j}b_{ni}X_{i} \Biggr\vert >\varepsilon \Biggr)/2 \\ &=(\varepsilon /2)\sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl(\max_{1 \le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert >\varepsilon \Biggr). \end{aligned}$$
(3.12)

By (3.12) we can deduce that (3.2) implies (3.3). Similarly, (3.4) implies (3.5). Hence we complement the results of Meng et al. [6] to those under sublinear expectations.

4 Proofs of Theorems 3.1 and 3.2

Proof of Theorem 3.1

Here we borrow the idea of the proof of Theorem 16 in Meng et al. [6]. We first prove that (3.1) implies (3.2). For all \(1\le i\le n\), \(n\ge 1\), write

$$ Y_{ni}=-\frac{1}{b_{ni}}I(b_{ni}X_{i}< -1)+X_{i}I \bigl( \vert b_{ni}X_{i} \vert \le 1 \bigr)+ \frac{1}{b_{ni}}I(b_{ni}X_{i}>1). $$

Since \(\mathbb{E}(X_{i})=-\mathbb{E}(-X_{i})= 0\), we conclude that

$$ \begin{aligned} \sum_{i=1}^{j}b_{ni}X_{i} \le {}&\sum_{i=1}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni})+\sum_{i=1}^{j} \bigl[ \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)+ \mathbb{E} \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \bigr] \\ &{} +\sum_{i=1}^{j} \bigl[I(b_{ni}X_{i}< -1)+I(b_{ni}X_{i}>1) \bigr]+\sum_{i=1}^{j}\mathbb{V} \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \end{aligned} $$

and

$$\begin{aligned} \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}b_{ni}X_{i} \Biggr\vert \le& \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert +\sum_{i=1}^{n} \bigl[I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)+ \mathbb{V} \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \bigr] \\ &{}+\sum_{i=1}^{n} \bigl[ \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)+\mathbb{E} \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \bigr]. \end{aligned}$$
(4.1)

Then

$$ \begin{aligned} \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl(\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert -\varepsilon \Biggr)^{+} \le{}& C\sum _{n=1}^{\infty }n^{r-2}\mathbb{E} \Biggl[\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert - \varepsilon \Biggr]^{+} \\ &{} +C\sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{V} \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)] \\ &{} +C\sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{E} \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \\ =:{}&\mathrm{I}+\mathrm{II}+\mathrm{III}. \end{aligned} $$

Thus, to prove (3.2), we need to establish \(\mathrm{I}<\infty \), \(\mathrm{II}<\infty \), and \(\mathrm{III}<\infty \). We first prove \(\mathrm{I}<\infty \). For fixed \(n\ge 1\), since \(\{Y_{ni}-\mathbb{E}Y_{ni},1\le i\le n\}\) is a sequence of independent identically distributed random variables under sublinear expectation space \((\Omega ,\mathcal{H},\mathbb{E})\), for \(M\ge 2\), combining Lemma 2.2, \(C_{r}\)’s inequality, Markov’s inequality, and Jensen’s inequality under sublinear expectations results in

$$\begin{aligned} \mathrm{I}&\le C\sum_{n=1}^{\infty }n^{r-2} \int _{0}^{\infty } \mathbb{V} \Biggl(\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert >t+\varepsilon \Biggr)\,\mathrm{d}t \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \int _{0}^{\infty } \frac{1}{(t+\varepsilon )^{M}}\mathbb{E} \Biggl[\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert ^{M} \Biggr]\,\mathrm{d}t \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \int _{0}^{\infty } \frac{1}{(t+\varepsilon )^{M}}\mathbb{E} \Biggl[\max_{0\le j\le n} \Biggl\vert \sum _{i=j+1}^{n}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert ^{M} \Biggr]\,\mathrm{d}t \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl[\max_{0\le j\le n} \Biggl\vert \sum _{i=j+1}^{n}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert ^{M} \Biggr] \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \Biggl[\sum_{i=1}^{n}\mathbb{E} \bigl(b_{ni}^{M} \vert Y_{ni}- \mathbb{E}Y_{ni} \vert ^{M} \bigr)+ \Biggl(\sum _{i=1}^{n}b_{ni}^{2} \mathbb{E} \vert Y_{ni}- \mathbb{E}Y_{ni} \vert ^{2} \Biggr)^{M/2} \Biggr] \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{E} \bigl(b_{ni}^{M} \vert Y_{ni} \vert ^{M} \bigr)+C \sum_{n=1}^{\infty }n^{r-2} \Biggl(\sum_{i=1}^{n}\mathbb{E} \bigl( \vert b_{ni}Y_{ni} \vert ^{2} \bigr) \Biggr)^{M/2}=:\mathrm{I}_{1}+\mathrm{I}_{2}. \\ \mathrm{I}_{1}&\le C\sum _{n=1}^{\infty }n^{r-2}\sum _{i=1}^{n} \mathbb{V} \bigl( \vert b_{ni}X \vert >1 \bigr)+C\sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{E} \vert b_{ni}X \vert ^{M}I \bigl( \vert b_{ni}X \vert \le 1 \bigr)=: \mathrm{I}_{11}+\mathrm{I}_{12}. \end{aligned}$$

By \(b_{ni}\approx (i/n)^{\beta }(1/n)\), Lemma 2.1 (or Lemma 2.2 in Zhong and Wu [16]), and (3.1) we see that

$$\begin{aligned} \mathrm{I}_{11} \le &C\sum _{n=1}^{\infty }n^{r-2}\sum _{i=1}^{n} \mathbb{V} \bigl( \vert X_{i} \vert >Cn^{1+\beta }i^{-\beta } \bigr) \\ \le & C \int _{1}^{\infty }x^{r-2} \int _{1}^{x}\mathbb{V} \bigl( \vert X \vert >Cx^{1+ \beta }y^{-\beta } \bigr)\,\mathrm{d}y\,\mathrm{d}x \\ &\text{(Setting }s=x^{1+\beta }y^{-\beta }, t=y,) \\ \approx &\textstyle\begin{cases} C\int _{1}^{\infty }s^{\frac{r-1}{1+\beta }-1}\mathbb{V} ( \vert X \vert >Cs )\,\mathrm{d}s& \text{ for $-1< \beta < -1/r$;} \\ C\int _{1}^{\infty }s^{r-1}\ln (s)\mathbb{V} ( \vert X \vert >Cs ) \,\mathrm{d}s & \text{ for $\beta =-1/r$;} \\ C\int _{1}^{\infty }s^{r-1}\mathbb{V} ( \vert X \vert >Cs )\,\mathrm{d}s & \text{ for $\beta >-1/r$;} \end{cases}\displaystyle \\ &\text{ (taking }\beta =\frac{r-1}{1+\beta }-1\text{ or }\beta =r-1, \alpha =0, \gamma =1\text{ in Lemma 2.1} \\ &\text{ and using Lemma 2.3)} \\ \le & \textstyle\begin{cases} C \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r+1)/(1+\beta )} )< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \text{ for $\beta =-1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned}$$
(4.2)

Taking M large enough satisfying \((r-1)/(1+\beta )-1-M<-1\), \(r-1-M<-1\), we combine Lemma 2.4 and (3.1) to obtain

$$ \begin{aligned} \mathrm{I}_{12}&=C\sum _{n=1}^{\infty }n^{r-2}\sum _{i=1}^{n}n^{-M(1+ \beta )}i^{M\beta } \mathbb{E} \bigl( \vert X \vert ^{M}I \bigl( \vert X \vert \le Cn^{1+\beta }i^{- \beta } \bigr) \bigr) \\ &\approx C \int _{1}^{\infty }x^{r-2} \int _{1}^{x}x^{-M(1+\beta )}y^{M \beta } \mathbb{E} \bigl( \vert X \vert ^{M}I \bigl( \vert X \vert \le Cx^{1+\beta }y^{-\beta } \bigr) \bigr)\,\mathrm{d}y\,\mathrm{d}x \\ &\approx \textstyle\begin{cases} C\int _{1}^{\infty }s^{\frac{r-1}{1+\beta }-1-M}\mathbb{E} ( \vert X \vert ^{M}I( \vert X \vert \le Cs) )\,\mathrm{d}s& \text{ for $-1< \beta < -1/r$;} \\ C\int _{1}^{\infty }s^{r-1-M}\ln (s)\mathbb{E} ( \vert X \vert ^{M}I( \vert X \vert \le Cs) )\,\mathrm{d}s & \text{ for $\beta =-1/r$;} \\ C\int _{1}^{\infty }s^{r-1-M}\mathbb{E} ( \vert X \vert ^{M}I( \vert X \vert \le Cs) )\,\mathrm{d}s & \text{ for $\beta >-1/r$;} \end{cases}\displaystyle \\ &\text{ (taking $\beta =\frac{r-1}{1+\beta }-1-M$ or $\beta =r-1-M$, $\alpha =M$, $\gamma =1$ in Lemma 2.4 and}\\ &\text{ using Lemma 2.3)} \\ &\le \textstyle\begin{cases} C \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r-1)/(1+\beta )} )< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \text{ for $\beta =-1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned} $$

We next prove \(\mathrm{I}_{2}<\infty \). By \(C_{r}\)’s inequality we see that

$$\begin{aligned} \mathrm{I}_{2} =&C\sum _{n=1}^{\infty }n^{r-2} \Biggl[\sum _{i=1}^{n} \mathbb{V} \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)+\sum _{i=1}^{n}\mathbb{E} \vert b_{ni}X_{i} \vert ^{2}I \bigl( \vert b_{ni}X_{i} \vert \le 1 \bigr) \Biggr]^{M/2} \\ \le &C\sum_{n=1}^{\infty }n^{r-2} \Biggl[\sum_{i=1}^{n}\mathbb{V} \bigl( \vert b_{ni}X \vert >1 \bigr)+\sum_{i=1}^{n} \mathbb{E} \vert b_{ni}X \vert ^{2}I \bigl( \vert b_{ni}X \vert \le 1 \bigr) \Biggr]^{M/2} \\ \le &C\sum_{n=1}^{\infty }n^{r-2} \Biggl[\sum_{i=1}^{n}\mathbb{V} \bigl( \vert b_{ni}X \vert >1 \bigr) \Biggr]^{M/2}+C\sum _{n=1}^{\infty }n^{r-2} \Biggl[\sum _{i=1}^{n}\mathbb{E} \vert b_{ni}X \vert ^{2}I \bigl( \vert b_{ni}X \vert \le 1 \bigr) \Biggr]^{M/2} \\ =&:\mathrm{I}_{21}+\mathrm{I}_{22}. \end{aligned}$$
(4.3)

Taking M large enough satisfying \(r-2-(r-1)M/2<-1\) and combining Markov’s inequality under sublinear expectations and (3.1) result in

$$ \begin{aligned} \mathrm{I}_{21}&\approx C\sum _{n=1}^{\infty }n^{r-2} \Biggl[\sum _{i=1}^{n} \mathbb{V} \bigl( \vert X \vert >Cn^{1+\beta }i^{-\beta } \bigr) \Biggr]^{M/2} \\ &\le \textstyle\begin{cases} C\sum_{n=1}^{\infty }n^{r-2} [\sum_{i=1}^{n} ( \frac{i^{\beta }}{n^{1+\beta }} )^{(r-1)/(1+\beta )} ]^{M/2}& \text{ for $-1< \beta < -1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2} [\sum_{i=1}^{n} ( \frac{i^{\beta }}{n^{1+\beta }} )^{r}\ln (1+ \frac{i^{\beta }}{n^{1+\beta }} ) ]^{M/2} & \text{ for $\beta =-1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2} [\sum_{i=1}^{n} ( \frac{i^{\beta }}{n^{1+\beta }} )^{r} ]^{M/2} & \text{ for $\beta >-1/r$;} \end{cases}\displaystyle \\ &\le \textstyle\begin{cases} C\sum_{n=1}^{\infty }n^{r-2-(r-1)M/2}< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-(r-1)M/2}(\ln n)^{M/2}< \infty & \text{ for $\beta =-1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-(r-1)M/2}< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned} $$

We next establish \(\mathrm{I}_{22}<\infty \) in the following two cases.

(i) If \(1< r<2\), then taking M large satisfying \(r-2-Mr(1+\beta )/2<-1\), \(r-2-M(r-1)/2<-1\), by \(\mathbb{E}(|X|^{r})\le \mathcal{C}_{\mathbb{V}}(|X|^{r})<\infty \) we see that

$$ \begin{aligned} \mathrm{I}_{22}&\le C\sum _{n=1}^{\infty }n^{r-2} \Biggl(\sum _{i=1}^{n}b_{ni}^{r} \Biggr)^{M/2} \\ &\approx C\sum_{n=1}^{\infty }n^{r-2} \Biggl(\sum_{i=1}^{n}n^{-r(1+ \beta )}i^{r\beta } \Biggr)^{M/2} \\ &\approx \textstyle\begin{cases} C\sum_{n=1}^{\infty }n^{r-2-Mr(1+\beta )/2}< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-(r-1)M/2}(\ln n)^{M/2}< \infty & \text{ for $\beta =-1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-(r-1)M/2}< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned} $$

(ii) If \(r\ge 2\), then (3.1) yields \(\mathbb{E}(X^{2})<\infty \). Taking M large enough to satisfy \(r-2-M(1+\beta )<-1\), \(r-1-p/2<-1\), we deduce that

$$\begin{aligned} \mathrm{I}_{22}&\le C\sum _{n=1}^{\infty }n^{r-2} \Biggl(\sum _{i=1}^{n}b_{ni}^{2} \Biggr)^{M/2} \\ &\approx C\sum_{n=1}^{\infty }n^{r-2} \Biggl(\sum_{i=1}^{n}n^{-2(1+ \beta )}i^{2\beta } \Biggr)^{M/2} \\ &\approx \textstyle\begin{cases} C\sum_{n=1}^{\infty }n^{r-2-M(1+\beta )}< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-M/2}(\ln n)^{M/2}< \infty & \text{ for $\beta =-1/r$;} \\ C\sum_{n=1}^{\infty }n^{r-2-M/2}< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned}$$

By the proof of \(\mathrm{I}_{11}<\infty \) we see that \(\mathrm{II}<\infty \). We finally establish \(\mathrm{III}<\infty \). Since \(b_{ni}\approx (i/n)^{\beta }(1/n)\), combining Lemmas 2.1, and 2.3 results in

$$ \begin{aligned} \mathrm{III}&\le C\sum _{n=1}^{\infty }n^{r-2}\sum _{i=1}^{n}n^{-(1+ \beta )}i^{\beta } \mathbb{E} \bigl( \vert X \vert I \bigl( \vert X \vert >Cn^{1+\beta }i^{-\beta } \bigr) \bigr) \\ &\le C \int _{1}^{\infty }x^{r-2} \int _{1}^{x}x^{-(1+\beta )}y^{ \beta } \mathcal{C}_{\mathbb{V}} \bigl( \vert X \vert I \bigl( \vert X \vert >Cx^{1+\beta }y^{-\beta } \bigr) \bigr)\,\mathrm{d}y\,\mathrm{d}x \\ &\text{ (setting $s=x^{1+\beta }y^{-\beta }$, $t=y$ )} \\ &\approx \textstyle\begin{cases} C\int _{1}^{\infty }s^{\frac{r-1}{1+\beta }-2}\mathcal{C}_{\mathbb{V}} ( \vert X \vert I( \vert X \vert >Cs) )\,\mathrm{d}s& \text{ for $-1< \beta < -1/r$;} \\ C\int _{1}^{\infty }s^{r-2}\ln (s)\mathcal{C}_{\mathbb{V}} ( \vert X \vert I( \vert X \vert >Cs) )\,\mathrm{d}s & \text{ for $\beta =-1/r$;} \\ C\int _{1}^{\infty }s^{r-2}\mathcal{C}_{\mathbb{V}} ( \vert X \vert I( \vert X \vert >Cs) )\,\mathrm{d}s & \text{ for $\beta >-1/r$;} \end{cases}\displaystyle \\ &\le \textstyle\begin{cases} C \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r+1)/(1+\beta )} )< \infty & \text{ for $-1< \beta < -1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \text{ for $\beta =-1/r$;} \\ C\mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned} $$

We now prove that (3.2) implies (3.1). By Remark 3.1 we see that (3.2) gives

$$\begin{aligned} \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Biggl(\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}b_{ni}X_{i} \Biggr\vert >\varepsilon \Biggr)< \infty . \end{aligned}$$
(4.4)

Observing that

$$\begin{aligned} \max_{1\le j\le n} \vert b_{nj}X_{j} \vert \le \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}b_{ni}X_{i} \Biggr\vert , \end{aligned}$$
(4.5)

by (4.4) we obtain

$$\begin{aligned}& \sum_{n=1}^{\infty }n^{r-2} \mathbb{V} \Bigl(\max_{1\le j\le n} \vert b_{nj}X_{j} \vert > \varepsilon \Bigr)< \infty , \end{aligned}$$
(4.6)
$$\begin{aligned}& \mathbb{V} \Bigl(\max_{1\le j\le n} \vert b_{nj}X_{j} \vert >\varepsilon \Bigr) \rightarrow 0 \quad \text{as }n\rightarrow \infty . \end{aligned}$$
(4.7)

By Lemma 2.5 and (4.7) we see that

$$\begin{aligned} \sum_{i=1}^{n} \mathbb{V} \bigl( \vert b_{ni}X_{i} \vert > \varepsilon \bigr) \le C\mathbb{V} \Bigl(\max_{1\le j\le n} \vert b_{nj}X_{j} \vert >\varepsilon \Bigr). \end{aligned}$$
(4.8)

Hence combining (4.8) and (4.6) results in

$$\begin{aligned} \sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{V} \bigl( \vert b_{ni}X_{i} \vert > \varepsilon \bigr)< \infty . \end{aligned}$$
(4.9)

As in the proof of \(\mathrm{I}_{11}<\infty \), we obtain

$$ \begin{aligned} \infty &>\sum_{n=1}^{\infty }n^{r-2} \sum_{i=1}^{n}\mathbb{V} \bigl( \vert b_{ni}X_{i} \vert > \varepsilon \bigr) \\ &\approx \int _{1}^{\infty }x^{r-2} \int _{1}^{x}\mathbb{V} \bigl( \vert X \vert >C\varepsilon x^{1+\beta }y^{-\beta } \bigr)\,\mathrm{d}y \,\mathrm{d}x \\ &\approx \textstyle\begin{cases} \int _{1}^{\infty }s^{\frac{r-1}{1+\beta }-1}\mathbb{V} ( \vert X \vert >Cs )\,\mathrm{d}s& \text{ for $-1< \beta < -1/r$;} \\ \int _{1}^{\infty }s^{r-1}\ln (s)\mathbb{V} ( \vert X \vert >Cs ) \,\mathrm{d}s & \text{ for $\beta =-1/r$;} \\ \int _{1}^{\infty }s^{r-1}\mathbb{V} ( \vert X \vert >Cs )\,\mathrm{d}s & \text{ for $\beta >-1/r$;} \end{cases}\displaystyle \\ &\approx \textstyle\begin{cases} \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{(r-1)/(1+\beta )} )< \infty & \text{ for $-1< \beta < -1/r$;} \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r}\ln (1+ \vert X \vert ) )< \infty & \text{ for $\beta =-1/r$;} \\ \mathcal{C}_{\mathbb{V}} ( \vert X \vert ^{r} )< \infty & \text{ for $\beta >-1/r$.} \end{cases}\displaystyle \end{aligned} $$

Consequently, this finishes the proof of Theorem 3.1. □

Proof of Corollary 3.1

The proofs here are similar to that of Theorem 16 in Meng et al. [6]. For reader’s convenience, we give a brief explanation here. For \(0\le i\le n-1\), \(n\ge 1\), write

$$ Y_{ni}=-\frac{1}{b_{ni}}I(b_{ni}X_{i}< -1)+X_{i}I \bigl( \vert b_{ni}X_{i} \vert \le 1 \bigr)+ \frac{1}{b_{ni}}I(b_{ni}X_{i}>1). $$

As in the proof of Theorem 3.1, we obtain

$$ \begin{aligned} \sum_{n=1}^{\infty }n^{r-2} \mathbb{E} \Biggl(\max_{0\le j\le n-1} \Biggl\vert \sum _{i=0}^{j}b_{ni}X_{i} \Biggr\vert -\varepsilon \Biggr)^{+} \le{} & C\sum _{n=1}^{\infty }n^{r-2}\mathbb{E} \Biggl[\max _{0\le j\le n-1} \Biggl\vert \sum_{i=0}^{j}b_{ni}(Y_{ni}- \mathbb{E}Y_{ni}) \Biggr\vert - \varepsilon \Biggr]^{+} \\ &{}+C\sum_{n=1}^{\infty }n^{r-2}\sum _{i=0}^{n-1}\mathbb{V} \bigl( \vert b_{ni}X_{i} \vert >1 \bigr)] \\ &{}+C\sum_{n=1}^{\infty }n^{r-2}\sum _{i=0}^{n-1}\mathbb{E} \vert b_{ni}X_{i} \vert I \bigl( \vert b_{ni}X_{i} \vert >1 \bigr) \\ =:{}&\mathrm{I}+\mathrm{II}+\mathrm{III}. \end{aligned} $$

For instance, combining Lemmas 2.1 and 2.3 results in

$$ \begin{aligned} \mathrm{III}&\le C\sum _{n=1}^{\infty }n^{r-2}\sum _{i=0}^{n-1}n^{-(1+ \beta )}(n-i)^{\beta } \mathbb{E} \bigl( \vert X \vert I \bigl( \vert X \vert >Cn^{1+\beta }(n-i)^{- \beta } \bigr) \bigr) \\ &\le C\sum_{n=1}^{\infty }n^{r-2} \sum_{k=1}^{n}n^{-(1+ \beta )}k^{\beta } \mathbb{E} \bigl( \vert X \vert I \bigl( \vert X \vert >Cn^{1+\beta }k^{-\beta } \bigr) \bigr)< \infty . \end{aligned} $$

The rest of proof is similar to that of Theorem 3.1, so they are omitted. □

Proof of Theorem 3.2

The proof is similar to that of Theorem 18 in Meng et al. [6]. Taking \(b_{ni}=A_{n-i}^{\alpha -1}/A_{n}^{\alpha }\), \(0\le i\le n\), \(n\ge 1\), we note that for \(\alpha >-1\), \(A_{\alpha }^{n}\approx n^{\alpha }/\Gamma (\alpha +1)\). Hence for \(\alpha >0\), we see that

$$\begin{aligned} b_{ni}\approx n^{-\alpha }(n-i)^{\alpha -1}, \quad 0\le i< n,\qquad a_{nn}\approx n^{- \alpha }. \end{aligned}$$
(4.10)

From \(A_{n}^{\alpha }=\sum_{i=0}^{n-1}A_{n-i}^{\alpha -1}\) it follows that

$$\begin{aligned} \sum_{i=0}^{n}a_{ni}=1. \end{aligned}$$
(4.11)

It follows from (4.10) and (4.11) that the assumptions of Corollary 3.1 hold. Hence (3.8) follows from (3.4). The proof of Theorem 3.2 is finished. □

Availability of data and materials

No data were used to support this study.

References

  1. Chen, Z.J.: Strong laws of large numbers for sub-linear expectations. Sci. China Math. 59(5), 945–954 (2016)

    Article  MathSciNet  Google Scholar 

  2. Gao, F.Q., Xu, M.Z.: Large deviations and moderate deviations for independent random variables under sublinear expectations. Sci. Sin., Math. 41(4), 337–352 (2011) (in Chinese)

    Article  Google Scholar 

  3. Hu, F., Chen, Z.J., Zhang, D.F.: How big are the increments of G-Brownian motion. Sci. China Math. 57(8), 1686–1700 (2014)

    Article  MathSciNet  Google Scholar 

  4. Hu, Z.C., Yang, Y.Z.: Some inequalities and limit theorems under sublinear expectations. Acta Math. Appl. Sin. Engl. Ser. 33(2), 451–462 (2017)

    Article  MathSciNet  Google Scholar 

  5. Huang, W.H., Wu, P.Y.: Strong laws of large numbers for general random variables in sublinear expectation spaces. J. Inequal. Appl. 2019, Article ID 143 (2019)

    Article  MathSciNet  Google Scholar 

  6. Meng, B., Wang, D.C., Wu, Q.Y.: Convergence for sums of asymptotically almost negatively associated random variables. Chinese J. Appl. Probab. Statist. 35(4), 345–359 (2019)

    MathSciNet  MATH  Google Scholar 

  7. Peng, S.G.: G-expectation, G-Brownian motion and related stochastic calculus of Itô type. Stoch. Anal. Appl. 2(4), 541–567 (2007)

    MATH  Google Scholar 

  8. Peng, S.G.: Nonlinear expectations and stochastic calculus under uncertainty (2010) arXiv:1002.4546v1

  9. Shen, A.T., Wu, R.C.: Strong convergence for sequences of asymptotically almost negatively associated random variables. Stochastics 86(2), 291–303 (2014)

    Article  MathSciNet  Google Scholar 

  10. Wu, Q.Y.: Precise asymptotics for complete integral convergence under sublinear expectations. Math. Probl. Eng. 2020, Article ID 3145935 (2020)

    MathSciNet  Google Scholar 

  11. Xu, M.Z., Cheng, K.: Precise asymptotics in the law of the iterated logarithm under sublinear expectations. Math. Probl. Eng. 2021, Article ID 6691857 (2021)

    MathSciNet  Google Scholar 

  12. Zhang, L.X.: Donsker’s invariance principle under the sub-linear expectation with an application to Chung’s law of the iterated logarithm. Commun. Math. Stat. 3(2), 187–214 (2015)

    Article  MathSciNet  Google Scholar 

  13. Zhang, L.X.: Exponential inequalities under sub-linear expectations with applications. Sci. China Math. 59(12), 2503–2526 (2016)

    Article  MathSciNet  Google Scholar 

  14. Zhang, L.X.: Rosenthal’s inequalities for independent and negatively dependent random variables under sub-linear expectations with applications. Sci. China Math. 59(4), 759–768 (2016)

    MathSciNet  MATH  Google Scholar 

  15. Zhang, L.X.: Strong limit theorems for extended independent and extended negatively random variables under non-linear expectations (2016) arXiv:1608.0071v1

  16. Zhong, H.Y., Wu, Q.Y.: Complete convergence and complete moment convergence for weighted sums of extended negatively dependent random variables under sub-linear expectation. J. Inequal. Appl. 2017, Article ID 261 (2017). https://doi.org/10.1186/s13660-017-1538-1

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the referees, whose comments helped to correct some mistakes and improve the quality of the paper.

Funding

This research was supported by Doctoral Scientific Research Starting Foundation of Jingdezhen Ceramic University (No. 102/01003002031), Scientific Program of Department of Education of Jiangxi Province of China (Nos. GJJ190732, GJJ180737), Natural Science Foundation Program of Jiangxi Province 20202BABL211005, and National Natural Science Foundation of China (No. 61662037).

Author information

Authors and Affiliations

Authors

Contributions

Both authors contributed equally and read and approved the final manuscript.

Corresponding author

Correspondence to Mingzhou Xu.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, M., Cheng, K. Convergence for sums of i.i.d. random variables under sublinear expectations. J Inequal Appl 2021, 157 (2021). https://doi.org/10.1186/s13660-021-02692-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-021-02692-x

MSC

Keywords