- Research
- Open access
- Published:
Central limit theorems for sub-linear expectation under the Lindeberg condition
Journal of Inequalities and Applications volume 2018, Article number: 316 (2018)
Abstract
In this paper, we investigate the central limit theorems for sub-linear expectation for a sequence of independent random variables without assumption of identical distribution. We first give a bound on the distance between the normalized sum distribution and G-normal distribution which can be used to derive the central limit theorem for sub-linear expectation under the Lindeberg condition. Then we obtain the central limit theorem for capacity under the Lindeberg condition. We also get the central limit theorem for capacity for summability methods under the Lindeberg condition.
1 Introduction
Peng [15] put forward the theory of sub-linear expectation to describe the probability uncertainties in statistics and economics which are difficult to be handled by classical probability theory. There has been increasing interest in sub-linear expectation (see, for example, [1, 2, 4, 11, 18, 26]).
The classical central limit theorem (CLT for short) is a fundamental result in probability theory. Peng [16] initiated the CLT for sub-linear expectation for a sequence of i.i.d. random variables with finite \((2+\alpha)\)-moments for some \(\alpha >0\). The CLT for sub-linear expectation has gotten considerable development. Hu and Zhang [10] obtained a CLT for capacity. Li and Shi [13] got a CLT for sub-linear expectation without assumption of identical distribution. Hu [9] extended Peng’s CLT by weakening the assumptions of test functions. Zhang and Chen [21] derived a weighted CLT for sub-linear expectation. Hu and Zhou [12] presented some multi-dimensional CLTs without assumption of identical distribution. Li [14] proved a CLT for sub-linear expectation for a sequence of m-dependent random variables. Rokhlin [19] gave a CLT under the Lindeberg condition under classical probability with variance uncertainty. Zhang [22] gained a CLT for sub-linear expectation under a moment condition weaker than \((2+\alpha)\)-moments. Zhang [23] established a martingale CLT and functional CLT for sub-linear expectation under the Lindeberg condition.
The purpose of this paper is to investigate the CLTs for sub-linear expectation for a sequence of independent random variables without assumption of identical distribution. We first give a bound on the distance between the normalized sum distribution \(\mathbb{E}[\varphi (\frac{S_{n}}{\overline{B}_{n}})]\) and G-normal distribution \(\widetilde{\mathbb{E}}[\varphi (\xi)]\), where \(\xi \sim \mathcal{N}( \{0\};[\underline{\sigma }^{2},1])\). It can be used to derive the CLT for sub-linear expectation under the Lindeberg condition directly, which coincides with the result in Zhang [23]. Different from the classical case, when choosing \(\underline{B}_{n}\) as the normalizing factor, we can also obtain a bound on the distance between the normalized sum distribution \(\mathbb{E}[\varphi (\frac{S_{n}}{\underline{B}_{n}})]\) and the corresponding G-normal random variable \(\mathbb{E}[\varphi ( \eta)]\) where \(\eta \sim \mathcal{N}(\{0\};[1,\overline{\sigma }^{2}])\). Secondly, we obtain a CLT for capacity under the Lindeberg condition which extends the CLT for capacity for a sequence of i.i.d. random variables in Hu and Zhang [10]. We also study the CLT for capacity for summability methods under the Lindeberg condition. The regular summability method is an important subject in functional analysis. In recent years it has been found that summability method plays an important role in the study of statistical convergence (see [5,6,7, 20]). So it is meaningful to investigate the CLT for capacity for summability methods.
This paper is organized as follows. In Sect. 2, we recall some basic concepts and lemmas related to the main results. In Sect. 3, we give a bound on the distance between the normalized sum distribution and G-normal distribution. In Sect. 4, we prove a CLT for capacity under the Lindeberg condition. In Sect. 5, we show a CLT for capacity for summability methods under the Lindeberg condition.
2 Basic concepts and lemmas
This paper is studied under the sub-linear expectation framework established by Peng [15,16,17,18]. Let \((\varOmega,\mathcal{F})\) be a given measurable space. Let \(\mathcal{H}\) be a linear space of real functions defined on Ω such that if \(X_{1},X_{2},\ldots,X_{n}\in \mathcal{H}\) then \(\varphi (X_{1},X_{2},\ldots,X_{n})\in \mathcal{H}\) for each \(\varphi \in C_{l,\mathrm{Lip}}(\mathbb{R}^{n})\) where \(C_{l,\mathrm{Lip}}(\mathbb{R}^{n})\) denotes the linear space of local Lipschitz continuous functions φ satisfying
for some \(C>0\), \(m\in \mathbb{N}\) depending on φ. \(\mathcal{H}\) contains all \(I_{A}\) where \(A\in \mathcal{F}\). We also denote \(C_{b,\mathrm{Lip}}(\mathbb{R}^{n})\) as the linear space of bounded Lipschitz continuous functions φ satisfying
for some \(C>0\).
Definition 2.1
A functional \(\mathbb{E}\): \(\mathcal{H} \rightarrow \overline{ \mathbb{R}}\) is said to be a sub-linear expectation if it satisfies: for \(\forall X,Y\in \mathcal{H}\),
-
(a)
Monotonicity: \(X\geq Y\) implies \(\mathbb{E}[X]\geq \mathbb{E}[Y]\).
-
(b)
Constant preserving: \(\mathbb{E}[c]=c\), \(\forall c\in \mathbb{R}\).
-
(c)
Positive homogeneity: \(\mathbb{E}[\lambda X]=\lambda \mathbb{E}[X]\), \(\forall \lambda \geq 0\).
-
(d)
Sub-additivity: \(\mathbb{E}[X+Y]\leq \mathbb{E}[X]+\mathbb{E}[Y]\) whenever \(\mathbb{E}[X]+\mathbb{E}[Y]\) is well defined.
The triple \((\varOmega, \mathcal{H}, \mathbb{E})\) is called a sub-linear expectation space.
Remark 2.1
The sub-linear expectation \(\mathbb{E}[\cdot ]\) satisfies translation invariance: \(\mathbb{E}[X+c]=\mathbb{E}[X]+c\), \(\forall c\in \mathbb{R}\).
Definition 2.2
([3])
A set function \(V:\mathcal{F}\rightarrow [0,1]\) is called a capacity if it satisfies
-
(a)
\(V(\emptyset)=0\), \(V(\varOmega)=1\).
-
(b)
\(V(A)\leq V(B)\), \(A\subset B\), \(A,B\in \mathcal{F}\).
Definition 2.3
For a capacity V, a set A is a polar set if \(V(A)=0\). And we say a property holds “quasi-surely” (q.s.) if it holds outside a polar set.
Definition 2.4
A sub-linear expectation \(\mathbb{E}:\mathcal{H}\rightarrow \overline{ \mathbb{R}}\) is said to be continuous if it satisfies:
-
(1)
continuity from below: \(X_{n}\uparrow X\) implies \(\mathbb{E}[X _{n}]\uparrow \mathbb{E}[X]\), where \(0\leq X_{n}\), \(X\in \mathcal{H}\).
-
(2)
continuity from above: \(X_{n}\downarrow X\) implies \(\mathbb{E}[X _{n}]\downarrow \mathbb{E}[X]\), where \(0\leq X_{n}\), \(X\in \mathcal{H}\).
A capacity \(V:\mathcal{F}\rightarrow [0,1]\) is said to be continuous if it satisfies:
-
(1)
continuity from below: \(A_{n}\uparrow A\) implies \(V(A_{n})\uparrow V(A)\), where \(A_{n},A\in \mathcal{F}\).
-
(2)
continuity from above: \(A_{n}\downarrow A\) implies \(V(A_{n}) \downarrow V(A)\), where \(A_{n},A\in \mathcal{F}\).
The conjugate expectation \(\mathcal{E}\) of sub-linear expectation \(\mathbb{E}\) is defined by
Obviously, for all \(X\in \mathcal{H}\), \(\mathcal{E}[X]\leq \mathbb{E}[X]\). A pair of capacities can be induced as follows: \(\mathbb{V}(A):=\mathbb{E}[I_{A}]\), \(v(A):=\mathcal{E}[I _{A}]=1-\mathbb{V}(A^{c})\), \(\forall A\in \mathcal{F}\).
Definition 2.5
(Independence) \(\boldsymbol{Y}=(Y_{1},\ldots,Y_{n})\) (\(Y_{i} \in \mathcal{H}\)) is said to be independent of \(\boldsymbol{X}=(X_{1},\ldots,X _{m})\) (\(X_{i}\in \mathcal{H}\)) if, for each test function \(\varphi \in C_{l,\mathrm{Lip}}(\mathbb{R}^{m} \times \mathbb{R}^{n})\),
whenever the sub-linear expectations are finite.
\(\{X_{n}\}_{n=1}^{\infty }\) is said to be a sequence of independent random variables if \(X_{n+1}\) is independent of \((X_{1},\ldots,X_{n})\) for each \(n\geq 1\).
Let X be an n-dimensional random variable on a sub-linear expectation space \((\varOmega, \mathcal{H}, \mathbb{E})\). We define a functional on \(C_{l,\mathrm{Lip}}(\mathbb{R}^{n})\) such that
Then \(\mathbb{F}_{\boldsymbol{X}}[\cdot ]\) can be regarded as the distribution of X under \(\mathbb{E}\) and it characterizes the uncertainty of the distribution of X.
Definition 2.6
(Identical distribution) Two n-dimensional random variables \(\boldsymbol{X}_{1}\), \(\boldsymbol{X}_{2}\) on respective sub-linear expectation spaces \((\varOmega_{1}, \mathcal{H}_{1}, \mathbb{E}_{1})\) and \((\varOmega_{2}, \mathcal{H}_{2}, \mathbb{E}_{2})\) are called identically distributed, denoted by \(\boldsymbol{X}_{1}\overset{d}{=} \boldsymbol{X}_{2}\), if
whenever the sub-linear expectations are finite.
Definition 2.7
A one-dimensional random variable ξ on sub-linear expectation \((\varOmega, \mathcal{H}, \mathbb{E})\) is said to be G-normal distributed, denoted by \(\xi \sim \mathcal{N}(0,[\underline{ \sigma }^{2},\overline{\sigma }^{2}])\), if for any \(\varphi \in C_{l,\mathrm{Lip}}( \mathbb{R})\) the following function defined by
is the unique viscosity solution of the following parabolic partial differential equation (PDE) defined on \([0,\infty)\times R\):
where \(G(a)=\frac{1}{2}a^{+}\overline{\sigma }^{2}-\frac{1}{2}a^{-}\underline{ \sigma }^{2}\), \(a\in \mathbb{R}\).
Remark 2.2
The G-normal distributed random variable ξ satisfies: \(a\xi +b\overline{\xi }\overset{d}{=}\sqrt{a^{2}+b^{2}}\xi\), \(\forall a,b\geq 0\), where \(\overline{\xi }\overset{d}{=}\xi \) and ξ̅ is independent of ξ. This implies \(\mathbb{E}[\xi ]= \mathbb{E}[-\xi ]=0\).
Next we recall the definition of G-expectation. Let \(\widetilde{\varOmega }=C[0,\infty)\) be a space of all \(\mathbb{R}\)-valued continuous paths \((\omega_{t})_{t\geq 0}\) with \(\omega_{0}=0\), equipped with distance
Denote \(W_{t}(\omega):=\omega_{t}\) for each \(\omega \in \widetilde{\varOmega }\) and
For each given monotonic and sub-linear function \(G(a)=\frac{1}{2}a ^{+}\overline{\sigma }^{2}-\frac{1}{2}a^{-}\underline{\sigma }^{2}\), \(0\leq \underline{\sigma }\leq \overline{\sigma }<\infty \), \(a\in \mathbb{R}\), let the canonical process \((W_{t})_{t\geq 0}\) be G-Brownian motion on a G-expectation space \((\widetilde{\varOmega },L _{ip}(\widetilde{\varOmega }),\widetilde{\mathbb{E}})\). That is,
where \(\psi (x_{1},\ldots,x_{n-1})=\widetilde{\mathbb{E}}[\varphi (x_{1},\ldots,x _{n-1},\sqrt{t_{n}-t_{n-1}}W_{1})]\), \(W_{1}\sim \mathcal{N}(\{0\};[\underline{ \sigma }^{2},\overline{\sigma }^{2}])\).
For each \(p\geq 1\), we denote \(L_{G}^{p}(\widetilde{\varOmega })\) as the completion of \(L_{ip}(\widetilde{\varOmega })\) under the norm \(\|X\|_{L _{G}^{p}}:=(\widetilde{\mathbb{E}}[|X|^{p}])^{1/p}\). Then the G-expectation \(\widetilde{\mathbb{E}}\) can be continuously extended to \((\widetilde{\varOmega },L_{G}^{p}(\widetilde{\varOmega }))\). We still denote the extended G-expectation space by \((\widetilde{\varOmega },L_{G}^{p}( \widetilde{\varOmega }),\widetilde{\mathbb{E}})\).
Proposition 2.1
There exists a weakly compact set of probability measures \(\mathcal{P}\) on \((\widetilde{\varOmega },\mathcal{B}( \widetilde{\varOmega }))\) such that
where \(\mathcal{B}(\widetilde{\varOmega })\) denotes the Borel σ-algebra of Ω̃. We say that \(\mathcal{P}\) represents \(\widetilde{\mathbb{E}}\).
Given a G-expectation space \((\widetilde{\varOmega },L_{G}^{1}( \widetilde{\varOmega }),\widetilde{\mathbb{E}})\), we can define a pair of capacities:
Obviously, by Proposition 2.1, \(\widetilde{\mathbb{E}}[\cdot ]\) and \(\widetilde{\mathbb{V}}(\cdot)\) are continuous from below.
Definition 2.8
A sub-linear expectation \(\mathbb{E}\) is said to be regular if, for each sequence \(\{X_{n}\}_{n=1}^{\infty }\) satisfying \(X_{n}\downarrow 0\), we have \(\mathbb{E}[X_{n}]\downarrow 0\).
Lemma 2.1
-
(1)
For any closed sets \(F_{n}\downarrow F\), it holds that \(\widetilde{\mathbb{V}}(F_{n})\downarrow \widetilde{\mathbb{V}}(F)\).
-
(2)
G-expectation \(\widetilde{\mathbb{E}}[\cdot ]\) is regular.
Hu et al. [11] indicated that G-Brownian motion does not converge to any single point in probability under capacity \(\widetilde{\mathbb{V}}\) as follows.
Lemma 2.2
Given a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}( \widetilde{\varOmega }),\widetilde{\mathbb{E}})\), for any fixed \(a\in \mathbb{R}\), it holds that
In particular, the above equation holds for G-normal distribution \(W_{1}\).
Lemma 2.3
([8])
\(\mathbb{E}[|X|]<\infty \) implies \(|X|<\infty \) q.s., i.e., \(\mathbb{V}(|X|=\infty)=0\).
The following Rosenthal’s inequality under sub-linear expectation was obtained by Zhang [24].
Proposition 2.2
Assume that \(\{X_{n}\}_{n=1}^{\infty }\) is a sequence of independent random variables. Denote \(S_{n}:=\sum_{i=1}^{n} X_{i}\). Then, for any \(p\geq 2\), we have
where \(C_{p}\) is a positive constant depending on p.
Lemma 2.4
Assume that \(\mathbb{E}\) is continuous from below and \(\lim_{n\rightarrow \infty }X_{n}=X\). Then
If we further assume that \(\mathbb{E}\) is continuous, then
Proof
Since \(\inf_{i\geq n}X_{i}\) is non-decreasing in n, we have
If \(\mathbb{E}\) is continuous, by noting that \(\sup_{i\geq n}X_{i}\) is non-increasing in n, we have
Thus \(\lim_{n\rightarrow \infty }\mathbb{E}[X_{n}]=\mathbb{E}[X]\). □
Lemma 2.5
Assume that \(\mathbb{E}\) is continuous from below and regular. Let \(\{X_{n}\}_{n=1}^{\infty }\) be a sequence of independent random variables with \(\mathbb{E}[X_{n}]=\mathcal{E}[X_{n}]=0\) for any \(n\geq 1\) and \(\sum_{i=1}^{\infty }\mathbb{E}[X_{i}^{2}]<\infty \). Then \(S:=\sum_{i=1}^{\infty }X_{i}\) convergence q.s. under capacity \(\mathbb{V}\) and, for any \(p\geq 2\), we have
Proof
One can refer to Zhang and Lin [25] for the proof of the convergence of S. Now we prove (2.2). By \(\mathbb{E}[X_{n}]=\mathcal{E}[X_{n}]=0\), taking \(\limsup_{n\rightarrow \infty }\) on both sides of (2.1), we have
On the other hand,
Note that \(\lim_{n\rightarrow \infty }S_{n}=S\). By Lemma 2.4 we have
Combining the above inequalities, we have
□
Throughout the rest of this paper, let \(\{X_{n}\}_{n=1}^{\infty }\) be a sequence of independent random variables on a sub-linear expectation space \((\varOmega,\mathcal{H},\mathbb{E})\) with \(\mathbb{E}[X_{n}]= \mathcal{E}[X_{n}]=0\), \(\mathbb{E}[X_{n}^{2}]=\overline{\sigma }_{n} ^{2}\), \(\mathcal{E}[X_{n}^{2}]=\underline{\sigma }_{n}^{2}\), \(0<\underline{\sigma }_{n}\leq \overline{\sigma }_{n}<\infty \). Denote \(S_{n}:=\sum_{i=1}^{n} X_{i}\), \(\overline{B}_{n}^{2}:=\sum_{i=1}^{n} \overline{ \sigma }_{i}^{2}\), and \(\underline{B}_{n}^{2}:=\sum_{i=1}^{n} \underline{\sigma}_{i}^{2}\). The symbol C presents an arbitrary positive constant and may take different values in different positions.
Zhang [23] obtained the following CLT for sub-linear expectation under the Lindeberg condition as a corollary of the martingale CLT for sub-linear expectation.
Theorem 2.1
Let ξ be G-normal distributed on a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}(\widetilde{\varOmega }), \widetilde{\mathbb{E}})\) with \(\xi \sim \mathcal{N}(\{0\};[\underline{ \sigma }^{2},1])\), \(0<\underline{\sigma }\leq 1\). Assume that
-
(1)
$$ \lim_{n\rightarrow \infty }\frac{1}{\overline{B}_{n}^{2}}\sum_{i=1} ^{n} \bigl\vert \underline{\sigma }^{2}\cdot \overline{ \sigma }_{i}^{2}-\underline{ \sigma }_{i}^{2} \bigr\vert =0. $$(2.3)
-
(2)
For any \(\varepsilon >0\),
$$ \lim_{n\rightarrow \infty }\frac{1}{\overline{B}_{n}^{2}}\sum_{i=1} ^{n} \mathbb{E}\bigl[ \vert X_{i} \vert ^{2}I \bigl( \vert X_{i} \vert >\varepsilon \overline{B}_{n} \bigr)\bigr]=0. $$(2.4)
Then, for any \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\),
3 The bound on the distance between the normalized sum distribution and G-normal distribution
The following theorem gives a bound on the distance between the normalized sum distribution \(\mathbb{E}[\varphi (\frac{S_{n}}{ \overline{B}_{n}})]\) and G-normal distribution \(\widetilde{\mathbb{E}}[\varphi (\xi)]\) where \(\xi \sim \mathcal{N}( \{0\};[\underline{\sigma }^{2},1])\).
Theorem 3.1
Let ξ be G-normal distributed on a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}(\widetilde{\varOmega }), \widetilde{\mathbb{E}})\) with \(\xi \sim \mathcal{N}(\{0\};[\underline{ \sigma }^{2},1])\), \(0<\underline{\sigma }\leq 1\).
Then, for any fixed \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\) and any \(h>0\), \(0<\varepsilon <1\), there exist some \(0<\alpha < 1\), \(C>0\), and \(C_{h}>0\) (a positive constant depending on h) such that
Remark 3.1
By Theorem 3.1 we can derive Theorem 2.1. If (2.3) and (2.4) hold, taking \(n\rightarrow \infty \), \(\varepsilon \rightarrow 0\), and \(h\rightarrow 0\) in turn on both sides of (3.1), we can get (2.5).
Proof
For any fixed \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\) and any \(h>0\), let \(V(t,x)=\widetilde{\mathbb{E}}[\varphi (x+\sqrt{1+h-t}\xi)]\). By Definition 2.7, we have that V is the unique viscosity solution of the following parabolic PDE:
Let \(X_{i}^{(n)}=(-\overline{B}_{n})\vee (X_{i} \wedge \overline{B} _{n})\), \(S_{i}^{(n)}=\frac{1}{\overline{B}_{n}}\sum_{j=1}^{i} X_{j} ^{(n)}\), \(S_{0}^{(n)}=0\), \(\delta_{i}^{(n)}=\frac{1}{\overline{B}_{n} ^{2}}\sum_{j=1}^{i} \overline{\sigma }_{j}^{2}\), \(\delta_{0}^{(n)}=0\) for each \(i=1,2,\ldots,n\). Then we have
Since \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\), for any \(0<\varepsilon <1\), it holds that
and
Then
So it is sufficient to get the bound of \(|\mathbb{E}[V(1,S_{n}^{(n)})]-V(0,0)|\).
where \(I_{i}^{(n)}\) and \(J_{i}^{(n)}\) are obtained by Taylor expansion:
Since \(X_{i+1}\) is independent of \(S_{i}^{(n)}\), we have
Similarly, we can also have \(\mathbb{E}[-\partial_{x} V(\delta_{i} ^{(n)},S_{i}^{(n)})X_{i+1}]=0\). It follows that
By the interior regularity of V (see Peng [18]), it holds that
which implies \(\partial_{t} V\), \(\partial_{x} V\), and \(\partial_{xx}V\) are uniformly \(\frac{\alpha }{2}\)-Hölder continuous in t and α-Hölder continuous in x on \([0,1]\times \mathbb{R}\). For any \(n\geq 1\) and \(i\leq n\), it holds that
By Proposition 2.2, we have
So we have \(\mathbb{E}[|\partial_{xx}V(\delta_{i}^{(n)},S_{i}^{(n)})|] \leq C_{h}\). Similarly \(\mathbb{E}[|\partial_{x}V(\delta_{i}^{(n)},S _{i}^{(n)})|]\leq C_{h}\). Then
On the other hand,
So we have
Since \(|X_{i+1}^{(n)}|^{2}-X_{i+1}^{2}\) is independent of \(\partial _{xx}V(\delta_{i}^{(n)},S_{i}^{(n)})\) and \(X_{i+1}^{(n)}-X_{i+1}\) is independent of \(\partial_{x} V(\delta_{i}^{(n)},S_{i}^{(n)})\), we have
On the other hand,
Then
For any \(0<\varepsilon <1\), we have
By Hölder’s inequality under sub-linear expectation, we have
Combining (3.7), (3.8), and (3.9), we have
By (3.4), (3.5), and (3.10), it holds that
Thus we obtain (3.1). □
By a similar method, we can obtain a bound on the distance between the normalized sum distribution \(\mathbb{E}[\varphi (\frac{S_{n}}{ \underline{B}_{n}})]\) and the corresponding G-normal distribution \(\mathbb{E}[\varphi (\eta)]\). And it can also be used to derive the CLT for normalizing factor \(\underline{B}_{n}\). We only give the theorem and omit the proof.
Theorem 3.2
Let η be G-normal distributed on a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}(\widetilde{\varOmega }), \widetilde{\mathbb{E}})\) with \(\eta \sim \mathcal{N}(\{0\};[1,\overline{ \sigma }^{2}])\), \(\overline{\sigma }\geq 1\).
Then, for any fixed \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\) and any \(h>0\), \(0<\varepsilon <1\), there exist some \(0<\alpha < 1\), \(C>0\), and \(C_{h}>0\) (a constant depending on h) such that
If we further assume that
-
(1)
$$ \lim_{n\rightarrow \infty }\frac{1}{\underline{B}_{n}^{2}}\sum_{i=1} ^{n} \bigl\vert \overline{\sigma }^{2}\cdot \underline{ \sigma }_{i}^{2}-\overline{ \sigma }_{i}^{2} \bigr\vert =0. $$
-
(2)
For any \(\varepsilon >0\),
$$ \lim_{n\rightarrow \infty }\frac{1}{\underline{B}_{n}^{2}}\sum_{i=1} ^{n} \mathbb{E}\bigl[ \vert X_{i} \vert ^{2}I \bigl( \vert X_{i} \vert >\varepsilon \underline{B}_{n} \bigr)\bigr]=0. $$
Then, for any \(\varphi \in C_{b,\mathrm{Lip}}(\mathbb{R})\),
4 Central limit theorem for capacity
The following theorem is the CLT for capacity under the Lindeberg condition.
Theorem 4.1
Assume that
-
(1)
ξ is G-normal distributed on a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}(\widetilde{\varOmega }), \widetilde{\mathbb{E}})\) with \(\xi \sim \mathcal{N}(\{0\};[\underline{ \sigma }^{2},1])\), \(0<\underline{\sigma }\leq 1\), and
$$ \lim_{n\rightarrow \infty }\frac{1}{\overline{B}_{n}^{2}}\sum_{i=1} ^{n} \bigl\vert \underline{\sigma }^{2}\cdot \overline{ \sigma }_{i}^{2}-\underline{ \sigma }_{i}^{2} \bigr\vert =0. $$(4.1) -
(2)
For any \(\varepsilon >0\),
$$ \lim_{n\rightarrow \infty }\frac{1}{\overline{B}_{n}^{2}}\sum_{i=1} ^{n} \mathbb{E}\bigl[ \vert X_{i} \vert ^{2}I \bigl( \vert X_{i} \vert >\varepsilon \overline{B}_{n} \bigr)\bigr]=0. $$(4.2)
Then, for any \(a\in \mathbb{R}\),
Proof
For any fixed \(\varepsilon >0\), define
and
It is easy to verify that \(f,g\in C_{b,\mathrm{Lip}}(\mathbb{R})\) and \(g(x)\leq I(x\leq a)\leq f(x)\). It follows that
By Theorem 2.1 we have
Then
Note that
which implies
By the arbitrariness of \(\varepsilon >0\) and Lemma 2.2, we have
which implies
Similarly, we can also obtain
That is,
□
Remark 4.1
By a similar method, we can also obtain the CLT for capacity for the normalized sum \(S_{n}/\underline{B}_{n}\). We omit the details here.
5 Central limit theorem for summability methods
Let \(c_{i}(\lambda)\) be continuous functions on \((0,\infty)\) or λ only valued in \(\mathbb{N}^{*}\). Assume that \(0\leq c_{i}( \lambda)\leq 1\) and, for any \(\lambda >0\),
where \(\lim_{\lambda \rightarrow \infty }\theta (\lambda)=0\). Denote \(\overline{B}_{\lambda }^{2}=\sum_{i=1}^{\infty }c_{i}(\lambda)^{2}\overline{ \sigma }_{i}^{2}\), \(\underline{B}_{\lambda }^{2}=\sum_{i=1}^{\infty }c _{i}(\lambda)^{2}\underline{\sigma }_{i}^{2}\), where \(\overline{B} _{\lambda }^{2}<\infty \) for any \(\lambda >0\). Assume that \(\mathbb{E}\) is continuous from below and regular, then by Lemma 2.5, \(S_{\lambda }:=\sum_{i=1}^{\infty }c_{i}(\lambda)X_{i}\) is well defined q.s. under capacity \(\mathbb{V}\).
Theorem 5.1
Given a sub-linear expectation space \((\varOmega,\mathcal{H},\mathbb{E})\), \(\mathbb{E}\) is continuous from below and regular. Assume that
-
(1)
ξ is G-normal distributed on a G-expectation space \((\widetilde{\varOmega },L_{G}^{2}(\widetilde{\varOmega }), \widetilde{\mathbb{E}})\) with \(\xi \sim \mathcal{N}(\{0\};[\underline{ \sigma }^{2},1])\), \(0<\underline{\sigma }\leq 1\), and
$$ \lim_{\lambda \rightarrow \infty }\frac{1}{\overline{B}_{\lambda } ^{2}}\sum_{i=1}^{\infty }c_{i}^{2}( \lambda) \bigl\vert \underline{\sigma }^{2} \cdot \overline{\sigma }_{i}^{2}-\underline{\sigma }_{i}^{2} \bigr\vert =0. $$(5.1) -
(2)
For any \(\varepsilon >0\),
$$ \lim_{\lambda \rightarrow \infty }\frac{1}{\overline{B}_{\lambda } ^{2}}\sum_{i=1}^{\infty }c_{i}^{2}( \lambda)\mathbb{E}\bigl[ \vert X_{i} \vert ^{2}I\bigl(c _{i}(\lambda) \vert X_{i} \vert >\varepsilon \overline{B}_{\lambda }\bigr)\bigr]=0. $$(5.2)
Then, for any \(a\in \mathbb{R}\),
Proof
Denote
Note that \(\lim_{N\rightarrow \infty } \overline{B}_{\lambda,N}^{2}= \overline{B}_{\lambda }^{2}\). For any \(k>1\), we can choose N sufficiently large such that \(\overline{B}_{\lambda,N}^{2}\geq \overline{B}_{\lambda }^{2}/k^{2}\).
For any \(a>0\), \(t>0\), and \(X,Y\in \mathcal{H}\), it holds that
Then
Hence
For any \(\eta >0\), let
and
It is easy to verify that \(f,g\in C_{b,\mathrm{Lip}}(\mathbb{R})\) and \(g(x)\leq I(x\leq k(a+t))\leq f(x)\). By the proof process of Theorem 4.1, we have
By (3.1), for any \(0<\varepsilon <1\), we have
Note that \(\overline{B}_{\lambda,N}^{2}\geq \overline{B}_{\lambda } ^{2}/k^{2}\), we have
In addition, by Lemma 2.5 we have
Let \(t=(\mathbb{E}[|S^{*}_{\lambda,N}|^{2}])^{\frac{1}{3}}\). We have \(\lim_{N\rightarrow \infty }t=0\) and
So we have
By (5.1), (5.2), Lemma 2.1, and Lemma 2.2, letting \(t=(\mathbb{E}[|S ^{*}_{\lambda,N}|^{2}])^{\frac{1}{3}}\), \(N\rightarrow \infty \), \(\lambda \rightarrow \infty \), \(\varepsilon \rightarrow 0\), and \(\eta \rightarrow 0\) in turn, we have
By the arbitrariness of \(k>1\) and \(h>0\), we get
On the other hand, for any \(a>0\), \(0< t< a\), and \(X,Y\in \mathcal{H}\),
Then
Hence
By the same method as before, we have
Letting \(t=(\mathbb{E}[|S^{*}_{\lambda,N}|^{2}])^{\frac{1}{3}}\), \(N\rightarrow \infty \), \(\lambda \rightarrow \infty \), \(\varepsilon \rightarrow 0\), and \(\eta \rightarrow 0\) in turn, we have
By the arbitrariness of \(h>0\), we have
Combining (5.5) with (5.4), we obtain
Similarly, we can also have
This is equivalent to
□
References
Artzner, P., Delbaen, F., Eber, J.M., Heath, D.: Coherent measures of risk. Math. Finance 9(3), 203–228 (1999)
Chen, Z., Wu, P., Li, B.: A strong law of large numbers for non-additive probabilities. Int. J. Approx. Reason. 54(3), 365–377 (2013)
Choquet, G.: Theory of capacities. Ann. Inst. Fourier 5, 131–295 (1954)
Denis, L., Hu, M., Peng, S.: Function spaces and capacity related to a sublinear expectation: application to G-Brownian motion paths. Potential Anal. 34(2), 139–161 (2011)
Embrechts, P., Maejima, M.: The central limit theorem for summability methods of iid random variables. Z. Wahrscheinlichkeitstheor. Verw. Geb. 68(2), 191–204 (1984)
Freedman, A., Sember, J.: Densities and summability. Pac. J. Math. 95(2), 293–305 (1981)
Fridy, J.A.: On statistical convergence. Analysis 5, 301–313 (1985)
Hu, C.: A strong law of large numbers for sub-linear expectation under a general moment condition. Stat. Probab. Lett. 119, 248–258 (2016)
Hu, F.: Moment bounds for IID sequences under sublinear expectations. Sci. China Math. 54(10), 2155–2160 (2011)
Hu, F., Zhang, D.: Central limit theorem for capacities. C. R. Math. 348(19–20), 1111–1114 (2010)
Hu, M., Wang, F., Zheng, G.: Quasi-continuous random variables and processes under the G-expectation framework. Stoch. Process. Appl. 126(8), 2367–2387 (2016)
Hu, Z., Zhou, L.: Multi-dimensional central limit theorems and laws of large numbers under sublinear expectations. Acta Math. Sin. Engl. Ser. 31(2), 305–318 (2015)
Li, M., Shi, Y.: A general central limit theorem under sublinear expectations. Sci. China Math. 53(8), 1989–1994 (2010)
Li, X.: A central limit theorem for m-dependent random variables under sublinear expectations. Acta Math. Appl. Sin. Engl. Ser. 31(2), 435–444 (2015)
Peng, S.: G-expectation, G-Brownian motion and related stochastic calculus of Itô type. In: Stochastic Analysis and Applications, pp. 541–567. Springer, Berlin (2007)
Peng, S.: Law of large numbers and central limit theorem under nonlinear expectations (2007). arXiv preprint math/0702358
Peng, S.: Survey on normal distributions, central limit theorem, Brownian motion and the related stochastic calculus under sublinear expectations. Sci. China Ser. A, Math. 52(7), 1391–1411 (2009)
Peng, S.: Nonlinear expectations and stochastic calculus under uncertainty (2010). arXiv preprint arXiv:1002.4546
Rokhlin, D.B.: Central limit theorem under variance uncertainty. Electron. Commun. Probab. 20, 10 (2015)
Wang, Q., Su, C.: Non-uniform Berry–Essen distance for summability methods with applications. Acta Math. Appl. Sin. 16(3), 383–395 (1993) (in Chinese)
Zhang, D., Chen, Z.: A weighted central limit theorem under sublinear expectations. Commun. Stat., Theory Methods 43(3), 566–577 (2014)
Zhang, L.: Exponential inequalities under the sub-linear expectations with applications to laws of the iterated logarithm. Sci. China Math. 59(12), 2503–2526 (2016)
Zhang, L.: Lindeberg’s central limit theorems for martingale like sequences under nonlinear expectations (2016). arXiv preprint arXiv:1611.01619
Zhang, L.: Rosenthal’s inequalities for independent and negatively dependent random variables under sub-linear expectations with applications. Sci. China Math. 59(4), 751–768 (2016)
Zhang, L., Lin, J.: Marcinkiewicz’s strong law of large numbers for non-additive expectation (2017). arXiv preprint arXiv:1703.00604
Zhong, H., Wu, Q.: Complete convergence and complete moment convergence for weighted sums of extended negatively dependent random variables under sub-linear expectation. J. Inequal. Appl. 2017, Article ID 261 (2017)
Acknowledgements
The author would like to thank the editor and referees for their valuable comments.
Funding
Not applicable.
Author information
Authors and Affiliations
Contributions
The author organized and wrote this paper. The author read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The author declares that there are no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Hu, C. Central limit theorems for sub-linear expectation under the Lindeberg condition. J Inequal Appl 2018, 316 (2018). https://doi.org/10.1186/s13660-018-1901-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-018-1901-x