- Research
- Open Access
- Published:
Concentration inequalities for upper probabilities
Journal of Inequalities and Applications volume 2020, Article number: 151 (2020)
Abstract
In this paper, we obtain a Bernstein-type concentration inequality and McDiarmid’s inequality under upper probabilities for exponential independent random variables. Compared with the classical result, our inequalities are investigated under a family of probability measures, rather than one probability measure. As applications, the convergence rates of the law of large numbers and the Marcinkiewicz–Zygmund-type law of large numbers about the random variables in upper expectation spaces are obtained.
1 Introduction
Concentration inequalities are useful technique tools for studying the limit theory in the classical probability and statistics, which describe the bounds of a random variable deviating from some value. The law of large numbers, central limit theorem, and law of the iterated logarithm could all be regarded as derivative results of concentration inequalities. The Bernstein-type inequality plays an important role among concentration inequalities especially, which provides a bound for the sum of independent random variables deviating from its mean value, while McDiarmid’s inequality is established to bound the deviations for Doob martingale in the probability space. More precisely, let Ω be a sample space, \(\mathcal{F}\) be a Borel field of subsets of Ω, and P be a probability measure on \(\mathcal{F}\). Suppose that \(X_{1}, X_{2}, \ldots , X_{n}, \ldots \) is a sequence of independent, zero-mean random variables defined in the probability space \((\varOmega , \mathcal{F}, \mathrm{P})\). Denote by \(S_{n}\) the partial sum of this sequence, namely \(S_{n} \triangleq \sum_{k=1}^{n} X_{k}\) and \(v_{n}^{2} \triangleq \frac{1}{n} \sum_{k=1}^{n} \mathrm{E}[X_{k}^{2}]\). For any \(x > 0\), Bernstein proved in [2] that
under the standard Bernstein condition which supposes that there exists a positive constant c, for any \(1 \leq k \leq n\) and any integer \(p \geq 3\),
Since then, some special cases of Bernstein’s inequality were established, known as Hoeffding’s inequality [1], Bennett’s inequality [1], and so on. Especially, Rio [21] improved (1.1) under a weaker condition
McDiarmid’s inequality was first proved in [17] by using martingale theory and reproved by Ying in [23]. This inequality shows that, for any \(\varepsilon >0\), any \(f:\mathbb{R}^{n} \rightarrow \mathbb{R}\) with bounded differences \(\{c_{k}\}_{k=1}^{n}\),
A function f with bounded differences \(\{c_{k}\}_{k=1}^{n}\) means that
for all \(1 \leq k \leq n\).
Stimulated by the uncertainties of the model, the theory of nonlinear expectations and nonadditive probabilities emerges as the times require. Nonlinear expectations have been used in a wide range of realistic situations, such as risk measures in finance and statistical uncertainty in decisions. The general framework of the nonlinear expectation space was proposed by Peng [20]. In [20], Peng established the fundamental theoretical framework of the sublinear expectation space and redefined the concepts of independence, identical distribution, law of large numbers, central limit theorem in this system. Recently, some researchers have already got the large deviation principle under the nonlinear framework. Chen and Xiong [8] studied the large deviation principle for diffusion processes under the g-expectation; Hu [10] obtained the upper bound of Cramér’s theorem for capacities; Gao and Xu [11] proved the large deviation principle for independent random variables in a sublinear expectation space; Chen and Feng [24] established the large deviation principle for negatively dependent random variables under sublinear expectations. Many authors began to study the corresponding inequalities in sublinear framework, including upper expectation spaces, where upper expectations are typical sublinear expectations generated by a family of probabilities. For instance, Chen et al. [7] presented several elementary inequalities in an upper expectation space, Hölder’s inequality, Chebyshev’s inequality, and Jensen’s inequality included; Wu [22] proved the maximal inequalities, exponential inequalities, and the Marcinkiewicz–Zygmund inequality for the partial sums of random variables which are independent in an upper expectation space; Zhang [25] showed Rosenthal’s inequalities for negatively dependent random variables; Zhang [26] proved the Kolmogorov-type exponential inequalities of the partial sums of independent random variables as well as negatively dependent random variables under the sublinear expectations; Huang and Wu [14] obtained the equivalent relations between Kolmogorov maximal inequality and Hájek–Rényi maximal inequality both in the moment and capacity types in sublinear expectation spaces. All of these inequalities could be used to investigate the limit theory. More detailed information about these results under sublinear expectations can be found in [3–5, 7–9, 12, 16, 18–20] and the references therein.
As is well known, concentration inequalities play an important role in the limit theory area, especially for the large deviation principle and the convergence rates. However, there are few results about the convergence rate of limit theory in sublinear expectation space. Our motivation is to study the Bernstein-type inequality (1.1) under Rio–Bernstein condition (1.2) and McDiarmid’s inequality (1.3) in upper expectation spaces, and then apply them to improve several types law of large numbers and obtain the convergence rates.
The organization of this paper is as follows. We first recall some preliminary definitions and notations about upper probabilities and sublinear expectation spaces in Sect. 2. Section 3 is to study the Bernstein-type inequality under the Rio–Bernstein condition and McDiarmid’s inequality for the upper probability, which will be used to discuss the convergence rates of the laws of large numbers in Sect. 4.
2 Preliminaries
Let \((\varOmega , \mathcal{F})\) be a measurable space and \(\mathcal{H}\) be a linear space of random variables defined on Ω. In this paper, we suppose that \(\mathcal{H}\) satisfies \(c \in \mathcal{H}\) for each constant c and \(|X| \in \mathcal{H}\) if \(X \in \mathcal{H}\).
Definition 2.1
([20, Definition 1.1.1])
A sublinear expectation \(\mathbb{E}\) is a functional \(\mathbb{E}: \mathcal{H} \rightarrow \mathbb{R}\) satisfying
-
(1)
Monotonicity: \(X \geq Y\) implies \(\mathbb{E}[X] \geq \mathbb{E}[Y]\).
-
(2)
Constant preserving: \(\mathbb{E}[c]=c\) for \(c \in \mathbb{R}\).
-
(3)
Subadditivity: For each \(X, Y \in \mathcal{H}\), \(\mathbb{E}[X+Y] \leq \mathbb{E}[X] + \mathbb{E}[Y]\).
-
(4)
Positive homogeneity: \(\mathbb{E}[\lambda X] = \lambda \mathbb{E}[X] \) for \(\lambda \geq 0\).
The triplet \((\varOmega ,\mathcal{H},\mathbb{E})\) is called a sublinear expectation space. Generally, we consider the following sublinear expectation space \((\varOmega ,\mathcal{H},\mathbb{E})\): if \(X_{1}, \ldots , X_{n} \in \mathcal{H}\), then \(\varphi (X_{1},\ldots ,X_{n}) \in \mathcal{H}\) for each \(\varphi \in C_{l,\mathrm{Lip}}(\mathbb{R}^{n})\), where \(C_{l,\mathrm{Lip}}(\mathbb{R}^{n})\) denotes the linear space of functions φ satisfying the following local Lipschitz condition:
In this case \(X=(X_{1},\ldots ,X_{n})\) is called an n-dimensional random vector, denoted by \(X \in \mathcal{H}^{n}\).
Definition 2.2
([20, Definition 1.3.1, Proposition 1.3.2])
Let \(X_{1}\) and \(X_{2}\) be two n-dimensional random vectors defined on nonlinear expectation spaces \((\varOmega _{1},\mathcal{H}_{1},\mathbb{E}_{1})\) and \((\varOmega _{2},\mathcal{H}_{2},\mathbb{E}_{2})\), respectively. They are called identically distributed, denoted by \(X_{1} \overset{d}{=} X_{2}\), if
Definition 2.3
([20, Definition 1.3.11])
In a sublinear expectation space \((\varOmega ,\mathcal{H},\mathbb{E})\), a random vector \(Y \in \mathcal{H}^{n}\) is said to be independent of another random vector \(X \in \mathcal{H}^{m}\) under \(\mathbb{E}[\cdot ]\) if, for each function \(\varphi \in C_{l,\mathrm{Lip}}(\mathbb{R}^{m+n})\), we have
Remark 2.1
It is important to observe that, under a nonlinear expectation, Y is independent of X does not in general imply that X is independent of Y. An example constructed in [20, Example 1.3.15] shows this nonsymmetric property. In fact, Hu and Li [13, Theorem 15] showed that, for two nontrivial random variables X and Y under a sublinear expectation space, if X is independent of Y and Y is independent of X, then X and Y must be maximally distributed.
Let \(\mathcal{M}\) be the collection of all probability measures on \((\varOmega ,\mathcal{F})\). Any given nonempty subset \(\mathcal{P} \subseteq \mathcal{M}\), define
as the upper probability and lower probability, respectively. Obviously, \(\mathbb{V}\) and ν are conjugate to each other, that is,
where \(A^{c}\) is the complement set of A.
The upper expectation \(\mathbb{E}[\cdot ]\) and the lower expectation \(\mathcal{E}[\cdot ]\) generated by \(\mathcal{P}\) can be defined respectively as follows [15]:
for each \(X \in L^{0}(\varOmega )\), where \(L^{0}(\varOmega )\) is the space of all \(X \in \mathcal{H}\) such that \(\mathrm{E}_{\mathrm{P}}[X]\) exists for each \(\mathrm{P} \in \mathcal{P}\). In this case, \((\varOmega ,\mathcal{H},\mathbb{E})\) is called an upper expectation space. It is easy to check that the upper expectation \(\mathbb{E}[\cdot ]\) is also a sublinear expectation. We also consider the random variables in the following spaces:
In the end of this section, we give an example of the sublinear expectation [20, Example 1.1.5].
Example
In a game a gambler randomly picks a ball from an urn containing W white, B black, and Y yellow balls. The owner of the urn, who is the banker of the game, does not tell the gambler the exact numbers of W, B, and Y. He/She only ensures that \(W + B + Y = 100\) and \(W = B \in [20, 25]\). Let ξ be a random variable defined by
We know that the distribution of ξ is
Thus the robust expectation of ξ is
Especially, \(\mathbb{E}[\xi ]=0\).
3 Concentration inequalities for upper probabilities
The following theorem gives the Bernstein-type inequality in an upper probability space.
Theorem 3.1
Let\(\{X_{i}\}_{i=1}^{\infty } \subset \mathcal{L}\)be a sequence of random variables in an upper expectation space\((\varOmega ,\mathcal{H},\mathbb{E})\). Assume that\(e^{X} \in \mathcal{H}\)if\(X \in \mathcal{H}\), and for any given integer\(n \geq 1\), all\(t \geq 0\),
Denote
Suppose that there exists a constant\(c > 0\)such that, for any integer\(p \geq 3\),
Then, for any\(x > 0\),
Proof
Notice that
Meanwhile, for any \(x \in \mathbb{R}\), note the fact that
where \(x^{+} \) denotes \(0 \vee x\). That is,
Because of the monotonicity, constant-preserving property, positive homogeneity, and countable subadditivity of \(\mathbb{E}\), we have
It follows from (3.2) that
In addition, according to Jensen’s inequality [7, Proposition 2.1],
namely, \(0< ct<1 \) in (3.5).
For all \(t > 0\), by (3.1),
Thus, taking \(t= \frac{x}{b_{n}^{2}+cx}\) in (3.5) and (3.6), together with (3.4), we get
The proof is completed. □
Remark 3.1
The exponential independence in [6, Definition 2.3], which is similar to (3.1), is defined for all bounded Lipschitz functions.
Remark 3.2
Particularly, when \(X_{i} \leq M\) uniformly, the Rio–Bernstein condition is satisfied with \(c = M\). Then the conclusion holds. The similar result for \(\{X_{i}\}_{i=1}^{\infty } \subset \mathcal{L}^{\infty }\) with different upper bounds is proved by Wu [22].
Remark 3.3
Suppose that \(\{-X_{i}\}_{i=1}^{\infty }\) satisfies the condition in Theorem 3.1. More precisely, for any fixed n, there exists \(c > 0\), for all \(p > 2\),
Then we have
where \(\tilde{b}^{2}_{n} \triangleq \frac{1}{n} \sum_{i=1}^{n} \mathbb{E}[(X_{i}-\mathcal{E}[X_{i}])^{2}]\). This inequality could be regarded as the other side of the Bernstein-type inequality.
Particularly, if \(\mathbb{E}[X_{i}] = \mathcal{E}[X_{i}] = 0\) for any \(i \geq 1\) and
the result degenerates to the two-side inequality, i.e.,
where \(b^{2}_{n} = \tilde{b}^{2}_{n} = \frac{1}{n} \sum_{i=1}^{n} \mathbb{E}[X_{i}^{2}]\).
Next we provide an example (\(\mathbb{E}[X_{i}] \neq 0\)) under which assumption (3.2) holds.
Example
Reconsider the example in Sect. 2. Let \(W + B = 90\) and \(W \in [30, 60]\). Let ξ be a random variable defined by
We know that the distribution of ξ is
Thus the robust expectation of ξ is
and \(\mathbb{E}[\xi ]=2/3\), \(\mathbb{E}[(\xi -\mathbb{E}[\xi ])^{2}]=1/3\), \(\mathbb{E} [ ((\xi -\mathbb{E}[\xi ])^{+} )^{p} ] = 2/3^{p+1}\) for every \(p \geqslant 3\). Then we define a sequence of random variables which are identically distributed with ξ. By some simple computations, Rio–Bernstein condition (3.2) holds for \(c=1/3\).
Before presenting McDiarmid’s inequality in the upper expectation space, we recall the definition of a bounded differences function and a crucial lemma whose proof could be found in [12, Lemma 3.1].
Definition 3.1
We say \(f:\mathbb{R}^{n} \rightarrow \mathbb{R}\) is a function with bounded differences \(\{c_{k}\}_{k=1}^{n}\) if
for all \(1 \leq k \leq n\), where \(\{c_{k}\}_{k=1}^{n}\) is a finite sequence of bounded numbers.
Lemma 3.1
([12, Lemma 3.1])
If a random variableXsatisfies\(\mathbb{E}[X] \leq 0\)and\(m \leq X \leq M\), \(m, M \in \mathbb{R}\), then for all\(h > 0\),
The following result tells McDiarmid’s inequality in upper expectation spaces.
Theorem 3.2
Let\(\{X_{i}\}_{i=1}^{\infty }\)be a sequence of random variables in an upper expectation space\((\varOmega ,\mathcal{H},\mathbb{E})\). For any given integer\(n \geq 1\), \(X_{n+1}\)is independent of\((X_{1},\ldots ,X_{n})\). Suppose that\(\varphi : \mathbb{R}^{n} \rightarrow \mathbb{R}\)is any local Lipschitz function with bounded differences\(\{c_{k}\}_{k=1}^{n}\). Then, for any\(\varepsilon > 0\), it holds that
Proof
Note that
where
Applying Chebyshev’s inequality [7, Proposition 2.3], we obtain, for any \(h > 0\),
Denote
By the definition of \(g_{k}(x_{1},\ldots ,x_{k})\), we have
In fact, by the monotonicity and subadditivity of \(\mathbb{E}[\cdot ]\),
Then due to the independence, it follows that
Choosing \(h=\frac{4\varepsilon }{\sum_{k=1}^{n} c_{k}^{2}}\), McDiarmid’s inequality (3.7) holds. □
4 Applications
As applications of Bernstein-type inequality, we get the following strong laws of large numbers.
Corollary 4.1
Let\(\{X_{i}\}_{i=1}^{\infty }\)and\(\{-X_{i}\}_{i=1}^{\infty }\)both satisfy the conditions in Theorem 3.1. Suppose that there exists some constantMsuch that\(b_{n}^{2} \leq M\)and\(\tilde{b}_{n}^{2} \leq M\)uniformly for anyn. For every\(i \geq 1\), \(\mathbb{E}[X_{i}] = \overline{\mu }\), \(\mathcal{E}[X_{i}] =- \mathbb{E}[-X_{i}] = \underline{\mu }\), and set\(S_{n}= \sum_{i=1}^{n} X_{i}\). Then we have
Moreover,
Proof
By the lower continuity and subadditivity of \(\mathbb{V}\), it is obvious that result (4.1) is equivalent to the conjunction of
and
for any \(\varepsilon > 0\). Given any \(\varepsilon > 0\), a direct result of Bernstein-type inequality (3.3) is
Thus, we obtain
It follows from the Borel–Cantelli lemma [7, Lemma 2.2] that
Then
More precisely, by (4.4),
and
Similarly, by applying the other side of the Bernstein-type inequality, we can get
and
Together with the subadditivity of \(\mathbb{V}\), (4.2) and (4.3) follow. □
Remark 4.1
The exponential moment condition is stronger for result (4.1). In fact, Chen et al. proved the strong law of large numbers (4.1) in upper expectation spaces in [7, Theorem 3.1] and [4, Theorem 3.1] under different conditions without the exponential moment condition. We strengthen the conditions to apply the Bernstein-type inequality and obtain results (4.2) and (4.3). Formula (4.2) illustrates the convergence rate of (4.1). In the probability theory, (4.3) characterizes the complete convergence and it is precise asymptotic.
Corollary 4.2
(Marcinkiewicz–Zygmund-type law of large numbers)
Let\(\{X_{i}\}_{i=1}^{\infty }\)be the sequence in Corollary 4.1. For any\(1< r< 2\), we have
where\(\bar{S}_{n} = \sum_{i=1}^{n} (X_{i}-\mathbb{E}[X_{i}])\)and\(\tilde{S}_{n} = \sum_{i=1}^{n} (X_{i}-\mathcal{E}[X_{i}])\).
Proof
For any given \(1 < r< 2\), \(\varepsilon > 0\), we take \(x = n^{1/r-1}\varepsilon \) in Theorem 3.1, we have
Thus, we get
Similar to the above Corollary 4.1, we can get
Moreover, for n large enough,
Result (4.6) is obvious. □
Remark 4.2
The similar result to (4.5) was considered by Lan [16, Theorem 4.4] under some weaker moment condition. Result (4.6) is our main result.
Corollary 4.3
Suppose that\(\{X_{i}\}_{i=1}^{\infty }\)satisfies the conditions in Corollary 4.1. Then, for any\(\varepsilon > 0\), we have
and
Moreover,
and
Proof
It is a straightforward result of Theorem 3.1 by taking \(x=\frac{1}{\sqrt{n}}\cdot \varepsilon \), and we omit the proof. □
References
Bercu, B., Delyon, B., Rio, E.: Concentration Inequalities for Sums and Martingales. Springer, New York (2015)
Bernstein, S.: Theory of Probability. Moscow (1927)
Chen, Z.: Strong laws of large numbers for capacities. Math. 46(15), 7529–7545 (2010)
Chen, Z.: Strong laws of large numbers for sub-linear expectations. Sci. China Math. 59(5), 945–954 (2016)
Chen, Z., Hu, F.: A law of the iterated logarithm under sublinear expectations. J. Financ. Eng. 1(02), 1450015 (2014)
Chen, Z., Huang, W., Wu, P.: Extension of the strong law of large numbers for capacities. Math. Control Relat. Fields 9(1), 175–190 (2019)
Chen, Z., Wu, P., Li, B.: A strong law of large numbers for non-additive probabilities. Int. J. Approx. Reason. 54(3), 365–377 (2013)
Chen, Z., Xiong, J.: Large deviation principle for diffusion processes under a sublinear expectation. Sci. China Math. 55(11), 2205–2216 (2012)
Denis, L., Hu, M., Peng, S.: Function spaces and capacity related to a sublinear expectation: application to g-Brownian motion paths. Potential Anal. 34(2), 139–161 (2011)
Feng, H.: On Cramér’s theorem for capacities. C. R. Math. 348(17–18), 1009–1013 (2010)
Fuqing, G., Mingzhou, X.: Large deviations and moderate deviations for independent random variables under sublinear expectations. Sci. Sin., Math. 41(4), 337–352 (2011)
Hu, C.: Strong laws of large numbers for sublinear expectation under controlled 1st moment condition. Chin. Ann. Math., Ser. B 39(5), 791–804 (2018)
Hu, M., Li, X., Independence under the G-expectation framework. J. Theor. Probab. 27(3), 1011–1020 (2014)
Huang, W., Wu, P.: Strong laws of large numbers for general random variables in sublinear expectation spaces. J. Inequal. Appl. 2019(1), 143 (2019)
Huber, P., Strassen, V.: Minimax tests and the Neyman–Pearson lemma for capacities. Ann. Stat. 1(2), 251–263 (1973)
Lan, Y., Zhang, N.: Strong limit theorems for weighted sums of negatively associated random variables in nonlinear probability (2017) arXiv:1706.05788
McDiarmid, C.: On the method of bounded differences. Surv. Comb. 141(1), 148–188 (1989)
Peng, S.: G-expectation, G-Brownian motion and related stochastic calculus of Itô type. Stoch. Anal. Appl. 2(4), 541–567 (2006)
Peng, S.: Law of large numbers and central limit theorem under nonlinear expectations. Math. (2007)
Peng, S.: Nonlinear Expectations and Stochastic Calculus Under Uncertainty. Springer, Berlin (2019)
Rio, E.: Inégalités exponentielles et inégalités de concentration (2009)
Wu, P.: Inequalities for independent random variables on upper expectation space. Chinese J. Appl. Probab. Statist. 31(1), 1–10 (2015)
Ying, Y.: McDiarmid’s inequalities of Bernstein and Bennett forms. City University of Hong Kong (2004)
Zengjing, C., Feng, X.: Large deviation for negatively dependent random variables under sublinear expectation. Commun. Stat. 45(2), 400–412 (2015)
Zhang, L.: Rosenthal’s inequalities for independent and negatively dependent random variables under sub-linear expectations with applications. Sci. China Math. 59(4), 751–768 (2016)
Zhang, L.: Exponential inequalities under the sub-linear expectations with applications to laws of the iterated logarithm. Sci. China Math. 59(12), 2503–2526 (2016)
Acknowledgements
The author would like to thank the editor and referees for their valuable comments.
Availability of data and materials
Data sharing not applicable to this paper as no data sets were generated or analyzed during the current study.
Funding
The work is supported by the National Natural Science Foundation of China (Grant Nos.11601280 and 11701331).
Author information
Authors and Affiliations
Contributions
The author organized and wrote this paper. The author read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The author declares that there are no competing interests.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Tan, Y. Concentration inequalities for upper probabilities. J Inequal Appl 2020, 151 (2020). https://doi.org/10.1186/s13660-020-02417-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-020-02417-6
Keywords
- Concentration inequality
- Law of large numbers
- Upper probability
- Sublinear expectation