 Research
 Open Access
 Published:
Quantile Jensen’s inequalities
Journal of Inequalities and Applications volume 2021, Article number: 86 (2021)
Abstract
Quantiles of random variable are crucial quantities that give more delicate information about distribution than mean and median and so on. We establish Jensen’s inequality for qquantile (\(q\geq 0.5\)) of a random variable, which includes as a special case Merkle (Stat. Probab. Lett. 71(3):277–281, 2005) where Jensen’s inequality about median (i.e. \(q= 0.5\)) was given. We also refine this inequality in the case where \(q<0.5\). An application to the confidence interval of parameters in pivotal quantity is also considered by virtue of the rigorous description on the relationship between quantiles and intervals that have required probability.
Introduction and preliminaries
Inequalities play a central role in all mathematical branches, especially in approximation theory, as illustrated by monographs such as Hardy et al. [2] and Kazarinoff [3]. The famous Jensen inequality is one of the most useful inequalities in probability and statistics, which applies to convex functions. A function \(f(x)\) defined on \(\mathbb{R}\) is proved to be a convex function if \(f(\lambda x+(1\lambda )y)\leqslant \lambda f(x)+(1\lambda )f(y)\) for all x and y, and \(0\leqslant \lambda \leqslant 1\). In addition, a function \(f(x)\) is said to be a concave function if \(f(x)\) is a convex function. When applied with expectation operator in probability, Jensen’s inequality states that, for any realvalued random variable X with a finite expectation \(EX\) and for any convex function f, \(f(EX)\leqslant Ef(X)\) holds. The equality holds if and only if, for every line \(a+bx\) that is related to be tangent \(f(x)\) at \(x=EX\), \(P(f(X)=a+bX)=1\). When we apply Jensen’s inequality to concave functions, we have a converse inequality, that is, if f is concave, then \(f(EX)\geqslant Ef(X)\).
There are some interesting examples about Jensen’s inequality. One immediate application of Jensen’s inequality on \(f(x)=x^{2}\) shows that \(EX^{2}\geqslant (EX)^{2}\) for any realvalued random variables. This is also a consequence if one thinks of \(\operatorname{Var} (X)=EX^{2}(EX)^{2}\geqslant 0\). Also, it is obvious that \(1/x\) is convex on \((0, \infty )\). Accordingly, \(E(1/X)\geqslant 1/EX\), if \(X>0\) almost surely. Other applications of Jensen’s inequality can be found almost in every textbook in the field of probability and statistics, for example, in proving an inequality between any two of the three different kinds of means (see Casella and Berger [4]), in proving the relationship between convergence in the rth mean and convergence in the sth mean with \(0< s< r\) (see Serfling [5], p. 7), etc. Jensen’s inequality can also play a significant role in the fields of applied mathematics (see Mitrinović, Pečarić, and Fink [6]; Malamud [7]), information theory (see Dragomir [8]; Budimir et al. [9]), and pricing theory of financial derivatives (see Hull [10]). Different kinds of generalizations and variant of Jensen’s inequalities can also be found, for example, in To and Yip [11], Rigler et al. [12], Agnew and Pecaric [13].
Another analogue of Jensen’s inequality was given by Merkle [1], where median is in lieu of expectation operator; moreover, Merkle [14] generalized median Jensen’s inequality to the multivariate case; Kwiecien and Gather [15] proved another analogous Jensen’s inequality with the expectation operator replaced by the Tukeymedianoperator.
As median is a special case of quantile with \(q=1/2\), we are very much curious whether Jensen’s inequality still holds for general qquantile. As far as we know, no relevant work is available in the literature. The present paper contributes in three folds: Firstly, we describe the relationship between quantiles and intervals that have corresponding probability when \(q\ge 1/2\); also we show that this relationship is violated when \(q< 1/3\), and further when \(1/3\le q<1/2\) we illustrate with several examples that this relationship can be both possible and impossible.
Secondly, we show rigorously that Jensen’s inequality with quantile operator and Cfunctions, which is more general than convex functions (defined in the paper), still holds for qquantile (\(q\geqslant 1/2\)) and any random variable X, while a refinement of the corresponding inequality is derived in the case that \(q<1/2\) though quantile Jensen’s inequality does not hold in this situation. Thirdly, we establish a stringent description for confidence interval of parameters in any pivotal quantity by virtue of the relationship in the first contribution.
The remaining part of the paper is arranged as follows. Two crucial lemmas that are fundamental for our theoretical development in the following sections are presented in Sect. 2; Sect. 3 shows our main results including quantile Jensen’s inequality when \(q\geqslant 1/2\) and its refinement when \(q<1/2\). Section 4 applies the assertion developed in Sect. 2 to construct a confidence interval for pivotal quantity. Some concluding remarks are included in Sect. 5.
Two crucial lemmas
Let X be a random variable defined on some probability space \((\Omega, {\mathcal{F}}, P)\). By definition, a qquantile (\(q\in [0,1]\)) of X is any real number \(\mu _{q}\) that satisfies the inequalities
As to any random variable X, its qquantile either is unique or there are infinitely many of them such that all its qquantiles are abound within a closed bounded interval \([a, b]\). Apparently, one significant feature of the quantile is that \(\mu _{q}\) is nondecreasing with q, that is, \(\mu _{q}\leqslant \mu _{q'}\) whenever \(q\leqslant q'\).
In the sequel we refer to all intervals like \((\infty, a]\) and \([b, \infty )\) for any real number a and b as closed half lines.
Lemma 1
Let I be either a closed half line or a closed interval on \(\mathbb{R}\) and \(q\geqslant \frac{1}{2}\). We have:

(i)
If \(P(X\in I)=q\), then there exist both qquantile and \((1q)\)quantile of X in I; additionally, for any \(\eta: 1q<\eta <q\), I contains all ηquantiles of X.

(ii)
Suppose that the set S possesses the following property: If J is any closed interval which has S as a proper subset, that is, \(S\subset J\), then \(P(X\in J)>q\). Then S includes all ηquantiles of X, where \(1q\leqslant \eta \leqslant q\).
As a special case where \(q=\frac{1}{2}\), the lemma is reduced to Lemma 1.2 of Merkle [1]. Accordingly, we shall dwell on the case \(q>\frac{1}{2}\).
Proof
(i) Note that, if \(I=(\infty,b]\), then clearly b is one qquantile of X. If \(I=[a,\infty )\), then for any \(\xi < a\), ξ is not a qquantile of X since \(P(X\leqslant \xi )\leqslant P(X\leqslant a)=1q< q\). Therefore, all qquantiles of X are in \(I=[a,\infty )\).
Moreover, let \(I=[a,b]\) and suppose that there is no qquantile in I. Notice that if \(\mu _{q}>b\), then \(P(X\leqslant b)\geqslant P(X\in I)=q\) and \(P(X\geqslant b)\geqslant P(X\geqslant \mu _{q})\geqslant 1q\), implying that b is a qquantile of X, a contradiction; if \(\mu _{q}< a\), then \(P(X\leqslant a)\geqslant P(X\leqslant \mu _{q})\geqslant q\) and \(P(X\geqslant a)\geqslant P(X\in I)=q\geqslant 1q\), so that a is a qquantile too, another contradiction. In conclusion, there at least exists a qquantile in I.
Next, given \(P(X\in I)=q\) and \(q>\frac{1}{2}\), in order to prove the existence of \(\mu _{1q}\in I\), let us consider the three possibilities of I. If \(I=(\infty, b]\), for any \(\xi >b\), ξ cannot be a \((1q)\)quantile. Indeed, \(P(X\geq \xi )\leq P(X>b)=1q< q=1(1q)\), which indicates that ξ is not a \((1q)\)quantile. If \(I=[a,\infty )\), then a obviously is a \(1q\) quantile. Thirdly, if \(I=[a,b]\) and there is a \((1q)\)quantile \(\mu _{1q}\) such that \(\mu _{1q}>b\), then we have \(P(X\le b)\ge P(X\in I)=q>1q\) and \(P(X\ge b)\ge P(X\ge \mu _{1q})\ge q\). Thus, b is a \((1q)\)quantile too. Conversely, if \(\mu _{1q}\) is a \((1q)\)quantile such that \(\mu _{1q}< a\), so is a by the same reason.
Additionally, because there are both \((1q)\)quantile and qquantile in I, the assertion holds for \(1q<\eta <q\) immediately due to the monotonicity of a quantile function.
(ii) Firstly, we prove that S contains all qquantiles of X. Suppose that \(\mu _{q}\) is a qquantile of X and \(\mu _{q}\notin S\). It is evident that J is set in a closed interval such that \(J\supset S\) and \(\mu _{q}\notin J\). Then J is disjoint with one of sets \(A=(\infty, \mu _{q}]\) or \(B=[\mu _{q},\infty )\). Thus, either \(P(X\in A)<1q<q\) or \(P(X\in B)<1q\), which implies that \(\mu _{q}\) is not a qquantile of X. Therefore, all qquantiles of X are in S.
Next, in a similar fashion, we can show that (ii) holds for \(\eta =1q\). Finally, by the monotonicity of quantile function, we have that (ii) holds for \(1q<\eta < q\). □
Since the assertions in Lemma 1 are based on the condition \(q\geqslant \frac{1}{2}\), we are curious what happens for the case where \(q<\frac{1}{2}\). Intuitively, if \(P(X\in I)=q<\frac{1}{2}\), then the set I is relatively “small” since the measure of the entire real line \(\mathbb{R}\) by \(P(X\in \cdot )\) is one, whereas the complement of I is sufficiently large to contain possibly the quantiles. By contrast the set I such that \(P(X\in I)=q\geqslant \frac{1}{2}\) is relatively “large”, that is the key to validate the assertions in Lemma 1. Thus, we may not be able to conclude similar assertions as in Lemma 1. Nevertheless, we have the following lemma.
Lemma 2
Let I be any interval on \(\mathbb{R}\) or half line and \(0< q<\frac{1}{3}\). If \(P(X\in I)=q\), then either \(\mu _{q}\) or \(\mu _{1q}\) is not in I.
Proof
Since \(1q>q\), we have \(\mu _{1q}\ge \mu _{q}\). It follows from the definition that
Thus, it is impossible that the interval \([\mu _{q}, \mu _{1q}]\) can be covered by I; thus at least one of \(\mu _{q}\) and \(\mu _{1q}\) is not in I, which finishes the proof. □
This lemma shows that when \(q<\frac{1}{3}\), Lemma 1 is no longer valid, a sufficient condition that hamstrings the preceding lemma. However, when \(\frac{1}{3}\le q<\frac{1}{2}\), both positive and negative examples exist.
Example 2.1
Suppose that \(X\sim \operatorname{Unif}[0,1]\). Let \(q=0.4\) and \(I=[0.3,0.7]\). Then both \(\mu _{0.4}=0.4\) and \(\mu _{0.6}=0.6\) are included in I.
Example 2.2
Suppose that \(X\sim \operatorname{Unif}[0,2]\); let \(q=0.4\) and \(I=[0.1,0.9]\). Then \(\mu _{0.4}=0.8\in I\), but \(\mu _{0.6}=1.2\notin I\); for another choice \(I=[1.1,1.9]\), \(\mu _{0.4}=0.8\notin I\), but \(\mu _{0.6}=1.2\in I\).
It follows from Lemmas 1 and 2 and Examples 2.1 and 2.2 that we may divide the unitary interval into three subintervals \([0, \frac{1}{3})\), \([\frac{1}{3}, \frac{1}{2})\), and \([\frac{1}{2},1]\), where on the last one both qquantiles and \((1q)\)quantiles are included in any closed half line or closed interval that has probability \(q\in [\frac{1}{2},1]\), on the first subinterval \([0, \frac{1}{3})\) this fact fails to hold, while on the middle part both positive and negative conclusions may happen.
Quantile Jensen’s inequality
Quantile Jensen’s inequality is established below with Cfunctions. Recall that the Cfunctions are realvalued functions f defined on \(\mathbb{R}\) such that, for any \(u\in \mathbb{R}\), the set
is a closed interval, a singleton, or an empty set.
Note that it happens to be lower semicontinuous if \(f^{1}((\infty, u])\) belongs to a closed interval for any \(u\in \mathbb{R}\). Therefore, it is clear that a Cfunction is lower semicontinuous, hence \(f(x)\le \liminf_{y\to x}f(y)\) for any \(x\in \mathbb{R}\). In addition, every Cfunction \(f(x)\) has finite left and right limits at any x, and \(f(x)\le \min (f(x_{}), f(x_{+}))\).
On the other hand, if \(f(x)\) is convex on \(\mathbb{R}\), it definitely belongs to the class of Cfunctions; the same as any monotone and continuous function on \(\mathbb{R}\). Similarly, any continuous function with nonincreasing on \((\infty, a)\) and nondecreasing on \((a,\infty )\), for a fixed a, is deemed to be a Cfunction e.g. all loss functions used in statistics. More discussion can be found in Merkle [1].
Theorem 1
(Jensen’s inequality for quantile)
Let g be a Cfunction and X be any real random variable. Suppose that \(q\geqslant \frac{1}{2}\). Then, if \(\mu _{q}^{X}\), the qquantile of X, is unique, then
where \(\mu ^{g(X)}_{q}\) is any qquantile of \(g(X)\). Conversely, if \(\mu ^{g(X)}_{q}\) is unique, (2) holds for any qquantile of X. General speaking, for any qquantile of \(g(X)\), there exists a qquantile of X such that (2) holds.
Proof
Let \(\mu ^{g(X)}_{q}\) be one qquantile of \(g(X)\), and define \(I=g^{1} ((\infty,\mu ^{g(X)}_{q}] )\). Then
and by Lemma 1 there is one \(\mu _{q}^{X}\) in I which implies \(g(\mu _{q})\leqslant \mu ^{g(X)}_{q}\). Therefore, it has been proved that for any qquantile of \(g(X)\) there exists a qquantile of X such that (2) holds. In particular, it is implied that if \(\mu _{q}^{X}\) is unique, (2) holds for any qquantile of \(g(X)\). It is supposed that \(\mu ^{g(X)}_{q}\) is unique. We only need to prove the property of interval I described in (ii) of Lemma 1. In fact, if otherwise, it can be found that there is a closed interval J under the condition such that I is a proper subset of J and \(P(X\in J)=q\). Without loss of generality, we assume \(I=(\infty,b]\), \(J=(\infty,c]\), where \(c>b\). Then, there is \(x_{0}\) such that \(x_{0}\notin I, x_{0}\in J\), and \(g(x_{0})>\mu ^{g(X)}_{q}\). Let \(M=\mu ^{g(X)}_{q}+(g(x_{0})\mu ^{g(X)}_{q})/2\). It is easy to show that the closed interval \(I_{1}=g^{1}((\infty,M])\) is contained in J due to the lower semicontinuous property of g, which also means
Consequently, M is a qquantile of \(g(X)\), which is contradictory to the uniqueness of \(\mu ^{g(X)}_{q}\). Therefore, we complete the proof of Theorem 1. □
Remark 1
The theorem indicates that, when \(q\ge \frac{1}{2}\), the quantile operator \(\mu ^{X}_{q}\), along with any Cfunction, possesses Jensen’s inequality, which extremely extends the existing literature; unfortunately, this cannot be guaranteed when \(q<\frac{1}{2}\) as illustrated in the following remark.
Remark 2
Theorem 1 does not hold for the case \(q<\frac{1}{2}\) yet. For example, let \(g(x)=x^{2}\) and X be a discrete random variable with
X  −3  −2  −1  0  1  2  3 

Pr  \(\frac{1}{4}\)  \(\frac{1}{4}\)  \(\frac{1}{4}\)  \(\frac{1}{16}\)  \(\frac{1}{16}\)  \(\frac{1}{16}\)  \(\frac{1}{16}\) 
It is easy to check that \(\{\mu _{0.25}\}=[3,2]\), \(\{\mu _{0.5}\}=[2,1]\), \(\{\mu _{0.75}\}=[1,0]\) and \(\{\mu ^{X^{2}}_{0.25}\}=\{1\}\), \(\{\mu ^{X^{2}}_{0.5}\}=\{4\}\), \(\{\mu ^{X^{2}}_{0.75}\}=\{9\}\). Then, for any 0.25quantile of X, we have \((\mu _{0.25})^{2}>\mu ^{X^{2}}_{0.25}\) i.e. Theorem 1 does not hold for \(q=0.25\).
Although Theorem 1 may not hold for the case \(q<\frac{1}{2}\), it is interesting to find that we still can construct a similar inequality for qquantile of X in this case.
Corollary 1
Let g be a Cfunction and X be any real random variable. Suppose that \(q<\frac{1}{2}\). Then, if \(\mu _{q}^{X}\) is unique,
where \(\mu ^{g(X)}_{1q}\) is any \((1q)\)quantile of \(g(X)\). If \(\mu ^{g(X)}_{1q}\) is unique, (3) holds for any qquantile of X. General speaking, for any \((1q)\)quantile of \(g(X)\), a qquantile of X such that (3) is available.
Proof
By noting that \(\mu _{q}^{X}=\mu ^{X}_{1q}\) and the fact that if \(g(x)\) is a Cfunction, \(h(x)=g(x)\) is also a Cfunction, we have
□
Remark 3
Assertion (3) can be reinforced as
by virtue of Theorem 1 and \(1q>1/2\). Furthermore, if \(g(\cdot )\) happens to be an increasing function, one has \(g (\mu _{q}^{X} )\leqslant g (\mu _{1q}^{X} ) \leqslant \mu ^{g(X)}_{1q}\).
To illustrate with a concrete example, let X be the discrete random variable defined in Remark 2. It is easy to check that, for all \(\mu _{0.25}\), we have \((\mu _{0.25})^{2}\leqslant \mu ^{X^{2}}_{0.75}\).
Confidence intervals
In statistical inference, to make some standard inference for unknown parameters θ, it is important to construct a confidence interval for it.
The first step of constructing a confidence interval is to find a pivotal quantity for θ. Given several random samples \(X_{1},\ldots, X_{n}\) with sample size n, an expression \(Y=Q(X_{1},\ldots, X_{n};\theta )\) is called pivotal quantity for θ if the distribution of Y does not depend on θ. For example, let \(X_{1},\ldots, X_{n}\) be a sample drawn from population \(X\sim N(\mu, \sigma ^{2})\), X̄ and \(S^{2}\) be the sample mean and sample variance. Then
are pivotal quantities, respectively, for μ and \(\sigma ^{2}\). For more discussion on pivotal quantity, please refer to Casella and Berger [4], p. 427).
The second step is that, for a specified value of \(\alpha \in (0,1)\), we can find numbers a and b, which do not depend on θ but on α, to satisfy
Then the \(1\alpha \) confidence interval for θ can be constructed by
However, often we can explicitly find the lower bound and the upper bound of the interval \(C_{\theta }\), \(\ell _{n}\equiv \ell (X_{1}, \ldots, X_{n})\) and \(L_{n}\equiv L(X_{1}, \ldots, X_{n})\) such that \(C_{\theta }(X_{1}, \ldots, X_{n})=[\ell _{n}, L_{n}]\), and therefore
which gives a confidence interval for θ at \(100(1\alpha )\%\) significant level.
Take \(Y_{2}\) above as an example. Let \(\chi _{n,\alpha }^{2}\) be the αquantile of \(\chi ^{2}_{n}\). Then
from which we have
Thus, the confidence interval for \(\sigma ^{2}\) is \([\ell _{n}, L_{n}]\), where both the lower and upper bounds
depend on the sample only.
Obviously, according to the procedures listed above, the confidence interval is not unique because another interval on which the pivotal quantity has the probability as well can be found. However, on the basis of the relationship of the intervals and quantiles in Lemma 1, a more natural definition of confidence interval for the pivotal quantity given below is to be unique.
Theorem 2
(A characterization of \(1\alpha \) confidence interval for pivotal quantity \(Q(X;\theta )\) with \(\alpha <0.5\), usually, \(\alpha =0.01, 0.05, 0.1\))
For any pivotal quantity \(Q(X;\theta )\), the set of its \(1\alpha \) confidence interval is the intersection of all closed intervals I that satisfy: If J is any closed interval that contains I as a proper subset of J, then \(P(J)>1\frac{\alpha }{2}\).
Proof
Let \(I_{q}\) denote the set of all q quantiles of \(Q(X;\theta )\) and Ĩ denote the intersection of all closed intervals I with a property stated in the theorem. We only need to show that \(\bigcup_{\alpha /2\leqslant q\leqslant (1\alpha /2)}I_{q}=\tilde{I}\). In fact, (ii) of Lemma 1 shows that, for any \(\alpha /2\leqslant q\leqslant (1\alpha /2)\), \(I_{q}\subset \tilde{I}\).
On the other hand, if \(I_{\alpha /2}=[a,b]\), \(I_{1\alpha /2}=[c,d]\), \(I_{ \alpha /2}\), \(I_{1\alpha /2}\) may be a closed interval or a singleton. Then intervals \([a,\infty )\) and \((\infty,d]\) both have the stated property. Note that \([a,d]=[a,\infty )\cap (\infty,d]\), the intersection of two closed intervals which both have the property stated in the theorem. Therefore, \(\tilde{I}\subset [a,d]\subset \bigcup_{\alpha /2\leqslant q\leqslant (1 \alpha /2)}I_{q}\). Thus, we complete the proof. □
Remark 4
The key point in the proof of Theorem 2 is to make use of the assertion in Lemma 1, where the condition \(q\geqslant \frac{1}{2}\) is fulfilled automatically due to \(1\alpha /2\geqslant \frac{1}{2}\) for any \(\alpha \in [0,1]\). Theorem 2 gives a characterization of the confidence interval for parameters in pivotal statistic. However, it is a challenge to extend the definition of confidence interval to higher dimensions due to its shape restriction.
Concluding remarks
The paper has shown two critical lemmas that help prove quantile Jensen’s inequality and construct a confidence interval for parameters in some pivotal quantities. All these results, however, are about univariate random variables, and we thus shall study the relevant theory on multivariate variables in our future work.
Availability of data and materials
Not applicable.
References
 1.
Merkle, M.: Jensen’s inequality for medians. Stat. Probab. Lett. 71(3), 277–281 (2005)
 2.
Hardy, G.H., Littlewood, J.E., Pólya, G.: Inequalities. Cambridge University Press, London (1934)
 3.
Kazarinoff, N.D.: Analytic Inequalities. Dover, New York (2003)
 4.
Casella, G., Berger, R.: Statistical Inference, 2th edn. China Machine Press, Beijing (2002)
 5.
Serfling, R.J.: Approximation Theorems of Mathematical Statics. Wiley, New York (2002)
 6.
Mitrinović, D.S., Pečarić, J.E., Fink, A.M.: Convex functions and Jensen’s inequality. In: Classical and New Inequalities in Analysis Mathematics and Its Applications (East European Series), vol. 61, pp. 1–19. Springer, Dordrecht (1993)
 7.
Malamud, S.M.: A converse to the Jensen inequality, its matrix extensions and inequalities for minors and eigenvalues. Linear Algebra Appl. 332, 19–41 (2001)
 8.
Dragomir, S.S.: A converse result for Jensen’s discrete inequality via Grüss inequality and applications in information theory. An. Univ. Oradea, Fasc. Mat. 7, 178–189 (1999)
 9.
Budimir, I., Dragomir, S.S., Pecaric, J.: Further reverse results for Jensen’s discrete inequality and applications in information theory. J. Inequal. Pure Appl. Math. 2(1), 5 (2000)
 10.
Hull, J.: Options, Futures, and Other Derivatives, 10th edn. Pearson, New York (2017)
 11.
To, T.O., Yip, K.W.: A generalized Jensen’s inequality. Pac. J. Math. 58(1), 255–259 (1975)
 12.
Rigler, A.K., Trimble, S.Y., Varga, R.S.: Sharp lower bounds for a generalized Jensen inequality. Rocky Mt. J. Math. 19(1), 353–374 (1989)
 13.
Agnew, R.A., Pecaric, J.E.: Generalized multivariate Jensentype inequality. J. Inequal. Pure Appl. Math. 7(4), 227–237 (2013)
 14.
Merkle, M.: Jensen’s inequality for multivariate medians. J. Math. Anal. Appl. 370(1), 258–269 (2010)
 15.
Kwiecien, R., Gather, U.: Jensen’s inequality for the Tukey median. Statistics and Probability Letters, Technical Reports (2007)
Acknowledgements
The authors are grateful to the referee for very useful comments.
Funding
This work was supported by the National Natural Science Foundation of China under Grant 71901222, Grant 71974204, and “the Fundamental Research Funds for the Central Universities”, Zhongnan University of Economics and Law under Grant 2722020PY040, Grant 2722020JX005.
Author information
Affiliations
Contributions
The authors contributed equally in writing the final version of this article. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that there is no competing interests regarding the publication of this paper.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Zhao, M., Yang, X., He, Q. et al. Quantile Jensen’s inequalities. J Inequal Appl 2021, 86 (2021). https://doi.org/10.1186/s1366002102622x
Received:
Accepted:
Published:
Keywords
 Cfunction
 Confidence interval
 Jensen’s inequality
 Quantile