- Research
- Open access
- Published:
Some new estimates of the ‘Jensen gap’
Journal of Inequalities and Applications volume 2016, Article number: 39 (2016)
Abstract
Let \(( \mu,\Omega ) \) be a probability measure space. We consider the so-called ‘Jensen gap’
for some classes of functions φ. Several new estimates and equalities are derived and compared with other results of this type. Especially the case when φ has a Taylor expansion is treated and the corresponding discrete results are pointed out.
1 Introduction
Let \(( \Omega,\mu ) \) be a probability measure space i.e. \(\mu ( \Omega ) =1\) and let f be a μ-measurable function on Ω. If φ is convex, then Jensen’s inequality
holds. This inequality can be traced back to Jensen’s original papers [1, 2] and is one of the most fundamental mathematical inequalities. One reason for that is that in fact a great number of classical inequalities can be derived from (1.1), see e.g. [3] and the references given therein. The inequality (1.1) cannot in general be improved since we have equality in (1.1) when \(\varphi ( x ) \equiv x\). However, for special cases of functions (1.1) can be given in a more specific form e.g. by giving lower estimates of the so-called ‘Jensen gap’
thus obtaining refined versions of (1.1).
We give a few examples of such results.
Example 1
(see [4])
Let φ be a superquadratic function i.e. \(\varphi: [ 0,\infty ) \rightarrow \mathbb{R} \) is such that there exists a constant \(C ( x ) \), \(x\geq0\), such that
for \(y\geq0\). For such functions we have the following estimate of the Jensen gap:
Example 2
We say that a function \(K ( x ) \) in γ-superconvex if \(\varphi ( x ) :=x^{-\gamma }K ( x ) \) is convex. If φ is a differentiable convex, increasing function and \(\varphi ( 0 ) = \lim_{z\rightarrow0+} z\varphi^{\prime} ( z ) =0\), then we have the following estimate of the Jensen gap:
for \(z= \int_{\Omega}f ( s )\,d\mu ( s ) >0\) and \(f\geq 0\), \(f^{\gamma}\) when \(\gamma\geq0 \) are integrable functions on the probability measure space \(( \Omega,\mu ) \).
Remark 1
By using the results in Examples 1 and 2 it is possible to derive Hardy-type inequalities with other ‘breaking points’ (the point where the inequality reverses) than the usual breaking point \(p=1\). See [5, 7, 8] and [9].
Remark 2
In the recent paper [6] it was proved that the notion of γ-superconvexity has sense also for the case \(-1\leq\gamma \leq0\) and in fact this was used even to derive there some new two-sided Jensen type inequalities.
Example 3
(see [10])
In his paper Walker studied the Jensen gap for the special case \(f\equiv1\) i.e. for \(J ( \varphi,\mu) :=J ( \varphi,\mu,1 ) \) and found an estimate of the type
where the positive constant \(C=C ( \varphi,\mu ) \) is easily computed.
In his paper it was assumed that φ admits a Taylor power series representation \(\varphi ( x ) =\sum_{n=1}^{\infty }a_{n}x^{n}\), \(a_{n}\geq0\), \(n=0,1,2,\ldots\) , \(0< x\leq A<\infty\). In another recent paper Dragomir [11] derived some other Jensen integral inequalities for this power series case. A comparison between these two results and our results is given in our concluding remarks.
Inspired by these results, we derive some new results of the same type. In Theorem 1 we get an estimate like that of Walker in [10] but for the general case of \(J ( \varphi,\mu,f ) \). In Theorem 2 we prove another complement of the Walker result by considering the Jensen functional
and get an estimate for this Jensen gap which even reduces to equality for \(\alpha=N\), \(N=2,3,\ldots\) . By using this result it is possible to derive a similar equality for the Jensen gap whenever it can be represented by a Taylor power series (see Theorem 3).
In Section 3 we show that our lower bound of the Jensen gap is better than the lower bound in [11] when the function that we deal with has a Taylor series expansion with non-negative coefficients. Moreover, we prove that by our technique we can in such cases derive also upper bounds and not only lower bounds as in [10].
2 The main results
Our first main result reads as follows.
Theorem 1
Let \(\phi: [ 0,A ) \rightarrow \mathbb{R} \) have a Taylor power series representation on \([ 0,A )\), \(0< A\leq\infty:\phi ( x ) =\sum_{n=0}^{\infty }a_{n}x^{n}\).
Let φ be a convex increasing function on \([ 0,A ) \) that is related to ϕ by
(a) If \(f\geq0\) and f, \(f^{2}\), and \(\phi\circ f\) are integrable functions on Ω, \(z=\int_{\Omega}f\,d\mu>0\), where μ is a probability measure on Ω, then
In other words,
(b) For \(\overline{x}=\sum_{i=1}^{m}\alpha_{i}x_{i}\), \(\ \sum_{i=1}^{m}\alpha_{i}=1\), \(0\leq\alpha_{i}\leq1\), \(0\leq x_{i}< A\), \(i=1,\ldots,m\), it yields
In other words,
Proof
For \(\phi ( x ) =\sum_{n=0}^{\infty}a_{n}x^{n}\), \(0\leq x< A\), by denoting the function \(\psi: [ 0,A ) \rightarrow \mathbb{R} _{+}\) \(\psi ( x ) =\phi ( x ) -\phi ( 0 ) =\sum_{n=0}^{\infty}a_{n+1}x^{n+1}\), \(0\leq x< A\), and \(\varphi ( x ) =\frac{\psi ( x ) }{x}\Leftrightarrow x\varphi ( x ) =\psi ( x )\), \(0\leq x< A\), we see that \(\psi ( x ) \) is 1-quasiconvex function (see [6]), \(\varphi ( x ) =\sum_{n=0}^{\infty}a_{n+1}x^{n}\), \(0\leq x< A\), and \(\varphi ^{\prime} ( x ) =\sum_{n=0}^{\infty} ( n+1 ) a_{n+2}x^{n}\).
The functions ϕ, ψ, φ, and \(\varphi^{\prime}\) are differentiable functions on \([ 0,A ) \). From the convexity of \(\varphi ( x ) \) we have
and, therefore,
Since \(\psi ( x ) =\phi ( x ) -\phi ( 0 ) \) we get
Now using this inequality with \(x=z\), \(y=f\), and integrating, we find that
In the last inequality we have used \(z=\int_{\Omega}f\,d\mu>0\) and φbeing convex increasing, where \(\varphi ( z ) =\frac{\phi ( z ) -\phi ( 0 ) }{z}\).
Hence (a) is proved and since (b) is just a special case of (a), the proof is complete. □
For the proof of our next main result we need the following lemma, which is also of independent interest.
Lemma 1
Let φ be a differentiable function on \(I\subset \mathbb{R} \), and let \(x,y\subseteq I\). Then, for \(N=2,3,\ldots\) ,
In particular, for \(N=2\) we have
Proof
A simple calculation shows that (2.2) holds, i.e., that (2.1) holds for \(N=2\). For \(N=3\) (2.1) reads
Moreover, it is easy to verify the identity
By using (2.4) together with (2.2) and making some straightforward calculations we obtain (2.3). The general proof follows in the same way using induction and the more general (than (2.4)) identity
□
Now we are ready to state our next main result.
Theorem 2
Let μ be a probability measure on \(\Omega= (0,\infty)\), \(z=\int_{\Omega}y\,d\mu ( y ) >0\), and \(N=2,3,\ldots\) . Then the refined Jensen-type inequality
holds for any \(\alpha\geq N\). Moreover, for \(N-1<\alpha\leq N\) (2.5) holds in the reversed direction. In particular, for \(\alpha=N\) we have equality in (2.5).
Proof
A convex differentiable function on \(\varphi ( x ) \) is characterized by
and this inequality holds in the reversed direction if \(\varphi ( x ) \) is concave. For \(\varphi ( x ) =x\) we have equality. Therefore, when \(\varphi ( x ) \) is convex it yields
Hence in view of Lemma 1 we find that
By using this inequality with the convex function \(\varphi ( x ) =x^{\alpha-N+1}\), \(x\geq0\), \(\alpha\geq N\), we obtain
By now choosing \(x=z\), integrating over Ω, and using the fact that \(\int_{\Omega} ( y-z )\,d\mu ( y ) =0\) we obtain (2.5). For the reversed inequality we use the concave function \(\varphi ( x ) =x^{\alpha-N+1}\), \(( N-1 ) <\alpha\leq N\), and all inequalities above reverse. For \(\alpha=N\) we get an equality, so the proof is complete. □
Corollary 1
Let \(x_{i}\geq0\), \(\alpha_{i}\geq0\), \(i=1,2,\ldots,m\), \(\sum_{i=1}^{m}\alpha_{i}=1\), and \(\overline{x}=\sum_{i=1}^{m}\alpha _{i}x_{i}\). Then, for \(N=2,3,\ldots\) ,
holds for any \(\alpha\geq N\). Moreover, for \(N-1<\alpha\leq1\) (2.6) holds in the reversed direction. In particular, for \(\alpha =N\), (2.6) reduces to an equality.
Our final main result reads as follows.
Theorem 3
Let \(0< A\leq\infty\) and let \(\phi: ( 0,A ] \rightarrow \mathbb{R} \) have a Taylor expansion \(\phi ( x ) =\sum_{n=0}^{\infty }a_{n}x^{n}\), on \(( 0,A ] \). If μ is a probability measure on \(( 0,A ] \) and \(z=\int_{0}^{A}x\,d\mu ( x ) >0\), then
Proof
We note that
Obviously, \(\int_{0}^{A} ( x^{n}-z^{n} )\,d\mu=0\), for \(n=0,1\), and hence (2.7) follows from the equality cases in (2.5) in Theorem 2, i.e. when \(\alpha=N=2,3,\ldots\) .
The proof is complete. □
Corollary 2
Let \(0< A\leq\infty\) and let \(\phi: [ 0,A ) \) have a Taylor expansion \(\phi ( x ) =\sum_{n=0}^{\infty}a_{n}x^{n}\), on \([ 0,A ) \). If \(\overline{x}=\sum_{i=1}^{m}\alpha_{i}x_{i}\), \(\sum_{i=1}^{m}\alpha_{i}=1\), \(0\leq \alpha_{i}\leq1\), \(0\leq x_{i}\leq A\), \(i=1,2,\ldots,m\), then
Corollary 3
Let \(0< a< b<\infty\), and μ be a probability measure on \(( a,b ) \). Then we have the following estimate of the Jensen gap \(J_{N}:=\int_{a}^{b}x^{N}\,d\mu- ( \int_{a}^{b}x\,d\mu ) ^{N}\), \(N=2,3,\ldots\) :
Proof
We use Theorem 2 with \(\alpha=N\) and find that
We note that if \(a< x< b\), then \(a< z< b \) so that \(a^{N-2}\leq x^{k-1}z^{N-k-1}\leq b^{N-2}\). Moreover, \(\sum_{k=1}^{N-1} ( N-k ) =\frac{N ( N-1 ) }{2}\) and
so (2.8) is proved. □
Remark 3
For the case \(N=2\) both inequalities in (2.8) reduce to equalities. Moreover, for the discrete case we have: If \(0< a< x_{i}< b\), \(\alpha_{i}\geq0\), \(i=1,2,\ldots,m\), \(\sum_{i=1}^{m}\alpha_{i}=1\), \(\overline{x}=\sum_{i=1}^{m}\alpha_{i}x_{i}\), then, for \(N=2,3,\ldots\) ,
3 Final remarks and examples
In this section we present some recent interesting results of Dragomir [11] and Walker [10]. Moreover, we point out the corresponding special cases of our results and compare these results with those of [11] and [10].
Example 4
In Dragomir’s paper [11], Theorem 2, it was proved that for
which converges on \(0< x< R\leq\infty\), the following lower bound of the Jensen gap holds:
when \(( \Omega,\mu ) \) is a probability measure space, \(f\geq 0\), and f, \(f^{2}\), and \(\phi\circ f\) are integrable on Ω and \(\int_{\Omega}f\,d\mu>0\).
Example 5
In Theorem 1 we proved that for convex increasing functions we get the inequalities
A function that satisfies (3.1) is convex increasing and therefore Theorem 1 holds, which means that we get the inequalities in (3.3).
Remark 4
It is easily computed that when ϕ is of the form (3.1), then
holds, and from this we conclude that our bound in (3.3), when (3.1) is satisfied, is stronger than Dragomir’s (3.2). Indeed,
and
and our claim is obvious.
Example 6
In Theorem 3.1 in Walker’s paper [10], a lower bound for the Jensen gap is given for a function ϕ that satisfies (3.1):
where
when μ is a probability measure defined on \(\Omega= ( 0,R ) \) and \(\mu_{2}\) is μ restricted and normalized to \(( 1,R ) \).
More generally, in Section 4 in [10], \(\mu ( 1,R ) \) was replaced by \(\mu ( a,R ) \) and we have
where
when \(\mu_{a}\) is μ restricted and normalized to \(\Omega= ( a,R ) \).
From Corollary 3 and Remark 3 we easily get the following.
Example 7
Let \(0< A\leq\infty\) and let \(\phi: ( 0,A ] \rightarrow \mathbb{R} \) have Taylor expansion \(\phi ( x ) =\sum_{n=0}^{\infty }a_{n}x^{n}\), \(a_{n}\geq0\), \(n=2,3,\ldots\) , on \(( 0,A ] \). If μ is a probability measure on \(( 0,A ] \), \(0\leq a< b\leq A \), and \(z=\int_{0}^{A}x\,d\mu ( x ) >0\), then
Moreover, for the discrete case we have: If \(0< a< x_{i}< b\), \(\alpha _{i}\geq 0\), \(i=1,2,\ldots,m\), \(\sum_{i=1}^{m}a_{i}=1\), \(\overline{x}=\sum_{i=1}^{m}\alpha_{i}x_{i}\), then, for \(n=2,3,\ldots\) ,
Remark 5
The lower bound in (3.5) coincides with that in (3.6) when \(a=1\). The lower bound in (3.6) is better than that in (3.5) when \(a<1\), but Walker’s bound (3.5) is better than (3.6) for \(a>1\). It seems not to be possible to derive an upper bound like that in (3.5) by using the method in [10].
References
Jensen, JLWV: Om konvexe Funktioner og Uligeder mellem Middelvaerdier. Nyt Tidsskr. Math. 16B, 49-69 (1905) (in Danish)
Jensen, JLWV: Sur les fonctions convexes et les inegalités entre les moyennes. Acta Math. 30, 175-193 (1906) (in French)
Persson, L-E, Samko, N: Inequalities and Convexity, Operator Theory: Advances and Applications, vol. 242, 29 pp. Birkhäuser, Basel (2014)
Abramovich, S, Jameson, G, Sinnamon, G: Refining of Jensen’s inequality. Bull. Math. Soc. Sci. Math. Roum. 47, 3-14 (2004)
Abramovich, S, Persson, L-E: Some new scales of refined Hardy type inequalities via functions related to superquadracity. Math. Inequal. Appl. 16, 679-695 (2013)
Abramovich, S, Persson, L-E, Samko, N: On γ-quasiconvexity, superquadracity and two sided reversed Jensen type inequalities. Math. Inequal. Appl. 18(2), 615-627 (2015)
Oguntuase, J, Persson, L-E: Refinement of Hardy’s inequalities via superquadratic and subquadratic functions. J. Math. Anal. Appl. 339, 1305-14012 (2008)
Abramovich, S, Persson, L-E: Some new refined Hardy type inequalities with breaking points \(p=2\) and \(p=3\). In: Proceedings of the IWOTA 2011, vol. 236, pp. 1-10. Birkhäuser, Basel (2014)
Abramovich, S, Persson, L-E, Samko, N: Some new scales of refined Jensen and Hardy type inequalities. Math. Inequal. Appl. 17, 1105-1114 (2014)
Walker, SG: On a lower bound for Jensen inequality. SIAM J. Math. Anal. (to appear)
Dragomir, SS: Jensen integral inequality for power series with nonnegative coefficients and applications. RGMIA Res. Rep. Collect. 17, 42 (2014)
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
The authors have on equal levels discussed, posed research questions, formulated theorems, and made proofs in this paper. Both authors have read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Abramovich, S., Persson, LE. Some new estimates of the ‘Jensen gap’. J Inequal Appl 2016, 39 (2016). https://doi.org/10.1186/s13660-016-0985-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13660-016-0985-4