- Research Article
- Open Access
On Some Improvements of the Jensen Inequality with Some Applications
Journal of Inequalities and Applications volume 2009, Article number: 323615 (2009)
An improvement of the Jensen inequality for convex and monotone function is given as well as various applications for mean. Similar results for related inequalities of the Jensen type are also obtained. Also some applications of the Cauchy mean and the Jensen inequality are discussed.
The well-known Jensen's inequality for convex function is given as follows.
If is a probability space and if is such that for all ,
is valid for any convex function . In the case when is strictly convex on one has equality in (1.1) if and only if is constant almost everywhere on .
Here and in the whole paper we suppose that all integrals exist. By considering the difference of (1.1) for functional in  Anwar and Pečarić proved an interesting result of log-convexity. We can define this result for integrals as follows.
Let be a probability space and is such that for all . Define
and let be positive. Then is log-convex, that is, for the following is valid
The following improvement of (1.1) was obtained in .
Let the conditions of Theorem 1.1 be fulfilled. Then
where represents the right-hand derivative of and
If is concave, then left-hand side of (1.4) should be .
In this paper, we give another proof and extension of Theorem 1.2 as well as improvements of Theorem 1.3 for monotone convex function with some applications. Also we give applications of the Jensen inequality for divergence measures in information theory and related Cauchy means.
2. Another Proof and Extension of Theorem 1.2
In fact, Theorem 1.2 for and was first of all initiated by Simić in .
Moreover, in his proof, he has used convex functions defined on (see [3, Theorem 1]). In his proof, he has used the following function:
where and are real with .
In  we have given correct proof by using extension of (2.1), so that it is defined on .
Moreover, we can give another proof so that we use only (2.1) but without using convexity as in .
Proof of Theorem 1.2.
Consider the function defined, as in , by (2.1).
that is, is convex. By using (1.1) we get
Therefore, (2.3) is valid for all . Now since left-hand side of (2.3) is quadratic form, by the nonnegativity of it, one has
Since we have and , we also have that (2.4) is valid for . So is log-convex function in the Jensen sense on .
Moreover, continuity of implies log-convexity, that is, the following is valid for :
Let us note that it was used in  to get corresponding Cauchy's means. Moreover, we can extend the above result.
Let the conditions of Theorem 1.2 be fulfilled and let be real numbers. Then
where define the determinant of order with elements and .
Consider the function
for and and .
So, it holds that
So is convex function, and as a consequence of (1.1), one has
Therefore, ( denote the matrix with elements ) is nonnegative semi definite and (2.6) is valid for . Moreover, since we have continuity of for all , (2.6) is valid for all .
In Theorem 2.1, if we set we get Theorem 1.2.
3. Improvements of the Jensen Inequality for Monotone Convex Function
In this section and in the following section, we denote and .
If is a probability space and if is such for and if for ( is measurable, i.e., ), then
for monotone convex function . If is monotone concave, then the left-hand side of (3.1) should be .
Consider the case when is nondecreasing on . Then
Now from (1.4), (3.3), and (3.4) we get (3.1).
The case when is nonincreasing can be treated in a similar way.
Of course a discrete inequality is a simple consequence of Theorem 3.1.
Let be a monotone convex function, . If for , then
If is monotone concave, then the left-hand side of (3.5) should be
The following improvement of the Hermite-Hadamard inequality is valid .
Let be a differentiable convex. Then
holds. If is differentiable concave, then the left-hand side of (3.7) should be
(ii)if is monotone, then the inequality
holds. If is differentiable and monotone concave then the left-hand side of (3.8) should be .
Setting in (1.4), we get (3.7).
Setting , and in (3.1), we get (3.8).
4. Improvements of the Levinson Inequality
If the third derivative of exist and is nonnegative, then for and one has
(ii)if is monotone and for , then
As for -convex function the function is convex on , so by setting in the discrete case of [2, Theorem 2], we get (4.1).
As is monotone convex, so by setting in (3.5), we get (5.16).
Ky Fan Inequality
Let be such that . We denote and , the weighted geometric and arithmetic means, respectively, that is,
and also by and , the arithmetic and geometric means of respectively, that is,
The following remarkable inequality, due to Ky Fan, is valid [6, page 5],
with equality sign if and only if .
Inequality (4.5) has evoked the interest of several mathematicians and in numerous articles new proofs, extensions, refinements and various related results have been published .
The following improvement of Ky Fan inequality is valid .
Let and be as defined earlier. Then, the following inequalities are valid
Setting , in (4.1), we get (4.6).
Consider and then is strictly monotone convex on the interval and has derivative(4.8)
Then the application of inequality (4.2) to this function is given by
From (4.9) we get (4.7).
5. On Some Inequalities for Csiszár Divergence Measures
Let be a measure space satisfying and a -finite measure on with values in . Let be the set of all probability measures on the measurable space which are absolutely continuous with respect to . For , let and denote the Radon-Nikodym derivatives of and with respect to respectively.
Let . Then
is called the -divergence of the probability distributions and .
We give some important -divergences, playing a significant role in Information Theory and statistics.
The class of -divergences: the -divergences, in this class, are generated by the family of functions:(5.2)
For , it gives the total variation distance:
For , it gives the Karl pearson -divergence:
(ii)The -order Renyi entropy: for , let
Then gives -order entropy
Harmonic distance: let(5.7)
Then gives Harmonic distance
Then -divergence functional gives rise to Kullback-Leibler distance 
The Dichotomy class: this class is generated by the family of functions ,
This class gives, for particular values of , some important divergences. For instance, for we have Hellinger distance and some other divergences for this class are given by
where and are positive integrable functions with
There are various other divergences in Information Theory and statistics such as Arimoto-type divergences, Matushita's divergence, Puri-Vincze divergences (cf. [12–14]) used in various problems in Information Theory and statistics. An application of Theorem 1.1 is the following result given by Csiszár and Körner (cf. ).
Let be convex, and let and be positive integrable function with . Then the following inequality is valid:
By substituting and in Theorem 1.1 we get (5.13).
Similar consequence of Theorems 1.2 and 2.1 in information theory for divergence measures discussed above is the following result.
Let and be positive integrable functions with . Define the function
and let be positive. Then
(i)it holds that
where define the determinant of order with elements and
(ii) is log-convex.
As we said in  we define new means of the Cauchy type, here we define an application of these means for divergence measures in the following definition.
Let and be positive integrable functions with . The mean is defined as
where and ,
where and .
Let be nonnegative reals such that then
By using log convexity of we get the following result for such that and
Also for we consider limiting case and the result follows from continuity of .
An application of Theorem 1.3 in divergence measure is the following result given in .
Let be differentiable convex function on , then
By substituting and in Theorem 1.3, we get (5.20).
Let be differentiable monotone convex function on and let for
and as in Theorem 5.7.
By substituting and in Theorem 3.1(ii) we get (5.22).
It holds that
and as in Theorem 5.7.
The proof follows by setting in Theorem 5.7.
Let be as given in (5.11), then
(i)for one has
(ii)for one has
(iii)for one has
and as in Theorem 5.7.
The proof follows be setting to be as given in (5.11), in Theorem 3.1.
Anwar M, Pečarić J: On logarithmic convexity for differences of power means and related results. Mathematical Inequalities & Applications 2009,12(1):81–90.
Hussain S, Pečarić J: An improvement of Jensen's inequality with some applications. Asian-European Journal of Mathematics 2009,2(1):85–94. 10.1142/S179355710900008X
Simić S: On logarithmic convexity for differences of power means. Journal of Inequalities and Applications 2007, 2007:-8.
Anwar M, Pečarić J: New means of Cauchy's type. Journal of Inequalities and Applications 2008, 2008: 10.
Dragomir SS, McAndrew A: Refinements of the Hermite-Hadamard inequality for convex functions. Journal of Inequalities in Pure and Applied Mathematics 2005,6(2, article 140):-6.
Alzer H: The inequality of Ky Fan and related results. Acta Applicandae Mathematicae 1995,38(3):305–354. 10.1007/BF00996150
Beckenbach EF, Bellman R: Inequalities, Ergebnisse der Mathematik und ihrer Grenzgebiete, N. F.. Volume 30. Springer, Berlin, Germany; 1961:xii+198.
Csiszár I: Information measures: a critical survey. In Transactions of the 7th Prague Conference on Information Theory, Statistical Decision Functions and the 8th European Meeting of Statisticians. Academia, Prague, Czech Republic; 1978:73–86.
Pardo MC, Vajda I: On asymptotic properties of information-theoretic divergences. IEEE Transactions on Information Theory 2003,49(7):1860–1868. 10.1109/TIT.2003.813509
Kullback S, Leibler RA: On information and sufficiency. Annals of Mathematical Statistics 1951, 22: 79–86. 10.1214/aoms/1177729694
Cressie P, Read TRC: Multinomial goodness-of-fit tests. Journal of the Royal Statistical Society. Series B 1984,46(3):440–464.
Kafka P, Österreicher F, Vincze I: On powers of -divergences defining a distance. Studia Scientiarum Mathematicarum Hungarica 1991,26(4):415–422.
Liese F, Vajda I: Convex Statistical Distances, Teubner Texts in Mathematics. Volume 95. BSB B. G. Teubner Verlagsgesellschaft, Leipzig, Germany; 1987:224.
Österreicher F, Vajda I: A new class of metric divergences on probability spaces and its applicability in statistics. Annals of the Institute of Statistical Mathematics 2003,55(3):639–653. 10.1007/BF02517812
Csiszár I, Körner J: Information Theory: Coding Theorems for Discrete Memoryless System, Probability and Mathematical Statistics. Academic Press, New York, NY, USA; 1981:xi+452.
Anwar M, Hussain S, Pečarić J: Some inequalities for Csiszár-divergence measures. International Journal of Mathematical Analysis 2009,3(26):1295–1304.
This research work is funded by the Higher Education Commission Pakistan. The research of the fourth author is supported by the Croatian Ministry of Science, Education and Sports under the Research Grants 117-1170889-0888.
About this article
Cite this article
Adil Khan, M., Anwar, M., Jakšetić, J. et al. On Some Improvements of the Jensen Inequality with Some Applications. J Inequal Appl 2009, 323615 (2009). https://doi.org/10.1155/2009/323615
- Convex Function
- Divergence Measure
- Hellinger Distance
- Renyi Entropy
- Jensen Inequality