Skip to main content
  • Research Article
  • Open access
  • Published:

On Some Improvements of the Jensen Inequality with Some Applications

Abstract

An improvement of the Jensen inequality for convex and monotone function is given as well as various applications for mean. Similar results for related inequalities of the Jensen type are also obtained. Also some applications of the Cauchy mean and the Jensen inequality are discussed.

1. Introduction

The well-known Jensen's inequality for convex function is given as follows.

Theorem 1.1.

If is a probability space and if is such that for all ,

(1.1)

is valid for any convex function . In the case when is strictly convex on one has equality in (1.1) if and only if is constant almost everywhere on .

Here and in the whole paper we suppose that all integrals exist. By considering the difference of (1.1) for functional in [1] Anwar and Pečarić proved an interesting result of log-convexity. We can define this result for integrals as follows.

Theorem 1.2.

Let be a probability space and is such that for all . Define

(1.2)

and let be positive. Then is log-convex, that is, for the following is valid

(1.3)

The following improvement of (1.1) was obtained in [2].

Theorem 1.3.

Let the conditions of Theorem 1.1 be fulfilled. Then

(1.4)

where represents the right-hand derivative of and

(1.5)

If is concave, then left-hand side of (1.4) should be .

In this paper, we give another proof and extension of Theorem 1.2 as well as improvements of Theorem 1.3 for monotone convex function with some applications. Also we give applications of the Jensen inequality for divergence measures in information theory and related Cauchy means.

2. Another Proof and Extension of Theorem 1.2

In fact, Theorem 1.2 for and was first of all initiated by Simić in [3].

Moreover, in his proof, he has used convex functions defined on (see [3, Theorem  1]). In his proof, he has used the following function:

(2.1)

where and are real with .

In [1] we have given correct proof by using extension of (2.1), so that it is defined on .

Moreover, we can give another proof so that we use only (2.1) but without using convexity as in [3].

Proof of Theorem 1.2.

Consider the function defined, as in [3], by (2.1).

Now

(2.2)

that is, is convex. By using (1.1) we get

(2.3)

Therefore, (2.3) is valid for all . Now since left-hand side of (2.3) is quadratic form, by the nonnegativity of it, one has

(2.4)

Since we have and , we also have that (2.4) is valid for . So is log-convex function in the Jensen sense on .

Moreover, continuity of implies log-convexity, that is, the following is valid for :

(2.5)

Let us note that it was used in [4] to get corresponding Cauchy's means. Moreover, we can extend the above result.

Theorem 2.1.

Let the conditions of Theorem 1.2 be fulfilled and let be real numbers. Then

(2.6)

where define the determinant of order with elements and .

Proof.

Consider the function

(2.7)

for and and .

So, it holds that

(2.8)

So is convex function, and as a consequence of (1.1), one has

(2.9)

Therefore, ( denote the matrix with elements ) is nonnegative semi definite and (2.6) is valid for . Moreover, since we have continuity of for all , (2.6) is valid for all .

Remark 2.2.

In Theorem 2.1, if we set we get Theorem 1.2.

3. Improvements of the Jensen Inequality for Monotone Convex Function

In this section and in the following section, we denote and .

Theorem 3.1.

If is a probability space and if is such for and if for ( is measurable, i.e., ), then

(3.1)

where

(3.2)

for monotone convex function . If is monotone concave, then the left-hand side of (3.1) should be .

Proof.

Consider the case when is nondecreasing on . Then

(3.3)

Similarly,

(3.4)

Now from (1.4), (3.3), and (3.4) we get (3.1).

The case when is nonincreasing can be treated in a similar way.

Of course a discrete inequality is a simple consequence of Theorem 3.1.

Theorem 3.2.

Let be a monotone convex function, . If for , then

(3.5)

If is monotone concave, then the left-hand side of (3.5) should be

(3.6)

The following improvement of the Hermite-Hadamard inequality is valid [5].

Corollary 3.3.

Let be a differentiable convex. Then

(i)the inequality

(3.7)

holds. If is differentiable concave, then the left-hand side of (3.7) should be

(ii)if is monotone, then the inequality

(3.8)

holds. If is differentiable and monotone concave then the left-hand side of (3.8) should be .

Proof.

  1. (i)

    Setting in (1.4), we get (3.7).

  2. (ii)

    Setting , and in (3.1), we get (3.8).

4. Improvements of the Levinson Inequality

Theorem 4.1.

If the third derivative of exist and is nonnegative, then for and one has

  1. (i)
    (4.1)

(ii)if is monotone and for , then

(4.2)

Proof.

  1. (i)

    As for -convex function the function is convex on , so by setting in the discrete case of [2, Theorem  2], we get (4.1).

  2. (ii)

    As is monotone convex, so by setting in (3.5), we get (5.16).

Ky Fan Inequality

Let be such that . We denote and , the weighted geometric and arithmetic means, respectively, that is,

(4.3)

and also by and , the arithmetic and geometric means of respectively, that is,

(4.4)

The following remarkable inequality, due to Ky Fan, is valid [6, page 5],

(4.5)

with equality sign if and only if .

Inequality (4.5) has evoked the interest of several mathematicians and in numerous articles new proofs, extensions, refinements and various related results have been published [7].

The following improvement of Ky Fan inequality is valid [2].

Corollary 4.2.

Let and be as defined earlier. Then, the following inequalities are valid

  1. (i)
    (4.6)
  1. (ii)
    (4.7)

Proof.

  1. (i)

    Setting , in (4.1), we get (4.6).

  2. (ii)

    Consider and then is strictly monotone convex on the interval and has derivative

    (4.8)

Then the application of inequality (4.2) to this function is given by

(4.9)

From (4.9) we get (4.7).

5. On Some Inequalities for Csiszár Divergence Measures

Let be a measure space satisfying and a -finite measure on with values in . Let be the set of all probability measures on the measurable space which are absolutely continuous with respect to . For , let and denote the Radon-Nikodym derivatives of and with respect to respectively.

Csiszár introduced the concept of -divergence for a convex function, that is continuous at 0 as follows (cf. [8], see also [9]).

Definition 5.1.

Let . Then

(5.1)

is called the -divergence of the probability distributions and .

We give some important -divergences, playing a significant role in Information Theory and statistics.

  1. (i)

    The class of -divergences: the -divergences, in this class, are generated by the family of functions:

    (5.2)

For , it gives the total variation distance:

(5.3)

For , it gives the Karl pearson -divergence:

(5.4)

(ii)The -order Renyi entropy: for , let

(5.5)

Then gives -order entropy

(5.6)
  1. (iii)

    Harmonic distance: let

    (5.7)

Then gives Harmonic distance

(5.8)
  1. (iv)

    Kullback-Leibler: let

    (5.9)

Then -divergence functional gives rise to Kullback-Leibler distance [10]

(5.10)

The one parametric generalization of the Kullback-Leibler [10] relative information studied in a different way by Cressie and Read [11].

  1. (v)

    The Dichotomy class: this class is generated by the family of functions ,

(5.11)

This class gives, for particular values of , some important divergences. For instance, for we have Hellinger distance and some other divergences for this class are given by

(5.12)

where and are positive integrable functions with

There are various other divergences in Information Theory and statistics such as Arimoto-type divergences, Matushita's divergence, Puri-Vincze divergences (cf. [1214]) used in various problems in Information Theory and statistics. An application of Theorem 1.1 is the following result given by Csiszár and Körner (cf. [15]).

Theorem 5.2.

Let be convex, and let and be positive integrable function with . Then the following inequality is valid:

(5.13)

where .

Proof.

By substituting and in Theorem 1.1 we get (5.13).

Similar consequence of Theorems 1.2 and 2.1 in information theory for divergence measures discussed above is the following result.

Theorem 5.3.

Let and be positive integrable functions with . Define the function

(5.14)

and let be positive. Then

(i)it holds that

(5.15)

where define the determinant of order with elements and

(ii) is log-convex.

As we said in [4] we define new means of the Cauchy type, here we define an application of these means for divergence measures in the following definition.

Definition 5.4.

Let and be positive integrable functions with . The mean is defined as

(5.16)

where and ,

(5.17)

where and .

Theorem 5.5.

Let be nonnegative reals such that then

(5.18)

Proof.

By using log convexity of we get the following result for such that and

(5.19)

Also for we consider limiting case and the result follows from continuity of .

An application of Theorem 1.3 in divergence measure is the following result given in [16].

Theorem 5.6.

Let be differentiable convex function on , then

(5.20)

where

(5.21)

Proof.

By substituting and in Theorem 1.3, we get (5.20).

Theorem 5.7.

Let be differentiable monotone convex function on and let for

(5.22)

where

(5.23)

and as in Theorem 5.7.

Proof.

By substituting and in Theorem 3.1(ii) we get (5.22).

Corollary 5.8.

It holds that

(5.24)

where

(5.25)

and as in Theorem 5.7.

Proof.

The proof follows by setting in Theorem 5.7.

Corollary 5.9.

Let be as given in (5.11), then

(i)for one has

(5.26)

(ii)for one has

(5.27)

(iii)for one has

(5.28)

where

(5.29)

and as in Theorem 5.7.

Proof.

The proof follows be setting to be as given in (5.11), in Theorem 3.1.

References

  1. Anwar M, Pečarić J: On logarithmic convexity for differences of power means and related results. Mathematical Inequalities & Applications 2009,12(1):81–90.

    Article  MathSciNet  MATH  Google Scholar 

  2. Hussain S, Pečarić J: An improvement of Jensen's inequality with some applications. Asian-European Journal of Mathematics 2009,2(1):85–94. 10.1142/S179355710900008X

    Article  MathSciNet  MATH  Google Scholar 

  3. Simić S: On logarithmic convexity for differences of power means. Journal of Inequalities and Applications 2007, 2007:-8.

    Google Scholar 

  4. Anwar M, Pečarić J: New means of Cauchy's type. Journal of Inequalities and Applications 2008, 2008: 10.

    MATH  Google Scholar 

  5. Dragomir SS, McAndrew A: Refinements of the Hermite-Hadamard inequality for convex functions. Journal of Inequalities in Pure and Applied Mathematics 2005,6(2, article 140):-6.

  6. Alzer H: The inequality of Ky Fan and related results. Acta Applicandae Mathematicae 1995,38(3):305–354. 10.1007/BF00996150

    Article  MathSciNet  MATH  Google Scholar 

  7. Beckenbach EF, Bellman R: Inequalities, Ergebnisse der Mathematik und ihrer Grenzgebiete, N. F.. Volume 30. Springer, Berlin, Germany; 1961:xii+198.

    Google Scholar 

  8. Csiszár I: Information measures: a critical survey. In Transactions of the 7th Prague Conference on Information Theory, Statistical Decision Functions and the 8th European Meeting of Statisticians. Academia, Prague, Czech Republic; 1978:73–86.

    Google Scholar 

  9. Pardo MC, Vajda I: On asymptotic properties of information-theoretic divergences. IEEE Transactions on Information Theory 2003,49(7):1860–1868. 10.1109/TIT.2003.813509

    Article  MathSciNet  MATH  Google Scholar 

  10. Kullback S, Leibler RA: On information and sufficiency. Annals of Mathematical Statistics 1951, 22: 79–86. 10.1214/aoms/1177729694

    Article  MathSciNet  MATH  Google Scholar 

  11. Cressie P, Read TRC: Multinomial goodness-of-fit tests. Journal of the Royal Statistical Society. Series B 1984,46(3):440–464.

    MathSciNet  MATH  Google Scholar 

  12. Kafka P, Österreicher F, Vincze I: On powers of -divergences defining a distance. Studia Scientiarum Mathematicarum Hungarica 1991,26(4):415–422.

    MathSciNet  MATH  Google Scholar 

  13. Liese F, Vajda I: Convex Statistical Distances, Teubner Texts in Mathematics. Volume 95. BSB B. G. Teubner Verlagsgesellschaft, Leipzig, Germany; 1987:224.

    Google Scholar 

  14. Österreicher F, Vajda I: A new class of metric divergences on probability spaces and its applicability in statistics. Annals of the Institute of Statistical Mathematics 2003,55(3):639–653. 10.1007/BF02517812

    Article  MathSciNet  MATH  Google Scholar 

  15. Csiszár I, Körner J: Information Theory: Coding Theorems for Discrete Memoryless System, Probability and Mathematical Statistics. Academic Press, New York, NY, USA; 1981:xi+452.

    Google Scholar 

  16. Anwar M, Hussain S, Pečarić J: Some inequalities for Csiszár-divergence measures. International Journal of Mathematical Analysis 2009,3(26):1295–1304.

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

This research work is funded by the Higher Education Commission Pakistan. The research of the fourth author is supported by the Croatian Ministry of Science, Education and Sports under the Research Grants 117-1170889-0888.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. Adil Khan.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Adil Khan, M., Anwar, M., Jakšetić, J. et al. On Some Improvements of the Jensen Inequality with Some Applications. J Inequal Appl 2009, 323615 (2009). https://doi.org/10.1155/2009/323615

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1155/2009/323615

Keywords