- Research
- Open access
- Published:
Some inequalities on generalized entropies
Journal of Inequalities and Applications volume 2012, Article number: 226 (2012)
Abstract
We give several inequalities on generalized entropies involving Tsallis entropies, using some inequalities obtained by the improvements of Young’s inequality. We also give a generalized Han’s inequality.
MSC:26D15, 94A17.
1 Introduction
Generalized entropies have been studied by many researchers (we refer the interested reader to [1, 2]). Rényi [3] and Tsallis [4] entropies are well known as one-parameter generalizations of Shannon’s entropy, being intensively studied not only in the field of classical statistical physics [5–7], but also in the field of quantum physics in relation to the entanglement [8–11]. The Tsallis entropy is a natural one-parameter extended form of the Shannon entropy, hence it can be applied to known models which describe systems of great interest in atomic physics [12]. However, to our best knowledge, the physical relevance of a parameter of the Tsallis entropy was highly debated and it has not been completely clarified yet, the parameter being considered as a measure of the non-extensivity of the system under consideration. One of the authors of the present paper studied the Tsallis entropy and the Tsallis relative entropy from the mathematical point of view. Firstly, fundamental properties of the Tsallis relative entropy were discussed in [13]. The uniqueness theorem for the Tsallis entropy and Tsallis relative entropy was studied in [14]. Following this result, an axiomatic characterization of a two-parameter extended relative entropy was given in [15]. In [16], information theoretical properties of the Tsallis entropy and some inequalities for conditional and joint Tsallis entropies were derived. These entropies are again used in the present paper, to derive the generalized Han’s inequality. In [17], matrix trace inequalities for the Tsallis entropy were studied. And in [18], the maximum entropy principle for the Tsallis entropy and the minimization of the Fisher information in Tsallis statistics were studied. Quite recently, we provided mathematical inequalities for some divergences in [19], considering that it is important to study the mathematical inequalities for the development of new entropies. In this paper, we define a further generalized entropy based on Tsallis and Rényi entropies and study mathematical properties by the use of scalar inequalities to develop the theory of entropies.
We start from the weighted quasilinear mean for some continuous and strictly monotonic function , defined by
where , , for and . If we take , then coincides with the weighted arithmetic mean . If we also take , then coincides with the weighted geometric mean .
If and , then is equal to the Tsallis entropy [4]:
where is a probability distribution with for all and the q-logarithmic function for is defined by which uniformly converges to the usual logarithmic function in the limit . Therefore, the Tsallis entropy converges to the Shannon entropy in the limit :
Thus, we find that the Tsallis entropy is one of generalizations of the Shannon entropy. It is known that the Rényi entropy [3] is also a generalization of the Shannon entropy. Here, we review a quasilinear entropy [1] as another generalization of the Shannon entropy. For a continuous and strictly monotonic function ϕ on , the quasilinear entropy is given by
If we take in (4), then we have . We may redefine the quasilinear entropy by
for a continuous and strictly monotonic function ψ on . If we take in (5), we have . The case is also useful in practice, since we recapture the Rényi entropy, namely where the Rényi entropy [3] is defined by
From a viewpoint of application on source coding, the relation between the weighted quasilinear mean and the Rényi entropy has been studied in Chapter 5 of [1] in the following way.
Theorem A ([1])
For all real numbers and integers , there exists a code such that
where the exponential function is defined on .
By simple calculations, we find that
and
Therefore, Theorem A appears as a generalization of the famous Shannon’s source coding theorem:
Motivated by the above results and recent advances on the Tsallis entropy theory, we investigate the mathematical results for generalized entropies involving Tsallis entropies and quasilinear entropies, using some inequalities obtained by improvements of Young’s inequality.
Definition 1.1 For a continuous and strictly monotonic function ψ on and two probability distributions and with , for all , the quasilinear relative entropy is defined by
The quasilinear relative entropy coincides with the Shannon relative entropy if , i.e.,
We denote by the Rényi relative entropy [3] defined by
This is another particular case of the quasilinear relative entropy, namely for we have
We denote by
the Tsallis relative entropy which converges to the usual relative entropy (divergence, K-L information) in the limit :
See [2, 5–7, 13–20] and references therein for recent advances and applications on the Tsallis entropy. We easily find that the Tsallis relative entropy is a special case of Csiszár f-divergence [21–23] defined for a convex function f on with by
since is convex on , vanishes at and
Furthermore, we define the dual function with respect to a convex function f by
for . Then the function is also convex on . In addition, we define the f-divergence for incomplete probability distributions and where and , in the following way:
On the other hand, the studies on refinements for Young’s inequality have given a great progress in the papers [24–35]. In the present paper, we give some inequalities on Tsallis entropies applying two types of inequalities obtained in [29, 32]. In addition, we give the generalized Han’s inequality for the Tsallis entropy in the final section.
2 Tsallis quasilinear entropy and Tsallis quasilinear relative entropy
As an analogy with (5), we may define the following entropy.
Definition 2.1 For a continuous and strictly monotonic function ψ on and with , the Tsallis quasilinear entropy (q-quasilinear entropy) is defined by
where is a probability distribution with for all .
We notice that if ψ does not depend on q, then .
For and with , we define the q-exponential function as the inverse function of the q-logarithmic function by , if , otherwise it is undefined. If we take , then we have . Furthermore, we have
Proposition 2.2 The Tsallis quasilinear entropy is nonnegative:
Proof We assume that ψ is an increasing function. Then we have from for for all . Thus, we have which implies , since is also increasing. For the case that ψ is a decreasing function, we can prove it similarly. □
We note here that the q-exponential function gives us the following connection between the Rényi entropy and Tsallis entropy [36]:
We should note here is always defined, since we have
From (16), we have the following proposition.
Proposition 2.3 Let be a partition of and put . Then we have
Proof We use the generalized Shannon additivity (which is often called q-additivity) for the Tsallis entropy (see [14] for example):
where , (; ). Thus, we have
since the second term of the right-hand side in (19) is nonnegative because of nonnegativity of the Tsallis entropy. Thus, we have
since is a monotone increasing function. Hence, the inequality
holds, which proves the present proposition. □
Definition 2.4 For a continuous and strictly monotonic function ψ on and two probability distributions and with , for all , the Tsallis quasilinear relative entropy is defined by
For , the Tsallis quasilinear relative entropy becomes Tsallis relative entropy, that is,
and for , we have
We give a sufficient condition on nonnegativity of the Tsallis quasilinear relative entropy.
Proposition 2.5 If ψ is a concave increasing function or a convex decreasing function, then we have nonnegativity of the Tsallis quasilinear relative entropy:
Proof We firstly assume that ψ is a concave increasing function. The concavity of ψ shows that we have which is equivalent to . From the assumption, is also increasing so that we have . Therefore, we have , since is increasing and . For the case that ψ is a convex decreasing function, we can prove similarly nonnegativity of the Tsallis quasilinear relative entropy. □
Remark 2.6 The following two functions satisfy the sufficient condition in the above proposition.
-
(i)
for , .
-
(ii)
for , .
It is notable that the following identity holds:
We should note here is always defined, since we have
We also find that (24) implies the monotonicity of the Rényi relative entropy.
Proposition 2.7 Under the same assumptions as in Proposition 2.3 and , we have
Proof We recall that the Tsallis relative entropy is a special case of f-divergence so that it has the same properties with f-divergence. Since is a monotone increasing function for and f-divergence has a monotonicity [21, 23], we have
which proves the statement. □
3 Inequalities for Tsallis quasilinear entropy and f-divergence
In this section, we give inequalities for the Tsallis quasilinear entropy and f-divergence. For this purpose, we review the results obtained in [29] as one of generalizations of refined Young’s inequality.
Proposition 3.1 ([29])
For two probability vectors and such that , , and such that , we have
where
for a continuous increasing function and a function such that
for any and any .
We have the following inequalities on the Tsallis quasilinear entropy and Tsallis entropy.
Theorem 3.2 For , a continuous and strictly monotonic function ψ on and a probability distribution with for all , we have
Proof If we take the uniform distribution in Proposition 3.1, then we have
(which coincides with Theorem 3.3 in [29]). In the inequalities (29), we put and for any , then we obtain the statement. □
Corollary 3.3 For and a probability distribution with for all , we have
Proof Put in Theorem 3.2. □
Remark 3.4 Corollary 3.3 improves the well-known inequalities . If we take the limit , the inequalities (30) recover Proposition 1 in [25].
We also have the following inequalities.
Theorem 3.5 For two probability distributions and , and an incomplete probability distribution with , we have
Proof Put in Proposition 3.1 with . Since we have the relation
we have the statement. □
Corollary 3.6 ([25])
Under the same assumption as in Theorem 3.5, we have
Proof If we take in Theorem 3.5, then we have
Since and , we also have
□
4 Inequalities for Tsallis entropy
We firstly give Lagrange’s identity [37], to establish an alternative generalization of refined Young’s inequality.
Lemma 4.1 (Lagrange’s identity)
For two vectors and , we have
Theorem 4.2 Let be a twice differentiable function such that there exist real constants m and M so that for any . Then we have
where with and for all .
Proof We consider the function defined by . Since we have , g is a convex function. Applying Jensen’s inequality, we thus have
where with and for all . From the inequality (34), we have
In the above calculations, we used Lemma 4.1. Thus, we proved the first part of the inequalities. Similarly, one can prove the second part of the inequalities putting the function defined by . We omit the details. □
Lemma 4.3 For with and , and with , we have
Proof We denote
The left-side term becomes
Similarly, a straightforward computation yields
This concludes the proof. □
Corollary 4.4 Under the assumptions of Theorem 4.2, we have
Remark 4.5 Corollary 4.4 gives a similar form with Cartwright-Field’s inequality [38]:
where for all and , and .
We also have the following inequalities for the Tsallis entropy.
Theorem 4.6 For two probability distributions and with , and , we have
where and are positive numbers depending on the parameter and satisfying and for all .
Proof Applying Theorem 4.2 for the convex function and , we have
since the second derivative of is . Putting for all in the inequalities (39), it follows
From the inequalities (39) and (40), we have the statement. □
Remark 4.7 The first part of the inequalities (40) gives another improvement of the well-known inequalities .
Corollary 4.8 For two probability distributions and with , and , we have
where and are positive numbers satisfying and for all .
Proof Take the limit in Theorem 4.6. □
Remark 4.9 The second part of the inequalities (41) gives the reverse inequality for the so-called information inequality [[39], Theorem 2.6.3]
which is equivalent to the non-negativity of the relative entropy
Using the inequality (42), we derive the following result.
Proposition 4.10 For two probability distributions and with , and , we have
Proof In the inequality (42), we put and which satisfy . Then we have the present proposition. □
5 A generalized Han’s inequality
In order to state our result, we give the definitions of the Tsallis conditional entropy and the Tsallis joint entropy.
For the conditional probability and the joint probability , we define the Tsallis conditional entropy and the Tsallis joint entropy by
and
We summarize briefly the following chain rules representing relations between the Tsallis conditional entropy and the Tsallis joint entropy.
Assume that x, y are random variables. Then
Proposition 5.2 implied the following propositions.
Proposition 5.3 ([16])
Suppose are random variables. Then
For , two random variables x and y, we have the following inequality:
Consequently, we have the following self-bounding property of the Tsallis joint entropy.
Theorem 5.5 (Generalized Han’s inequality)
Let be random variables. Then for , we have the following inequality:
Proof Since the Tsallis joint entropy has a symmetry , we have
by the use of Proposition 5.2 and Proposition 5.4. Summing both sides on i from 1 to n, we have
due to Proposition 5.3. Therefore, we have the present proposition. □
Remark 5.6 Theorem 5.5 recovers the original Han’s inequality [41, 42] if we take the limit as .
6 Conclusion
We gave an improvement of Young’s inequalities for scalar numbers. Using this result, we gave several inequalities on generalized entropies involving Tsallis entropies. We also provided a generalized Han’s inequality, based on the conditional Tsallis entropy and the joint Tsallis entropy.
References
Aczél J, Daróczy Z: On Measures of Information and Their Characterizations. Academic Press, San Diego; 1975.
Furuichi S: Tsallis entropies and their theorems, properties and applications. In Aspects of Optical Sciences and Quantum Information. Edited by: Abdel-Aty M. Research Signpost, Trivandrum; 2007:1–86.
Rényi A: On measures of entropy and information. 1. In Proc. 4th Berkeley Symp., Mathematical and Statistical Probability. University of California Press, Berkeley; 1961:547–561.
Tsallis C: Possible generalization of Bolzmann-Gibbs statistics. J. Stat. Phys. 1988, 52: 479–487. 10.1007/BF01016429
Tsallis C, et al.: Nonextensive Statistical Mechanics and Its Applications. Edited by: Abe S, Okamoto Y. Springer, Berlin; 2001. See also the comprehensive list of references at http://tsallis.cat.cbpf.br/biblio.htm
Tsallis C: Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World. Springer, Berlin; 2009.
Tsallis C: Entropy. In Encyclopedia of Complexity and Systems Science. Springer, Berlin; 2009.
Sebawe Abdalla M, Thabet L: Nonclassical properties of a model for modulated damping under the action of an external force. Appl. Math. Inf. Sci. 2011, 5: 570–588.
El-Barakaty A, Darwish M, Obada A-SF: Purity loss for a cooper pair box interacting dispersively with a nonclassical field under phase damping. Appl. Math. Inf. Sci. 2011, 5: 122–131.
Sun L-H, Li G-X, Ficek Z: Continuous variables approach to entanglement creation and processing. Appl. Math. Inf. Sci. 2010, 4: 315–339.
Ficek Z: Quantum entanglement processing with atoms. Appl. Math. Inf. Sci. 2009, 3: 375–393.
Furuichi S: A note on a parametrically extended entanglement-measure due to Tsallis relative entropy. Information 2006, 9: 837–844.
Furuichi S, Yanagi K, Kuriyama K: Fundamental properties of Tsallis relative entropy. J. Math. Phys. 2004, 45: 4868–4877. 10.1063/1.1805729
Furuichi S: On uniqueness theorems for Tsallis entropy and Tsallis relative entropy. IEEE Trans. Inf. Theory 2005, 47: 3638–3645.
Furuichi S: An axiomatic characterization of a two-parameter extended relative entropy. J. Math. Phys. 2010., 51: Article ID 123302
Furuichi S: Information theoretical properties of Tsallis entropies. J. Math. Phys. 2006., 47: Article ID 023302
Furuichi S: Matrix trace inequalities on Tsallis entropies. J. Inequal. Pure Appl. Math. 2008., 9(1): Article ID 1
Furuichi S: On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics. J. Math. Phys. 2009., 50: Article ID 013303
Furuichi S, Mitroi F-C: Mathematical inequalities for some divergences. Physica A 2012, 391: 388–400. 10.1016/j.physa.2011.07.052
Furuichi S: Inequalities for Tsallis relative entropy and generalized skew information. Linear Multilinear Algebra 2012, 59: 1143–1158.
Csiszár I: Axiomatic characterizations of information measures. Entropy 2008, 10: 261–273. 10.3390/e10030261
Csiszár I: Information measures: a critical survey. In Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes. Reidel, Dordrecht; 1978:73–86.
Csiszár I, Shields PC: Information theory and statistics: a tutorial. Found. Trends Commun. Inf. Theory 2004, 1: 417–528. 10.1561/0100000004
Bobylev, NA, Krasnoselsky, MA: Extremum analysis (degenerate cases), Moscow. Preprint (1981), 52 pages (in Russian)
Dragomir S: Bounds for the normalised Jensen functional. Bull. Aust. Math. Soc. 2006, 74: 471–478. 10.1017/S000497270004051X
Kittaneh F, Manasrah Y: Improved Young and Heinz inequalities for matrices. J. Math. Anal. Appl. 2010, 36: 262–269.
Aldaz JM: Self-improvement of the inequality between arithmetic and geometric means. J. Math. Inequal. 2009, 3: 213–216.
Aldaz JM: Comparison of differences between arithmetic and geometric means. Tamkang J. Math. 2011, 42: 445–451.
Mitroi FC: About the precision in Jensen-Steffensen inequality. An. Univ. Craiova, Ser. Math. Comput. Sci 2010, 37: 73–84.
Furuichi S: On refined Young inequalities and reverse inequalities. J. Math. Inequal. 2011, 5: 21–31.
Furuichi S: Refined Young inequalities with Specht’s ratio. J. Egypt. Math. Soc. 2012, 20: 46–49. 10.1016/j.joems.2011.12.010
Minculete N: A result about Young inequality and several applications. Sci. Magna 2011, 7: 61–68.
Minculete N: A refinement of the Kittaneh-Manasrah inequality. Creat. Math. Inform. 2011, 20: 157–162.
Furuichi S, Minculete N: Alternative reverse inequalities for Young’s inequality. J. Math. Inequal. 2011, 5: 595–600.
Minculete N, Furuichi S: Several applications of Cartwright-Field’s inequality. Int. J. Pure Appl. Math. 2011, 71: 19–30.
Masi M: A step beyond Tsallis and Rényi entropies. Phys. Lett. A 2005, 338: 217–224. 10.1016/j.physleta.2005.01.094
Weisstein EW: CRC Concise Encyclopedia of Mathematics. 2nd edition. CRC Press, Boca Raton; 2003.
Cartwright DI, Field MJ: A refinement of the arithmetic mean-geometric mean inequality. Proc. Am. Math. Soc. 1978, 71: 36–38. 10.1090/S0002-9939-1978-0476971-2
Cover TM, Thomas JA: Elements of Information Theory. 2nd edition. Wiley, New York; 2006.
Daróczy Z: General information functions. Inf. Control 1970, 16: 36–51. 10.1016/S0019-9958(70)80040-7
Han T: Nonnegative entropy measures of multivariate symmetric correlations. Inf. Control 1978, 36: 133–156. 10.1016/S0019-9958(78)90275-9
Boucheron S, Lugosi G, Bousquet O: Concentration inequalities. In Advanced Lectures on Machine Learning. Springer, Berlin; 2003:208–240.
Acknowledgements
We would like to thank the anonymous reviewer for providing valuable comments to improve the manuscript. The author SF was supported in part by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 20740067. The author NM was supported in part by the Romanian Ministry of Education, Research and Innovation through the PNII Idei project 842/2008. The author FCM was supported by CNCSIS Grant 420/2008.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
The work presented here was carried out in collaboration between all authors. The study was initiated by SF. The author SF also played the role of the corresponding author. All authors contributed equally and significantly in writing this article. All authors have contributed to, seen and approved the manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Furuichi, S., Minculete, N. & Mitroi, FC. Some inequalities on generalized entropies. J Inequal Appl 2012, 226 (2012). https://doi.org/10.1186/1029-242X-2012-226
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1029-242X-2012-226