Some inequalities on generalized entropies
© Furuichi et al.; licensee Springer 2012
Received: 18 February 2012
Accepted: 25 September 2012
Published: 9 October 2012
We give several inequalities on generalized entropies involving Tsallis entropies, using some inequalities obtained by the improvements of Young’s inequality. We also give a generalized Han’s inequality.
Keywordsrefined Young’s inequality Tsallis entropy f-divergence quasilinear entropy Han’s inequality
Generalized entropies have been studied by many researchers (we refer the interested reader to [1, 2]). Rényi  and Tsallis  entropies are well known as one-parameter generalizations of Shannon’s entropy, being intensively studied not only in the field of classical statistical physics [5–7], but also in the field of quantum physics in relation to the entanglement [8–11]. The Tsallis entropy is a natural one-parameter extended form of the Shannon entropy, hence it can be applied to known models which describe systems of great interest in atomic physics . However, to our best knowledge, the physical relevance of a parameter of the Tsallis entropy was highly debated and it has not been completely clarified yet, the parameter being considered as a measure of the non-extensivity of the system under consideration. One of the authors of the present paper studied the Tsallis entropy and the Tsallis relative entropy from the mathematical point of view. Firstly, fundamental properties of the Tsallis relative entropy were discussed in . The uniqueness theorem for the Tsallis entropy and Tsallis relative entropy was studied in . Following this result, an axiomatic characterization of a two-parameter extended relative entropy was given in . In , information theoretical properties of the Tsallis entropy and some inequalities for conditional and joint Tsallis entropies were derived. These entropies are again used in the present paper, to derive the generalized Han’s inequality. In , matrix trace inequalities for the Tsallis entropy were studied. And in , the maximum entropy principle for the Tsallis entropy and the minimization of the Fisher information in Tsallis statistics were studied. Quite recently, we provided mathematical inequalities for some divergences in , considering that it is important to study the mathematical inequalities for the development of new entropies. In this paper, we define a further generalized entropy based on Tsallis and Rényi entropies and study mathematical properties by the use of scalar inequalities to develop the theory of entropies.
where , , for and . If we take , then coincides with the weighted arithmetic mean . If we also take , then coincides with the weighted geometric mean .
From a viewpoint of application on source coding, the relation between the weighted quasilinear mean and the Rényi entropy has been studied in Chapter 5 of  in the following way.
Theorem A ()
where the exponential function is defined on .
Motivated by the above results and recent advances on the Tsallis entropy theory, we investigate the mathematical results for generalized entropies involving Tsallis entropies and quasilinear entropies, using some inequalities obtained by improvements of Young’s inequality.
On the other hand, the studies on refinements for Young’s inequality have given a great progress in the papers [24–35]. In the present paper, we give some inequalities on Tsallis entropies applying two types of inequalities obtained in [29, 32]. In addition, we give the generalized Han’s inequality for the Tsallis entropy in the final section.
2 Tsallis quasilinear entropy and Tsallis quasilinear relative entropy
As an analogy with (5), we may define the following entropy.
where is a probability distribution with for all .
We notice that if ψ does not depend on q, then .
Proof We assume that ψ is an increasing function. Then we have from for for all . Thus, we have which implies , since is also increasing. For the case that ψ is a decreasing function, we can prove it similarly. □
From (16), we have the following proposition.
holds, which proves the present proposition. □
We give a sufficient condition on nonnegativity of the Tsallis quasilinear relative entropy.
Proof We firstly assume that ψ is a concave increasing function. The concavity of ψ shows that we have which is equivalent to . From the assumption, is also increasing so that we have . Therefore, we have , since is increasing and . For the case that ψ is a convex decreasing function, we can prove similarly nonnegativity of the Tsallis quasilinear relative entropy. □
for , .
for , .
We also find that (24) implies the monotonicity of the Rényi relative entropy.
which proves the statement. □
3 Inequalities for Tsallis quasilinear entropy and f-divergence
In this section, we give inequalities for the Tsallis quasilinear entropy and f-divergence. For this purpose, we review the results obtained in  as one of generalizations of refined Young’s inequality.
Proposition 3.1 ()
for any and any .
We have the following inequalities on the Tsallis quasilinear entropy and Tsallis entropy.
(which coincides with Theorem 3.3 in ). In the inequalities (29), we put and for any , then we obtain the statement. □
Proof Put in Theorem 3.2. □
Remark 3.4 Corollary 3.3 improves the well-known inequalities . If we take the limit , the inequalities (30) recover Proposition 1 in .
We also have the following inequalities.
we have the statement. □
Corollary 3.6 ()
4 Inequalities for Tsallis entropy
We firstly give Lagrange’s identity , to establish an alternative generalization of refined Young’s inequality.
Lemma 4.1 (Lagrange’s identity)
where with and for all .
In the above calculations, we used Lemma 4.1. Thus, we proved the first part of the inequalities. Similarly, one can prove the second part of the inequalities putting the function defined by . We omit the details. □
This concludes the proof. □
where for all and , and .
We also have the following inequalities for the Tsallis entropy.
where and are positive numbers depending on the parameter and satisfying and for all .
From the inequalities (39) and (40), we have the statement. □
Remark 4.7 The first part of the inequalities (40) gives another improvement of the well-known inequalities .
where and are positive numbers satisfying and for all .
Proof Take the limit in Theorem 4.6. □
Using the inequality (42), we derive the following result.
Proof In the inequality (42), we put and which satisfy . Then we have the present proposition. □
5 A generalized Han’s inequality
In order to state our result, we give the definitions of the Tsallis conditional entropy and the Tsallis joint entropy.
We summarize briefly the following chain rules representing relations between the Tsallis conditional entropy and the Tsallis joint entropy.
Proposition 5.2 implied the following propositions.
Proposition 5.3 ()
Consequently, we have the following self-bounding property of the Tsallis joint entropy.
Theorem 5.5 (Generalized Han’s inequality)
due to Proposition 5.3. Therefore, we have the present proposition. □
We gave an improvement of Young’s inequalities for scalar numbers. Using this result, we gave several inequalities on generalized entropies involving Tsallis entropies. We also provided a generalized Han’s inequality, based on the conditional Tsallis entropy and the joint Tsallis entropy.
We would like to thank the anonymous reviewer for providing valuable comments to improve the manuscript. The author SF was supported in part by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 20740067. The author NM was supported in part by the Romanian Ministry of Education, Research and Innovation through the PNII Idei project 842/2008. The author FCM was supported by CNCSIS Grant 420/2008.
- Aczél J, Daróczy Z: On Measures of Information and Their Characterizations. Academic Press, San Diego; 1975.Google Scholar
- Furuichi S: Tsallis entropies and their theorems, properties and applications. In Aspects of Optical Sciences and Quantum Information. Edited by: Abdel-Aty M. Research Signpost, Trivandrum; 2007:1–86.Google Scholar
- Rényi A: On measures of entropy and information. 1. In Proc. 4th Berkeley Symp., Mathematical and Statistical Probability. University of California Press, Berkeley; 1961:547–561.Google Scholar
- Tsallis C: Possible generalization of Bolzmann-Gibbs statistics. J. Stat. Phys. 1988, 52: 479–487. 10.1007/BF01016429MathSciNetView ArticleGoogle Scholar
- Tsallis C, et al.: Nonextensive Statistical Mechanics and Its Applications. Edited by: Abe S, Okamoto Y. Springer, Berlin; 2001. See also the comprehensive list of references at http://tsallis.cat.cbpf.br/biblio.htmGoogle Scholar
- Tsallis C: Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World. Springer, Berlin; 2009.Google Scholar
- Tsallis C: Entropy. In Encyclopedia of Complexity and Systems Science. Springer, Berlin; 2009.Google Scholar
- Sebawe Abdalla M, Thabet L: Nonclassical properties of a model for modulated damping under the action of an external force. Appl. Math. Inf. Sci. 2011, 5: 570–588.MathSciNetGoogle Scholar
- El-Barakaty A, Darwish M, Obada A-SF: Purity loss for a cooper pair box interacting dispersively with a nonclassical field under phase damping. Appl. Math. Inf. Sci. 2011, 5: 122–131.MathSciNetGoogle Scholar
- Sun L-H, Li G-X, Ficek Z: Continuous variables approach to entanglement creation and processing. Appl. Math. Inf. Sci. 2010, 4: 315–339.MathSciNetGoogle Scholar
- Ficek Z: Quantum entanglement processing with atoms. Appl. Math. Inf. Sci. 2009, 3: 375–393.MathSciNetGoogle Scholar
- Furuichi S: A note on a parametrically extended entanglement-measure due to Tsallis relative entropy. Information 2006, 9: 837–844.MathSciNetGoogle Scholar
- Furuichi S, Yanagi K, Kuriyama K: Fundamental properties of Tsallis relative entropy. J. Math. Phys. 2004, 45: 4868–4877. 10.1063/1.1805729MathSciNetView ArticleGoogle Scholar
- Furuichi S: On uniqueness theorems for Tsallis entropy and Tsallis relative entropy. IEEE Trans. Inf. Theory 2005, 47: 3638–3645.MathSciNetView ArticleGoogle Scholar
- Furuichi S: An axiomatic characterization of a two-parameter extended relative entropy. J. Math. Phys. 2010., 51: Article ID 123302Google Scholar
- Furuichi S: Information theoretical properties of Tsallis entropies. J. Math. Phys. 2006., 47: Article ID 023302Google Scholar
- Furuichi S: Matrix trace inequalities on Tsallis entropies. J. Inequal. Pure Appl. Math. 2008., 9(1): Article ID 1Google Scholar
- Furuichi S: On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics. J. Math. Phys. 2009., 50: Article ID 013303Google Scholar
- Furuichi S, Mitroi F-C: Mathematical inequalities for some divergences. Physica A 2012, 391: 388–400. 10.1016/j.physa.2011.07.052MathSciNetView ArticleGoogle Scholar
- Furuichi S: Inequalities for Tsallis relative entropy and generalized skew information. Linear Multilinear Algebra 2012, 59: 1143–1158.MathSciNetView ArticleGoogle Scholar
- Csiszár I: Axiomatic characterizations of information measures. Entropy 2008, 10: 261–273. 10.3390/e10030261View ArticleGoogle Scholar
- Csiszár I: Information measures: a critical survey. In Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes. Reidel, Dordrecht; 1978:73–86.Google Scholar
- Csiszár I, Shields PC: Information theory and statistics: a tutorial. Found. Trends Commun. Inf. Theory 2004, 1: 417–528. 10.1561/0100000004View ArticleGoogle Scholar
- Bobylev, NA, Krasnoselsky, MA: Extremum analysis (degenerate cases), Moscow. Preprint (1981), 52 pages (in Russian)Google Scholar
- Dragomir S: Bounds for the normalised Jensen functional. Bull. Aust. Math. Soc. 2006, 74: 471–478. 10.1017/S000497270004051XView ArticleGoogle Scholar
- Kittaneh F, Manasrah Y: Improved Young and Heinz inequalities for matrices. J. Math. Anal. Appl. 2010, 36: 262–269.MathSciNetView ArticleGoogle Scholar
- Aldaz JM: Self-improvement of the inequality between arithmetic and geometric means. J. Math. Inequal. 2009, 3: 213–216.MathSciNetView ArticleGoogle Scholar
- Aldaz JM: Comparison of differences between arithmetic and geometric means. Tamkang J. Math. 2011, 42: 445–451.MathSciNetView ArticleGoogle Scholar
- Mitroi FC: About the precision in Jensen-Steffensen inequality. An. Univ. Craiova, Ser. Math. Comput. Sci 2010, 37: 73–84.MathSciNetGoogle Scholar
- Furuichi S: On refined Young inequalities and reverse inequalities. J. Math. Inequal. 2011, 5: 21–31.MathSciNetView ArticleGoogle Scholar
- Furuichi S: Refined Young inequalities with Specht’s ratio. J. Egypt. Math. Soc. 2012, 20: 46–49. 10.1016/j.joems.2011.12.010MathSciNetView ArticleGoogle Scholar
- Minculete N: A result about Young inequality and several applications. Sci. Magna 2011, 7: 61–68.Google Scholar
- Minculete N: A refinement of the Kittaneh-Manasrah inequality. Creat. Math. Inform. 2011, 20: 157–162.MathSciNetGoogle Scholar
- Furuichi S, Minculete N: Alternative reverse inequalities for Young’s inequality. J. Math. Inequal. 2011, 5: 595–600.MathSciNetView ArticleGoogle Scholar
- Minculete N, Furuichi S: Several applications of Cartwright-Field’s inequality. Int. J. Pure Appl. Math. 2011, 71: 19–30.MathSciNetGoogle Scholar
- Masi M: A step beyond Tsallis and Rényi entropies. Phys. Lett. A 2005, 338: 217–224. 10.1016/j.physleta.2005.01.094MathSciNetView ArticleGoogle Scholar
- Weisstein EW: CRC Concise Encyclopedia of Mathematics. 2nd edition. CRC Press, Boca Raton; 2003.Google Scholar
- Cartwright DI, Field MJ: A refinement of the arithmetic mean-geometric mean inequality. Proc. Am. Math. Soc. 1978, 71: 36–38. 10.1090/S0002-9939-1978-0476971-2MathSciNetView ArticleGoogle Scholar
- Cover TM, Thomas JA: Elements of Information Theory. 2nd edition. Wiley, New York; 2006.Google Scholar
- Daróczy Z: General information functions. Inf. Control 1970, 16: 36–51. 10.1016/S0019-9958(70)80040-7View ArticleGoogle Scholar
- Han T: Nonnegative entropy measures of multivariate symmetric correlations. Inf. Control 1978, 36: 133–156. 10.1016/S0019-9958(78)90275-9View ArticleGoogle Scholar
- Boucheron S, Lugosi G, Bousquet O: Concentration inequalities. In Advanced Lectures on Machine Learning. Springer, Berlin; 2003:208–240.Google Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.