- Research
- Open Access

# Some inequalities on generalized entropies

- Shigeru Furuichi
^{1}Email author, - Nicuşor Minculete
^{2}and - Flavia-Corina Mitroi
^{3}

**2012**:226

https://doi.org/10.1186/1029-242X-2012-226

© Furuichi et al.; licensee Springer 2012

**Received:**18 February 2012**Accepted:**25 September 2012**Published:**9 October 2012

## Abstract

We give several inequalities on generalized entropies involving Tsallis entropies, using some inequalities obtained by the improvements of Young’s inequality. We also give a generalized Han’s inequality.

**MSC:**26D15, 94A17.

## Keywords

- refined Young’s inequality
- Tsallis entropy
*f*-divergence- quasilinear entropy
- Han’s inequality

## 1 Introduction

Generalized entropies have been studied by many researchers (we refer the interested reader to [1, 2]). Rényi [3] and Tsallis [4] entropies are well known as one-parameter generalizations of Shannon’s entropy, being intensively studied not only in the field of classical statistical physics [5–7], but also in the field of quantum physics in relation to the entanglement [8–11]. The Tsallis entropy is a natural one-parameter extended form of the Shannon entropy, hence it can be applied to known models which describe systems of great interest in atomic physics [12]. However, to our best knowledge, the physical relevance of a parameter of the Tsallis entropy was highly debated and it has not been completely clarified yet, the parameter being considered as a measure of the non-extensivity of the system under consideration. One of the authors of the present paper studied the Tsallis entropy and the Tsallis relative entropy from the mathematical point of view. Firstly, fundamental properties of the Tsallis relative entropy were discussed in [13]. The uniqueness theorem for the Tsallis entropy and Tsallis relative entropy was studied in [14]. Following this result, an axiomatic characterization of a two-parameter extended relative entropy was given in [15]. In [16], information theoretical properties of the Tsallis entropy and some inequalities for conditional and joint Tsallis entropies were derived. These entropies are again used in the present paper, to derive the generalized Han’s inequality. In [17], matrix trace inequalities for the Tsallis entropy were studied. And in [18], the maximum entropy principle for the Tsallis entropy and the minimization of the Fisher information in Tsallis statistics were studied. Quite recently, we provided mathematical inequalities for some divergences in [19], considering that it is important to study the mathematical inequalities for the development of new entropies. In this paper, we define a further generalized entropy based on Tsallis and Rényi entropies and study mathematical properties by the use of scalar inequalities to develop the theory of entropies.

where ${\sum}_{j=1}^{n}{p}_{j}=1$, ${p}_{j}>0$, ${x}_{j}\in I$ for $j=1,2,\dots ,n$ and $n\in \mathbb{N}$. If we take $\psi (x)=x$, then ${M}_{\psi}({x}_{1},{x}_{2},\dots ,{x}_{n})$ coincides with the weighted arithmetic mean $A({x}_{1},{x}_{2},\dots ,{x}_{n})\equiv {\sum}_{j=1}^{n}{p}_{j}{x}_{j}$. If we also take $\psi (x)=log(x)$, then ${M}_{\psi}({x}_{1},{x}_{2},\dots ,{x}_{n})$ coincides with the weighted geometric mean $G({x}_{1},{x}_{2},\dots ,{x}_{n})\equiv {\prod}_{j=1}^{n}{x}_{j}^{{p}_{j}}$.

*q*-logarithmic function for $x>0$ is defined by ${ln}_{q}(x)\equiv \frac{{x}^{1-q}-1}{1-q}$ which uniformly converges to the usual logarithmic function $log(x)$ in the limit $q\to 1$. Therefore, the Tsallis entropy converges to the Shannon entropy in the limit $q\to 1$:

*ϕ*on $(0,1]$, the quasilinear entropy is given by

*ψ*on $(0,\mathrm{\infty})$. If we take $\psi (x)=log(x)$ in (5), we have ${I}_{1}^{log}({p}_{1},{p}_{2},\dots ,{p}_{n})={H}_{1}({p}_{1},{p}_{2},\dots ,{p}_{n})$. The case $\psi (x)={x}^{1-q}$ is also useful in practice, since we recapture the Rényi entropy, namely ${I}_{1}^{{x}^{1-q}}({p}_{1},{p}_{2},\dots ,{p}_{n})={R}_{q}({p}_{1},{p}_{2},\dots ,{p}_{n})$ where the Rényi entropy [3] is defined by

From a viewpoint of application on source coding, the relation between the weighted quasilinear mean and the Rényi entropy has been studied in Chapter 5 of [1] in the following way.

**Theorem A** ([1])

*For all real numbers*$q>0$

*and integers*$D>1$,

*there exists a code*$({x}_{1},{x}_{2},\dots ,{x}_{n})$

*such that*

*where the exponential function* ${D}^{\frac{1-q}{q}x}$ *is defined on* $[1,\mathrm{\infty})$.

Motivated by the above results and recent advances on the Tsallis entropy theory, we investigate the mathematical results for generalized entropies involving Tsallis entropies and quasilinear entropies, using some inequalities obtained by improvements of Young’s inequality.

**Definition 1.1**For a continuous and strictly monotonic function

*ψ*on $(0,\mathrm{\infty})$ and two probability distributions $\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$ and $\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$ with ${p}_{j}>0$, ${r}_{j}>0$ for all $j=1,2,\dots ,n$, the quasilinear relative entropy is defined by

*i.e.*,

*f*-divergence [21–23] defined for a convex function

*f*on $(0,\mathrm{\infty})$ with $f(1)=0$ by

*f*by

*f*-divergence for

*incomplete*probability distributions $\{{a}_{1},{a}_{2},\dots ,{a}_{n}\}$ and $\{{b}_{1},{b}_{2}\dots ,{b}_{n}\}$ where ${a}_{i}>0$ and ${b}_{i}>0$, in the following way:

On the other hand, the studies on refinements for Young’s inequality have given a great progress in the papers [24–35]. In the present paper, we give some inequalities on Tsallis entropies applying two types of inequalities obtained in [29, 32]. In addition, we give the generalized Han’s inequality for the Tsallis entropy in the final section.

## 2 Tsallis quasilinear entropy and Tsallis quasilinear relative entropy

As an analogy with (5), we may define the following entropy.

**Definition 2.1**For a continuous and strictly monotonic function

*ψ*on $(0,\mathrm{\infty})$ and $q\ge 0$ with $q\ne 1$, the Tsallis quasilinear entropy (

*q*-quasilinear entropy) is defined by

where $\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$ is a probability distribution with ${p}_{j}>0$ for all $j=1,2,\dots ,n$.

We notice that if *ψ* does not depend on *q*, then ${lim}_{q\to 1}{I}_{q}^{\psi}({p}_{1},{p}_{2},\dots ,{p}_{n})={I}_{1}^{\psi}({p}_{1},{p}_{2},\dots ,{p}_{n})$.

*q*-exponential function as the inverse function of the

*q*-logarithmic function by ${exp}_{q}(x)\equiv {\{1+(1-q)x\}}^{1/(1-q)}$, if $1+(1-q)x>0$, otherwise it is undefined. If we take $\psi (x)={ln}_{q}(x)$, then we have ${I}_{q}^{{ln}_{q}}({p}_{1},{p}_{2},\dots ,{p}_{n})={H}_{q}({p}_{1},{p}_{2},\dots ,{p}_{n})$. Furthermore, we have

**Proposition 2.2**

*The Tsallis quasilinear entropy is nonnegative*:

*Proof* We assume that *ψ* is an increasing function. Then we have $\psi (\frac{1}{{p}_{j}})\ge \psi (1)$ from $\frac{1}{{p}_{j}}\ge 1$ for ${p}_{j}>0$ for all $j=1,2,\dots ,n$. Thus, we have ${\sum}_{j=1}^{n}{p}_{j}\psi (\frac{1}{{p}_{j}})\ge \psi (1)$ which implies ${\psi}^{-1}({\sum}_{j=1}^{n}{p}_{j}\psi (\frac{1}{{p}_{j}}))\ge 1$, since ${\psi}^{-1}$ is also increasing. For the case that *ψ* is a decreasing function, we can prove it similarly. □

*q*-exponential function gives us the following connection between the Rényi entropy and Tsallis entropy [36]:

From (16), we have the following proposition.

**Proposition 2.3**

*Let*$\mathcal{A}\equiv \{{\mathcal{A}}_{i}:i=1,2,\dots ,k\}$

*be a partition of*$\{1,2,\dots ,n\}$

*and put*${p}_{i}^{\mathcal{A}}\equiv {\sum}_{j\in {\mathcal{A}}_{i}}{p}_{j}$.

*Then we have*

*Proof*We use the generalized Shannon additivity (which is often called

*q*-additivity) for the Tsallis entropy (see [14] for example):

holds, which proves the present proposition. □

**Definition 2.4**For a continuous and strictly monotonic function

*ψ*on $(0,\mathrm{\infty})$ and two probability distributions $\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$ and $\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$ with ${p}_{j}>0$, ${r}_{j}>0$ for all $j=1,2,\dots ,n$, the Tsallis quasilinear relative entropy is defined by

We give a sufficient condition on nonnegativity of the Tsallis quasilinear relative entropy.

**Proposition 2.5**

*If*

*ψ*

*is a concave increasing function or a convex decreasing function*,

*then we have nonnegativity of the Tsallis quasilinear relative entropy*:

*Proof* We firstly assume that *ψ* is a concave increasing function. The concavity of *ψ* shows that we have $\psi ({\sum}_{j=1}^{n}{p}_{j}\frac{{r}_{j}}{{p}_{j}})\ge {\sum}_{j=1}^{n}{p}_{j}\psi (\frac{{r}_{j}}{{p}_{j}})$ which is equivalent to $\psi (1)\ge {\sum}_{j=1}^{n}{p}_{j}\psi (\frac{{r}_{j}}{{p}_{j}})$. From the assumption, ${\psi}^{-1}$ is also increasing so that we have $1\ge {\psi}^{-1}({\sum}_{j=1}^{n}{p}_{j}\psi (\frac{{r}_{j}}{{p}_{j}}))$. Therefore, we have $-{ln}_{q}{\psi}^{-1}({\sum}_{j=1}^{n}{p}_{j}\psi (\frac{{r}_{j}}{{p}_{j}}))\ge 0$, since ${ln}_{q}x$ is increasing and ${ln}_{q}(1)=0$. For the case that *ψ* is a convex decreasing function, we can prove similarly nonnegativity of the Tsallis quasilinear relative entropy. □

**Remark 2.6**The following two functions satisfy the sufficient condition in the above proposition.

- (i)
$\psi (x)={ln}_{q}x$ for $q\ge 0$, $q\ne 1$.

- (ii)
$\psi (x)={x}^{1-q}$ for $q\ge 0$, $q\ne 1$.

We also find that (24) implies the monotonicity of the Rényi relative entropy.

**Proposition 2.7**

*Under the same assumptions as in Proposition*2.3

*and*${r}_{i}^{\mathcal{A}}\equiv {\sum}_{j\in {\mathcal{A}}_{i}}{r}_{j}$,

*we have*

*Proof*We recall that the Tsallis relative entropy is a special case of

*f*-divergence so that it has the same properties with

*f*-divergence. Since ${exp}_{2-q}$ is a monotone increasing function for $0\le q\le 2$ and

*f*-divergence has a monotonicity [21, 23], we have

which proves the statement. □

## 3 Inequalities for Tsallis quasilinear entropy and *f*-divergence

In this section, we give inequalities for the Tsallis quasilinear entropy and *f*-divergence. For this purpose, we review the results obtained in [29] as one of generalizations of refined Young’s inequality.

**Proposition 3.1** ([29])

*For two probability vectors*$\mathbf{p}=\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*and*$\mathbf{r}=\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*such that*${p}_{j}>0$, ${r}_{j}>0$, ${\sum}_{j=1}^{n}{p}_{j}={\sum}_{j=1}^{n}{r}_{j}=1$

*and*$\mathbf{x}=\{{x}_{1},{x}_{2},\dots ,{x}_{n}\}$

*such that*${x}_{i}\ge 0$,

*we have*

*where*

*for a continuous increasing function*$\psi :I\to I$

*and a function*$f:I\to J$

*such that*

*for any* $a,b\in I$ *and any* $\lambda \in [0,1]$.

We have the following inequalities on the Tsallis quasilinear entropy and Tsallis entropy.

**Theorem 3.2**

*For*$q\ge 0$,

*a continuous and strictly monotonic function*

*ψ*

*on*$(0,\mathrm{\infty})$

*and a probability distribution*$\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*with*${r}_{j}>0$

*for all*$j=1,2,\dots ,n$,

*we have*

*Proof*If we take the uniform distribution $\mathbf{p}=\{\frac{1}{n},\dots ,\frac{1}{n}\}\equiv \mathbf{u}$ in Proposition 3.1, then we have

(which coincides with Theorem 3.3 in [29]). In the inequalities (29), we put $f(x)=-{ln}_{q}(x)$ and ${x}_{j}=\frac{1}{{r}_{j}}$ for any $j=1,2,\dots ,n$, then we obtain the statement. □

**Corollary 3.3**

*For*$q\ge 0$

*and a probability distribution*$\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*with*${r}_{j}>0$

*for all*$j=1,2,\dots ,n$,

*we have*

*Proof* Put $\psi (x)=x$ in Theorem 3.2. □

**Remark 3.4** Corollary 3.3 improves the well-known inequalities $0\le {H}_{q}({r}_{1},{r}_{2},\dots ,{r}_{n})\le {ln}_{q}n$. If we take the limit $q\to 1$, the inequalities (30) recover Proposition 1 in [25].

We also have the following inequalities.

**Theorem 3.5**

*For two probability distributions*$\mathbf{p}=\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*and*$\mathbf{r}=\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$,

*and an incomplete probability distribution*$\mathbf{t}=\{{t}_{1},{t}_{2},\dots ,{t}_{n}\}$

*with*${t}_{j}\equiv \frac{{p}_{j}^{2}}{{r}_{j}}$,

*we have*

*Proof*Put ${x}_{j}=\frac{{p}_{j}}{{r}_{j}}$ in Proposition 3.1 with $\psi (x)=x$. Since we have the relation

we have the statement. □

**Corollary 3.6** ([25])

*Under the same assumption as in Theorem*3.5,

*we have*

*Proof*If we take $f(x)=-log(x)$ in Theorem 3.5, then we have

□

## 4 Inequalities for Tsallis entropy

We firstly give Lagrange’s identity [37], to establish an alternative generalization of refined Young’s inequality.

**Lemma 4.1** (Lagrange’s identity)

*For two vectors*$\{{a}_{1},{a}_{2},\dots ,{a}_{n}\}$

*and*$\{{b}_{1},{b}_{2},\dots ,{b}_{n}\}$,

*we have*

**Theorem 4.2**

*Let*$f:I\to \mathbb{R}$

*be a twice differentiable function such that there exist real constants*

*m*

*and*

*M*

*so that*$0\le m\le {f}^{\mathrm{\prime}\mathrm{\prime}}(x)\le M$

*for any*$x\in I$.

*Then we have*

*where* ${p}_{j}>0$ *with* ${\sum}_{j=1}^{n}{p}_{j}=1$ *and* ${x}_{j}\in I$ *for all* $j=1,2,\dots ,n$.

*Proof*We consider the function $g:I\to \mathbb{R}$ defined by $g(x)\equiv f(x)-\frac{m}{2}{x}^{2}$. Since we have ${g}^{\mathrm{\prime}\mathrm{\prime}}(x)={f}^{\mathrm{\prime}\mathrm{\prime}}(x)-m\ge 0$,

*g*is a convex function. Applying Jensen’s inequality, we thus have

In the above calculations, we used Lemma 4.1. Thus, we proved the first part of the inequalities. Similarly, one can prove the second part of the inequalities putting the function $h:I\to \mathbb{R}$ defined by $h(x)\equiv \frac{M}{2}{x}^{2}-f(x)$. We omit the details. □

**Lemma 4.3**

*For*$\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*with*${p}_{j}>0$

*and*${\sum}_{j=1}^{n}{p}_{j}=1$,

*and*$\{{x}_{1},{x}_{2},\dots ,{x}_{n}\}$

*with*${x}_{j}>0$,

*we have*

*Proof*We denote

This concludes the proof. □

**Corollary 4.4**

*Under the assumptions of Theorem*4.2,

*we have*

**Remark 4.5**Corollary 4.4 gives a similar form with Cartwright-Field’s inequality [38]:

where ${p}_{j}>0$ for all $j=1,2,\dots ,n$ and ${\sum}_{j=1}^{n}{p}_{j}=1$, ${m}^{\mathrm{\prime}}\equiv min\{{x}_{1},{x}_{2},\dots ,{x}_{n}\}>0$ and ${M}^{\mathrm{\prime}}\equiv max\{{x}_{1},{x}_{2},\dots ,{x}_{n}\}$.

We also have the following inequalities for the Tsallis entropy.

**Theorem 4.6**

*For two probability distributions*$\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*and*$\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*with*${p}_{j}>0$, ${r}_{j}>0$

*and*${\sum}_{j=1}^{n}{p}_{j}={\sum}_{j=1}^{n}{r}_{j}=1$,

*we have*

*where* ${m}_{q}$ *and* ${M}_{q}$ *are positive numbers depending on the parameter* $q\ge 0$ *and satisfying* ${m}_{q}\le q{r}_{j}^{-q-1}\le {M}_{q}$ *and* ${m}_{q}\le q{p}_{j}^{-q-1}\le {M}_{q}$ *for all* $j=1,2,\dots ,n$.

*Proof*Applying Theorem 4.2 for the convex function $-{ln}_{q}(x)$ and ${x}_{j}=\frac{1}{{r}_{j}}$, we have

From the inequalities (39) and (40), we have the statement. □

**Remark 4.7** The first part of the inequalities (40) gives another improvement of the well-known inequalities $0\le {H}_{q}({r}_{1},{r}_{2},\dots ,{r}_{n})\le {ln}_{q}n$.

**Corollary 4.8**

*For two probability distributions*$\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*and*$\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*with*${p}_{j}>0$, ${r}_{j}>0$

*and*${\sum}_{j=1}^{n}{p}_{j}={\sum}_{j=1}^{n}{r}_{j}=1$,

*we have*

*where* ${m}_{1}$ *and* ${M}_{1}$ *are positive numbers satisfying* ${m}_{1}\le {r}_{j}^{-2}\le {M}_{1}$ *and* ${m}_{1}\le {p}_{j}^{-2}\le {M}_{1}$ *for all* $j=1,2,\dots ,n$.

*Proof* Take the limit $q\to 1$ in Theorem 4.6. □

**Remark 4.9**The second part of the inequalities (41) gives the reverse inequality for the so-called information inequality [[39], Theorem 2.6.3]

Using the inequality (42), we derive the following result.

**Proposition 4.10**

*For two probability distributions*$\{{p}_{1},{p}_{2},\dots ,{p}_{n}\}$

*and*$\{{r}_{1},{r}_{2},\dots ,{r}_{n}\}$

*with*$0<{p}_{j}<1$, $0<{r}_{j}<1$

*and*${\sum}_{j=1}^{n}{p}_{j}={\sum}_{j=1}^{n}{r}_{j}=1$,

*we have*

*Proof* In the inequality (42), we put ${p}_{j}=\frac{1-{p}_{j}}{n-1}$ and ${r}_{j}=\frac{1-{r}_{j}}{n-1}$ which satisfy ${\sum}_{j=1}^{n}\frac{1-{p}_{j}}{n-1}={\sum}_{j=1}^{n}\frac{1-{r}_{j}}{n-1}=1$. Then we have the present proposition. □

## 5 A generalized Han’s inequality

In order to state our result, we give the definitions of the Tsallis conditional entropy and the Tsallis joint entropy.

We summarize briefly the following chain rules representing relations between the Tsallis conditional entropy and the Tsallis joint entropy.

*Assume that*

**x**,

**y**

*are random variables*.

*Then*

Proposition 5.2 implied the following propositions.

**Proposition 5.3** ([16])

*Suppose*${\mathbf{x}}_{1},{\mathbf{x}}_{2},\dots ,{\mathbf{x}}_{n}$

*are random variables*.

*Then*

*For*$q\ge 1$,

*two random variables*

**x**

*and*

**y**,

*we have the following inequality*:

Consequently, we have the following self-bounding property of the Tsallis joint entropy.

**Theorem 5.5** (Generalized Han’s inequality)

*Let*${\mathbf{x}}_{1},{\mathbf{x}}_{2},\dots ,{\mathbf{x}}_{n}$

*be random variables*.

*Then for*$q\ge 1$,

*we have the following inequality*:

*Proof*Since the Tsallis joint entropy has a symmetry ${H}_{q}(\mathbf{x},\mathbf{y})={H}_{q}(\mathbf{y},\mathbf{x})$, we have

*i*from 1 to

*n*, we have

due to Proposition 5.3. Therefore, we have the present proposition. □

**Remark 5.6** Theorem 5.5 recovers the original Han’s inequality [41, 42] if we take the limit as $q\to 1$.

## 6 Conclusion

We gave an improvement of Young’s inequalities for scalar numbers. Using this result, we gave several inequalities on generalized entropies involving Tsallis entropies. We also provided a generalized Han’s inequality, based on the conditional Tsallis entropy and the joint Tsallis entropy.

## Declarations

### Acknowledgements

We would like to thank the anonymous reviewer for providing valuable comments to improve the manuscript. The author SF was supported in part by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 20740067. The author NM was supported in part by the Romanian Ministry of Education, Research and Innovation through the PNII Idei project 842/2008. The author FCM was supported by CNCSIS Grant 420/2008.

## Authors’ Affiliations

## References

- Aczél J, Daróczy Z:
*On Measures of Information and Their Characterizations*. Academic Press, San Diego; 1975.Google Scholar - Furuichi S: Tsallis entropies and their theorems, properties and applications. In
*Aspects of Optical Sciences and Quantum Information*. Edited by: Abdel-Aty M. Research Signpost, Trivandrum; 2007:1–86.Google Scholar - Rényi A: On measures of entropy and information. 1. In
*Proc. 4th Berkeley Symp., Mathematical and Statistical Probability*. University of California Press, Berkeley; 1961:547–561.Google Scholar - Tsallis C: Possible generalization of Bolzmann-Gibbs statistics.
*J. Stat. Phys.*1988, 52: 479–487. 10.1007/BF01016429MathSciNetView ArticleGoogle Scholar - Tsallis C,
*et al*.:*Nonextensive Statistical Mechanics and Its Applications*. Edited by: Abe S, Okamoto Y. Springer, Berlin; 2001. See also the comprehensive list of references at http://tsallis.cat.cbpf.br/biblio.htmGoogle Scholar - Tsallis C:
*Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World*. Springer, Berlin; 2009.Google Scholar - Tsallis C: Entropy. In
*Encyclopedia of Complexity and Systems Science*. Springer, Berlin; 2009.Google Scholar - Sebawe Abdalla M, Thabet L: Nonclassical properties of a model for modulated damping under the action of an external force.
*Appl. Math. Inf. Sci.*2011, 5: 570–588.MathSciNetGoogle Scholar - El-Barakaty A, Darwish M, Obada A-SF: Purity loss for a cooper pair box interacting dispersively with a nonclassical field under phase damping.
*Appl. Math. Inf. Sci.*2011, 5: 122–131.MathSciNetGoogle Scholar - Sun L-H, Li G-X, Ficek Z: Continuous variables approach to entanglement creation and processing.
*Appl. Math. Inf. Sci.*2010, 4: 315–339.MathSciNetGoogle Scholar - Ficek Z: Quantum entanglement processing with atoms.
*Appl. Math. Inf. Sci.*2009, 3: 375–393.MathSciNetGoogle Scholar - Furuichi S: A note on a parametrically extended entanglement-measure due to Tsallis relative entropy.
*Information*2006, 9: 837–844.MathSciNetGoogle Scholar - Furuichi S, Yanagi K, Kuriyama K: Fundamental properties of Tsallis relative entropy.
*J. Math. Phys.*2004, 45: 4868–4877. 10.1063/1.1805729MathSciNetView ArticleGoogle Scholar - Furuichi S: On uniqueness theorems for Tsallis entropy and Tsallis relative entropy.
*IEEE Trans. Inf. Theory*2005, 47: 3638–3645.MathSciNetView ArticleGoogle Scholar - Furuichi S: An axiomatic characterization of a two-parameter extended relative entropy.
*J. Math. Phys.*2010., 51: Article ID 123302Google Scholar - Furuichi S: Information theoretical properties of Tsallis entropies.
*J. Math. Phys.*2006., 47: Article ID 023302Google Scholar - Furuichi S: Matrix trace inequalities on Tsallis entropies.
*J. Inequal. Pure Appl. Math.*2008., 9(1): Article ID 1Google Scholar - Furuichi S: On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics.
*J. Math. Phys.*2009., 50: Article ID 013303Google Scholar - Furuichi S, Mitroi F-C: Mathematical inequalities for some divergences.
*Physica A*2012, 391: 388–400. 10.1016/j.physa.2011.07.052MathSciNetView ArticleGoogle Scholar - Furuichi S: Inequalities for Tsallis relative entropy and generalized skew information.
*Linear Multilinear Algebra*2012, 59: 1143–1158.MathSciNetView ArticleGoogle Scholar - Csiszár I: Axiomatic characterizations of information measures.
*Entropy*2008, 10: 261–273. 10.3390/e10030261View ArticleGoogle Scholar - Csiszár I: Information measures: a critical survey. In
*Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes*. Reidel, Dordrecht; 1978:73–86.Google Scholar - Csiszár I, Shields PC: Information theory and statistics: a tutorial.
*Found. Trends Commun. Inf. Theory*2004, 1: 417–528. 10.1561/0100000004View ArticleGoogle Scholar - Bobylev, NA, Krasnoselsky, MA: Extremum analysis (degenerate cases), Moscow. Preprint (1981), 52 pages (in Russian)Google Scholar
- Dragomir S: Bounds for the normalised Jensen functional.
*Bull. Aust. Math. Soc.*2006, 74: 471–478. 10.1017/S000497270004051XView ArticleGoogle Scholar - Kittaneh F, Manasrah Y: Improved Young and Heinz inequalities for matrices.
*J. Math. Anal. Appl.*2010, 36: 262–269.MathSciNetView ArticleGoogle Scholar - Aldaz JM: Self-improvement of the inequality between arithmetic and geometric means.
*J. Math. Inequal.*2009, 3: 213–216.MathSciNetView ArticleGoogle Scholar - Aldaz JM: Comparison of differences between arithmetic and geometric means.
*Tamkang J. Math.*2011, 42: 445–451.MathSciNetView ArticleGoogle Scholar - Mitroi FC: About the precision in Jensen-Steffensen inequality.
*An. Univ. Craiova, Ser. Math. Comput. Sci*2010, 37: 73–84.MathSciNetGoogle Scholar - Furuichi S: On refined Young inequalities and reverse inequalities.
*J. Math. Inequal.*2011, 5: 21–31.MathSciNetView ArticleGoogle Scholar - Furuichi S: Refined Young inequalities with Specht’s ratio.
*J. Egypt. Math. Soc.*2012, 20: 46–49. 10.1016/j.joems.2011.12.010MathSciNetView ArticleGoogle Scholar - Minculete N: A result about Young inequality and several applications.
*Sci. Magna*2011, 7: 61–68.Google Scholar - Minculete N: A refinement of the Kittaneh-Manasrah inequality.
*Creat. Math. Inform.*2011, 20: 157–162.MathSciNetGoogle Scholar - Furuichi S, Minculete N: Alternative reverse inequalities for Young’s inequality.
*J. Math. Inequal.*2011, 5: 595–600.MathSciNetView ArticleGoogle Scholar - Minculete N, Furuichi S: Several applications of Cartwright-Field’s inequality.
*Int. J. Pure Appl. Math.*2011, 71: 19–30.MathSciNetGoogle Scholar - Masi M: A step beyond Tsallis and Rényi entropies.
*Phys. Lett. A*2005, 338: 217–224. 10.1016/j.physleta.2005.01.094MathSciNetView ArticleGoogle Scholar - Weisstein EW:
*CRC Concise Encyclopedia of Mathematics*. 2nd edition. CRC Press, Boca Raton; 2003.Google Scholar - Cartwright DI, Field MJ: A refinement of the arithmetic mean-geometric mean inequality.
*Proc. Am. Math. Soc.*1978, 71: 36–38. 10.1090/S0002-9939-1978-0476971-2MathSciNetView ArticleGoogle Scholar - Cover TM, Thomas JA:
*Elements of Information Theory*. 2nd edition. Wiley, New York; 2006.Google Scholar - Daróczy Z: General information functions.
*Inf. Control*1970, 16: 36–51. 10.1016/S0019-9958(70)80040-7View ArticleGoogle Scholar - Han T: Nonnegative entropy measures of multivariate symmetric correlations.
*Inf. Control*1978, 36: 133–156. 10.1016/S0019-9958(78)90275-9View ArticleGoogle Scholar - Boucheron S, Lugosi G, Bousquet O: Concentration inequalities. In
*Advanced Lectures on Machine Learning*. Springer, Berlin; 2003:208–240.Google Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.