Skip to main content

A necessary and sufficient condition for the inequality of generalized weighted means

Abstract

We present in this paper a necessary and sufficient condition to establish the inequality between generalized weighted means which share the same sequence of numbers but differ in the weights. We first present a sufficient condition, and then obtain the more general, necessary and sufficient, condition. Our results were motivated by an inequality, involving harmonic means, found in the study of multiple importance sampling Monte Carlo technique. We present new proofs of Chebyshev’s sum inequality, Cauchy-Schwartz, and the rearrangement inequality, and derive several interesting inequalities, some of them related to the Shannon entropy, the Tsallis, and the Rényi entropy with different entropic indices, and to logsumexp mean. Those inequalities are obtained as particular cases of our general inequality, and show the potential and practical interest of our approach. We show too the relationship of our inequality with sequence majorization.

1 Introduction

In the research in the multiple importance sample Monte Carlo integration problem [13], we were confronted with several inequalities relating harmonic means, which were either described in the literature [47] or easy to prove from it. However, we were unable to find in the literature the following inequality (we give in an Appendix an interpretation of this inequality), which we conjectured was true.

Conjecture 1

For \(\{ b_{k} \}\) a sequence of M strictly positive numbers, and \(C \ge0\), we have

$$ \frac{\mathcal{H}(\{b_{k} (b_{k}+C)\})}{\mathcal{H}(\{b_{k}\})} \le\frac{\mathcal{H}(\{(b_{k}+C) (b_{k}+C)\})}{\mathcal{H}(\{b_{k}+C\})}, $$
(1)

where \(\mathcal{H}\) stands for harmonic mean.

Observe that if we use the following weights:

$$ \alpha_{k} = \frac{\mathcal{H}(\{b_{k}+C\})}{M (b_{k}+C)}, \quad\mbox{where } \sum _{k=1}^{M} \alpha _{k} =1, $$
(2)

and

$$ \alpha'_{k} = \frac{\mathcal{H}(\{b_{k}\})}{ M b_{k}},\quad \mbox{where } \sum_{k=1}^{M} \alpha'_{k} =1 $$
(3)

then equation (1) can be written as an inequality between weighted harmonic means,

$$ \mathcal{H}\bigl(\bigl\{ (b_{k}+C)/\alpha'_{k} \bigr\} \bigr) \le\mathcal{H}\bigl(\bigl\{ (b_{k}+C)/\alpha_{k} \bigr\} \bigr). $$
(4)

Observe also that

$$ b_{i} \le b_{j} \quad\Rightarrow\quad \frac{\alpha'_{i}}{ \alpha'_{j}} = \frac{b_{j}}{ b_{i}} \ge\frac{b_{j} + C}{ b_{i} + C} = \frac{\alpha_{i}}{\alpha_{j}}, $$
(5)

and as \(b_{i} \le b_{j} \Leftrightarrow b_{i}+C \le b_{j}+C\) we can state that, for any sequence \(\{ b_{k} \}\) of strictly positive numbers, the weights \(\{\alpha_{k}\}\) and \(\{\alpha'_{k}\}\) in equations (2) and (3), with \(C\ge0\), fulfill the following condition:

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow \quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(6)

We will see in next section that this condition is sufficient for Conjecture 1 to be true.

2 Results: inequalities for generalized weighted mean

2.1 A new inequality for generalized weighted mean

By taking the \(\{ \alpha_{k} \}\) to be any weights obeying equation (6) we generalize Conjecture 1 to the following theorem.

Theorem 1

Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\). Consider that the weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) obey the following condition:

$$ \forall(b_{i},b_{j}), \quad b_{i} \le b_{j}\quad \Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(7)

Then the following inequality holds:

$$ \frac{1}{\sum_{k=1}^{M} \alpha'_{k}/b_{k}} \le\frac{1}{\sum_{k=1}^{M} \alpha_{k}/b_{k}}, $$
(8)

or, equivalently,

$$ \mathcal{H}\bigl({b_{k}/\alpha'_{k}} \bigr) \le\mathcal{H}({b_{k}/\alpha_{k}}) . $$
(9)

Without loss of generality we can assume the sequence \(\{ b_{k} \}\) is given in increasing order. In that case condition (7) is equivalent to

$$ \forall(i,j),\quad 1 \le i,j \le M, i \le j \quad\Rightarrow\quad \alpha'_{i} / \alpha '_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(10)

Let us prove first the following lemma.

Lemma 1

Obeying condition in equation (7) implies that

$$\begin{aligned} &\alpha'_{1} \ge\alpha_{1}, \\ &\alpha'_{M} \le\alpha_{M}. \end{aligned}$$
(11)

Proof

Effectively, using equation (7),

$$\begin{aligned} 1 =& \alpha'_{1}+ \cdots+ \alpha'_{M} \ge \frac{\alpha_{1} \alpha'_{M}}{ \alpha_{M}}+ \cdots+ \frac{\alpha_{M-1} \alpha'_{M}}{ \alpha_{M}}+\alpha '_{M} \\ =& \frac{\alpha'_{M}}{\alpha_{M}} (\alpha_{1} + \cdots+ \alpha_{M}) = \frac {\alpha'_{M}}{\alpha_{M}}. \end{aligned}$$
(12)

In a similar way we can show that \(\alpha'_{1} \ge\alpha_{1}\). □

We prove now Theorem 1 under the \(\{b_{k}\}\)-increasing condition:

Proof

Now, if we prove that

$$ \sum_{k=1}^{M} \frac{\alpha_{k}}{b_{k}} \le\sum_{k=1}^{M} \frac{\alpha'_{k}}{b_{k}}, $$
(13)

we will have proved equation (8).

Proving equation (13) for \(M=2\) is easy:

$$\begin{aligned} \frac{\alpha'_{1}}{b_{1}}+\frac{\alpha'_{2}}{b_{2}} - \biggl(\frac{\alpha _{1}}{b_{1}}+ \frac{\alpha_{2}}{b_{2}} \biggr) &= \frac{\alpha'_{1}-\alpha_{1}}{b_{1}}+\frac{\alpha'_{2}-\alpha_{2}}{b_{2}} \ge \frac{\alpha'_{1}-\alpha_{1}+ \alpha'_{2}-\alpha_{2}}{b_{2}} = 0, \end{aligned}$$
(14)

where the inequality is obtained because from equation (11), \(\alpha'_{1} - \alpha_{1} \ge0\) and \(0 < b_{1} \le b_{2} \).

Let us use induction for \(M > 2\), i.e., assume that equation (13) holds for \(M-1\), \(M\ge3\), and take as weights \(\{\frac{\alpha_{k}}{(1-\alpha_{M})}\}\) and \(\{\frac{\alpha'_{k}}{(1-\alpha '_{M})}\}\) (they add to 1 as \(\sum_{k=1}^{M-1} \alpha_{k} = 1-\alpha_{M}\), and \(\sum_{k=1}^{M-1} \alpha'_{k} = 1-\alpha'_{M}\)). These weights fulfill condition (7), and thus

$$ \frac{1}{(1-\alpha_{M})} \biggl(\frac{\alpha_{1}}{b_{1}} +\cdots+ \frac{\alpha _{M-1}}{b_{M-1}}\biggr) \le\frac{1}{(1-\alpha'_{M})} \biggl( \frac{\alpha'_{1}}{b_{1}} + \cdots+ \frac{\alpha'_{M-1}}{b_{M-1}} \biggr). $$
(15)

Changing of side the divisors and adding \(\alpha'_{M}/b_{M}\) to the left and right members,

$$\begin{aligned} & \bigl(1-\alpha'_{M}\bigr) \biggl( \frac{\alpha_{1}}{b_{1}} +\cdots+ \frac{\alpha _{M-1}}{b_{M-1}}\biggr) + \frac{\alpha'_{M}}{b_{M}} \\ &\quad\le (1-\alpha_{M}) \biggl( \frac{\alpha'_{1}}{b_{1}} +\cdots+ \frac{\alpha '_{M-1}}{b_{M-1}} \biggr) + \frac{\alpha'_{M}}{b_{M}} . \end{aligned}$$
(16)

After reorganizing terms, and adding and subtracting \(\alpha_{M}/b_{M}\), we obtain

$$\begin{aligned} & \frac{\alpha'_{1}}{b_{1}} +\cdots+ \frac{\alpha'_{M}}{b_{M}} - \biggl( \frac{\alpha_{1}}{b_{1}} +\cdots+ \frac{\alpha_{M}}{b_{M}} \biggr) \\ &\quad\ge- \frac {\alpha_{M}}{b_{M}}+ \frac{\alpha'_{M}}{b_{M}} + \frac{ \alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} }{b_{1}} + \cdots+ \frac{ \alpha_{M} \alpha'_{M-1}- \alpha'_{M} \alpha_{M-1}}{b_{M-1}}. \end{aligned}$$
(17)

Observe now that both by the condition on the weights

$$\alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} \ge0,\qquad \ldots,\qquad \alpha_{M} \alpha '_{M-1}- \alpha'_{M} \alpha_{M-1} \ge0 $$

and by the ordering of the \(\{b_{k}\}\), \(0 < b_{1} \le b_{2}\le \cdots\le b_{M}\), we can write

$$\begin{aligned} &{-} \frac{\alpha_{M}}{b_{M}}+ \frac{\alpha'_{M}}{b_{M}} + \frac{ \alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} }{b_{1}} + \cdots+ \frac{ \alpha_{M} \alpha'_{M-1}- \alpha'_{M} \alpha _{M-1}}{b_{M-1}} \\ &\quad\ge - \frac{\alpha_{M}}{b_{M}}+ \frac{\alpha'_{M}}{b_{M}} + \frac{ \alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} + \cdots+ \alpha_{M} \alpha'_{M-1}- \alpha'_{M} \alpha_{M-1}}{b_{M}} = 0, \end{aligned}$$
(18)

proving thus equation (13) for any M. □

Corollary 1

A sufficient condition for strict inequality in equation (9) is that \(b_{1} < b_{M}\) (i.e., the non-trivial case) and \(\alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} >0 \).

Proof

Observe that \(\alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} = 0\) would imply, from equation (11), that \(\alpha_{1} = \alpha'_{1}\) and \(\alpha_{M} = \alpha'_{M}\), and applying condition (7) with \(i=1\) we would obtain \(\alpha_{j} \ge\alpha'_{j}\) for all \(j \ge1\). Using the fact that \(\sum_{j} \alpha_{j} = \sum_{j} \alpha'_{j}=1\) we immediately arrive at \(\alpha_{j} = \alpha'_{j}\) for all j, thus the trivial case. If we exclude this trivial case the inequality (13) is strict. □

Corollary 2

Given a strictly positive sequence \(\{ b_{k} \}\), and \(k_{1} >0\), \(k_{2},C_{1}, C_{2} \ge0\), we have

$$ \frac{\mathcal{H}( \{ b_{k} (k_{1} b_{k} + C_{1}) \})}{\mathcal{H}( \{ b_{k} \})} \le \frac{\mathcal{H}( \{ (k_{2} b_{k} + C_{2}) (k_{1} b_{k} + C_{1}) \})}{\mathcal{H}( \{ (k_{2} b_{k} + C_{2}) \})}. $$
(19)

Proof

Consider the strictly positive sequence \(\{ (k_{1} b_{k} + C_{1}) \}\) and the weights

$$\begin{aligned} &\alpha'_{i} =\frac{\mathcal{H}( \{ b_{k} \})}{M b_{i}} , \\ &\alpha_{i} = \frac{\mathcal{H}( \{ (k_{2} b_{k} + C_{2}) \} )}{M (k_{2} b_{i} + C_{2})}, \end{aligned}$$
(20)

then we have

$$ \begin{aligned} &\forall(i,j), \quad \bigl( (k_{1} b_{i} + C_{1}) \le(k_{1} b_{j} + C_{1}) \bigr) \quad\Leftrightarrow\quad(b_{i} \le b_{j}) \\ &\quad\Rightarrow\quad \frac{\alpha'_{i}}{\alpha'_{j}} = \frac{b_{j}}{b_{i}} \ge \frac {(k_{2} b_{j} + C_{2}) }{ (k_{2} b_{i} + C_{2})} = \frac{\alpha_{i}}{\alpha_{j}}. \end{aligned} $$
(21)

 □

Observe that Conjecture 1 is a particular case of equation (21) when \(k_{1}=k_{2}=1, C_{2}=C_{1}\).

Other interesting cases are when \(k_{1} = 1, k_{2} = 0\):

$$ \mathcal{H}\bigl( \bigl\{ b_{k} ( b_{k} + C) \bigr\} \bigr) \le\mathcal{H}\bigl( \{ b_{k} \}\bigr) \mathcal{H}\bigl( \bigl\{ (b_{k} + C) \bigr\} \bigr), $$
(22)

and when \(k_{1} = 1, k_{2} = 0, C_{1} = 0\):

$$ \mathcal{H}\bigl( \bigl\{ b_{k}^{2} \bigr\} \bigr) \le \bigl( \mathcal{H}\bigl( \{ b_{k} \}\bigr) \bigr)^{2}, $$
(23)

a result that can be immediately derived from the inequality between power means with orders −1 and −2 [6]. Finally, taking \(k_{1} = 1, k_{2} = 1, C_{1} = 0\):

$$ \frac{\mathcal{H}( \{ b_{k}^{2} \})}{\mathcal{H}( \{ b_{k} \})} \le\frac{\mathcal{H}( \{ b_{k} (b_{k} + C)\})}{\mathcal{H}( \{ (b_{k} + C) \})} . $$
(24)

Corollary 3

Given a strictly positive sequence \(\{ b_{k} \}\), for any γ, we have

$$ \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma+1} \bigr\} \bigr) \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma-1} \bigr\} \bigr) \le \bigl( \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma} \bigr\} \bigr) \bigr)^{2}. $$
(25)

Proof

Taking as weights

$$\begin{aligned} &\alpha'_{k} = \frac{\mathcal{H}( \{ b_{k}^{\gamma}\})}{M b_{k}^{\gamma}}, \\ &\alpha_{k} = \frac{\mathcal{H}( \{ b_{k}^{\gamma-1} \})}{M b_{k}^{\gamma-1}}, \end{aligned}$$
(26)

we have

$$\begin{aligned} b_{i} \le b_{j} \quad\Rightarrow\quad \frac{\alpha'_{i}}{\alpha'_{j}} = \frac{b_{j}^{\gamma}}{b_{i}^{\gamma}} \ge\frac{b_{j}^{\gamma-1}}{b_{i}^{\gamma-1}} = \frac{\alpha _{i}}{\alpha_{j}}. \end{aligned}$$
(27)

 □

Observe that for \(\gamma= 1\) we reproduce again equation (23). Observe also that taking \(\gamma= 0\) we get the well-known inequality

$$\begin{aligned} \mathcal{H}\bigl( \{ b_{k} \}\bigr) \le \mathcal{A} \bigl( \{ b_{k} \}\bigr), \end{aligned}$$
(28)

where \(\mathcal{A}( \{ b_{k} \}) = ( \mathcal{H}( \{ b_{k}^{-1} \}) )^{-1}\) is the arithmetic mean of \(\{ b_{k} \}\).

Corollary 4

Given a strictly positive sequence \(\{ b_{k} \}\), and \(\gamma_{1},\gamma _{2}\), such that \(\gamma_{1}, \gamma_{2} \ge0\), the following inequality holds:

$$ \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma_{1}+\gamma_{2}} \bigr\} \bigr) \le \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma _{1}} \bigr\} \bigr) \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma_{2}} \bigr\} \bigr). $$
(29)

Proof

Consider the sequence \(\{b_{k}^{\gamma_{1}} \}\), and take as weights

$$\begin{aligned} &\alpha'_{k} = \frac{\mathcal{H}( \{ b_{k}^{\gamma_{2}} \})}{ M b_{k}^{\gamma _{2}}}, \\ &\alpha_{k} = 1/ M , \end{aligned}$$
(30)

we have

$$\begin{aligned} b_{i} \le b_{j} \quad\Leftrightarrow\quad b_{i}^{\gamma_{1}} \le b_{j}^{\gamma_{1}} \quad\Rightarrow\quad \frac{\alpha'_{i}}{\alpha'_{j}} = \frac{b_{j}^{\gamma _{2}}}{b_{i}^{\gamma_{2}}} \ge1 = \frac{\alpha_{i}}{\alpha_{j}}. \end{aligned}$$
(31)

 □

Let us see now that Theorem 1, for the weighted harmonic mean, also holds for the weighted arithmetic mean.

Theorem 2

Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\). If the weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) obey the following condition:

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}, $$
(32)

then the following inequality holds:

$$ \sum_{k} \alpha'_{k} b_{k} \le\sum_{k} \alpha_{k} b_{k}, $$
(33)

or equivalently

$$ \mathcal{A}\bigl( \bigl\{ \alpha'_{k} b_{k}\bigr\} \bigr) \le\mathcal{A}\bigl(\{\alpha_{k} b_{k}\}\bigr). $$
(34)

Proof

As \(\mathcal{A}(\{\alpha_{k} b_{k}\}) = ( \mathcal{H}(\{b_{k}^{-1}/\alpha_{k} \} ) )^{-1}\) it is enough to prove

$$ \mathcal{H}\bigl(\bigl\{ b_{k}^{-1}/ \alpha_{k} \bigr\} \bigr) \le\mathcal{H}\bigl(\bigl\{ b_{k}^{-1}/ \alpha'_{k} \bigr\} \bigr). $$
(35)

Observe now that

$$ b_{j}^{-1} \le b_{i}^{-1} \quad \Leftrightarrow\quad b_{i} \le b_{j} \quad\Rightarrow \quad\alpha '_{i} / \alpha'_{j} \ge\alpha_{i}/ \alpha_{j} \quad\Leftrightarrow\quad \alpha_{j} / \alpha_{i} \ge\alpha'_{j}/ \alpha'_{i}, $$
(36)

and thus we can apply Theorem 1 to the sequence \(\{ b_{k}^{-1} \}\) with switched weights (i.e. with \(\{ \alpha_{k}' \}\) and \(\{ \alpha_{k} \}\) instead of \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\)) to obtain equation (35). □

Remark 1

We can relax from Theorem 2 the positivity condition for sequence \(\{ b_{k} \}\). Indeed, for any C,

$$ \biggl( \sum_{k} \alpha'_{k} b_{k} \le\sum _{k} \alpha_{k} b_{k} \biggr) \quad \Leftrightarrow\quad \biggl( \sum_{k} \alpha'_{k} (b_{k}+C) \le\sum _{k} \alpha_{k} (b_{k}+C) \biggr) $$
(37)

and

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Leftrightarrow\quad b_{i}+C \le b_{j}+C. $$
(38)

The next corollary is the equivalent of Corollary 1.

Corollary 5

A sufficient condition for strict inequality in equation (33) is that \(b_{1} < b_{M}\) (i.e., the non trivial case) and \(\alpha_{M} \alpha'_{1}- \alpha'_{M} \alpha_{1} >0\).

Proof

From equation (35) and equation (36), we apply Corollary 1 to the sequence \(\{b_{k}^{-1}\}\) with weights \(\{ \alpha_{k}' \}\) and \(\{ \alpha_{k} \}\). As the ordering is the reverse of \(\{b_{k}\}\), and weights are switched, \(\alpha_{M}\) would take the role of \(\alpha'_{1}\) and vice versa, and the same with \(\alpha'_{M}\) and \(\alpha_{1}\). □

Corollary 6

Chebyshev’s sum inequality

Given the sequences \(\{x_{1} \ge x_{2} \ge\cdots\ge x_{M} \ge0 \}, \{ y_{1} \ge y_{2} \ge\cdots\ge y_{M} \ge0 \}\) the following inequality holds:

$$ \mathcal{A}\bigl( \{ x_{k} y_{k} \}\bigr) \ge \mathcal{A}\bigl( \{ x_{k} \}\bigr) \mathcal{A}\bigl( \{ y_{k} \}\bigr). $$
(39)

Proof

Consider first the minimum index, \(M^{\star} \le M\), so that all \(x_{k}, y_{k}\) are strictly positive, the sequence \(\{x_{k}\}_{k=1}^{M^{\star}} \), and weights

$$\begin{aligned} &\alpha'_{k} = 1/M^{\star}, \\ &\alpha_{k} = \frac{y_{k}}{\sum_{k=1}^{M^{\star}} y_{k}} . \end{aligned}$$
(40)

As both sequences \(\{x_{k}\}_{k=1}^{M^{\star}} \) and \(\{y_{k}\} _{k=1}^{M^{\star}} \) are in the same (decreasing) order, we have

$$\begin{aligned} x_{i} \le x_{j} \quad\Leftrightarrow\quad y_{i} \le y_{j} \quad\Rightarrow\quad\frac{\alpha '_{i}}{\alpha'_{j}} = 1 \ge\frac{\alpha_{i}}{\alpha_{j}} = \frac{y_{i}}{y_{j}}, \end{aligned}$$
(41)

and we can apply Theorem 2 to obtain

$$\begin{aligned} &\frac{\sum_{k=1}^{M^{\star}} x_{k} y_{k}}{ \sum_{k=1}^{M^{\star}} y_{k}} \ge \frac{\sum_{k=1}^{M^{\star}} x_{k}}{M^{\star}}, \\ &\sum_{k=1}^{M^{\star}} x_{k} y_{k} \ge \frac{1}{M^{\star}} \sum_{k=1}^{M^{\star}} x_{k} \sum_{k=1}^{M^{\star}} y_{k}. \end{aligned}$$
(42)

Suppose now without loss of generality that the minimum index \(M^{\star }\) corresponds to the \(\{x_{k}\}\) succession. We can write

$$\begin{aligned} \sum_{k=1}^{M} x_{k} y_{k} = \sum_{k=1}^{M^{\star}} x_{k} y_{k} \ge& \frac {1}{M^{\star}} \sum _{k=1}^{M} x_{k} \sum _{k=1}^{M^{\star}} y_{k} \ge \frac{1}{M} \sum_{k=1}^{M} x_{k} \sum _{k=1}^{M} y_{k} , \end{aligned}$$
(43)

where the last inequality happens because the \(\{y_{k}\}\) is a decreasing sequence. □

Corollary 6 can easily be extended to the following one.

Corollary 7

Chebyshev’s sum inequality extended

Given the positive sequences \(\{x_{1}, x_{2}, \ldots, x_{M} \}, \{y_{1} , y_{2}, \ldots, y_{M} \}\), then the following holds:

If both sequences are sorted in the same order then

$$ \mathcal{A}\bigl( \{ x_{k} y_{k} \}\bigr) \ge \mathcal{A}\bigl( \{ x_{k} \}\bigr) \mathcal{A}\bigl( \{ y_{k} \}\bigr). $$
(44)

If in addition they are strictly positive

$$ \mathcal{A}\bigl( \{ x_{k} /y_{k} \}\bigr) \le \mathcal{A}\bigl( \{ x_{k} \}\bigr) \mathcal{A}\bigl( \{ 1/ y_{k} \}\bigr). $$
(45)

If both sequences are sorted in opposite order then

$$ \mathcal{A}\bigl( \{ x_{k} y_{k} \}\bigr) \le \mathcal{A}\bigl( \{ x_{k} \}\bigr) \mathcal{A}\bigl( \{ y_{k} \}\bigr). $$
(46)

If in addition they are strictly positive

$$ \mathcal{A}\bigl( \{ x_{k} / y_{k} \}\bigr) \ge \mathcal{A}\bigl( \{ x_{k} \}\bigr) \mathcal{A}\bigl( \{ 1/ y_{k} \}\bigr). $$
(47)

Corollary 8

Cauchy-Schwartz-Buniakowski inequality

Given the sequences \(\{x_{1}, x_{2}, \ldots, x_{M} \}, \{ y_{1} , y_{2}, \ldots, y_{M} \}\) the following inequality holds:

$$ \biggl( \sum_{k} x_{k}^{2} \biggr) \biggl( \sum_{k} y_{k}^{2} \biggr) \ge \biggl( \sum _{k} x_{k} y_{k} \biggr)^{2} . $$
(48)

Proof

Consider first the sequences of absolute values \(\{|x_{1}|, |x_{2}|, \ldots , |x_{M}| \}, \{|y_{1}| , |y_{2}|, \ldots, |y_{M}| \}\). Reorder the sequences so that the zero values are at the end, and be \(M^{\star}\) the minimum length of non-zero values. Consider now the sequence \(\{ |x_{k}|/|y_{k}| \} \) and the weights

$$\begin{aligned} &\alpha'_{k}= \frac{ |y_{k}|^{2}}{\sum_{k=1}^{M^{\star}} |y_{k}|^{2}} , \\ &\alpha_{k} = \frac{ |x_{k}| |y_{k}|}{\sum_{k=1}^{M^{\star}} |x_{k}| |y_{k}|}. \end{aligned}$$
(49)

Trivially, \(\frac{|x_{i}|}{|y_{i}|} \le \frac{|x_{j}|}{|y_{j}|} \Leftrightarrow\alpha'_{i}/ \alpha'_{j} = \frac{ |y_{i}|^{2} }{ |y_{j}|^{2}} \ge \frac{|x_{i}| |y_{i}| }{ |x_{j}| |y_{j}|}= \alpha_{i}/ \alpha_{j}\), and we can apply Theorem 2,

$$\begin{aligned}& \Biggl( \sum_{k=1}^{M^{\star}} |x_{k}| |y_{k}| \Biggr)^{2} \le \Biggl( \sum _{k=1}^{M^{\star}} |x_{k}|^{2} \Biggr) \Biggl( \sum_{k=1}^{M^{\star}} |y_{k}|^{2} \Biggr), \\& \begin{aligned}[b] \Biggl( \sum_{k=1}^{M} x_{k} y_{k} \Biggr)^{2} &\le \Biggl( \sum _{k=1}^{M} |x_{k}| |y_{k}| \Biggr)^{2} = \Biggl( \sum_{k=1}^{M^{\star}} |x_{k}| |y_{k}| \Biggr)^{2} \le \Biggl( \sum _{k=1}^{M^{\star}} |x_{k}|^{2} \Biggr) \Biggl( \sum_{k=1}^{M^{\star}} |y_{k}|^{2} \Biggr) \\ &\le \Biggl( \sum_{k=1}^{M} |x_{k}|^{2} \Biggr) \Biggl( \sum _{k=1}^{M} |y_{k}|^{2} \Biggr) = \Biggl( \sum_{k=1}^{M} x_{k}^{2} \Biggr) \Biggl( \sum_{k=1}^{M} y_{k}^{2} \Biggr). \end{aligned} \end{aligned}$$
(50)

 □

\(\mathcal{A}( \{ b_{k}^{\gamma} \}) = 1/\mathcal{H}( \{ b_{k}^{-\gamma} \})\) and reciprocally, Corollary 6, applied to sequences \(\{ b_{k}^{\gamma_{1}} \}, \{ b_{k}^{\gamma_{2}} \}, \gamma_{1}, \gamma_{1} \ge0\), together with Corollary 4, allow us to establish the following corollary.

Corollary 9

Given a strictly positive sequence \(\{ b_{k} \}\), and \(\gamma_{1},\gamma_{2}\) both positive or negative, the following inequalities hold:

$$\begin{aligned} &\mathcal{A}\bigl( \bigl\{ b_{k}^{\gamma_{1}+\gamma_{2}} \bigr\} \bigr) \ge \mathcal{A}\bigl( \bigl\{ b_{k}^{\gamma _{1}} \bigr\} \bigr) \mathcal{A}\bigl( \bigl\{ b_{k}^{\gamma_{2}} \bigr\} \bigr), \\ &\mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma_{1}+\gamma_{2}} \bigr\} \bigr) \le \mathcal{H}\bigl( \bigl\{ b_{k}^{\gamma _{1}} \bigr\} \bigr) \mathcal{H} \bigl( \bigl\{ b_{k}^{\gamma_{2}} \bigr\} \bigr). \end{aligned}$$
(51)

Observe that we can only guarantee that equation (51) hold when both \(\gamma_{1}, \gamma_{2}\) are of the same sign. For instance, taking \(\gamma_{1}= \gamma= -\gamma_{2}\) we can easily check that equation (51) would read \(\mathcal{H}( \{ b_{k}^{\gamma} \}) \ge \mathcal{A}( \{ b_{k}^{\gamma} \})\), which is false in general (rather what is true is the inverse inequality).

Consider now \(H(\{p_{k}\})= -\sum_{k} p_{k}\log p_{k} \), the Shannon entropy of \(\{p_{k}\}\).

Corollary 10

For any probability distribution \(\{p_{k}\}\) the following inequality holds:

$$ H\bigl(\{p_{k}\}\bigr) \le\mathcal{A}\bigl(\{-\log p_{k}\}\bigr), $$
(52)

where, if for any \(i, p_{i} =0\), we use the convention \(p_{i}\log p_{i} =\lim_{p_{i} \rightarrow0} p_{i}\log p_{i} = 0, -\log p_{i} = \lim_{p_{i} \rightarrow 0} -\log p_{i} = + \infty\).

Proof

The limiting cases are trivial, thus let us assume \(\forall i, p_{i} >0\). As the logarithm function is increasing, given the sequence \(\{\log p_{k}\}\), and the weights

$$\alpha'_{k} = 1/ M $$

and \(\alpha_{k} = p_{k} \), we have

$$\begin{aligned} p_{i} \le p_{j} \quad\Leftrightarrow\quad \log p_{i} \le\log p_{j} \quad\Rightarrow\quad \frac {\alpha'_{i}}{\alpha'_{j}} = 1 \ge\frac{\log p_{i}}{\log p_{j}} = \frac{\alpha _{i}}{\alpha_{j}} \end{aligned}$$
(53)

and we can apply Theorem 2. □

In information theory [8], the value \(-\log p_{i}\) is considered as the information of result i, thus Corollary 10 says that the expected value of information is less than or equal to its average value.

Corollary 11

For any strictly positive probability distribution \(\{p_{k}\}\) the following inequality holds:

$$ H\biggl(\biggl\{ \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}}\biggr\} \biggr) + H\bigl( \{p_{k}\}\bigr) \le\log \biggl( \sum_{k} p_{k}^{-1} \biggr). $$
(54)

Proof

We apply Theorem 2 to the sequence \(\{\log p_{k}\}\), with weights

$$\alpha'_{k} = \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} $$

and \(\alpha_{k} = p_{k} \),

$$\begin{aligned} &\sum_{k} \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} \log p_{k} \le \sum_{k} p_{k} \log p_{k}, \\ &{-} \sum_{k} \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} \log p_{k}^{-1} \le \sum_{k} p_{k} \log p_{k}, \\ &{-} \sum_{k} \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} \log p_{k}^{-1} + \log \biggl( \sum_{k} p_{k}^{-1} \biggr)\le \sum_{k} p_{k} \log p_{k} + \log \biggl( \sum _{k} p_{k}^{-1} \biggr), \\ &{-} \sum_{k} \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} \log \frac{p_{k}^{-1}}{ \sum_{k} p_{k}^{-1}} - \sum_{k} p_{k} \log p_{k} \le \log \biggl( \sum_{k} p_{k}^{-1} \biggr). \end{aligned}$$
(55)

 □

The Tsallis entropy with entropic index q of probability distribution \(\{p_{k}\}\) [9, 10] is defined as

$$ S_{q} \bigl(\{p_{k}\}\bigr) = \frac{1}{q-1} \Biggl( 1- \sum_{k=1}^{M} p_{k}^{q} \Biggr). $$
(56)

Corollary 12

For any probability distribution \(\{p_{k}\}\), and \(q, r\) both positive or both negative, the following inequality holds:

$$ \bigl(1-(q-1) S_{q} \bigl(\{p_{k}\}\bigr) \bigr) \bigl(1-(r-1) S_{r} \bigl(\{p_{k}\}\bigr)\bigr) \le M \bigl(1-(r+q-1) S_{r+q} \bigl(\{p_{k}\}\bigr)\bigr). $$
(57)

Proof

From the definition of the Tsallis entropy, equation (56), we have

$$ \sum_{k=1}^{M} p_{k}^{q} = 1 - (q-1) S_{q} \bigl(\{p_{k} \}\bigr). $$
(58)

Consider first both \(q, r\) positive. For simplicity, let us assume the null \(p_{k}\) members of the sequence to be the last ones. Applying Theorem 2 to the sequence of strictly positive \(M^{\star} \le M\), \(\{ p_{k}^{q} > 0 \}\), with weights

$$\begin{aligned} &\alpha'_{k} = \frac{1}{M^{\star}} , \\ &\alpha_{k} = \frac{p^{r}}{\sum_{k=1}^{M^{\star}} p_{k}^{r}}, \end{aligned}$$
(59)

we obtain

$$\begin{aligned} &\frac{1}{M^{\star}} \sum_{k=1}^{M^{\star}} p_{k}^{q} \le \frac{1}{ ( \sum_{k=1}^{M^{\star}} p_{k}^{r} )} \sum _{k=1}^{M^{\star}} p_{k}^{q+r} , \\ & \Biggl( \sum_{k=1}^{M^{\star}} p_{k}^{q} \Biggr) \Biggl( \sum_{k=1}^{M^{\star}} p_{k}^{r} \Biggr) \le M^{\star} \sum _{k=1}^{M^{\star}} p_{k}^{q+r} , \\ & \Biggl( \sum_{k=1}^{M} p_{k}^{q} \Biggr) \Biggl( \sum_{k=1}^{M} p_{k}^{r} \Biggr) = \Biggl( \sum _{k=1}^{M^{\star}} p_{k}^{q} \Biggr) \Biggl( \sum_{k=1}^{M^{\star}} p_{k}^{r} \Biggr) \le M^{\star} \sum_{k=1}^{M^{\star}} p_{k}^{q+r} \le M \sum_{k=1}^{M} p_{k}^{q+r}, \end{aligned}$$
(60)

where in last inequality in equation (60) we have expanded the sums with all the null \(p_{k}\) probabilities. Using now equation (58) we obtain equation (57). Consider now both \(q, r\) negative. For the sake of simplicity, we assume all the \(p_{k} >0\), otherwise we proceed as in the \(q, r\) positive case. Applying Theorem 1, equation (13), to the sequence \(\{p_{k}^{-|q|}\}\) with weights

$$\begin{aligned} &\alpha'_{k} = \frac{p^{r}}{\sum_{k=1}^{M} p_{k}^{r}} , \\ &\alpha_{k} = \frac{1}{M}, \end{aligned}$$
(61)

we obtain again equation (60) and using equation (58) we obtain equation (57). □

Rényi entropy of order \(\beta\ge0\) [11] is defined as

$$ H_{\beta} \bigl(\{p_{k}\}\bigr) = \frac{1}{1- \beta} \log \Biggl( \sum_{k=1}^{M} p_{k}^{q} \Biggr). $$
(62)

Corollary 13

For any probability distribution \(\{p_{k}\}\), and any \(\beta, \gamma\ge 0\) the following inequality holds:

$$ (1-\beta) H_{\beta} \bigl(\{p_{k}\}\bigr)+ (1- \gamma) H_{\gamma} \bigl(\{p_{k}\}\bigr)\le \bigl(1-(\beta+\gamma) \bigr) H_{\beta+\gamma} \bigl(\{p_{k}\}\bigr)+ \log M. $$
(63)

Proof

For the sake of simplicity, we assume all the \(p_{k} >0\), otherwise we proceed as in Corollary 12. Using the weights in equation (59) with \(r = \gamma\), and applying Theorem 1, we obtain a similar result to equation (60),

$$\begin{aligned} \Biggl( \sum_{k=1}^{M} p_{k}^{\beta} \Biggr) \Biggl( \sum_{k=1}^{M} p_{k}^{\gamma} \Biggr) \le M \sum_{k=1}^{M} p_{k}^{\beta+\gamma}. \end{aligned}$$
(64)

As the logarithm is an increasing function

$$\begin{aligned} \log \Biggl( \sum_{k=1}^{M} p_{k}^{\beta} \Biggr) + \log \Biggl( \sum _{k=1}^{M} p_{k}^{\gamma} \Biggr) \le \log M + \log \Biggl( \sum_{k=1}^{M} p_{k}^{\beta +\gamma} \Biggr), \end{aligned}$$
(65)

by substituting

$$ \log \Biggl( \sum_{k=1}^{M} p_{k}^{q} \Biggr) = ({1- \beta} ) H_{\beta} \bigl( \{p_{k}\}\bigr), $$
(66)

we obtain the result. □

Let us see now that Theorem 1 extends to a weighted geometric mean.

Theorem 3

Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\). Consider that the weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) obey the following condition:

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(67)

Then the following inequality holds:

$$ \Pi_{k} b_{k}^{ \alpha'_{k}} \le \Pi_{k} b_{k}^{ \alpha_{k}}. $$
(68)

Proof

Taking logarithms in both sides of equation (68), as \(\log(x)\) is an increasing function, equation (68) is equivalent to

$$ \sum_{k} { \alpha'_{k}} \log b_{k} \le\sum_{k} { \alpha_{k}} \log b_{k} . $$
(69)

As \(b_{i} \le b_{j} \Leftrightarrow\log b_{i} \le\log b_{j}\), we could apply Theorem 2 except for the fact that sequence \(\{ \log b_{k} \} \) can contain negative numbers. But this is not a problem taking into account the Remark to Theorem 2. □

Let us see now that Theorem 1 also extends to weighted generalized (or power) mean:

Theorem 4

Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\). Consider that the weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) obey the following condition:

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(70)

Then the following inequality holds:

$$ \biggl( \sum_{k} { \alpha'_{k}} b_{k}^{p} \biggr)^{1/p} \le \biggl( \sum_{k} { \alpha_{k}} b_{k}^{p} \biggr)^{1/p} $$
(71)

for any \(p \ne0\) (for \(p=0\) the power mean is defined as the weighted geometric mean).

Proof

Observe that the power function is increasing when the exponent is positive and decreasing when negative. We assume first the case \(p > 0\), equation (71) is equivalent to

$$ \sum_{k} { \alpha'_{k}} b_{k}^{p} \le \sum_{k} { \alpha_{k}} b_{k}^{p} . $$
(72)

But as \(b_{i} \le b_{j} \Leftrightarrow b_{i}^{p} \le b_{j}^{p} \) we just apply Theorem 2.

When \(p < 0\) we just apply Theorem 1. □

Theorem 5 below extends Theorem 1 to the quasi-arithmetic or Kolmogorov generalized weighted mean.

Theorem 5

Let \(f(x)\) be an invertible strictly positive monotonic function, with inverse function \(f^{-1}(x)\). Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha _{k} =1, \sum_{k} \alpha'_{k} =1\). Consider that the weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) obey the following condition:

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(73)

Then the following inequality holds:

$$ f^{-1} \Bigl( \sum\alpha'_{k} f(b_{k}) \Bigr) \le f^{-1} \Bigl( \sum\alpha _{k} f(b_{k}) \Bigr). $$
(74)

Proof

Consider first \(f(x)\) increasing. Then we can apply Theorem 2 to the increasing sequence \(\{{f( b_{k})} \}\):

$$ \sum\alpha'_{k} f(b_{k}) \le\sum\alpha_{k} f(b_{k}), $$
(75)

and the result follows by applying \(f^{-1}(x)\), which is also increasing, to both terms of equation (75).

Consider now \(f(x)\) decreasing. The sequence \(\{\frac{1}{f( b_{k})} \}\) is increasing, and we can apply Theorem 1:

$$ \sum\alpha'_{k} \frac{1}{\frac{1}{f( b_{k})}} \ge\sum\alpha_{k} \frac {1}{\frac{1}{f( b_{k})}}, $$
(76)

and the result follows by applying \(f^{-1}(x)\), which is also decreasing, to both terms of equation (76). □

The mean \(\operatorname{logsumexp} (\{ b_{k} \})\) is defined as

$$ \operatorname{logsumexp} \bigl(\{ b_{k} \}\bigr) = \log \biggl( \sum_{k} {e}^{b_{k} } \biggr). $$
(77)

We can state then the following corollary to Theorem 5.

Corollary 14

Given a strictly positive sequence \(\{ b_{k} \}, 1 \le k \le M\), we have

$$ 2 \operatorname{logsumexp} \bigl(\{ b_{k} \}\bigr) \le \operatorname{logsumexp} \bigl(\{ 2 b_{k} \}\bigr) + \log M. $$
(78)

Proof

Observe that we can write

$$ \log \biggl( \frac{ \sum_{k} {e}^{b_{k} }}{M} \biggr) = \operatorname{logsumexp} \bigl(\{ b_{k} \}\bigr) - \log M, $$
(79)

which is a Kolmogorov mean with \(f(x) = e^{x}\). Applying now Theorem 5 to this mean with weights

$$\begin{aligned} &\alpha'_{k} = 1/ M, \\ &\alpha_{k} = \frac{e^{b_{k}}}{ \sum_{k} e^{b_{k}}}, \end{aligned}$$
(80)

we obtain

$$\begin{aligned} &\log \biggl( \frac{ \sum_{k} {e}^{b_{k} }}{M} \biggr) \le \log \biggl( \frac{ \sum_{k} e^{2b_{k}}}{ \sum_{k} e^{b_{k}}} \biggr), \\ &\log \biggl( \sum_{k} {e}^{b_{k} } \biggr) - \log M \le \log \biggl( \sum_{k} e^{2b_{k}} \biggr) - \log \biggl( \sum_{k} {e}^{b_{k} } \biggr) , \\ &2 \log \biggl( \sum_{k} {e}^{b_{k} } \biggr) \le \log \biggl( \sum_{k} e^{2b_{k}} \biggr) + \log M, \end{aligned}$$
(81)

and by the definition of the logsumexp function, equation (77), is equal to equation (78). □

Corollary 15

Given a strictly positive sequence \(\{ b_{k} \}, 1 \le k \le M\), we have

$$ \operatorname{logsumexp} \bigl(\{ -b_{k} \}\bigr) + \operatorname{logsumexp} \bigl(\{ b_{k} \}\bigr) \ge2 \log M. $$
(82)

Proof

Apply Theorem 5 to the Kolmogorov mean with \(f(x) = e^{x}\) with weights

$$ \begin{aligned}[t] &\alpha'_{k} = \frac{e^{-b_{k}}}{ \sum_{k} e^{-b_{k}}} ,\\ &\alpha_{k} = 1/ M. \end{aligned} $$
(83)

 □

Observe now that the condition \(\alpha'_{i} / \alpha'_{j} \ge\alpha_{i}/ \alpha_{j}\) appearing Theorems 1-5 is equivalent to the one of decreasing quotients:

$$ \alpha'_{1} / \alpha_{1} \ge \alpha'_{2}/ \alpha_{2} \ge\cdots\ge \alpha'_{M}/ \alpha_{M}. $$
(84)

Thus we can immediately extend the previous Theorems 1-5 to the following one.

Theorem 6

Be \(\mathcal{M} (\{ b_{k} \}, \{ \alpha_{k} \})\) any of the means appearing in Theorems 1-5, of a sequence \(\{ b_{k} \}\) of M strictly positive numbers, with \(\{ \alpha _{k} \}\), \(\{ \alpha'_{k} \}\) strictly positive weights obeying the condition

$$ \forall(b_{i},b_{j}),\quad b_{i} \le b_{j} \quad\Rightarrow\quad\alpha'_{i} / \alpha'_{j} \ge \alpha_{i}/ \alpha_{j}. $$
(85)

Then, for any subsequence of \(M^{\star}\) numbers \(\{ b_{l} \}\) of \(\{ b_{k} \}\), \(M^{\star} \le M\), with their corresponding normalized weights \(\{ \alpha_{l} \}\), \(\{ \alpha '_{l} \}\), we have

$$ \mathcal{M}\bigl(\{ b_{l} \}, \bigl\{ \alpha'_{l} \bigr\} \bigr) \le\mathcal{M}\bigl(\{ b_{l} \}, \{ \alpha _{l} \}\bigr). $$
(86)

Finally, we consider the following theorem.

Theorem 7

Let \(f(x)\) be an invertible strictly positive monotonic function, with inverse function \(f^{-1}(x)\). Consider a sequence of M strictly positive numbers \(\{ b_{k} \}\) and functions \(g(x), h(x), h'(x)\) strictly positive in the domain \([ \min_{k} \{ b_{k} \}, \max_{k} \{ b_{k} \}]\), and such that in the same domain \(g(x)\) is increasing, and \(h'(x)/h(x)\) is decreasing. Then the following inequality holds:

$$ f^{-1} \biggl( \frac{\sum_{k} h'(b_{k}) f(g(b_{k}))}{\sum_{k} h'(b_{k})} \biggr) \le f^{-1} \biggl( \frac{\sum_{k} h(b_{k}) f(g(b_{k}))}{\sum_{k} h(b_{k})} \biggr). $$
(87)

Proof

Apply Theorem 5 to the sequence \(\{ g(b_{k}) \}\) with weights \(\alpha'_{k}= \frac{h'(b_{k})}{\sum_{k} h'(b_{k})} \) and \(\alpha_{k} = \frac{h(b_{k})}{\sum_{k} h(b_{k})}\). □

Observe that Corollaries 3-15 can be considered as applications of Theorem 7.

2.2 Relationship to majorization

Consider the sequences \(\{x_{k}\}\), \(\{y_{k} \}\), and renumber the indices so that \(\{x_{1} \ge x_{2} \ge\cdots\ge x_{M} \ge0 \}, \{y_{1} \ge y_{2} \ge \cdots\ge y_{M} \ge0 \}\). The sequence \(\{ x_{k} \}\) is said to major sequence \(\{ y_{k} \}\) [5, 12], and we write \(\{ x_{k} \} \succ\{ y_{k} \}\), when the following inequalities hold:

$$\begin{aligned} &x_{1} \ge y_{1}, \\ &x_{1} + x_{2} \ge y_{1} + y_{2}, \\ &\cdots \\ &x_{1} + x_{2} + \cdots+ x_{M-1} \ge y_{1} + y_{2} +\cdots+ y_{M-1}, \\ &x_{1} + x_{2} + \cdots+ x_{M-1}+ y_{M} = y_{1} + y_{2}+ \cdots+ y_{M-1} + y_{M}. \end{aligned}$$
(88)

In general, the \(\alpha'_{k}, \alpha_{k} \) sequences fulfilling condition in equation (7) do not major each other, and we can find examples of sequences that major each other but do not fulfill equation (7), consider for instance the sequences \(\{ 3,2,1,1\}, \{2,2,2,1\}\), they do not fulfill equation (7) but \(\{3,2,1,1\} \succ\{2,2,2,1 \}\). We will find now when both conditions, majorization and equation (7), coincide. Let us prove first the following lemma.

Lemma 2

Consider the sequences of M strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\). If for any sequence of M strictly positive numbers in increasing order \(\{ b_{k} \}\) the following inequality holds:

$$ \sum_{k} \alpha'_{k} b_{k} \le\sum_{k} \alpha_{k} b_{k}, $$
(89)

then the following inequalities also hold:

$$\begin{aligned} &\alpha'_{1} \ge \alpha_{1}, \\ &\alpha'_{1}+ \alpha'_{2} \ge \alpha_{1} + \alpha_{2}, \\ &\vdots \\ &\alpha'_{1}+ \cdots+ \alpha'_{M-1} \ge \alpha_{1} + \cdots+ \alpha _{M-1}, \\ &\alpha'_{1}+ \cdots+ \alpha'_{M-1} + \alpha'_{M} = \alpha_{1} + \cdots + \alpha_{M-1} + \alpha_{M,} \\ &\alpha_{M} \ge \alpha'_{M}, \\ &\alpha_{M}+ \alpha_{M-1} \ge \alpha'_{M}+ \alpha'_{M-1}, \\ &\vdots \\ &\alpha_{M}+ \cdots+ \alpha_{2} \ge \alpha'_{M}+ \cdots+ \alpha'_{2}. \end{aligned}$$
(90)

Proof

Consider the increasing sequence \(\{b_{1},\ldots,b_{1},b_{M},\ldots,b_{M}\}\), \(b_{1} < b_{M}\), and where \(b_{1}\) is written l times, denote \(\mathbf{L} = a_{1}+\cdots+a_{l}\), \(\mathbf{L}' = a'_{1}+\cdots+a'_{l}\). Since \(a_{l+1}+\cdots+a_{M} = 1-\mathbf {L}\), \(a'_{l+1}+\cdots+a'_{M} = 1-\mathbf{L}'\) the inequality (89) gives

$$\mathbf{L}' b_{1} + \bigl(1-\mathbf{L}' \bigr) b_{M} - \mathbf{L} b_{1} - (1- \mathbf{L}) b_{M} \le0, $$

i.e.,

$$\bigl(\mathbf{L}' -\mathbf{L}\bigr) (b_{1}-b_{M}) \le0 \quad\Rightarrow\quad\mathbf{L}' \ge \mathbf{L}. $$

This proves the first \(M-1\) inequalities. Observe now that

$$\mathbf{L}' \ge \mathbf{L}\quad \Rightarrow \quad\mathbf{1} - \mathbf{L} \ge \mathbf{1}-\mathbf{L}', $$

and this accounts for the last \(M-1\) inequalities. □

We can then state Theorem 8.

Theorem 8

Consider the sequences of M increasing strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha _{k} =1, \sum_{k} \alpha'_{k} =1\), with the following condition (equation (10)):

$$\begin{aligned} \forall(i,j),\quad 1 \le i,j \le M, i \le j\quad \Rightarrow \quad\alpha'_{i} / \alpha '_{j} \ge\alpha_{i}/ \alpha_{j}, \end{aligned}$$
(91)

or equivalently

$$\forall(i,j),\quad 1 \le i,j \le M, i \le j \quad \Rightarrow\quad \alpha'_{i} / \alpha _{i} \ge \alpha'_{j}/ \alpha_{j}, $$

then the following majorization holds:

$$\begin{aligned} \{ \alpha_{k} \} \succ\bigl\{ \alpha'_{k} \bigr\} . \end{aligned}$$
(92)

Proof

Observe first that equation (91) (equation (10)) is equivalent to equation (7) for sequences of M increasing strictly positive numbers \(\{ b_{k} \}\). We can apply then Theorem 1, and obtain the condition equation (89) in Lemma 2. It is enough then to apply Lemma 2, which guarantees that the inequalities for majorization, equation (88), are fulfilled for the decreasing sequences \(\{ \alpha_{M+1-k} \}, \{ \alpha '_{M+1-k}\}\). □

Observe that the weights in Corollary 6 are such that \(\{ \alpha_{k} \} \succ\{ \alpha'_{k}\}\), and in this way Corollary 6 can be proved by direct application of Lemma 1 in [12].

A similar theorem can be proved for decreasing weights.

Theorem 9

Consider the sequences of M decreasing strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\), \(\sum_{k} \alpha _{k} =1, \sum_{k} \alpha'_{k} =1\), with the following condition (equation (10)):

$$\begin{aligned} \forall(i,j),\quad 1 \le i,j \le M, i \le j \quad\Rightarrow \quad\alpha'_{i} / \alpha '_{j} \ge\alpha_{i}/ \alpha_{j}, \end{aligned}$$
(93)

or equivalently

$$\forall(i,j),\quad 1 \le i,j \le M, i \le j \quad \Rightarrow\quad \alpha'_{i} / \alpha _{i} \ge \alpha'_{j}/ \alpha_{j}, $$

then the following majorization holds:

$$\begin{aligned} \bigl\{ \alpha'_{k} \bigr\} \succ\{ \alpha_{k}\} . \end{aligned}$$
(94)

Proof

The same proof as for Theorem 8 holds. □

Theorem 10

Consider a convex function \(f(x)\), and the sequences of M strictly positive weights \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \} \), \(\sum_{k} \alpha_{k} =1, \sum_{k} \alpha'_{k} =1\), with the following condition (equation (10)):

$$\begin{aligned} \forall(i,j),\quad 1 \le i,j \le M, i \le j\quad \Rightarrow \quad\alpha'_{i} / \alpha '_{j} \ge\alpha_{i}/ \alpha_{j}, \end{aligned}$$
(95)

or equivalently

$$\forall(i,j),\quad 1 \le i,j \le M, i \le j \quad \Rightarrow\quad \alpha'_{i} / \alpha _{i} \ge \alpha'_{j}/ \alpha_{j}, $$

then the following holds:

If both \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) are increasing then

$$ \sum_{k} f(\alpha_{k}) \ge\sum_{k} f\bigl(\alpha'_{k} \bigr). $$
(96)

If both \(\{ \alpha_{k} \}\) and \(\{ \alpha'_{k} \}\) are decreasing then

$$ \sum_{k} f\bigl( \alpha'_{k}\bigr) \ge\sum_{k} f(\alpha_{k}). $$
(97)

Proof

It is enough to apply Theorems 8 and 9 together with Hardy-Littlewood-Pólya theorem [5, 13] on majorization. □

Theorem 11

Given the sequences of strictly positive numbers \(\{ x_{1} , x_{2} , \ldots, x_{M} \}\), \(\{y_{1}, y_{2} , \ldots, y_{M} \}\) obeying the conditions \(i \le j \Rightarrow x_{i}/y_{i} \ge x_{j}/y_{j}\), and \(\sum_{k} x_{k} = \sum_{k} y_{k}\), then if both sequences are increasing

$$ \{ y_{k} \} \succ\{ x_{k} \}, $$
(98)

and if both sequences are decreasing

$$ \{ x_{k} \} \succ\{ y_{k} \}. $$
(99)

Proof

It is enough to normalize the sequences and apply Theorems 8 and 9, taking into account that majorization is invariant to a change in scale. □

Theorem 12

Given the sequences of numbers \(\{x_{1} , x_{2} , \ldots , x_{M} \}\), \(\{y_{1}, y_{2} , \ldots, y_{M} \}\) obeying the condition \(\sum_{k} x_{k} = \sum_{k} y_{k}\), and be \(\{x_{1}+C , x_{2}+C , \ldots, x_{M}+C \} \), \(\{y_{1}+C, y_{2}+C , \ldots, y_{M}+C \}\) the sequences translated by a positive constant C such that, for all k, \(x_{k}+C >0, y_{k}+C >0\). If the new sequences obey the condition \(i \le j \Rightarrow(x_{i}+C)/(y_{i}+C) \ge(x_{j}+C)/(y_{j}+C)\), then if both sequences are increasing

$$ \{ y_{k} \} \succ\{ x_{k} \}, $$
(100)

and if both sequences are decreasing

$$ \{ x_{k} \} \succ\{ y_{k} \}. $$
(101)

Proof

It is enough to normalize the translated sequences and apply Theorems 8 and 9, taking into account that majorization is invariant to a change of scale and a translation. □

Theorem 13

Given the sequences of numbers \(\{x_{1} , x_{2} , \ldots , x_{M} \}\), \(\{y_{1}, y_{2} , \ldots, y_{M} \}\) obeying the conditions \(i \le j \Rightarrow x_{i}/y_{i} \ge x_{j}/y_{j}\), \(i \le j \Rightarrow (x_{i}-y_{i}) \ge(x_{j}-y_{j})\), and \(\sum_{k} x_{k} = \sum_{k} y_{k}\), then if both sequences are increasing

$$ \{ y_{k} \} \succ\{ x_{k} \}, $$
(102)

and if both sequences are decreasing

$$ \{ x_{k} \} \succ\{ y_{k} \}. $$
(103)

Proof

Using the first two conditions, we have, for \(i \le j\) and for any positive constant C,

$$x_{i} y_{j} - x_{j} y_{i} + C \bigl( ( x_{i} - y_{i} ) - (x_{j} -y_{j}) \bigr) \ge0 \quad\Leftrightarrow\quad(x_{i}+C)/(y_{i}+C) \ge(x_{j}+C)/(y_{j}+C) . $$

We take C such that, for all k, \(x_{k}+C >0, y_{k}+C >0\), and we apply Theorem 12. □

We can also obtain similar results for convex functions than in Theorem 10 for Theorems 11, 12, and 13.

2.3 Not a necessary condition

We can see with counterexamples that the sufficient condition equation (7) appearing on all Theorems 1-5 is not a necessary condition for the inequality of the means for any strictly positive sequence \(\{b_{k}\}\) for \(M\ge3\) (although it is easy to prove it is a necessary condition for \(M=2\)).

Using the (unnormalized) weights \(\{\alpha_{k}\}= \{ 1,2,\ldots,1,2\}\), \(\{\alpha'_{k}\}= \{2,1,\ldots,2,1\}\) for \(M \ge4\) even and \(\{\alpha_{k}\}= \{1,2,\ldots,1,2,2\}\), \(\{\alpha'_{k}\}= \{2,1,\ldots,2,1,2\}\) for \(M\ge3\) odd, we can see that equation (32) does not hold but on the other side equation (33) holds for any strictly positive increasing sequence \(\{b_{k}\}\). For instance, for M even,

$$ 2 b_{1} + b_{2} + \cdots +2 b_{M-1} + b_{M} \le b_{1} + 2 b_{2} + \cdots + b_{M-1} + 2 b_{M} , $$
(104)

because as the \(\{b_{k}\}\) are in increasing order then \(2 b_{1} + b_{2} \le b_{1} + 2 b_{2}, \ldots,2 b_{M-1} + b_{M} \le b_{M-1} + 2 b_{M}\).

We leave it to the reader to check with the other means considered in this paper.

3 Results: a necessary and sufficient condition

We will see in this section that the condition found in Lemma 2 is a necessary and sufficient condition.

Theorem 14

Consider the sequences of M numbers \(\{ x_{k} \}\) and \(\{ y_{k} \}\), \(\sum_{k} x_{k} =\sum_{k} y_{k} \). Then propositions (1) to (4) are equivalent:

(1) for any sequence of M numbers in increasing order \(\{ z_{k}\}\) the following inequality holds:

$$ \sum_{k=1}^{M} x_{k} z_{k} \le\sum_{k=1}^{M} y_{k} z_{k}, $$
(105)

(2) for any sequence of M numbers in increasing order \(\{ z_{k}\}\) the following inequality holds:

$$ \sum_{k=1}^{M} y_{M-k+1} z_{k} \le\sum_{k=1}^{M} x_{M-k+1} z_{k}, $$
(106)

(3) the following inequalities hold:

$$ \begin{aligned}[b] &x_{1} \ge y_{1}, \\ &x_{1}+ x_{2} \ge y_{1} + y_{2} , \\ &\vdots \\ &x_{1}+ \cdots+ x_{M-1} \ge y_{1} + \cdots+ y_{M-1} , \\ &x_{1}+ \cdots+ x_{M-1} + x_{M} = y_{1} + \cdots+ y_{M-1} + y_{M}, \end{aligned} $$
(107)

(4) the following inequalities hold:

$$\begin{aligned} &y_{M} \ge x_{M}, \\ &y_{M}+ y_{M-1} \ge x_{M}+ x_{M-1}, \\ &\vdots \\ &y_{M}+ \cdots+ y_{2} \ge x_{M}+ \cdots+ x_{2}, \\ &y_{1} + \cdots+ y_{M-1} + y_{M} = x_{1}+ \cdots+ x_{M-1} + x_{M} . \end{aligned}$$
(108)

Proof

Let us see that proposition (1) implies (3) and (4). The proof is similar to the one in Lemma 2. Consider the increasing sequence \(\{z_{1},\ldots, z_{1}, z_{M}, \ldots, z_{M}\} \), \(z_{1} < z_{M}\), and where \(z_{1}\) is written l times, denote \(\mathbf{L}' = x_{1}+\cdots+x_{l}\), \(\mathbf{L} = y_{1}+\cdots+y_{l}\). Since \(x_{l+1}+\cdots+x_{M} = C- \mathbf{L}'\), \(y_{l+1}+\cdots+y_{M} = C-\mathbf{L}\) the inequality equation (105) gives

$$\mathbf{L}' z_{1} + \bigl(C-\mathbf{L}' \bigr) z_{M} - \mathbf{L} z_{1} - (C- \mathbf{L}) z_{M} \le0, $$

i.e.,

$$\bigl(\mathbf{L}' -\mathbf{L}\bigr) (z_{1}-z_{M}) \le0 \quad \Rightarrow\quad \mathbf{L}' \ge \mathbf{L}. $$

This proves that (1) implies (3). To prove that (1) implies (4) observe that

$$\mathbf{L'} \ge\mathbf{L} \quad \Leftrightarrow\quad\mathbf{C}- \mathbf{L} \ge\mathbf{C}-\mathbf{L}'. $$

This also proves that (3) implies (4) and (4) implies (3).

To prove that (2) implies (3) and (4), consider the sequence, \(\{z_{1},\ldots , z_{1}, z_{M}, \ldots, z_{M}\}\), \(z_{1} < z_{M}\), \(z_{M}\) is written l times, and the same definitions as before for \(\mathbf{L}, \mathbf{L}'\), then equation (106) gives

$$(C-\mathbf{L}) z_{1} + \mathbf{L} z_{M} - \bigl(C- \mathbf{L}'\bigr) z_{1} - \mathbf{L}' z_{M} \le0, $$

and we proceed as above.

Let us see now that (3) implies (1). The proof can be found in the proof of Lemma 1 by Marjanović and Kadelburg [12], where it was used to show the majorization between two decreasing sequences weighting a third decreasing one. For the sake of completeness we repeat it here. Define for \(1 \le M\), \(A_{k} = \sum_{j=1}^{k} y_{k}, A'_{k} = \sum_{j=1}^{k} x_{k}\), and \(A_{0} = A'_{0} = 0\). Then

$$\begin{aligned} \sum_{k=1}^{M} y_{k} z_{k} - \sum_{k=1}^{M} x_{k} z_{k} &= \sum_{k=1}^{M} (y_{k} - x_{k}) z_{k} \\ &= \sum_{k=1}^{M} \bigl(A_{k}-A_{k-1} - {A'}_{k}+{A'}_{k-1}\bigr) z_{k} \\ &= \sum_{k=1}^{M} \bigl(A_{k}-{A'}_{k} \bigr) z_{k} - \sum_{k=1}^{M} \bigl({A}_{k-1}-{A'}_{k-1}\bigr) z_{k} \\ &= \sum_{k=1}^{M-1} \bigl(A_{k}-{A'}_{k} \bigr) z_{k} - \sum_{k=0}^{M-1} \bigl({A}_{k}-{A'}_{k}\bigr) z_{k+1} \\ &= \sum_{k=1}^{M-1} \bigl(A_{k}-{A'}_{k} \bigr) z_{k} - \sum_{k=1}^{M-1} \bigl({A}_{k}-{A'}_{k}\bigr) z_{k+1} \\ &= \sum_{k=1}^{M-1} \bigl(A_{k}-{A'}_{k} \bigr) (z_{k} - z_{k+1}) \ge0, \end{aligned}$$
(109)

as (3) implies that, for all k, \(A'_{k} - A_{k} \ge0 \), and \(\{ z_{k} \}\) is an increasing sequence.

Repeating the proof for the sequences \(\{y_{M-k+1}\}\) and \(\{x_{M-k+1}\} \) we find that (4) implies (2). □

Remark

The case when the sequences \(\{x_{k}\}, \{y_{k}\}\) are strictly positive and \(\sum_{k} x_{k} = \sum_{k} y_{k} = 1\) is interesting and proves that condition in Lemma 2 is necessary and sufficient, and that it can be extended to harmonic, geometric, and power means, in the same way as Theorem 1 has been extended.

Corollary 16

Rearrangement inequality

Given the sequences of real numbers \(\{x_{1} , x_{2} , \ldots, x_{M} \}\), \(\{y_{1}, y_{2} , \ldots, y_{M} \} \), then the maximum of their crossed sum is obtained when both are paired in the same (increasing or decreasing) order, while the minimum is obtained when both are paired in inverse order.

Proof

Without lack of generality, let us suppose that \(\{x_{1} , x_{2} , \ldots, x_{M} \}\), \(\{y_{1}, y_{2} , \ldots, y_{M} \}\) are in increasing order. Consider \(\{y_{1}^{\star}, y_{2}^{\star} , \ldots, y_{M}^{\star} \}\) be any rearrangement of the sequence \(\{y_{1}, y_{2} , \ldots, y_{M} \}\). Let us consider the sequences \(\{y_{1}^{\star}, y_{2}^{\star} , \ldots, y_{M}^{\star} \}\) and \(\{y_{1}, y_{2} , \ldots, y_{M} \}\). These sequences obey proposition (3) from Theorem 14, thus, they obey proposition (1),

$$ \sum_{k} x_{k} y_{k}^{\star} \le\sum_{k} x_{k} y_{k}, $$
(110)

and proposition (2),

$$ \sum_{k} x_{k} y_{M-k+1} \le\sum_{k} x_{k} y_{k}^{\star}. $$
(111)

 □

4 Conclusions

Motivated by proving an inequality that appeared in our research in Monte Carlo multiple importance sampling (MIS), we identified first a sufficient condition for a generalized weighted means inequality where only the weights are changed. We obtained then a necessary and sufficient condition. We have given new proofs for Chebyshev’s sum, the Cauchy-Schwartz, and the rearrangement inequalities, as well as for other interesting inequalities, and we obtained results for the Shannon, the Tsallis, and the Rényi entropy and logsumexp. We also showed the relationship to majorization.

References

  1. Veach, E, Guibas, LJ: Optimally combining sampling techniques for Monte Carlo rendering. In: Proceedings of the 22nd Annual Conference on Computer Graphics and Interactive Techniques. SIGGRAPH ’95, pp. 419-428. ACM, New York (1995). doi:10.1145/218380.218498

    Chapter  Google Scholar 

  2. Havran, V, Sbert, M: Optimal combination of techniques in multiple importance sampling. In: Proceedings of the 13th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. VRCAI ’14, pp. 141-150. ACM, New York (2014). doi:10.1145/2670473.2670496

    Google Scholar 

  3. Havran, V, Sbert, M: Optimal Combination of Techniques in Multiple Importance Sampling. Technical Report Series of DCGI CS-TR-DCGI-2014-2, Department of Computer Graphics and Interaction, Czech Technical University, FEE, (August 2014)

  4. Bullen, PS: Handbook of Means and Their Inequalities. Springer, Dordrecht (2003)

    Book  MATH  Google Scholar 

  5. Hardy, GH, Littlewood, JE, Pólya, G: Inequalities. Cambridge Mathematical Library. Cambridge University Press, Cambridge (1952). https://books.google.es/books?id=t1RCSP8YKt8C

    MATH  Google Scholar 

  6. Korovkin, PP: Inequalities. Little Mathematics Library. Mir Publishers, Moscow (1975). https://archive.org/details/InequalitieslittleMathematicsLibrary

    MATH  Google Scholar 

  7. Hoehn, L, Niven, I: Averages on the move. Math. Mag. 58(3), 151-156 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  8. Cover, TM, Thomas, JA: Elements of Information Theory. Wiley, New York (2006)

    MATH  Google Scholar 

  9. Tsallis, C: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52(1/2), 479-487 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  10. Tsallis, C: Generalized entropy-based criterion for consistent testing. Phys. Rev. E 58, 1442-1445 (1998)

    Article  Google Scholar 

  11. Rényi, A: On measures of entropy and information. In: Proc. Fourth Berkeley Symp. Math. Stat. and Probability’ 60, vol. 1, pp. 547-561. University of California Press, Berkeley (1961)

    Google Scholar 

  12. Marjanović, MM, Kadelburg, Z: A proof of Chebyshev’s inequality. In: The Teaching of Mathematics, vol. X, 2, pp. 107-108 (2007)

    Google Scholar 

  13. Karamata, J: Sur une inégalité rélative aux fonctions convexes. Publ. Math. Univ. Belgrade 1, 145-148 (1932)

    MATH  Google Scholar 

  14. Rubinstein, RY, Kroese, DP: Simulation and the Monte Carlo Method. Wiley Series in Probability and Statistics. Wiley, New York (2008). http://books.google.com.au/books?id=1-ffZVmazvwC

    MATH  Google Scholar 

  15. Kalos, MH, Whitlock, PA: Monte Carlo Methods: Basics. Monte Carlo Methods. Wiley, New York (1986). http://books.google.com.au/books?id=87UdAQAAMAAJ

    Book  MATH  Google Scholar 

Download references

Acknowledgements

The authors are funded in part by grants TIN2013-47276-C6-1-R from Spanish Government and by grant number 2014-SGR-1232 from Catalan Government. The authors acknowledge the comments by David Juher to an earlier draft, and comments by anonymous reviewers that helped to improve the final version of the paper and suggested simplified proofs for Lemma 2 and Theorem 14.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mateu Sbert.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

The first author introduced the main ideas, obtained the theorems and proofs, and wrote the paper, the second author found counterexamples, checked the proofs, and the main classical bibliography, and checked the draft for legibility. Both authors discussed at length the several drafts of the paper and how to improve it. Both authors read and approved the final manuscript.

Appendix:  Interpretation of Conjecture 1

Appendix:  Interpretation of Conjecture 1

Conjecture 1 has the following interpretation in terms of variances of Monte Carlo estimators [14, 15]. Suppose we have M independent estimators with variances \(v_{k} = \{ b_{k}+C \}\), all of them with the same expected value μ, \(C > \mu^{2} \), and we draw a fixed number of samples from them, distributed for each estimator proportionally to \(\alpha_{k}\), \(\sum_{k} \alpha_{k} = 1\). The variance of the optimal linear combination of the estimators when we distribute the samples according to \(\{ \alpha_{k}\}\) weights can be shown to be [2, 3]:

$$ \frac{\mathcal{H}(\{v_{k}/\alpha_{k}\})}{M} = \frac{\mathcal{H}(\{(b_{k}+C)/\alpha _{k}\})}{M} . $$
(112)

Thus Conjecture 1 means that, for estimators with variance \(\{v_{k} = b_{k}+C\}\), to sample the estimators using weights in equation (3) is better than using weights in equation (2).

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sbert, M., Poch, J. A necessary and sufficient condition for the inequality of generalized weighted means. J Inequal Appl 2016, 292 (2016). https://doi.org/10.1186/s13660-016-1233-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-016-1233-7

Keywords