Generalizations of Levinson type inequalities via new Green functions with applications to information theory

In this paper, four new Green functions are used to generalize Levinson-type inequalities for the class of 3-convex functions. The f-divergence, Renyi entropy, Renyi divergence, Shannon entropy, and the Zipf–Mandelbrot law are also used to apply the main results to information theory.


Introduction and preliminaries
Information theory is a branch of science concerned with the storage, quantification, and transmission of data.Information is difficult to quantify since it is an abstract object.In his first of two important and essential theorems, Claude Shannon [1], who created the idea of information theory, presented entropy as the fundamental unit of information in 1948.The probability density function may also be used to measure the information.The distance between the two probability distributions is calculated using the divergence measure.Divergence measure is a concept in probability theory that is used to overcome a number of problems.Divergence measures, which compare two probability distributions and are employed in statistics and information theory, are described in the literature.Sensor networks [2], finance [3], economics [4], and approximation of probability distributions [5] are all areas where information and divergence measures are highly valuable and play a significant role.
In [6], Adeel et al. generealized the Levinson inequality and gave fruitful results in information theory.Khan [7] et al., used Abel-Gontscharoff interpolation to present Levinsontype inequalities for convex functions of higher order.Adeel et al., calculated Shannon entropy and estimated f -divergence in [8] by utilising new Lidstone polynomials and Green functions in association with Levinson-type inequalities.By using Hermite interpolating polynomial, Khan [9] et al., were successful to achieve Levinson type inequality for the convex functions of higher order and provided estimates for the Shannon entropy and f -divergence.In [10], Adeel et al. used Bullen-type inequalities to estimate different entropies and f-divergence via Fink's identity.Khan [11] et al. gave various entropy results related to Levinson-type inequalities using Green functions and also presented results for Hermite interpolating polynomial.For 2n-convex functions, Khan [12] et al. generalized Levinson-type inequalities by applying Lidstone interpolating polynomial.In [13], Adeel et al. used Green functions to obtain generalize Levinson-type inequalities via Montgomery identity.They also found bounds for different entropies and divergences.However, in [6,11,13], all the generalizations and results are proved using only two Green functions.In this study, four newly defined 3-convex Green functions are used to generalize the Levinson-type inequalities, and the bounds for different entropies are given.
Higher-order convex functions are defined using divided difference techniques.
The nth-order divided difference is used to define a real valued convex function in the following formulation (see [14, p. 15]).
Let g = [ê 1 , ê2 ] ⊂ R, ê1 < ê2 and λ = 1, . . ., 4. In [20], Pečarić et al. define new type of Green functions, Ĝλ : g × g → R, which are given as: The Abel-Gontscharof-type identities were also proved by them utilising these Green functions, which are given by; where This work is arranged as follows: in Sect.2, new type of 3-convex Green functions are defined, also identities related to these 3-convex Green functions are established.Additionally, a new class of 3-convex Green functions is used to modify Levinson's inequality for the 3-convex function.In Sect.3, the f-divergence, the Renyi entropy, Renyi divergence, the Zipf-Mandelbrot law, and the Shannon entropy are used to give results to information theory.

Mian results
Firstly, we define new type of 3-convex Green functions with graphs and using these Green functions, a lemma is also stated.Then results related to Levinson-type inequalities using new Green functions are being presented.
Lemma 2.1 Consider a function f be defined on g such that f exists and G λ (λ = 1, . . ., 4) be the two-point right focal problem-type Green functions given by ( 14)- (17).Then the following identities are true: Proof The aforementioned results can be proved by employing the same integrating approach.As a result, we give proof of (18) only.
Proof Let k = 4 and using ( 21) in ( 24), we have Similar steps are followed to get (23).
Proof Applying Theorem 2.3, with given conditions of the statement, we get the desired result.

Applications to information theory
Levinson-type inequalities play an essential role in generalizing inequalities for divergence between probability distributions.In [6,11,13], Adeel et al. applied their findings to information theory by using two 3-convex Green functions.In this section, the key findings of Sect. 1 are linked to information theory via f -divergence, Rényi entropy and divergence, Shannon entropy and Zipf-Mandelbrot law using newly defined 3-convex Green functions ( 14)-(17).
(i) For each 3-convex and continuous function f : where and y τ = s τ u τ in Theorem 2.3, we get the required results.(ii) Similarly, in Remark 2.1, take constant of integration equals to zero in second part of piecewise function G 2 , also replace ê2 with ê1 and φ with ϑ then the results of Theorem 3.1 coincide with [6, p. 12, Theorem 6].

Shannon entropy
Definition 3.3 (See [25]) For positive probability distribution l = (l 1 , . . ., l η ) the Shannon entropy is given by and where and S is defined in (44) and If log has base less than 1, then (47) and (46) are reversed.

Rényi divergence and entropy
In [26], Rényi divergence and entropy are defined as: and -order Rényi entropy is given by For non-negative probability distributions, these definitions are also valid.

Theorem 3.2 Let the hypothesis E holds and
If either 1 < and base of log is greater than 1 or ∈ [0, 1) and base of log is less than If either base of log is less than 1 and 1 < or ∈ [0, 1) and log has base greater than 1, then (52) and (53) are reversed.
Proof We prove for ∈ [0, 1) and base of log is greater than 1 and the other cases can be proved in a similar manner.

Zipf-Mandelbrot law
The Zipf law is given as follows (see [27]).

Figure 1
Figure 1 Graph of Green functions G λ for different values of φ and ϑ

Remark 3 .
1 (i)  In Remark 2.1, put ê2 = ê1 and constant of integration equals to zero in first part of piecewise function G 1 , then the results of Theorem 3.1 coincide with [6, p. 12, Theorem 6].

Remark 3 . 4
If all the assumptions of part (i) and (ii) of Remark 3.1 are applied to Corollary 3.2, then the results of Corollary 3.2 meet with [6, p. 16, Corollary 7].
[6,ark 3.2 (i) In Remark 2.1, put ê2 = ê1 and constant of integration equals to zero in first part of piecewise function G 1 then the results of Corollary 3.1 meet with [6, p. 13, Corollary 6].(ii)Take constant of integration equals to zero in second part of piecewise function G 2 , also replace ê2 with ê1 and φ with ϑ in Remark 2.1, then the results of Corollary 3.1 meet with[6, p. 13, Corollary 6].
), we have required results by Remark 2.2.