Physica A: Statistical Mechanics and its Applications
Mathematical inequalities for some divergences
Highlights
► We introduce some extended divergences combining Jeffreys and Tsallis divergence. ► Then we give new some inequalities for them. ► We also give lower bounds for extended Fermi–Dirac and Bose–Einstein divergences. ► Finally, we establish some inequalities for some divergences by Young inequality.
Introduction
For the study of multifractals, in 1988, Tsallis [1] introduced one-parameter extended entropy of Shannon entropy by where is a probability distribution with for all and the logarithmic function for is defined by which uniformly converges to the usual logarithmic function in the limit . Therefore Tsallis entropy converges to Shannon entropy in the limit : It is also known that Rényi entropy [2] is a one-parameter extension of Shannon entropy.
For two probability distributions and we have divergences based on these quantities (1), (3). We denote by Tsallis relative entropy. Tsallis relative entropy converges to the usual relative entropy (divergence, Kullback–Leibler information) in the limit : We also denote by the Rényi relative entropy [2] defined by Obviously .
The divergences can be considered to be a generalization of entropies in the sense that Shannon entropy can be reproduced by the divergence for the uniform distribution . Therefore the study of divergences is important for the development of information science. In this paper, we study several mathematical inequalities related to some generalized divergences.
Section snippets
Two parameter entropies and divergences
In this section and throughout the rest of the paper we consider and with for all to be probability distributions.
We start by introducing the Tsallis quasilinear entropies, that is we extend the quasilinear entropies using the Tsallis recipe.
Definition 2.1 For a continuous and strictly monotonic function on and with (the nonextensivity parameter), Tsallis quasilinear entropy (-quasilinear entropy) is defined by
In
Tsallis type divergences
In what follows Jeffreys and Jensen–Shannon divergences are extended in the context of Tsallis theory. We firstly review their definitions.
Definition 3.1 The Jeffreys divergence is defined by and the Jensen–Shannon divergence is defined as [18], [19]
Analogously we may define the following divergences.
Definition 3.2 The Jeffreys–Tsallis divergence is and the Jensen–Shannon–Tsallis divergence is
We find that
Fermi–Dirac and Bose–Einstein type divergences
Fermi–Dirac and Bose–Einstein divergences are very often relevent in physics problems related to elementary particles which obey such statistics, called fermions, respectively bosons. As one-parameter extension of Fermi–Dirac entropy and Bose–Einstein entropy (see also Refs. [22], [23], [24], [25]), that is of and the Fermi–Dirac–Tsallis entropy was introduced in Ref. [23] and physical phenomena for power-law
Young’s inequality and Tsallis entropies with finite sum
We establish more inequalities involving Tsallis entropy and Tsallis relative entropy applying Young’s inequality.
Lemma 5.1 Young’s inequality Let and such that . If (then ) or (then ), then one has .
Lemma 5.2 Let satisfying . If and , or if and , then Let satisfying . If and , or if and , then
Proof Using Lemma 5.1, we obtain Lemma 5.1 leads to
Concluding remarks
As we have seen, we established a generalized inequality for a new quantity defined by Jeffreys divergence and Tsallis entropy. In addition, we gave lower bounds for one-parameter extended Fermi–Dirac and Bose–Einstein divergences. These lower bounds give the sharper inequalities than their nonnegativity. The nonnegativity of Tsallis divergence was applied to obtain the maximum entropy principle in Ref. [29]. Therefore one could check if also the nonnegativity of one-parameter extended
Acknowledgments
The author (S.F.) was supported in part by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 20740067. The author (F.-C. M.) was supported by CNCSIS Grant 420/2008.
References (30)
A step beyond Tsallis and Rényi entropies
Phys. Lett. A
(2005)- et al.
Multiplicative duality, -triplet and -relation derived from the one-to-one correspondence between the -multinomial coefficient and Tsallis entropy
Physica A
(2008) Mathematical structures derived from the -multinomial coefficient in Tsallis statistics
Physica A
(2006)- et al.
On the cut-off prescriptions associated with power-law generalized thermostatistics
Phys. Lett. A
(2005) - et al.
A fractal approach to entropy and distribution functions
Phys. Lett. A
(1993) - et al.
Thermodynamic consistency of the -deformed Fermi–Dirac distribution in nonextensive thermostatics
Phys. Lett. A
(2010) Possible generalization of Bolzmann–Gibbs statistics
J. Stat. Phys.
(1988)On measures of entropy and information
- et al.
New nonadditive measures of inaccuracy
J. Math. Sci.
(1975) - et al.
New nonadditive measures of relative information
J. Comb. Inform. Syst. Sci.
(1977)
Two-parameter generalization of the logarithm and exponential functions and Boltzmann–Gibbs–Shannon entropy
J. Math. Phys.
Information-theoretic considerations on estimation problems
Inform. Control
The -norm information measure
Inform. Control
Cited by (33)
On Jensen–Rényi and Jeffreys–Rényi type f-divergences induced by convex functions
2020, Physica A: Statistical Mechanics and its ApplicationsInequalities related to some types of entropies and divergences
2019, Physica A: Statistical Mechanics and its ApplicationsThe residual extropy of order statistics
2018, Statistics and Probability LettersGeneralizations of Crooks and Lin's results on Jeffreys–Csiszár and Jensen–Csiszár f-divergences
2016, Physica A: Statistical Mechanics and its ApplicationsCitation Excerpt :Therefore we now introduce the following notions. Bounding a given divergence by other ones is an important research problem [6,9,5,1,7,2,17,8]. Each refinement of such bounds provides a deeper insight into physical interpretations and applications.
Tsallis and Rényi divergences of generalized Jacobi polynomials
2016, Physica A: Statistical Mechanics and its ApplicationsCitation Excerpt :Tsallis and Rényi entropies generalize Shannon entropy. Tsallis entropy was primarily introduced by C. Tsallis in his paper [1] and, in the last couple of years, it has been intensively studied (see e.g., Refs. [2–6] and many other papers). Rényi entropy appeared for the first time in Ref. [7].
Bounds for Jeffreys-Tsallis and Jensen-Shannon-Tsallis divergences
2014, Physica A: Statistical Mechanics and its Applications