Escort entropies and divergences and related canonical distribution
Highlights
► Two-parameter entropies are derived from q-entropies and escort distributions. ► The related canonical distribution is derived. ► This connects and extends known results in nonextensive statistics.
Introduction
Rényi and Tsallis entropies extend the standard Shannon–Boltzmann entropy, enabling to build generalized thermostatistics, that include the standard one as a special case. The thermodynamics derived from Tsallis entropy, the nonextensive thermodynamics, has received a high attention and there is a wide variety of applications where experiments, numerical results and analytical derivations fairly agree with the new formalisms [1]. Some physical applications of the generalized entropies, including statistics of cosmic rays, defect turbulence, optical lattices, systems with long-range interactions, superstatistical systems, etc., can be found in the recent review [2] and references therein. In the extended thermodynamics, it has been found particularly useful to use generalized moments [3], [4]. These moments are computed with respect to a deformed version of the density at hands, which is called its escort distribution [5], [6]. Actually, several type of constraints [7], [8], [9] have been used in order to derive the canonical distributions: a first type of constraints is expressed as standard linear mean values while a second type of constraints is given as generalized escort mean values. In the two cases, the related canonical distributions are expressed as a q-Gaussian distribution, but with two opposite exponents, and both solutions reduce to a standard Gaussian distribution in the case. These q-Gaussian distributions can exhibit a power-law behavior, with a remarkable agreement with experimental data, see for instance [1], [10], [11], and references therein. These distributions are also analytical solutions of actual physical problems, e.g. [12], [13], [14]. There has been numerous discussions regarding the choice of a ‘correct’ form of the constraints, either as a standard mean or an escort-average, and on the connections between the solutions and associated thermodynamics. In particular, dualities and equivalences between the two settings have been described.
This is precisely the context of the present Letter, where we propose a simple connection between these different formulations and between the related canonical distributions. More specifically, we suggest a simple way to combine the originally distinct concepts of entropy and escort distributions into a single two-parameter -entropy. Then, we propose to look at an associated extended maximum entropy problem. This approach includes the two aforementioned formulations as special cases. Exploiting the nonnegativity of the associated divergence, we derive the expression of the canonical distribution for an observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints.
We begin by recalling the context and definitions in Section 2. Then, the combined escort divergences and entropies are introduced and discussed in Section 3. The related maximum entropy problem and its solution are described in Section 4. Finally, in Section 5, we illustrate the results in the case of a two-level system.
Section snippets
Main definitions
Let us recall that if f and g are two probability densities defined with respect to a common measure μ, then for a parameter , called the entropic index, the Tsallis information divergence is defined by and similarly, the Rényi information divergence is defined by provided, in both cases, that the integral is finite. By lʼHospitalʼs rule, both Tsallis and Rényi information divergences reduce to the Kullback–Leibler
The divergences and entropies
In complement to the dualities mentioned above, we will show that there is a continuum of q-Gaussians, solutions of an extended maximum entropy problem, that smoothly connects the problems (7), (8) and their solutions (9), (10). The basic idea is to mix the concepts of q-entropies and escort distributions into a single quantity. This leads us to a simple extension of the Rényi (Tsallis) information divergence and entropy to a two parameters case. Interestingly, the generalized -Rényi
The maximum entropy problem
With the previous definitions at hands, we can now consider the following extended maximum entropy problem: which includes the previous problems as particular cases. The usual procedure for handling such variational problem is the technique of Lagrangian multipliers, e.g. [7], [8]. However, even though the objective functional is strictly concave (here for ), the constraints set is not convex and the uniqueness and nature of the maximum cannot be
The case of a two-level system
Finally, we close this Letter with the example of a two-level system, with eigenenergies 0 and 1. Although the general maximum entropy problem (18) usually do not lead to closed-form expressions for the entropies , see e.g. [25], we still can get an explicit solution for the simple two-level system. For this system, the measure at hands charges the two levels, with where denotes the Dirac mass at x. Then, the maximum entropy distribution is the discrete distribution with
Conclusions
In this Letter, we have suggested a possible extension of the standard q-divergences and entropies by considering a combination of the concepts of q-divergences and of escort distributions. This leads to two-parameter information measures that recover the classical ones as special cases, and coincide with some recently proposed measures [21]. Further work should examine the general properties of these information measures. We have introduced a general maximum entropy problem and derived the
References (25)
- et al.
Physica A
(1998) - et al.
Physica A
(2000) - et al.
Physica A
(2009) - et al.
Physica A
(2007) - et al.
Phys. Lett. A
(2005) Chaos Solitons Fractals
(2002)- et al.
Physica A
(2008) - et al.
J. Multivar. Anal.
(2008) Inf. Sci.
(2008)Introduction to Nonextensive Statistical Mechanics
(2009)
Contemp. Physics
Phys. Rev. E
Cited by (20)
Derivation of the density operator with quantum analysis for the generalized Gibbs ensemble in quantum statistics
2021, Physica A: Statistical Mechanics and its ApplicationsDerivation of density operators for generalized entropies with quantum analysis
2020, Physica A: Statistical Mechanics and its ApplicationsCitation Excerpt :Generalized entropies such as the Tsallis entropy [1–4] and the Rényi entropy [3–5] are used to study various phenomena in many branches of science.
A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians
2012, Physica A: Statistical Mechanics and its ApplicationsElaboration Models with Symmetric Information Divergence
2022, International Statistical ReviewAn overview of generalized entropic forms
2021, arXiv