Elsevier

Physics Letters A

Volume 375, Issue 33, 1 August 2011, Pages 2969-2973
Physics Letters A

Escort entropies and divergences and related canonical distribution

https://doi.org/10.1016/j.physleta.2011.06.057Get rights and content

Abstract

We discuss two families of two-parameter entropies and divergences, derived from the standard Rényi and Tsallis entropies and divergences. These divergences and entropies are found as divergences or entropies of escort distributions. Exploiting the nonnegativity of the divergences, we derive the expression of the canonical distribution associated to the new entropies and a observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints.

Highlights

► Two-parameter entropies are derived from q-entropies and escort distributions. ► The related canonical distribution is derived. ► This connects and extends known results in nonextensive statistics.

Introduction

Rényi and Tsallis entropies extend the standard Shannon–Boltzmann entropy, enabling to build generalized thermostatistics, that include the standard one as a special case. The thermodynamics derived from Tsallis entropy, the nonextensive thermodynamics, has received a high attention and there is a wide variety of applications where experiments, numerical results and analytical derivations fairly agree with the new formalisms [1]. Some physical applications of the generalized entropies, including statistics of cosmic rays, defect turbulence, optical lattices, systems with long-range interactions, superstatistical systems, etc., can be found in the recent review [2] and references therein. In the extended thermodynamics, it has been found particularly useful to use generalized moments [3], [4]. These moments are computed with respect to a deformed version of the density at hands, which is called its escort distribution [5], [6]. Actually, several type of constraints [7], [8], [9] have been used in order to derive the canonical distributions: a first type of constraints is expressed as standard linear mean values while a second type of constraints is given as generalized escort mean values. In the two cases, the related canonical distributions are expressed as a q-Gaussian distribution, but with two opposite exponents, and both solutions reduce to a standard Gaussian distribution in the q=1 case. These q-Gaussian distributions can exhibit a power-law behavior, with a remarkable agreement with experimental data, see for instance [1], [10], [11], and references therein. These distributions are also analytical solutions of actual physical problems, e.g. [12], [13], [14]. There has been numerous discussions regarding the choice of a ‘correct’ form of the constraints, either as a standard mean or an escort-average, and on the connections between the solutions and associated thermodynamics. In particular, dualities and equivalences between the two settings have been described.

This is precisely the context of the present Letter, where we propose a simple connection between these different formulations and between the related canonical distributions. More specifically, we suggest a simple way to combine the originally distinct concepts of entropy and escort distributions into a single two-parameter (a,λ)-entropy. Then, we propose to look at an associated extended maximum entropy problem. This approach includes the two aforementioned formulations as special cases. Exploiting the nonnegativity of the associated divergence, we derive the expression of the canonical distribution for an observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints.

We begin by recalling the context and definitions in Section 2. Then, the combined escort divergences and entropies are introduced and discussed in Section 3. The related maximum entropy problem and its solution are described in Section 4. Finally, in Section 5, we illustrate the results in the case of a two-level system.

Section snippets

Main definitions

Let us recall that if f and g are two probability densities defined with respect to a common measure μ, then for a parameter q>0, called the entropic index, the Tsallis information divergence is defined byDq(T)(fg)=1q1(f(x)qg(x)1qdμ(x)1), and similarly, the Rényi information divergence is defined byDq(R)(fg)=1q1logf(x)qg(x)1qdμ(x), provided, in both cases, that the integral is finite. By lʼHospitalʼs rule, both Tsallis and Rényi information divergences reduce to the Kullback–Leibler

The (a,λ) divergences and entropies

In complement to the dualities mentioned above, we will show that there is a continuum of q-Gaussians, solutions of an extended maximum entropy problem, that smoothly connects the problems (7), (8) and their solutions (9), (10). The basic idea is to mix the concepts of q-entropies and escort distributions into a single quantity. This leads us to a simple extension of the Rényi (Tsallis) information divergence and entropy to a two parameters case. Interestingly, the generalized (a,λ)-Rényi

The (a,λ) maximum entropy problem

With the previous definitions at hands, we can now consider the following extended maximum entropy problem:Ha,λ(m)=maxf{Ha,λ[f]:mp,a[f]=mandm0,1[f]=1} which includes the previous problems as particular cases. The usual procedure for handling such variational problem is the technique of Lagrangian multipliers, e.g. [7], [8]. However, even though the objective functional is strictly concave (here for aλ), the constraints set is not convex and the uniqueness and nature of the maximum cannot be

The case of a two-level system

Finally, we close this Letter with the example of a two-level system, with eigenenergies 0 and 1. Although the general (a,λ) maximum entropy problem (18) usually do not lead to closed-form expressions for the entropies Ha,λ(m), see e.g. [25], we still can get an explicit solution for the simple two-level system. For this system, the measure at hands charges the two levels, with μ=δ0+δ1 where δx denotes the Dirac mass at x. Then, the maximum entropy distribution is the discrete distribution with

Conclusions

In this Letter, we have suggested a possible extension of the standard q-divergences and entropies by considering a combination of the concepts of q-divergences and of escort distributions. This leads to two-parameter information measures that recover the classical ones as special cases, and coincide with some recently proposed measures [21]. Further work should examine the general properties of these information measures. We have introduced a general maximum entropy problem and derived the

References (25)

  • C. Tsallis et al.

    Physica A

    (1998)
  • S. Martínez et al.

    Physica A

    (2000)
  • C. Vignat et al.

    Physica A

    (2009)
  • F. Pennini et al.

    Physica A

    (2007)
  • T. Wada et al.

    Phys. Lett. A

    (2005)
  • J. Naudts

    Chaos Solitons Fractals

    (2002)
  • H. Suyari et al.

    Physica A

    (2008)
  • H. Fujisawa et al.

    J. Multivar. Anal.

    (2008)
  • J.F. Bercher

    Inf. Sci.

    (2008)
  • C. Tsallis

    Introduction to Nonextensive Statistical Mechanics

    (2009)
  • C. Beck

    Contemp. Physics

    (2009)
  • S. Abe et al.

    Phys. Rev. E

    (2005)
  • Cited by (20)

    • Derivation of density operators for generalized entropies with quantum analysis

      2020, Physica A: Statistical Mechanics and its Applications
      Citation Excerpt :

      Generalized entropies such as the Tsallis entropy [1–4] and the Rényi entropy [3–5] are used to study various phenomena in many branches of science.

    • Elaboration Models with Symmetric Information Divergence

      2022, International Statistical Review
    View all citing articles on Scopus
    View full text