Mathematical inequalities for some divergences

https://doi.org/10.1016/j.physa.2011.07.052Get rights and content

Abstract

Some natural phenomena are deviating from standard statistical behavior and their study has increased interest in obtaining new definitions of information measures. But the steps for deriving the best definition of the entropy of a given dynamical system remain unknown. In this paper, we introduce some parametric extended divergences combining Jeffreys divergence and Tsallis entropy defined by generalized logarithmic functions, which lead to new inequalities. In addition, we give lower bounds for one-parameter extended Fermi–Dirac and Bose–Einstein divergences. Finally, we establish some inequalities for the Tsallis entropy, the Tsallis relative entropy and some divergences by the use of the Young’s inequality.

Highlights

► We introduce some extended divergences combining Jeffreys and Tsallis divergence. ► Then we give new some inequalities for them. ► We also give lower bounds for extended Fermi–Dirac and Bose–Einstein divergences. ► Finally, we establish some inequalities for some divergences by Young inequality.

Introduction

For the study of multifractals, in 1988, Tsallis [1] introduced one-parameter extended entropy of Shannon entropy by Hq(p)j=1npjqlnqpj=j=1npjlnq1pj,(q0,q1) where p={p1,p2,,pn} is a probability distribution with pj>0 for all j=1,2,,n and the q- logarithmic function for x>0 is defined by lnq(x)x1q11q which uniformly converges to the usual logarithmic function log(x) in the limit q1. Therefore Tsallis entropy converges to Shannon entropy in the limit q1: limq1Hq(p)=H1(p)j=1npjlogpj. It is also known that Rényi entropy [2]Rq(p)11qlog(j=1npjq) is a one-parameter extension of Shannon entropy.

For two probability distributions p={p1,p2,,pn} and r={r1,r2,,rn} we have divergences based on these quantities (1), (3). We denote by Dq(pr)j=1npjq(lnqpjlnqrj)=j=1npjlnqrjpj. Tsallis relative entropy. Tsallis relative entropy converges to the usual relative entropy (divergence, Kullback–Leibler information) in the limit q1: limq1Dq(pr)=D1(pr)j=1npj(logpjlogrj). We also denote by Rq(pr) the Rényi relative entropy [2] defined by Rq(pr)1q1log(j=1npjqrj1q). Obviously limq1Rq(pr)=D1(pr).

The divergences can be considered to be a generalization of entropies in the sense that Shannon entropy can be reproduced by the divergence lognD1(pu) for the uniform distribution u={1/n,1/n,,1/n}. Therefore the study of divergences is important for the development of information science. In this paper, we study several mathematical inequalities related to some generalized divergences.

Section snippets

Two parameter entropies and divergences

In this section and throughout the rest of the paper we consider p={p1,p2,,pn} and r={r1,r2,,rn} with pj>0,rj>0 for all j=1,2,,n to be probability distributions.

We start by introducing the Tsallis quasilinear entropies, that is we extend the quasilinear entropies using the Tsallis recipe.

Definition 2.1

For a continuous and strictly monotonic function ψ on (0,) and r0 with r1 (the nonextensivity parameter), Tsallis quasilinear entropy (r-quasilinear entropy) is defined by Irψ(p)lnrψ1(j=1npjψ(1pj)).

In

Tsallis type divergences

In what follows Jeffreys and Jensen–Shannon divergences are extended in the context of Tsallis theory. We firstly review their definitions.

Definition 3.1

[18], [19]

The Jeffreys divergence is defined by J1(pr)D1(pr)+D1(rp) and the Jensen–Shannon divergence is defined as JS1(pr)12D1(pp + r2)+12D1(rp + r2).

Analogously we may define the following divergences.

Definition 3.2

The Jeffreys–Tsallis divergence is Jr(pr)Dr(pr)+Dr(rp) and the Jensen–Shannon–Tsallis divergence is JSr(pr)12Dr(pp + r2)+12Dr(rp + r2).

We find that Jr(p

Fermi–Dirac and Bose–Einstein type divergences

Fermi–Dirac and Bose–Einstein divergences are very often relevent in physics problems related to elementary particles which obey such statistics, called fermions, respectively bosons. As one-parameter extension of Fermi–Dirac entropy and Bose–Einstein entropy (see also Refs. [22], [23], [24], [25]), that is of I1FD(p)j=1npjlogpjj=1n(1pj)log(1pj) and I1BE(p)j=1npjlogpj+j=1n(1+pj)log(1+pj), the Fermi–Dirac–Tsallis entropy was introduced in Ref. [23] and physical phenomena for power-law

Young’s inequality and Tsallis entropies with finite sum

We establish more inequalities involving Tsallis entropy and Tsallis relative entropy applying Young’s inequality.

Lemma 5.1 Young’s inequality

Let m,n0 and p,qR such that 1p+1q=1 . If p<0 (then 0<q<1 ) or 0<p<1 (then q<0 ), then one has mpp+nqqmn.

Lemma 5.2

  • (i)

    Let p,qR satisfying 11p+11q=1 . If p>1 and 0<q<1, or if 0<p<1 and q>1, thenlnpx+lnqyxy1.

  • (ii)

    Let p,qR satisfying 1p1+1q1=1 . If p<1 and 1<q<2, or if 1<p<2 and q<1, thenlnp1x+lnq1yxy+1.

Proof

  • (i)

    Using Lemma 5.1, we obtain lnpx+lnqy=x1p11p+y1q11qxy1.

  • (ii)

    Lemma 5.1 leads to lnp1x+lnq

Concluding remarks

As we have seen, we established a generalized inequality for a new quantity defined by Jeffreys divergence and Tsallis entropy. In addition, we gave lower bounds for one-parameter extended Fermi–Dirac and Bose–Einstein divergences. These lower bounds give the sharper inequalities than their nonnegativity. The nonnegativity of Tsallis divergence was applied to obtain the maximum entropy principle in Ref. [29]. Therefore one could check if also the nonnegativity of one-parameter extended

Acknowledgments

The author (S.F.) was supported in part by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B), 20740067. The author (F.-C. M.) was supported by CNCSIS Grant 420/2008.

References (30)

  • V. Schwämmle et al.

    Two-parameter generalization of the logarithm and exponential functions and Boltzmann–Gibbs–Shannon entropy

    J. Math. Phys.

    (2007)
  • S. Arimoto

    Information-theoretic considerations on estimation problems

    Inform. Control

    (1971)
  • E. Boekee et al.

    The R-norm information measure

    Inform. Control

    (1980)
  • E. Aktürk, G.B. Bağci, R. Sever, Is Sharma–Mittal entropy really a step beyond Tsallis and Rényi entropies,...
  • M. Masi, Generalized information-entropy measures and Fisher information,...
  • Cited by (33)

    • On Jensen–Rényi and Jeffreys–Rényi type f-divergences induced by convex functions

      2020, Physica A: Statistical Mechanics and its Applications
    • Inequalities related to some types of entropies and divergences

      2019, Physica A: Statistical Mechanics and its Applications
    • The residual extropy of order statistics

      2018, Statistics and Probability Letters
    • Generalizations of Crooks and Lin's results on Jeffreys–Csiszár and Jensen–Csiszár f-divergences

      2016, Physica A: Statistical Mechanics and its Applications
      Citation Excerpt :

      Therefore we now introduce the following notions. Bounding a given divergence by other ones is an important research problem [6,9,5,1,7,2,17,8]. Each refinement of such bounds provides a deeper insight into physical interpretations and applications.

    • Tsallis and Rényi divergences of generalized Jacobi polynomials

      2016, Physica A: Statistical Mechanics and its Applications
      Citation Excerpt :

      Tsallis and Rényi entropies generalize Shannon entropy. Tsallis entropy was primarily introduced by C. Tsallis in his paper [1] and, in the last couple of years, it has been intensively studied (see e.g., Refs. [2–6] and many other papers). Rényi entropy appeared for the first time in Ref. [7].

    • Bounds for Jeffreys-Tsallis and Jensen-Shannon-Tsallis divergences

      2014, Physica A: Statistical Mechanics and its Applications
    View all citing articles on Scopus
    View full text