Skip to main content
Log in

On one extreme value problem for entropy and error probability

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

The problem of determining both the maximum and minimum entropy of a random variable Y as well as the maximum absolute value of the difference between entropies of Y and another random variable X is considered under the condition that the probability distribution of X is fixed and the error probability (i.e., the probability of noncoincidence of random values of X and Y) is given. A precise expression for the minimum entropy of Y is found. Some conditions under which the entropy of Y takes its maximum value are pointed out. In other cases, some lower and upper bounds are obtained for the maximum entropy of Y as well as for the maximum absolute value of the difference between entropies of Y and X.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Pinsker, M.S., On Estimation of Information via Variation, Probl. Peredachi Inf., 2005, vol. 41, no. 2, pp. 3–8 [Probl. Inf. Trans. (Engl. Transl.), 2005, vol. 41, no. 2, pp. 71–75].

    MathSciNet  Google Scholar 

  2. Zhang, Z., Estimating Mutual Information via Kolmogorov Distance, IEEE Trans. Inform. Theory, 2007, vol. 53, no. 9, pp. 3280–3282.

    Article  MathSciNet  MATH  Google Scholar 

  3. Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].

    Google Scholar 

  4. Prelov, V.V, Generalization of a Pinsker Problem, Probl. Peredachi Inf., 2011, vol. 47, no. 2, pp. 17–37 [Probl. Inf. Trans. (Engl. Transl.), 2011, vol. 47, no. 2, pp. 98–116].

    MathSciNet  Google Scholar 

  5. Ho, S.-W. and Yeung, R.W., The Interplay between Entropy and Variational Distance, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5906–5929.

    Article  MathSciNet  Google Scholar 

  6. Ho, S.-W. and Verdú, S., On the Interplay between Conditional Entropy and Error Probability, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 12, pp. 5930–5942.

    Article  MathSciNet  Google Scholar 

  7. Ley, C. and Swan, Y., Local Pinsker Inequalities via Stein’s Discrete Density Approach, IEEE Trans. Inform. Theory, 2013, vol. 59, no. 9, pp. 5584–5591.

    Article  MathSciNet  Google Scholar 

  8. Sason, I., Entropy Bounds for Discrete Random Variables via Maximal Coupling, IEEE Trans. Inform. Theory, 2013, vol. 59, no. 11, pp. 7118–7131.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. V. Prelov.

Additional information

Original Russian Text © V.V. Prelov, 2014, published in Problemy Peredachi Informatsii, 2014, Vol. 50, No. 3, pp. 3–18.

Supported in part by the Russian Foundation for Basic Research, project no. 12-01-00905-a.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Prelov, V.V. On one extreme value problem for entropy and error probability. Probl Inf Transm 50, 203–216 (2014). https://doi.org/10.1134/S003294601403016

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S003294601403016

Keywords

Navigation