Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4123))

Abstract

Among the mostly investigated parameters for noisy channels are code size, error probability in decoding, block length; rate, capacity, reliability function; delay, complexity of coding. There are several statements about connections between these quantities. They carry names like “coding theorem”, “converse theorem” (weak, strong, ...), “direct theorem”, “capacity theorem”, “lower bound”, “upper bound”, etc. There are analogous notions for source coding.

This note has become necessary after the author noticed that Information Theory suffers from a lack of precision in terminology. Its purpose is to open a discussion about this situation with the goal to gain more clarity.

There is also some confusion concerning the scopes of analytical and combinatorial methods in probabilistic coding theory, particularly in the theory of identification. We present a covering (or approximation) lemma for hypergraphs, which especially makes strong converse proofs in this area transparent and dramatically simplifies them.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Ahlswede, R.: Certain results in coding theory for compound channels. Proc. Colloquium Inf. Th. Debrecen (Hungary), 35–60 (1967)

    Google Scholar 

  2. Ahlswede, R.: Beiträge zur Shannonschen Informationstheorie im Fall nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie und verw. Geb. 10, 1–42 (1968); Dipl. Thesis Nichtstationäre Kanäle, Göttingen (1963)

    Google Scholar 

  3. Ahlswede, R.: The weak capacity of averaged channels, Z. Wahrscheinlichkeitstheorie und verw. Geb. 11, 61–73 (1968)

    Article  MATH  MathSciNet  Google Scholar 

  4. Ahlswede, R.: On two–way communication channels and a problem by Zarankiewicz. In: Sixth Prague Conf. on Inf. Th., Stat. Dec. Fct’s and Rand. Proc., September 1971, pp. 23–37. Publ. House Chechosl. Academy of Sc. (1973)

    Google Scholar 

  5. Ahlswede, R.: An elementary proof of the strong converse theorem for the multiple–access channel. J. Combinatorics, Information and System Sciences, 7(3), 216–230 (1982)

    MATH  MathSciNet  Google Scholar 

  6. Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding I. Journ. of Combinatorics, Information and System Sciences 4(1), 76–115 (1979)

    MATH  MathSciNet  Google Scholar 

  7. Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding II. Journ. of Combinatorics, Information and System Sciences 5(3), 220–268 (1980)

    MATH  MathSciNet  Google Scholar 

  8. Ahlswede, R., Balakirsky, V.: Identification under random processes, Preprint 95–098, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld, Problemy peredachii informatsii (special issue devoted to M.S. Pinsker) 32(1), 144–160 (1996); Problems of Information Transmission 32(1), 123–138 (1996)

    Google Scholar 

  9. Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part I: secret sharing. IEEE Trans. Information Theory 39(4), 1121–1132 (1993)

    Article  MATH  Google Scholar 

  10. Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part II: CR capacity, Preprint 95–101, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Inf. Theory 44(1), 55–62 (1998)

    Google Scholar 

  11. Ahlswede, R., Dueck, G.: Every bad code has a good subcode: a local converse to the coding theorem, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 179–182 (1976)

    Article  MATH  MathSciNet  Google Scholar 

  12. Ahlswede, R., Dueck, G.: Identification via channels. IEEE Trans. Inf. Theory 35(1), 15–29 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  13. Ahlswede, R., Dueck, G.: Identification in the presence of feedback — a discovery of new capacity formulas. IEEE Trans. on Inf. Theory 35(1), 30–39 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  14. Ahlswede, R., Wolfowitz, J.: The structure of capacity functions for compound channels. In: Proc. of the Internat. Symposium on Probability and Information Theory at McMaster University, Canada, April 1968, pp. 12–54 (1969)

    Google Scholar 

  15. Ahlswede, R., Verboven, B.: On identification via multi–way channels with feedback. IEEE Trans. Information Theory 37(5), 1519–1526 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  16. Ahlswede, R., Zhang, Z.: New directions in the theory of identification via channels, Preprint 94–010, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 41(4), 1040–1050 (1995)

    Google Scholar 

  17. Ahlswede, R., Cai, N., Zhang, Z.: Erasure, list, and detection zero–error capacities for low noise and a relation to identification, Preprint 93–068, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 42(1), 55–62 (1996)

    Google Scholar 

  18. Ahlswede, R., Gács, P., Körner, J.: Bounds on conditional probabilities with applications in multiuser communication, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 157–177 (1976)

    Article  MATH  Google Scholar 

  19. Ahlswede, R.: General theory of information transfer, Preprint 97–118, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld (1997); General theory of information transfer: updated, General Theory of Information Transfer and Combinatorics, a Special Issue of Discrete Applied Mathematics (to appear)

    Google Scholar 

  20. Ash, R.: Information Theory, Interscience Tracts in Pure and Applied Mathematics, vol. 19. Wiley & Sons, New York (1965)

    Google Scholar 

  21. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Series in Telecommunications. J. Wiley & Sons, Chichester (1991)

    Book  MATH  Google Scholar 

  22. Csiszár, I., Körner, J.: Information Theory — Coding Theorem for Discrete Memoryless Systems. Academic, New York (1981)

    Google Scholar 

  23. Dobrushin, R.L.: General formulation of Shannon’s main theorem of information theory. Usp. Math. Nauk. 14, 3–104 (1959); Translated in Am. Math. Soc. Trans. 33, 323–438 (1962)

    Google Scholar 

  24. Fano, R.M.: Transmission of Information: A Statistical Theory of Communication. Wiley, New York (1961)

    Google Scholar 

  25. Feinstein, A.: Foundations of Information Theory. McGraw–Hill, New York (1958)

    MATH  Google Scholar 

  26. Gallager, R.G.: A simple derivation of the coding theorem and some applications. IEEE Trans. Inf. Theory, 3–18 (1965)

    Google Scholar 

  27. Gallager, R.G.: Information Theory and Reliable Communication. J. Wiley and Sons, Inc., New York (1968)

    MATH  Google Scholar 

  28. Han, T.S.: Oral communication (1998)

    Google Scholar 

  29. Han, T.S.: Information – Spectrum Methods in Information Theory (April 1998) (in Japanese)

    Google Scholar 

  30. Han, T.S., Verdú, S.: Approximation theory of output statistics. IEEE Trans. Inf. Theory IT–39(3), 752–772 (1993)

    Article  Google Scholar 

  31. Han, T.S., Verdú, S.: New results in the theory of identification via channels. IEEE Trans. Inf. Theory 39(3), 752–772 (1993)

    Article  MATH  Google Scholar 

  32. Jacobs, K.: Almost periodic channels, Colloquium on Combinatorial Methods in Probability Theory, pp. 118–126, Matematisk Institute, Aarhus University, August 1–10 (1962)

    Google Scholar 

  33. Jelinek, F.: Probabilistic Information Theory (1968)

    Google Scholar 

  34. Kesten, H.: Some remarks on the capacity of compound channels in the semicontinuous case. Inform. and Control 4, 169–184 (1961)

    Article  MATH  MathSciNet  Google Scholar 

  35. Pinsker, M.S.: Information and Stability of Random Variables and Processes, Izd. Akad. Nauk (1960)

    Google Scholar 

  36. Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)

    Google Scholar 

  37. Shannon, C.E.: The zero error capacity of a noisy channel. IRE, Trans. Inf. Theory 2, 8–19 (1956)

    Article  MathSciNet  Google Scholar 

  38. Shannon, C.E.: Certain results in coding theory for noisy channels. Inform. and Control 1, 6–25 (1957)

    Article  MATH  MathSciNet  Google Scholar 

  39. Verdú, S., Han, T.S.: A general formula for channel capacity. IEEE Trans. Inf. Theory 40(4), 1147–1157 (1994)

    Article  MATH  Google Scholar 

  40. Wolfowitz, J.: The coding of messages subject to chance errors. Illinois Journal of Mathematics 1, 591–606 (1957)

    MATH  MathSciNet  Google Scholar 

  41. Wolfowitz, J.: Coding theorems of information theory, 3rd edn. Ergebnisse der Mathematik und ihrer Grenzgebiete, Band 31. Springer, Berlin- New York (1978)

    Google Scholar 

  42. Wyner, A.D.: The capacity of the product channel. Information and Control 9, 423–430 (1966)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Ahlswede, R. (2006). On Concepts of Performance Parameters for Channels. In: Ahlswede, R., et al. General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol 4123. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11889342_40

Download citation

  • DOI: https://doi.org/10.1007/11889342_40

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46244-6

  • Online ISBN: 978-3-540-46245-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics