Abstract
Among the mostly investigated parameters for noisy channels are code size, error probability in decoding, block length; rate, capacity, reliability function; delay, complexity of coding. There are several statements about connections between these quantities. They carry names like “coding theorem”, “converse theorem” (weak, strong, ...), “direct theorem”, “capacity theorem”, “lower bound”, “upper bound”, etc. There are analogous notions for source coding.
This note has become necessary after the author noticed that Information Theory suffers from a lack of precision in terminology. Its purpose is to open a discussion about this situation with the goal to gain more clarity.
There is also some confusion concerning the scopes of analytical and combinatorial methods in probabilistic coding theory, particularly in the theory of identification. We present a covering (or approximation) lemma for hypergraphs, which especially makes strong converse proofs in this area transparent and dramatically simplifies them.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ahlswede, R.: Certain results in coding theory for compound channels. Proc. Colloquium Inf. Th. Debrecen (Hungary), 35–60 (1967)
Ahlswede, R.: Beiträge zur Shannonschen Informationstheorie im Fall nichtstationärer Kanäle. Z. Wahrscheinlichkeitstheorie und verw. Geb. 10, 1–42 (1968); Dipl. Thesis Nichtstationäre Kanäle, Göttingen (1963)
Ahlswede, R.: The weak capacity of averaged channels, Z. Wahrscheinlichkeitstheorie und verw. Geb. 11, 61–73 (1968)
Ahlswede, R.: On two–way communication channels and a problem by Zarankiewicz. In: Sixth Prague Conf. on Inf. Th., Stat. Dec. Fct’s and Rand. Proc., September 1971, pp. 23–37. Publ. House Chechosl. Academy of Sc. (1973)
Ahlswede, R.: An elementary proof of the strong converse theorem for the multiple–access channel. J. Combinatorics, Information and System Sciences, 7(3), 216–230 (1982)
Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding I. Journ. of Combinatorics, Information and System Sciences 4(1), 76–115 (1979)
Ahlswede, R.: Coloring hypergraphs: A new approach to multi–user source coding II. Journ. of Combinatorics, Information and System Sciences 5(3), 220–268 (1980)
Ahlswede, R., Balakirsky, V.: Identification under random processes, Preprint 95–098, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld, Problemy peredachii informatsii (special issue devoted to M.S. Pinsker) 32(1), 144–160 (1996); Problems of Information Transmission 32(1), 123–138 (1996)
Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part I: secret sharing. IEEE Trans. Information Theory 39(4), 1121–1132 (1993)
Ahlswede, R., Csiszár, I.: Common randomness in information theory and cryptography, part II: CR capacity, Preprint 95–101, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Inf. Theory 44(1), 55–62 (1998)
Ahlswede, R., Dueck, G.: Every bad code has a good subcode: a local converse to the coding theorem, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 179–182 (1976)
Ahlswede, R., Dueck, G.: Identification via channels. IEEE Trans. Inf. Theory 35(1), 15–29 (1989)
Ahlswede, R., Dueck, G.: Identification in the presence of feedback — a discovery of new capacity formulas. IEEE Trans. on Inf. Theory 35(1), 30–39 (1989)
Ahlswede, R., Wolfowitz, J.: The structure of capacity functions for compound channels. In: Proc. of the Internat. Symposium on Probability and Information Theory at McMaster University, Canada, April 1968, pp. 12–54 (1969)
Ahlswede, R., Verboven, B.: On identification via multi–way channels with feedback. IEEE Trans. Information Theory 37(5), 1519–1526 (1991)
Ahlswede, R., Zhang, Z.: New directions in the theory of identification via channels, Preprint 94–010, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 41(4), 1040–1050 (1995)
Ahlswede, R., Cai, N., Zhang, Z.: Erasure, list, and detection zero–error capacities for low noise and a relation to identification, Preprint 93–068, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld. IEEE Trans. Information Theory 42(1), 55–62 (1996)
Ahlswede, R., Gács, P., Körner, J.: Bounds on conditional probabilities with applications in multiuser communication, Z. Wahrscheinlichkeitstheorie und verw. Geb. 34, 157–177 (1976)
Ahlswede, R.: General theory of information transfer, Preprint 97–118, SFB 343 Diskrete Strukturen in der Mathematik, Universität Bielefeld (1997); General theory of information transfer: updated, General Theory of Information Transfer and Combinatorics, a Special Issue of Discrete Applied Mathematics (to appear)
Ash, R.: Information Theory, Interscience Tracts in Pure and Applied Mathematics, vol. 19. Wiley & Sons, New York (1965)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Series in Telecommunications. J. Wiley & Sons, Chichester (1991)
Csiszár, I., Körner, J.: Information Theory — Coding Theorem for Discrete Memoryless Systems. Academic, New York (1981)
Dobrushin, R.L.: General formulation of Shannon’s main theorem of information theory. Usp. Math. Nauk. 14, 3–104 (1959); Translated in Am. Math. Soc. Trans. 33, 323–438 (1962)
Fano, R.M.: Transmission of Information: A Statistical Theory of Communication. Wiley, New York (1961)
Feinstein, A.: Foundations of Information Theory. McGraw–Hill, New York (1958)
Gallager, R.G.: A simple derivation of the coding theorem and some applications. IEEE Trans. Inf. Theory, 3–18 (1965)
Gallager, R.G.: Information Theory and Reliable Communication. J. Wiley and Sons, Inc., New York (1968)
Han, T.S.: Oral communication (1998)
Han, T.S.: Information – Spectrum Methods in Information Theory (April 1998) (in Japanese)
Han, T.S., Verdú, S.: Approximation theory of output statistics. IEEE Trans. Inf. Theory IT–39(3), 752–772 (1993)
Han, T.S., Verdú, S.: New results in the theory of identification via channels. IEEE Trans. Inf. Theory 39(3), 752–772 (1993)
Jacobs, K.: Almost periodic channels, Colloquium on Combinatorial Methods in Probability Theory, pp. 118–126, Matematisk Institute, Aarhus University, August 1–10 (1962)
Jelinek, F.: Probabilistic Information Theory (1968)
Kesten, H.: Some remarks on the capacity of compound channels in the semicontinuous case. Inform. and Control 4, 169–184 (1961)
Pinsker, M.S.: Information and Stability of Random Variables and Processes, Izd. Akad. Nauk (1960)
Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)
Shannon, C.E.: The zero error capacity of a noisy channel. IRE, Trans. Inf. Theory 2, 8–19 (1956)
Shannon, C.E.: Certain results in coding theory for noisy channels. Inform. and Control 1, 6–25 (1957)
Verdú, S., Han, T.S.: A general formula for channel capacity. IEEE Trans. Inf. Theory 40(4), 1147–1157 (1994)
Wolfowitz, J.: The coding of messages subject to chance errors. Illinois Journal of Mathematics 1, 591–606 (1957)
Wolfowitz, J.: Coding theorems of information theory, 3rd edn. Ergebnisse der Mathematik und ihrer Grenzgebiete, Band 31. Springer, Berlin- New York (1978)
Wyner, A.D.: The capacity of the product channel. Information and Control 9, 423–430 (1966)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Ahlswede, R. (2006). On Concepts of Performance Parameters for Channels. In: Ahlswede, R., et al. General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol 4123. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11889342_40
Download citation
DOI: https://doi.org/10.1007/11889342_40
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46244-6
Online ISBN: 978-3-540-46245-3
eBook Packages: Computer ScienceComputer Science (R0)