Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction
- 2 Error-detecting codes
- 3 Repetition and Hamming codes
- 4 Data compression: efficient coding of a random message
- 5 Entropy and Shannon's Source Coding Theorem
- 6 Mutual information and channel capacity
- 7 Approaching the Shannon limit by turbo coding
- 8 Other aspects of coding theory
- References
- Index
7 - Approaching the Shannon limit by turbo coding
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of contributors
- Preface
- 1 Introduction
- 2 Error-detecting codes
- 3 Repetition and Hamming codes
- 4 Data compression: efficient coding of a random message
- 5 Entropy and Shannon's Source Coding Theorem
- 6 Mutual information and channel capacity
- 7 Approaching the Shannon limit by turbo coding
- 8 Other aspects of coding theory
- References
- Index
Summary
Information Transmission Theorem
The reliable transmission of information-bearing signals over a noisy communication channel is at the heart of what we call communication. Information theory, founded by Claude E. Shannon in 1948 [Sha48], provides a mathematical framework for the theory of communication. It describes the fundamental limits to how efficiently one can encode information and still be able to recover it with negligible loss.
At its inception, the main role of information theory was to provide the engineering and scientific communities with a mathematical framework for the theory of communication by establishing the fundamental limits on the performance of various communication systems. Its birth was initiated with the publication of the works of Claude E. Shannon, who stated that it is possible to send information-bearing signals at a fixed code rate through a noisy communication channel with an arbitrarily small error probability as long as the code rate is below a certain fixed quantity that depends on the channel characteristics [Sha48]; he “baptized” this quantity with the name of channel capacity (see the discussion in Chapter 6). He further proclaimed that random sources – such as speech, music, or image signals – possess an irreducible complexity beyond which they cannot be compressed distortion-free. He called this complexity the source entropy (see the discussion in Chapter 5). He went on to assert that if a source has an entropy that is less than the capacity of a communication channel, then asymptotically error-free transmission of the source over the channel can be achieved.
- Type
- Chapter
- Information
- A Student's Guide to Coding and Information Theory , pp. 143 - 166Publisher: Cambridge University PressPrint publication year: 2012