Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgments
- Notation
- 1 Introduction
- Part I Preliminaries
- Part II Single-Hop Networks
- 4 Multiple Access Channels
- 5 Degraded Broadcast Channels
- 6 Interference Channels
- 7 Channels with State
- 8 General Broadcast Channels
- 9 Gaussian Vector Channels
- 10 Distributed Lossless Compression
- 11 Lossy Compression with Side Information
- 12 Distributed Lossy Compression
- 13 Multiple Description Coding
- 14 Joint Source–Channel Coding
- Part III Multihop Networks
- Part IV Extensions
- Appendices
- Bibliography
- Common Symbols
- Author Index
- Subject Index
10 - Distributed Lossless Compression
from Part II - Single-Hop Networks
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgments
- Notation
- 1 Introduction
- Part I Preliminaries
- Part II Single-Hop Networks
- 4 Multiple Access Channels
- 5 Degraded Broadcast Channels
- 6 Interference Channels
- 7 Channels with State
- 8 General Broadcast Channels
- 9 Gaussian Vector Channels
- 10 Distributed Lossless Compression
- 11 Lossy Compression with Side Information
- 12 Distributed Lossy Compression
- 13 Multiple Description Coding
- 14 Joint Source–Channel Coding
- Part III Multihop Networks
- Part IV Extensions
- Appendices
- Bibliography
- Common Symbols
- Author Index
- Subject Index
Summary
In this chapter, we begin the discussion on communication of uncompressed sources over multiple noiseless links. We consider the limits on lossless compression of separately encoded sources, which is motivated by distributed sensing problems. For example, consider a sensor network for measuring the temperature at different locations across a city. Suppose that each sensor node compresses its measurement and transmits it to a common base station via a noiseless link. What is the minimum total transmission rate needed so that the base station can losslessly recover the measurements from all the sensors? If the sensor measurements are independent of each other, then the answer to this question is straightforward; each sensor compresses its measurement to the entropy of its respective temperature process, and the limit on the total rate is the sum of the individual entropies? The temperature processes at the sensors, however, can be highly correlated. Can such correlation be exploited to achieve a lower rate than the sum of the individual entropies? Slepian and Wolf showed that the total rate can be reduced to the joint entropy of the processes, that is, the limit on distributed lossless compression is the same as that on centralized compression, where the sources are jointly encoded. The achievability proof of this surprising result uses the new idea of random binning.
We then consider lossless source coding with helpers. Suppose that the base station in our sensor network example wishes to recover the temperature measurements from only a subset of the sensors while using the information sent by the rest of the sensor nodes to help achieve this goal. What is the optimal tradeoff between the rates from the different sensors? We establish the optimal rate region for the case of a single helper node.
In Chapter 20, we continue the discussion of distributed lossless source coding by considering more general networks modeled by graphs.
Distributed Lossless Source Coding For A 2-Dms
Consider the distributed compression system depicted in Figure 10.1, where two sources X1 and X2 are separately encoded (described) at rates R1 and R2, respectively, and the descriptions are communicated over noiseless links to a decoder who wishes to recover both sources losslessly. What is the set of simultaneously achievable description rate pairs (R1, R2)?
- Type
- Chapter
- Information
- Network Information Theory , pp. 258 - 273Publisher: Cambridge University PressPrint publication year: 2011