Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
15 - Network Functional Compression
Published online by Cambridge University Press: 22 March 2021
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
Summary
We study compression for function computation of sources at nodes in a network at receiver(s). The rate region of this problem has been considered under restrictive assumptions. We present results that significantly relax these assumptions. For a one-stage tree network, we characterize a rate region by a necessary and sufficient condition for any achievable coloring-based coding scheme, the coloring connectivity condition. We propose a modularized coding scheme based on graph colorings to perform arbitrarily closely to derived rate lower bounds. For a general tree network, we provide a rate lower bound based on graph entropies and show that it is tight for independent sources. We show that, in a general tree network case with independent sources, to achieve the rate lower bound, intermediate nodes should perform computations, but for a family of functions and random variables, which we call chain-rule proper sets, it suffices to have no computations at intermediate nodes to perform arbitrarily closely to the rate lower bound. We consider practicalities of coloring-based coding schemes and propose an efficient algorithm to compute a minimum-entropy coloring of a characteristic graph.
- Type
- Chapter
- Information
- Information-Theoretic Methods in Data Science , pp. 455 - 486Publisher: Cambridge University PressPrint publication year: 2021