Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
2 - An Information-Theoretic Approach to Analog-to-Digital Compression
Published online by Cambridge University Press: 22 March 2021
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
Summary
Processing, storing, and communicating information that originates as an analog phenomenon involve conversion of the information to bits. This conversion can be described by the combined effect of sampling and quantization. The digital representation in this procedure is achieved by first sampling the analog signal so as to represent it by a set of discrete-time samples and then quantizing these samples to a finite number of bits. Traditionally, these two operations are considered separately. The sampler is designed to minimize information loss due to sampling based on prior assumptions about the continuous-time input. The quantizer is designed to represent the samples as accurately as possible, subject to the constraint on the number of bits that can be used in the representation. The goal of this chapter is to revisit this paradigm by considering the joint effect of these two operations and to illuminate the dependence between them.
Keywords
- Type
- Chapter
- Information
- Information-Theoretic Methods in Data Science , pp. 44 - 71Publisher: Cambridge University PressPrint publication year: 2021