Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
11 - Entropy Modulo a Prime
Published online by Cambridge University Press: 21 April 2021
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
Summary
Ordinary probabilities are real numbers, and ordinary entropy is a real number too. Building on ideas of Kontsevich, we develop an analogue of entropy in which both the probabilities and entropy itself are integers modulo a prime number p. While the formula for entropy mod p is quite unlike the formula for real entropy, we prove characterization theorems for entropy mod p and information loss mod p that are very closely analogous to the theorems over the real numbers, thus justifying the definition. We also establish a sense in which entropy mod p is the residue mod p of real entropy.
Keywords
- Type
- Chapter
- Information
- Entropy and DiversityThe Axiomatic Approach, pp. 343 - 367Publisher: Cambridge University PressPrint publication year: 2021