Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
9 - Probabilistic Methods
Published online by Cambridge University Press: 21 April 2021
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
Summary
Probability theory can be used to solve certain entirely deterministic functional equations, using a technique pioneered by Aubrun and Nechita. We give some background on moment generating functions, large deviations, and convex conjugates, then state Aubrun and Nechita’s multiplicative characterization of the p-norms (giving a variant of their proof). We then use this theorem on norms to give a multiplicative characterization of the power means.
Keywords
- Type
- Chapter
- Information
- Entropy and DiversityThe Axiomatic Approach, pp. 303 - 328Publisher: Cambridge University PressPrint publication year: 2021