Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
8 - Mutual Information and Metacommunities
Published online by Cambridge University Press: 21 April 2021
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Note to the Reader
- Interdependence of Chapters
- Introduction
- 1 Fundamental Functional Equations
- 2 Shannon Entropy
- 3 Relative Entropy
- 4 Deformations of Shannon Entropy
- 5 Means
- 6 Species Similarity and Magnitude
- 7 Value
- 8 Mutual Information and Metacommunities
- 9 Probabilistic Methods
- 10 Information Loss
- 11 Entropy Modulo a Prime
- 12 The Categorical Origins of Entropy
- Appendix A The Categorical Origins of Entropy
- Appendix B Summary of Conditions
- References
- Index of Notation
- Index
Summary
We give a short introduction to some classical information-theoretic quantities: joint entropy, conditional entropy and mutual information. We then interpret their exponentials ecologically, as meaningful measures of subcommunities of a larger metacommunity. These subcommunity and metacommunity measures have excellent logical properties, as we establish. We also show how all these quantities can be presented in terms of relative entropy and the value measures of the previous chapter.
Keywords
- Type
- Chapter
- Information
- Entropy and DiversityThe Axiomatic Approach, pp. 257 - 302Publisher: Cambridge University PressPrint publication year: 2021