Book contents
- Frontmatter
- Contents
- Preface
- 1 Thermodynamic entropy
- 2 Statistical entropy
- 3 Entropy of classical systems
- 4 Entropy of quantized systems
- 5 Entropy of a non-isolated system
- 6 Entropy of fermion systems
- 7 Entropy of systems of bosons
- 8 Entropy of information
- Epilogue
- Appendix I Physical constants and standard definitions
- Appendix II Formulary
- Appendix III Glossary
- Appendix IV Time line
- Appendix V Answers to problems
- Appendix VI Annotated further reading
- Index
Preface
Published online by Cambridge University Press: 05 September 2014
- Frontmatter
- Contents
- Preface
- 1 Thermodynamic entropy
- 2 Statistical entropy
- 3 Entropy of classical systems
- 4 Entropy of quantized systems
- 5 Entropy of a non-isolated system
- 6 Entropy of fermion systems
- 7 Entropy of systems of bosons
- 8 Entropy of information
- Epilogue
- Appendix I Physical constants and standard definitions
- Appendix II Formulary
- Appendix III Glossary
- Appendix IV Time line
- Appendix V Answers to problems
- Appendix VI Annotated further reading
- Index
Summary
The mathematician John von Neumann once urged the information theorist Claude Shannon to assign the name entropy to the measure of uncertainty Shannon had been investigating. After all, a structurally identical measure with the name entropy had long been an element of statistical mechanics. Furthermore, “No one really knows what entropy really is, so in a debate you will always have the advantage.” Most of us love clever one-liners and allow each other to bend the truth in making them. But von Neumann was wrong about entropy. Many people have understood the concept of entropy since it was first discovered 150 years ago.
Actually, scientists have no choice but to understand entropy because the concept describes an important aspect of reality. We know how to calculate and how to measure the entropy of a physical system. We know how to use entropy to solve problems and to place limits on processes. We understand the role of entropy in thermodynamics and in statistical mechanics. We also understand the parallelism between the entropy of physics and chemistry and the entropy of information theory.
But von Neumann’s witticism contains a kernel of truth: entropy is difficult, if not impossible, to visualize. Consider that we are able to invest the concept of the energy of a rod of iron with meaning by imagining the rod broken into its smallest parts, atoms of iron, and comparing the energy of an iron atom to that of a macroscopic, massive object attached to a network of springs that model the interactions of the atom with its nearest neighbors. The object’s energy is then the sum of its kinetic and potential energies – types of energy that can be studied in elementary physics laboratories. Finally, the energy of the entire system is the sum of the energy of its parts.
- Type
- Chapter
- Information
- A Student's Guide to Entropy , pp. ix - xiiPublisher: Cambridge University PressPrint publication year: 2013