Book contents
- Frontmatter
- Contents
- Preface
- 1 Thermodynamic entropy
- 2 Statistical entropy
- 3 Entropy of classical systems
- 4 Entropy of quantized systems
- 5 Entropy of a non-isolated system
- 6 Entropy of fermion systems
- 7 Entropy of systems of bosons
- 8 Entropy of information
- Epilogue
- Appendix I Physical constants and standard definitions
- Appendix II Formulary
- Appendix III Glossary
- Appendix IV Time line
- Appendix V Answers to problems
- Appendix VI Annotated further reading
- Index
8 - Entropy of information
Published online by Cambridge University Press: 05 September 2014
- Frontmatter
- Contents
- Preface
- 1 Thermodynamic entropy
- 2 Statistical entropy
- 3 Entropy of classical systems
- 4 Entropy of quantized systems
- 5 Entropy of a non-isolated system
- 6 Entropy of fermion systems
- 7 Entropy of systems of bosons
- 8 Entropy of information
- Epilogue
- Appendix I Physical constants and standard definitions
- Appendix II Formulary
- Appendix III Glossary
- Appendix IV Time line
- Appendix V Answers to problems
- Appendix VI Annotated further reading
- Index
Summary
Messages and message sources
Information technologies are as old as the first recorded messages, but not until the twentieth century did engineers and scientists begin to quantify something they called information. Yet the word information poorly describes the concept the first information theorists quantified. Of course specialists have every right to select words in common use and give them new meaning. Isaac Newton, for instance, defined force and work in ways useful in his theory of dynamics. But a well-chosen name is one whose special, technical meaning does not clash with its range of common meanings. Curiously, the information of information theory violates this commonsense rule.
Compare the opening phrase of Dickens’s A Tale of Two Cities: It was the best of times, it was the worst of times … with the sequence of 50 letters, spaces, and a comma: eon jhktsiwnsho d ri nwfnn ti losabt,tob euffr te … taken from the tenth position on the first 50 pages of the same book. To me the first is richly associative; the second means nothing. The first has meaning and form; the second does not. Yet these two phrases could be said to carry the same information content because they have the same source. Each is a sequence of 50 characters taken from English text.
- Type
- Chapter
- Information
- A Student's Guide to Entropy , pp. 140 - 158Publisher: Cambridge University PressPrint publication year: 2013
- 1
- Cited by