1 - Introduction
Published online by Cambridge University Press: 05 July 2012
Summary
This book has two parts: the first summarizes the facts of coding and information theory which are needed to understand the essence of estimation and statistics, and the second describes a new theory of estimation, which also covers a good part of statistics as well. After all, both estimation and statistics are about extracting information from the often chaotic looking data in order to learn what it is that makes the data behave the way they do. The first part together with an outline of the algorithmic information in Appendix A is meant for the statistician who wants to understand his or her discipline rather than just learn a bag of tricks with programs to apply them to various data, tricks that are not based on any theory and do not stand a critical examination although some of them can be quite useful, providing solutions for important statistical problems.
The word information has many meanings, two of which have been formalized by Shannon. The first is fundamental in communication as just the number of messages, strings of symbols, either to be stored or to be sent over some communication channel, the practical question being the size of the storage device needed or the time it takes to send them. The second meaning is the measure of the strength of the statistical property a string has, which is fundamental in statistics, and very different from that in communication.
- Type
- Chapter
- Information
- Optimal Estimation of Parameters , pp. 1 - 8Publisher: Cambridge University PressPrint publication year: 2012