A - Elements of algorithmic information
from Appendices
Published online by Cambridge University Press: 05 July 2012
Summary
Although the material in this appendix is not used directly in the rest of the book, it has a profound importance in understanding the very essence of randomness and complexity, which are fundamental to probabilities and statistics. The algorithmic information or complexity theory is founded on the theory of recursive functions, the roots of which go back to the logicians Godel, Kleen, Church, and above all to Turing, who described a universal computer, the Turing machine, whose computing capability is no less than that of the latest supercomputers. The field of recursiveness has grown into an extensive branch of mathematics, of which we give just the very basics, which are relevant to statistics. For a comprehensive treatment we refer the reader to [26].
The field is somewhat peculiar in that the derivations and proofs of the basic results can be performed in two very different manners. First, since the partial recursive functions can be axiomatized as is common in other branches of mathematics the proofs are similar. But since the same set is also defined in terms of the computer, the Turing machine, a proof has to be a program. Now, the programs as binary strings of that machine are very primitive, like the machine language programs in modern computers, and they tend to be long and both hard to read and hard to understand. To shorten them and make them more comprehensible an appeal is often made to intuition, and the details are left to the reader to fill in.
- Type
- Chapter
- Information
- Optimal Estimation of Parameters , pp. 144 - 152Publisher: Cambridge University PressPrint publication year: 2012