Preface
Published online by Cambridge University Press: 05 July 2012
Summary
I have a long lasting interest in estimation, which started with attempts to control industrial processes. It did not take long to realize that the control part is easy if you knew the behavior of the process you want to control, which meant that the real problem is estimation. When I was asked by the Information Theory Society to give the 2009 Shannon Lecture I thought of giving a coherent survey of estimation theory. However, during the year given to prepare the talk I found that it was not possible, because there was no coherent theory of estimation. There was a collection of facts and results but they were isolated with little to connect them. To my surprise this applied even to the works of some of the greatest names in statistics, such as Fisher, Cramér, and Rao, which I had been familiar with for decades, but which I had never questioned until now that I was more or less forced to do so. As an example, the famous maximum likelihood estimator due to Fisher [12] had virtually no formal justification. The celebrated Cramér-Rao inequality gives it a non-asymptotic justification only for special models and for more general parametric models only an asymptotic justification. Clearly, no workable theory should be founded on asymptotic behavior. About the value of asymptotics, we quote Keynes' famous quip that “asymptotically we all shall be dead.”
- Type
- Chapter
- Information
- Optimal Estimation of Parameters , pp. vii - viiiPublisher: Cambridge University PressPrint publication year: 2012