Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgments
- Notation
- Part I Classic Statistical Inference
- Part II Early Computer-Age Methods
- 6 Empirical Bayes
- 7 James–Stein Estimation and Ridge Regression
- 8 Generalized Linear Models and Regression Trees
- 9 Survival Analysis and the EM Algorithm
- 10 The Jackknife and the Bootstrap
- 11 Bootstrap Confidence Intervals
- 12 Cross-Validation and Cp Estimates of Prediction Error
- 13 Objective Bayes Inference and MCMC
- 14 Statistical Inference and Methodology in the Postwar Era
- Part III Twenty-First-Century Topics
- Epilogue
- References
- Author Index
- Subject Index
13 - Objective Bayes Inference and MCMC
from Part II - Early Computer-Age Methods
Published online by Cambridge University Press: 05 July 2016
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgments
- Notation
- Part I Classic Statistical Inference
- Part II Early Computer-Age Methods
- 6 Empirical Bayes
- 7 James–Stein Estimation and Ridge Regression
- 8 Generalized Linear Models and Regression Trees
- 9 Survival Analysis and the EM Algorithm
- 10 The Jackknife and the Bootstrap
- 11 Bootstrap Confidence Intervals
- 12 Cross-Validation and Cp Estimates of Prediction Error
- 13 Objective Bayes Inference and MCMC
- 14 Statistical Inference and Methodology in the Postwar Era
- Part III Twenty-First-Century Topics
- Epilogue
- References
- Author Index
- Subject Index
Summary
From its very beginnings, Bayesian inference exerted a powerful influence on statistical thinking. The notion of a single coherent methodology employing only the rules of probability to go from assumption to conclusion was and is immensely attractive. For 200 years, however, two impediments stood between Bayesian theory's philosophical attraction and its practical application.
In the absence of relevant past experience, the choice of a prior distribution introduces an unwanted subjective element into scientific inference.
Bayes’ rule (3.5) looks simple enough, but carrying out the numerical calculation of a posterior distribution often involves intricate higherdimensional integrals.
The two impediments fit neatly into the dichotomy of Chapter 1, the first being inferential and the second algorithmic.
A renewed cycle of Bayesian enthusiasm took hold in the 1960s, at first concerned mainly with coherent inference. Building on work by Bruno de Finetti and L. J. Savage, a principled theory of subjective probability was constructed: the Bayesian statistician, by the careful elicitation of prior knowledge, utility, and belief, arrives at the correct subjective prior distribution for the problem at hand. Subjective Bayesianism is particularly appropriate for individual decision making, say for the business executive trying to choose the best investment in the face of uncertain information.
It is less appropriate for scientific inference, where the sometimes skeptical world of science puts a premium on objectivity. An answer came from the school of objective Bayes inference. Following the approach of Laplace and Jeffreys, as discussed in Section 3.2, their goal was to fashion objective, or “uninformative,” prior distributions that in some sense were unbiased in their effects upon the data analysis.
In what came as a surprise to the Bayes community, the objective school has been the most successful in bringing Bayesian ideas to bear on scientific data analysis. Of the 24 articles in the December 2014 issue of the Annals of Applied Statistics, 8 employed Bayesian analysis, predominantly based on objective priors.
This is where electronic computation enters the story. Commencing in the 1980s, dramatic steps forward were made in the numerical calculation of high-dimensional Bayes posterior distributions. Markov chain Monte Carlo (MCMC) is the generic name for modern posterior computation algorithms.
- Type
- Chapter
- Information
- Computer Age Statistical InferenceAlgorithms, Evidence, and Data Science, pp. 233 - 263Publisher: Cambridge University PressPrint publication year: 2016