Book contents
- Frontmatter
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- 1 Algorithms and Computers
- 2 Computer Arithmetic
- 3 Matrices and Linear Equations
- 4 More Methods for Solving Linear Equations
- 5 Regression Computations
- 6 Eigenproblems
- 7 Functions: Interpolation, Smoothing, and Approximation
- 8 Introduction to Optimization and Nonlinear Equations
- 9 Maximum Likelihood and Nonlinear Regression
- 10 Numerical Integration and Monte Carlo Methods
- 11 Generating Random Variables from Other Distributions
- 12 Statistical Methods for Integration and Monte Carlo
- 13 Markov Chain Monte Carlo Methods
- 14 Sorting and Fast Algorithms
- Author Index
- Subject Index
- References
13 - Markov Chain Monte Carlo Methods
Published online by Cambridge University Press: 01 June 2011
- Frontmatter
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- 1 Algorithms and Computers
- 2 Computer Arithmetic
- 3 Matrices and Linear Equations
- 4 More Methods for Solving Linear Equations
- 5 Regression Computations
- 6 Eigenproblems
- 7 Functions: Interpolation, Smoothing, and Approximation
- 8 Introduction to Optimization and Nonlinear Equations
- 9 Maximum Likelihood and Nonlinear Regression
- 10 Numerical Integration and Monte Carlo Methods
- 11 Generating Random Variables from Other Distributions
- 12 Statistical Methods for Integration and Monte Carlo
- 13 Markov Chain Monte Carlo Methods
- 14 Sorting and Fast Algorithms
- Author Index
- Subject Index
- References
Summary
Introduction
One of the main advantages of Monte Carlo integration is a rate of convergence that is unaffected by increasing dimension, but a more important advantage for statisticians is the familiarity of the technique and its tools. Although Markov chain Monte Carlo (MCMC) methods are designed to integrate high-dimensional functions, the ability to exploit distributional tools makes these methods much more appealing to statisticians. In contrast to importance sampling with weighted observations, MCMC methods produce observations that are no longer independent; rather, the observations come from a stationary distribution and so time-series methods are needed for their analysis. The emphasis here will be on using MCMC methods for Bayesian problems with the goal of generating a series of observations whose stationary distribution π(t) is proportional to the unnormalized posterior p(t). Standard statistical methods can then be used to gain information about the posterior.
The two general approaches covered in this chapter are known as Gibbs sampling and the Metropolis–Hastings algorithm, although the former can be written as a special case of the latter. Gibbs sampling shows the potential of MCMC methods for Bayesian problems with hierarchical structure, also known as random effects or variance components.
- Type
- Chapter
- Information
- Numerical Methods of Statistics , pp. 375 - 402Publisher: Cambridge University PressPrint publication year: 2011