Book contents
- Frontmatter
- Contents
- Preface
- Guide to Notation
- 1 Introduction
- 2 Parametric Regression
- 3 Scatterplot Smoothing
- 4 Mixed Models
- 5 Automatic Scatterplot Smoothing
- 6 Inference
- 7 Simple Semiparametric Models
- 8 Additive Models
- 9 Semiparametric Mixed Models
- 10 Generalized Parametric Regression
- 11 Generalized Additive Models
- 12 Interaction Models
- 13 Bivariate Smoothing
- 14 Variance Function Estimation
- 15 Measurement Error
- 16 Bayesian Semiparametric Regression
- 17 Spatially Adaptive Smoothing
- 18 Analyses
- 19 Epilogue
- A Technical Complements
- B Computational Issues
- Bibliography
- Author Index
- Notation Index
- Example Index
- Subject Index
16 - Bayesian Semiparametric Regression
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface
- Guide to Notation
- 1 Introduction
- 2 Parametric Regression
- 3 Scatterplot Smoothing
- 4 Mixed Models
- 5 Automatic Scatterplot Smoothing
- 6 Inference
- 7 Simple Semiparametric Models
- 8 Additive Models
- 9 Semiparametric Mixed Models
- 10 Generalized Parametric Regression
- 11 Generalized Additive Models
- 12 Interaction Models
- 13 Bivariate Smoothing
- 14 Variance Function Estimation
- 15 Measurement Error
- 16 Bayesian Semiparametric Regression
- 17 Spatially Adaptive Smoothing
- 18 Analyses
- 19 Epilogue
- A Technical Complements
- B Computational Issues
- Bibliography
- Author Index
- Notation Index
- Example Index
- Subject Index
Summary
Introduction
Classical statistics treats parameters as fixed unknown quantities. Bayesian statistics is based on a different philosophy; parameters are treated as random variables. The probability distribution of a parameter characterizes knowledge about the parameter's value, and this distribution changes as new data are acquired. The mixed models of classical statistics have a Bayesian flavor because some parameters are treated as random. However, in a mixed model both the fixed effects and the variance components are treated as nonrandom unknowns. Bayesians go one step beyond mixed models in that they treat all parameters as random. In this chapter we take the mixed model formulation of Section 4.9 and extend it to a fully Bayesian model.
Bayesian statistics differs from classical statistics in two important respects:
(1) the use of the prior distribution to characterize knowledge of the parameter values prior to data collection; and
(2) the use of the posterior distribution – that is, the conditional distribution of the parameters given the data – as the basis of inference.
Some statisticians are uneasy about the use of priors, but when done with care, the use of priors is quite sensible. In some situations, we might have strong prior beliefs that will influence our analysis. For example, suppose we needed to estimate the probability that a toss of a coin comes up heads.
- Type
- Chapter
- Information
- Semiparametric Regression , pp. 276 - 292Publisher: Cambridge University PressPrint publication year: 2003