Book contents
- Frontmatter
- Dedication
- Epigraph
- Contents
- Preface
- Acknowledgements
- Expanded Note for Instructors
- Part I Concepts from Modeling, Inference, and Computing
- Part II Statistical Models
- 6 Regression Models
- 7 Mixture Models
- 8 Hidden Markov Models
- 9 State-Space Models
- 10 Continuous Time Models
- Part III Appendices
- Index
- Back Cover
7 - Mixture Models
from Part II - Statistical Models
Published online by Cambridge University Press: 17 August 2023
- Frontmatter
- Dedication
- Epigraph
- Contents
- Preface
- Acknowledgements
- Expanded Note for Instructors
- Part I Concepts from Modeling, Inference, and Computing
- Part II Statistical Models
- 6 Regression Models
- 7 Mixture Models
- 8 Hidden Markov Models
- 9 State-Space Models
- 10 Continuous Time Models
- Part III Appendices
- Index
- Back Cover
Summary
In this chapter we introduce the clustering problem and use it to motivate mixture models. We start by describing clustering in a frequentist paradigm and introduce the relevant likelihoods and latent variables. We then discuss properties of the likelihoods including invariance with respect to label swapping. Finally, we expand this discussion to describe clustering and mixture models more generally within a Bayesian paradigm. This allows us to introduce Dirichlet priors used in inferring the weight we ascribe to each cluster component from which the data are drawn. Finally, we describe the infinite mixture model and Dirichlet process priors within the Bayesian nonparametric paradigm, appropriate for the analysis of uncharacterized data that may contain an unspecified number of clusters.
Keywords
- Type
- Chapter
- Information
- Data Modeling for the SciencesApplications, Basics, Computations, pp. 245 - 263Publisher: Cambridge University PressPrint publication year: 2023