Book contents
- Frontmatter
- Contents
- Preface
- 1 Probability basics
- 2 Estimation and uncertainty
- 3 Statistical models and inference
- 4 Linear models, least squares, and maximum likelihood
- 5 Parameter estimation: single parameter
- 6 Parameter estimation: multiple parameters
- 7 Approximating distributions
- 8 Monte Carlo methods for inference
- 9 Parameter estimation: Markov Chain Monte Carlo
- 10 Frequentist hypothesis testing
- 11 Model comparison
- 12 Dealing with more complicated problems
- References
- Index
1 - Probability basics
Published online by Cambridge University Press: 12 July 2017
- Frontmatter
- Contents
- Preface
- 1 Probability basics
- 2 Estimation and uncertainty
- 3 Statistical models and inference
- 4 Linear models, least squares, and maximum likelihood
- 5 Parameter estimation: single parameter
- 6 Parameter estimation: multiple parameters
- 7 Approximating distributions
- 8 Monte Carlo methods for inference
- 9 Parameter estimation: Markov Chain Monte Carlo
- 10 Frequentist hypothesis testing
- 11 Model comparison
- 12 Dealing with more complicated problems
- References
- Index
Summary
This chapter reviews the basic ideas of probability and statistics that we will use in the rest of this book. I will outline some fundamental concepts, introduce some of the most common discrete and continuous probability distributions, and work through a few examples. But we'll start with an example that illustrates the role of information in solving problems.
The three doors problem
You are a contestant in a game show, presented with three closed doors. Behind one of the doors – chosen at random without your knowledge – is a prize car. Behind each of the other doors is a goat. Your objective is to reveal and thus win the car (the assumption is you're less keen on goats). You first select a door at random, which you do not open. The game show host, who knows where the car is, then opens one of the other doors to reveal a goat (she would never show you the car). She then gives you the opportunity to change your choice of door to the other closed one. Do you change?
This is a classic inference problem, also known as the Monty Hall problem. If you've not encountered it before – and even if you have – do think about it before reading on.
Your initial probability of winning (before any door is opened) is 1/3. The question is whether this probability is changed by the actions of the game show host. It appears that her actions don't change anything. After all, she will always show you a goat. In that case you may think that your probability of winning remains at 1/3 whether or not you change doors. Or you may think that once the host has revealed a goat, then because there are only two doors left, your chance of winning has changed to 1/2 whether you change or not. In either of these cases changing doors does not improve your chances of winning.
In fact, if you change to the other closed door your chance of winning increases to 2/3. This may seem counter-intuitive, but it is explained by how the host behaves. If you initially selected the door with the car behind it, then her choice of which door to open is random.
- Type
- Chapter
- Information
- Practical Bayesian InferenceA Primer for Physical Scientists, pp. 4 - 35Publisher: Cambridge University PressPrint publication year: 2017