Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Uncertainty and approximation
- 3 Simple illustrations
- 4 Discrete data
- 5 Regression with continuous responses
- 6 Some case studies
- 7 Further topics
- 8 Likelihood approximations
- 9 Numerical implementation
- 10 Problems and further results
- A Some numerical techniques
- References
- Example index
- Name index
- Index
2 - Uncertainty and approximation
Published online by Cambridge University Press: 19 November 2009
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Uncertainty and approximation
- 3 Simple illustrations
- 4 Discrete data
- 5 Regression with continuous responses
- 6 Some case studies
- 7 Further topics
- 8 Likelihood approximations
- 9 Numerical implementation
- 10 Problems and further results
- A Some numerical techniques
- References
- Example index
- Name index
- Index
Summary
Introduction
In the examples in later chapters we use parametric models almost exclusively. These models are used to incorporate a key element of statistical thinking: the explicit recognition of uncertainty. In frequentist settings imprecise knowledge about the value of a single parameter is typically expressed through a collection of confidence intervals, or equivalently by computation of the P-values associated with a set of hypotheses. If prior information is available then Bayes' theorem can be employed to perform posterior inference.
In almost every realistic setting, uncertainty is gauged using approximations, the most common of which rely on the application of the central limit theorem to quantities derived from the likelihood function. Not only does likelihood provide a powerful and very general framework for inference, but the resulting statements have many desirable properties.
In this chapter we provide a brief overview of the main approximations for likelihood inference. We present both first order and higher order approximations; first order approximations are derived from limiting distributions, and higher order approximations are derived from further analysis of the limiting process. A minimal amount of theory is given to structure the discussion of the examples in Chapters 3 to 7; more detailed discussion of asymptotic theory is given in Chapter 8.
Scalar parameter
In the simplest situation, observations y1, …, yn are treated as a realization of independent identically distributed random variables Y1, …, Yn whose probability density function f(y; θ) depends on an unknown scalar parameter θ.
- Type
- Chapter
- Information
- Applied AsymptoticsCase Studies in Small-Sample Statistics, pp. 5 - 16Publisher: Cambridge University PressPrint publication year: 2007