Book contents
- Frontmatter
- Contents
- List of abbreviations and acronyms
- Preface
- Acknowledgments
- 1 Introduction
- Part I Probability, random variables, and statistics
- Part II Transform methods, bounds, and limits
- Part III Random processes
- Part IV Statistical inference
- 18 Estimation and decision theory
- 19 Estimation algorithms
- Part V Applications and advanced topics
- References
- Index
19 - Estimation algorithms
from Part IV - Statistical inference
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of abbreviations and acronyms
- Preface
- Acknowledgments
- 1 Introduction
- Part I Probability, random variables, and statistics
- Part II Transform methods, bounds, and limits
- Part III Random processes
- Part IV Statistical inference
- 18 Estimation and decision theory
- 19 Estimation algorithms
- Part V Applications and advanced topics
- References
- Index
Summary
In this chapter we will study statistical methods to estimate parameters and procedures to test the goodness of fit of a model to the experimental data. We are primarily concerned with computational algorithms for these methods and procedures. The expectation-maximization (EM) algorithm for maximum-likelihood estimation is discussed in detail.
Classical numerical methods for estimation
As we stated earlier, it is often the case that a maximum-likelihood estimate (MLE) cannot be found analytically. Thus, numerical methods for computing the MLE are important. Finding the maximum of a likelihood function is an optimization problem. There are a number of optimization algorithms and software packages. In this and the next sections we will discuss several important methods that are pertinent to maximization of a likelihood function: the method of moments, the minimum χ2method, the minimum Kullback–Leibler divergence method, and the Newton–Raphson algorithm. In Section 19.2 we give a full account of the EM algorithm, because of its rather recent development and its increasing applications in signal processing and other science and engineering fields.
Method of moments
This method is typically used to estimate unknown parameters of a distribution function by equating the sample mean, sample variance, and other higher moments calculated from data to the corresponding moments expressed in the parameters of interest.
- Type
- Chapter
- Information
- Probability, Random Processes, and Statistical AnalysisApplications to Communications, Signal Processing, Queueing Theory and Mathematical Finance, pp. 554 - 570Publisher: Cambridge University PressPrint publication year: 2011