Book contents
- Frontmatter
- Contents
- List of Illustrations
- Computer Code Used in the Examples
- Preface
- PART ONE Maximum Likelihood
- 1 The Maximum Likelihood Principle
- 2 Properties of Maximum Likelihood Estimators
- 3 Numerical Estimation Methods
- 4 Hypothesis Testing
- PART TWO Regression Models
- PART THREE Other Estimation Methods
- PART FOUR Stationary Time Series
- PART FIVE Nonstationary Time Series
- PART SIX Nonlinear Time Series
- Appendix A Change of Variable in Density Functions
- Appendix B The Lag Operator
- Appendix C FIML Estimation of a Structural Model
- Appendix D Additional Nonparametric Results
- References
- Author Index
- Subject Index
2 - Properties of Maximum Likelihood Estimators
from PART ONE - Maximum Likelihood
Published online by Cambridge University Press: 05 January 2013
- Frontmatter
- Contents
- List of Illustrations
- Computer Code Used in the Examples
- Preface
- PART ONE Maximum Likelihood
- 1 The Maximum Likelihood Principle
- 2 Properties of Maximum Likelihood Estimators
- 3 Numerical Estimation Methods
- 4 Hypothesis Testing
- PART TWO Regression Models
- PART THREE Other Estimation Methods
- PART FOUR Stationary Time Series
- PART FIVE Nonstationary Time Series
- PART SIX Nonlinear Time Series
- Appendix A Change of Variable in Density Functions
- Appendix B The Lag Operator
- Appendix C FIML Estimation of a Structural Model
- Appendix D Additional Nonparametric Results
- References
- Author Index
- Subject Index
Summary
Introduction
Under certain conditions known as regularity conditions, the maximum likelihood estimator introduced in Chapter 1 possesses a number of important statistical properties and the aim of this chapter is to derive these properties. In large samples, the maximum likelihood estimator is consistent, efficient and normally distributed. In small samples, it satisfies an invariance property, is a function of sufficient statistics and in some, but not all, cases, is unbiased and unique. As the derivation of analytical expressions for the finite-sample distributions of the maximum likelihood estimator is generally complicated, computationally intensive methods based on Monte Carlo simulations or series expansions are used to examine some of these properties.
The maximum likelihood estimator encompasses many other estimators often used in econometrics, including ordinary least squares and instrumental variables (Chapter 5), nonlinear least squares (Chapter 6), the Cochrane-Orcutt method for the autocorrelated regression model (Chapter 7), weighted least squares estimation of heteroskedastic regression models (Chapter 8) and the Johansen procedure for cointegrated nonstationary time series models (Chapter 18).
Preliminaries
Before deriving the formal properties of the maximum likelihood estimator, four important preliminary concepts are reviewed. The first presents some stochastic models of time series and briefly discusses their properties. The second is concerned with the convergence of a sample average to its population mean as T →∞, known as the weak law of large numbers. The third identifies the scaling factor ensuring convergence of scaled random variables to nondegenerate distributions.
- Type
- Chapter
- Information
- Econometric Modelling with Time SeriesSpecification, Estimation and Testing, pp. 33 - 86Publisher: Cambridge University PressPrint publication year: 2012