Book contents
- Frontmatter
- Contents
- Preface
- Guide to Notation
- 1 Introduction
- 2 Parametric Regression
- 3 Scatterplot Smoothing
- 4 Mixed Models
- 5 Automatic Scatterplot Smoothing
- 6 Inference
- 7 Simple Semiparametric Models
- 8 Additive Models
- 9 Semiparametric Mixed Models
- 10 Generalized Parametric Regression
- 11 Generalized Additive Models
- 12 Interaction Models
- 13 Bivariate Smoothing
- 14 Variance Function Estimation
- 15 Measurement Error
- 16 Bayesian Semiparametric Regression
- 17 Spatially Adaptive Smoothing
- 18 Analyses
- 19 Epilogue
- A Technical Complements
- B Computational Issues
- Bibliography
- Author Index
- Notation Index
- Example Index
- Subject Index
12 - Interaction Models
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface
- Guide to Notation
- 1 Introduction
- 2 Parametric Regression
- 3 Scatterplot Smoothing
- 4 Mixed Models
- 5 Automatic Scatterplot Smoothing
- 6 Inference
- 7 Simple Semiparametric Models
- 8 Additive Models
- 9 Semiparametric Mixed Models
- 10 Generalized Parametric Regression
- 11 Generalized Additive Models
- 12 Interaction Models
- 13 Bivariate Smoothing
- 14 Variance Function Estimation
- 15 Measurement Error
- 16 Bayesian Semiparametric Regression
- 17 Spatially Adaptive Smoothing
- 18 Analyses
- 19 Epilogue
- A Technical Complements
- B Computational Issues
- Bibliography
- Author Index
- Notation Index
- Example Index
- Subject Index
Summary
Introduction
The additive models of Chapters 8 and 11 have many attractive features. The joint effect of all the predictor variables upon the response is expressed as a sum of individual effects. These individual effects show how the expected response varies as any single predictor varies with the others held fixed at arbitrary values; because of the additivity, the effect of one predictor does not depend on the values at which the others are fixed. Thus, the individual component functions can be plotted separately to visualize the effect of each predictor, and these functions – taken together – allow us to understand the joint effects of all the predictors upon the expected response. If, for example, we wish to find conditions under which the expected response is maximized, then we need only maximize separately each of the component functions of the additive model. In summary, it is extremely convenient whenever an additive model provides an accurate summary of the data.
However, there are no guarantees that an additive model will provide a satisfactory fit in any given situation. Nonadditivity means that, as one predictor is varied, the effect on the expected response depends on the fixed values of the other predictors. A deviation from additivity is called an interaction.
- Type
- Chapter
- Information
- Semiparametric Regression , pp. 223 - 237Publisher: Cambridge University PressPrint publication year: 2003