Article contents
Theory for high-order bounds in functional principal components analysis
Published online by Cambridge University Press: 01 January 2009
Abstract
Functional data analysis, or FDA, is a relatively new and rapidly growing area of statistics. A substantial part of the interest in the field derives from new types of data that are generated through the application of new technologies. Statistical methodologies, such as linear regression, which are effectively finite-dimensional in conventional statistical settings, become infinite-dimensional in the context of functional data. As a result, the convergence rates of estimators based on functional data can be relatively slow, and so there is substantial interest in methods for dimension reduction, such as principal components analysis (PCA). However, although the statistical development of PCA for FDA has been underway for approximately two decades, relatively high-order theoretical arguments have been largely absent. This makes it difficult to assess the impact that, for example, eigenvalue spacings have on properties of eigenvalue estimators, or to develop concise first-order limit theory for linear functional regression. This paper shows how to overcome these hurdles. It develops rigorous arguments that underpin stochastic expansions of estimators of eigenvalues and eigenfunctions, and shows how to use them to answer statistical questions. The theory is based on arguments from operator theory, made more challenging by the requirement of statisticians that closeness of functions be measured in the L∞, rather than L2, metric. The statistical implications of the properties we develop have been discussed elsewhere, but the theoretical arguments that lie behind them have not been presented before.
- Type
- Research Article
- Information
- Mathematical Proceedings of the Cambridge Philosophical Society , Volume 146 , Issue 1 , January 2009 , pp. 225 - 256
- Copyright
- Copyright © Cambridge Philosophical Society 2008
References
REFERENCES
- 41
- Cited by