No CrossRef data available.
Published online by Cambridge University Press: 15 January 2008
An unknown constant matrix M is observed with additive random error. The basic problem considered is to devise an estimator of M that trades off bias against variance so as to achieve relatively low quadratic risk. This paper develops an adaptive total least squares estimator and an adaptive total shrinkage estimator of M that minimize estimated risk over certain large classes of linear estimators. It is shown that the asymptotic risk of the adaptive total least squares estimator is the smallest attainable among reduced rank total least squares fits to the data matrix. The asymptotic risk of the adaptive total shrinkage estimator is shown to be smaller still. A close link is established between total shrinkage and the Efron–Morris estimator of M. In the asymptotics, the row dimension of M tends to infinity, and the column dimension stays fixed. The risks converge uniformly when the signal-to-noise ratio and the measurement error variance are both bounded. A second problem treated is estimation of M under the assumption that a linear relation holds among its columns. In this formulation of the errors-in-variables linear regression model, rank constrained adaptive total least squares asymptotically dominates the usual total least squares estimator of M, and rank constrained adaptive total shrinkage is better still.This research was supported in part by National Science Foundation Grant DMS 0404547.