Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-08T07:45:19.138Z Has data issue: false hasContentIssue false

Model selection and estimationof a component in additive regression

Published online by Cambridge University Press:  28 November 2013

Xavier Gendre*
Affiliation:
Institut de Mathématiques de Toulouse, Équipe de Statistique et Probabilités, Université Paul Sabatier, 31000 Toulouse, France. [email protected]
Get access

Abstract

Let Y ∈ ℝn be a random vector with means and covariance matrix σ2PntPn wherePn is some knownn × n-matrix. We construct a statistical procedure toestimate s as well as under moment condition on Y orGaussian hypothesis. Both cases are developed for known or unknownσ2. Our approach is free from any prior assumption ons and is based on non-asymptotic model selection methods. Given somelinear spaces collection {Sm, m ∈ ℳ}, we consider, for any m ∈ ℳ, the least-squaresestimator ŝm of s inSm. Considering a penalty function that isnot linear in the dimensions of the Sm’s, weselect some m̂ ∈ ℳ in order to get an estimatorŝ with a quadratic risk as close aspossible to the minimal one among the risks of theŝm’s. Non-asymptotic oracle-typeinequalities and minimax convergence rates are proved forŝ. A special attention is given to theestimation of a non-parametric component in additive models. Finally, we carry out asimulation study in order to illustrate the performances of our estimators inpractice.

Type
Research Article
Copyright
© EDP Sciences, SMAI, 2013

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Akaike, H., Statistical predictor identification. Ann. Inst. Stat. Math. 22 (1970) 203217. Google Scholar
S. Arlot, Choosing a penalty for model selection in heteroscedastic regression. Preprint arXiv:0812.3141v2 (2010).
Arlot, S. and Massart, P., Data-driven calibration of penalties for least-squares regression. J. Machine Learn. Research 10 (2009) 245279. Google Scholar
Baraud, Y., Model selection for regression on a fixed design. Probab. Theory Related Fields 117 (2000) 467493. Google Scholar
Baraud, Y., Model selection for regression on a random design. ESAIM: Probab. Statist. 6 (2002) 127146. Google Scholar
Baraud, Y., Comte, F. and Viennet, G., Adaptive estimation in autoregression or β-mixing regression via model selection. Ann. Stat. 29 (2001) 839875. Google Scholar
L. Birgé and P. Massart, From model selection to adaptive estimation. Festschrift for Lucien Lecam: Research Papers in Probab. Stat. (1997) 55–87.
Birgé, L. and Massart, P., Minimum contrast estimators on sieves: exponential bounds and rates of convergence. Bernoulli 4 (1998) 329375. Google Scholar
Birgé, L. and Massart, P., An adaptive compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 136. Google Scholar
Birgé, L. and Massart, P.. Gaussian model selection. J. Europ. Math. Soc. 3 (2001) 203268. Google Scholar
Birgé, L. and Massart, P., Minimal penalties for gaussian model selection. Probab. Theory Related Fields 138 (2007) 3373. Google Scholar
Breiman, L. and Friedman, J.H., Estimating optimal transformations for multiple regression and correlations (with discussion). J. Amer. Stat. Assoc. 80 (1985) 580619. Google Scholar
Brunel, E. and Comte, F., Adaptive nonparametric regression estimation in presence of right censoring. Math. Methods Stat. 15 (2006) 233255. Google Scholar
E. Brunel and F. Comte, Model selection for additive regression models in the presence of censoring, chapt. 1 in “Mathematical Methods in Survival Analysis, Reliability and Quality of Life”, Wiley (2008) 17–31.
Buja, A., Hastie, T.J. and Tibshirani, R.J., Linear smoothers and additive models (with discussion). Ann. Stat. 17 (1989) 453555. Google Scholar
Comte, F. and Rozenholc, Y., Adaptive estimation of mean and volatility functions in (auto-)regressive models. Stoch. Process. Appl. 97 (2002) 111145. Google Scholar
Gendre, X., Simultaneous estimation of the mean and the variance in heteroscedastic gaussian regression. Electron. J. Stat. 2 (2008) 13451372. Google Scholar
W. Härdle, M. Müller, S. Sperlich and A. Werwatz. Nonparametric and Semiparametric Models. Springer (2004).
T.J. Hastie and R.J. Tibshirani, Generalized additive models. Chapman and Hall (1990).
R.A. Horn and C.R. Johnson, Matrix analysis. Cambridge University Press (1990).
Laurent, B., Loubes, J.M. and Marteau, C., Testing inverse problems: a direct or an indirect problem? J. Stat. Plann. Inference 141 (2011) 18491861. Google Scholar
Laurent, B. and Massart, P., Adaptive estimation of a quadratic functional by model selection. Ann. Stat. 28 (2000) 13021338. Google Scholar
Leontief, W., Introduction to a theory of the internal structure of functional relationships. Econometrica 15 (1947) 361373. Google Scholar
Linton, O. and Nielsen, J.P., A kernel method of estimating structured nonparametric regression based on marginal integration. Biometrika 82 (1995) 93101. Google Scholar
Mallows, C.L., Some comments on c p. Technometrics 15 (1973) 661675,. Google Scholar
Mammen, E., Linton, O. and Nielsen, J.P., The existence and asymptotic properties of a backfitting projection algorithm under weak conditions. Ann. Stat. 27 (1999) 14431490. Google Scholar
P. Massart, Concentration inequalities and model selection, in vol. 1896 of Lect. Notes Math. Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, July 6-23 (2003). Springer, Berlin (2007).
A.D.R. McQuarrie and C.L. Tsai, Regression and times series model selection. River Edge, NJ (1998).
Meier, L., S., van de Geer and P., Bühlmann, High-dimensional additive modeling. Ann. Stat. 37 (2009) 37793821. Google Scholar
Opsomer, J. and Ruppert, D., Fitting a bivariate additive model by local polynomial regression. Ann. Stat. 25 (1997) 186211. Google Scholar
V.V. Petrov, Limit theorems of probability theory: sequences of independent random variables. Oxford Studies Probab. 4 (1995).
Ravikumar, P.D., Liu, H., Lafferty, J.D. and Wasserman, L.A., Sparse additive models. J. Royal Statist. Soc. 71 (2009) 10091030. Google Scholar
S. Robin, F. Rodolphe and S. Schbath, DNA, Words and Models. Cambridge University Press (2005).
Ruppert, D. and Wand, M.P., Multivariate locally weighted least squares regression. Ann. Stat. 22 (1994) 13461370. Google Scholar
H. Scheffé, The analysis of variance. Wiley-Interscience (1959).
Severance-Lossin, E. and Sperlich, S., Estimation of derivatives for additive separable models. Statististics 33 (1999) 241265. Google Scholar
Stone, C.J., Additive regression and other nonparametric models. Ann. Stat. 14 (1985) 590606. Google Scholar
Tjøstheim, D. and Auestad, B., Nonparametric identification of nonlinear time series: Selecting significant lags. J. Amer. Stat. Assoc. 89 (1994) 14101430. Google Scholar
von Bahr, B. and Esseen, C.G., Inequalities for the rth absolute moment of a sum of random variables 1 ≤ r ≤ 2 . Ann. Math. Stat. 36 (1965) 299303. Google Scholar