Published online by Cambridge University Press: 01 February 2009
It is well known that in standard linear regression models with independent and identically distributed data and homoskedasticity, adding “irrelevant regressors” hurts (asymptotic) efficiency unless such irrelevant regressors are orthogonal to the remaining regressors. But we have found that under (conditional) heteroskedasticity “irrelevant regressors” can always be found such that one can achieve the asymptotic variance of the generalized least squares estimator by adding the “irrelevant regressors” to the model.