It has been known for some time that certain least-squares problems are “ill-conditioned”, and that it is therefore difficult to compute an accurate solution. The degree of ill-conditioning depends on the basis chosen for the subspace in which it is desired to find an approximation. This paper characterizes the degree of ill-conditioning, for a general inner-product space, in terms of the basis.
The results are applied to least-squares polynomial approximation. It is shown, for example, that the powers {1, z, z2,…} are a universally bad choice of basis. In this case, the condition numbers of the associated matrices of the normal equations grow at least as fast as 4n, where n is the degree of the approximating polynomial.
Analogous results are given for the problem of finite interpolation, which is closely related to the least-squares problem.
Applications of the results are given to two algorithms—the Method of Moments for solving linear equations and Krylov's Method for computing the characteristic polynomial of a matrix.