Open Access
June, 1988 Approximation of Least Squares Regression on Nested Subspaces
Dennis D. Cox
Ann. Statist. 16(2): 713-732 (June, 1988). DOI: 10.1214/aos/1176350830

Abstract

For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.

Citation

Download Citation

Dennis D. Cox. "Approximation of Least Squares Regression on Nested Subspaces." Ann. Statist. 16 (2) 713 - 732, June, 1988. https://doi.org/10.1214/aos/1176350830

Information

Published: June, 1988
First available in Project Euclid: 12 April 2007

zbMATH: 0669.62047
MathSciNet: MR947572
Digital Object Identifier: 10.1214/aos/1176350830

Subjects:
Primary: 62J05
Secondary: 41A10 , 62F12

Keywords: bias approximation , Model selection , Nonparametric regression , orthogonal polynomials , polynomial regression , rates of convergence , regression

Rights: Copyright © 1988 Institute of Mathematical Statistics

Vol.16 • No. 2 • June, 1988
Back to Top