Abstract
A linear regression model assumes that the regression function E(Y|X) is linear in the inputs X 1,..., X p . Linear models were largely developed in the precomputer age of statistics, but even in today’s computer era there are still good reasons to study and use them. They are simple and often provide an adequate and interpretable description of how the inputs affect the output. For prediction purposes they can sometimes outperform fancier nonlinear models, especially in situations with small numbers of training cases, low signal-to-noise ratio or sparse data. Finally, linear methods can be applied to transformations of the inputs and this considerably expands their scope. These generalizations are sometimes called basis-function methods, and are discussed in Chapter 5.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer Science+Business Media New York
About this chapter
Cite this chapter
Hastie, T., Friedman, J., Tibshirani, R. (2001). Linear Methods for Regression. In: The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-0-387-21606-5_3
Download citation
DOI: https://doi.org/10.1007/978-0-387-21606-5_3
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4899-0519-2
Online ISBN: 978-0-387-21606-5
eBook Packages: Springer Book Archive