lreg - Linear, Ridge Regression, and Principal Component...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Linear, Ridge Regression, and Principal Component Analysis Linear, Ridge Regression, and Principal Component Analysis Jia Li Department of Statistics The Pennsylvania State University Email: [email protected] http://www.stat.psu.edu/ jiali Jia Li http://www.stat.psu.edu/ jiali
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Linear, Ridge Regression, and Principal Component Analysis Introduction to Regression I Input vector: X = ( X 1 , X 2 , ..., X p ). I Output Y is real-valued. I Predict Y from X by f ( X ) so that the expected loss function E ( L ( Y , f ( X ))) is minimized. I Square loss: L ( Y , f ( X )) = ( Y - f ( X )) 2 . I The optimal predictor f * ( X ) = argmin f ( X ) E ( Y - f ( X )) 2 = E ( Y | X ) . I The function E ( Y | X ) is the regression function . Jia Li http://www.stat.psu.edu/ jiali
Background image of page 2
Linear, Ridge Regression, and Principal Component Analysis Example The number of active physicians in a Standard Metropolitan Statistical Area (SMSA), denoted by Y , is expected to be related to total population ( X 1 , measured in thousands), land area ( X 2 , measured in square miles), and total personal income ( X 3 , measured in millions of dollars). Data are collected for 141 SMSAs, as shown in the following table. i : 1 2 3 ... 139 140 141 X 1 9387 7031 7017 ... 233 232 231 X 2 1348 4069 3719 ... 1011 813 654 X 3 72100 52737 54542 ... 1337 1589 1148 Y 25627 15389 13326 ... 264 371 140 Goal: Predict Y from X 1 , X 2 , and X 3 . Jia Li http://www.stat.psu.edu/ jiali
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Linear, Ridge Regression, and Principal Component Analysis Linear Methods I The linear regression model f ( X ) = β 0 + p X j =1 X j β j . I What if the model is not true? I It is a good approximation I Because of the lack of training data/or smarter algorithms, it is the most we can extract robustly from the data. I Comments on X j : I Quantitative inputs I Transformations of quantitative inputs, e.g., log( · ), p ( · ). I Basis expansions: X 2 = X 2 1 , X 3 = X 3 1 , X 3 = X 1 · X 2 . Jia Li http://www.stat.psu.edu/ jiali
Background image of page 4
Linear, Ridge Regression, and Principal Component Analysis Jia Li http://www.stat.psu.edu/ jiali
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Linear, Ridge Regression, and Principal Component Analysis Estimation I The issue of finding the regression function E ( Y | X ) is converted to estimating β j , j = 0 , 1 , ..., p . I Training data: { ( x 1 , y 1 ) , ( x 2 , y 2 ) , ..., ( x N , y N ) } , where x i = ( x i 1 , x i 2 , ..., x ip ) . I Denote β = ( β 0 , β 1 , ..., β p ) T . I The loss function E ( Y - f ( X )) 2 is approximated by the empirical loss RSS ( β ) / N : RSS ( β ) = N X i =1 ( y i - f ( x i )) 2 = N X i =1 ( y i - β 0 - p X j =1 x ij β j ) 2 . Jia Li http://www.stat.psu.edu/ jiali
Background image of page 6
Notation I The input matrix X of dimension N × ( p + 1): 1 x 1 , 1 x 1 , 2 ... x 1 , p 1 x 2 , 1 x 2 , 2 ... x 2 , p ... ... ... ... ... 1
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/27/2012 for the course STATS 315A taught by Professor Tibshirani,r during the Spring '10 term at Stanford.

Page1 / 34

lreg - Linear, Ridge Regression, and Principal Component...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online