RidgeRegrMain.pdf - PSTAT 127 Ordinary Least Squares Review followed by Ridge Regression Introduction In this set of notes we first review Ordinary

RidgeRegrMain.pdf - PSTAT 127 Ordinary Least Squares Review...

This preview shows page 1 - 6 out of 14 pages.

Feb 26, 2018. PSTAT 127. Ordinary Least Squares Review, followed by Ridge Regression Introduction. In this set of notes, we first review Ordinary Least Squares Estimation in the Linear Model with zero-mean, homoskedastic errors. Then, we introduce a different estimation approach (which can give you lower MSE) - namely ridge regression. Ridge regression is an example of a shrinkage estimation approach, using a penalization criterion.
Recall Estimation in Linear ModelSupposeY=+with(0, σ2I).Aim: estimateβ.One of the estimation approaches you studied is ordinary least squaresestimation (OLS):(Ordinary) Least squares:findˆβto minimize the residual sum of squaresni=1e0e=y-Xˆβ2,whereei=yi-X[i,]ˆβ, ande=y-Xˆβfor estimatevectorˆβ, a func-tion of the observationsy. 2 i = e
Properties of Ordinary Least Squares Estimates If E [ ] = 0 , then E h ˆ β i = ( X 0 X ) - 1 X 0 E [ Y ] = ( X 0 X ) - 1 X 0 = β . OLS ˆ β is an unbiased estimator of β . Also, Var [ Y ] = Var [ Y - ] = Var [ ] . So, if Var( ) = σ 2 I n , then Var h ˆ β i = Var h ( X 0 X ) - 1 X 0 Y i = ( X 0 X ) - 1 X 0 Var( Y ) X ( X 0 X ) - 1 = ( X 0 X ) - 1

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture