Unformatted text preview: 25/2012. © P. KOLM 18 Algebraic Properties of OLS The sum of the OLS residuals is zero
o Thus, the sample average of the OLS residuals is zero as well The sample covariance between the regressors and the OLS residuals is zero The OLS regression line always goes through the mean of the sample (For you: Why?) VER. 9/25/2012. © P. KOLM 19 Important Terminology Recall that for any value of x = x i we obtain a fitted value for y,
ˆ
ˆ
ˆ
yi = b0 + b1x i We can think of each observation yi as being made up of an explained part and an unexplained part,
ˆ
ˆ
yi = yi + ui
= explained part + unexplained part We then define the following: å (y  y ) is the total sum of squares (SST)
ˆ
å (y  y ) is the explained sum of squares (SSE)
ˆ
å u is the residual sum of squares (SSR)
2 i 2 i 2
i Fact: SST = SSE + SSR VER. 9/25/2012. © P. KOLM 20 Proof that SST = SSE + SSR We show simply by calculating (For you: Verify!) SST = å (yi  y ) 2
2 ˆ
ˆ
= å éê(yi  yi ) + (yi  y )ùú
ë
û
2
ˆ
ˆ
= å éêui + (yi  y )ùú ë
û
2
ˆ
ˆˆ
ˆ
= å ui2 + 2å ui (yi  y ) + å (yi  y )
ˆˆ
= SSR + 2å ui (yi  y ) + SSE
= SSR + SSE
as we know that VER. 9/25/2012. © P. KOLM ˆˆ
å u (y  y )
i i =0 21 “Goodness–of–Fit” How do we think about how well our sample regression line fits our sample data? We compute the fraction of the total sum of squares (SST) that is explained by the model, call this the Rsquared of regression: 0 < R2 < 1
R 2 = SSE / SST = 1  SSR / SST Note that 0 < R2 < 1 A higher R 2 implies a better fitting model (but not necessary a better model!)
2 In the twovariable case R 2 = rxy . However, in the general case this is no longer true Cautionary note:4 Don't just rely on looking at the Rsquared when evaluating a regression test
o We will come back to this in later lectures VER. 9/25/2012. © P. KOLM 22 References
Wooldridge, J. M. (2009). Introductory Econometrics: A Modern Approach, SouthWestern Pub. Footnotes 1 This uses the identities (for you: show these!)
n å (x
i =1 n å (x
i =1 2
3 i 2 i n n i =1
n i =1
n  x ) = å x i2  nx 2 = å x i (x i  x )
n  x )(yi  y ) = å x i (yi  y ) = å (x i  x )yi = å x iyi ...
View
Full Document
 Fall '14
 Regression Analysis, P. KOLM, Petter Kolm

Click to edit the document details