Econ 399 Chapter3e - 3.4 The Components of the OLS Variances Multicollinearity We see in(3.51 that the variance of Bjhat depends on three factors 2

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
3.4 The Components of the OLS  Variances: Multicollinearity We see in (3.51) that the variance of B j hat  depends on three factors:  σ 2 , SST and R j 2 : 1) The error variance,  σ   2 Larger error variance = Larger OLS variance -more “noise” in the equation makes it more  difficult to accurately estimate partial effects of  the variables -one can reduce the error variance by adding  (valid) variables to the equation
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
3.4 The Components of the OLS  Variances: Multicollinearity 2) The Total Sample Variation in x   j , SST   j Larger x j  variance – Smaller OLS j  variance -increasing sample size keeps increasing SST j   since  2 ) ( - = j ij j x x SST -This still assumes that we have a random  sample
Background image of page 2
3.4 The Components of the OLS  Variances: Multicollinearity 3) Linear relationships among x variables: R   j   2 Larger correlation in x’s – Bigger OLS j  variance -R j 2  is the most difficult component to  understand  - R j differs from the typical R 2  in that it  measures the goodness of fit of: ik k i i ij x x x x φ + + + + = ... ˆ 2 2 1 1 0 -Where x j  itself is not considered an explanatory  variable
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
3.4 The Components of the OLS  Variances: Multicollinearity 3) Linear relationships among x variables: R   j   2 -In general, R j is the total variation in x j  that is  explained by the  other  independent variables -If R j 2 =1, MLR.3 (and OLS) fails due to perfect  multicollinearity (x j  is a perfect linear  combination of the other x’s) Note that: 2 j R as ) ˆ ( j Var β -High (but not perfect) correlation between  independent variables is  MULTICOLLINEARITY
Background image of page 4
3.4 Multicollinearity -Note that an R j 2  close to 1 DOES NOT violate  MLR. 3 -unfortunately, the “problem” of multicollinearity is  hard to define -No R j is accepted as being too high -A high R j can always be offset by a high SST j  or  a low  σ 2 -Ultimately, how big is B j hat relative to its 
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
3.4 Multicollinearity -Ceteris Paribus, it is best to have little correlation  between x j  and all other independent variables -Dropping independent variables will reduce  multicollinearity -But if these variables are valid, we have  created bias -Multicollinearity can always be fought by  collecting more data -Sometimes multicollinearity is due to over  specifying independent variables:
Background image of page 6
3.4 Multicollinearity Example -In a study of heart disease, our economic model 
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 03/14/2009 for the course ECON ECON 399 taught by Professor Priemaza during the Spring '09 term at University of Alberta.

Page1 / 28

Econ 399 Chapter3e - 3.4 The Components of the OLS Variances Multicollinearity We see in(3.51 that the variance of Bjhat depends on three factors 2

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online