week9 - Econ 508 Issues beyond heteroskedasticity and...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Econ 508: Issues beyond heteroskedasticity and autocorrelation Juan Fung MSPE April 22, 2011 Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 1 / 26 Outline More violations of the classical assumptions Qualitative explanatory variables Qualitative dependent variables Dynamic models Other issues Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 2 / 26 More violations of the classical assumptions Multicollinearity Recall the assumption that the X i ’s are independent. Consider two cases, X i = φ + φ 1 Z i (1) X i = φ + φ 1 Z i + ν i , (2) where ν i is a white noise error. Suppose you want to estimate Y i = β + β 1 X i + β 2 Z i + i . In case (1), X i is an exact linear function of Z i . This violates independence, and OLS estimators do not exist. In case (2), X i and Z i have an approximate linear relationship. No assumptions are violated, so OLS estimators exist. Moreover, Gauss-Markov holds. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 3 / 26 More violations of the classical assumptions Multicollinearity If we don’t have exact multicollinearity, then what’s the problem? Note that the estimated variance of ˆ β 1 can be written as 1 n s 2 y s 2 x 1- R 2 1- R 2 x , where s 2 y is the estimated variance of Y , s 2 x is the estimated variance of X , and R 2 x is the coefficient of determination from regressing X on all other explanatory variables (in this case Z ). a. More variation in X (higher s 2 x ) reduces variance of ˆ β 1 . b. More collinearity (higher R 2 x ) increases variance of ˆ β 1 . So high collinearity implies high variance. Misleading inference. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 4 / 26 More violations of the classical assumptions Detecting Multicollinearity High multicollinearity arises because OLS is not given enough independent variation in X to calculate its effect on Y . It is a sample problem. Several methods exist for “diagnosing” multicollinearity. 1. Coefficients have wrong sign. 2. High R 2 , F and low t ’s. 3. Small data changes result in large changes in estimates. 4. Partial correlation coefficients. 5. Variance inflation factor, VIF x = 1 1- R 2 x . None of these are necessary or sufficient. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 5 / 26 More violations of the classical assumptions Dealing with multicollinearity Since it’s a sample problem, you can try to increase variation by incorporating additional information: a. Get more data. b. Re-specify the model. Specify relationship between X and Z , or among β 1 ,β 2 . Use theory....
View Full Document

{[ snackBarMessage ]}

Page1 / 26

week9 - Econ 508 Issues beyond heteroskedasticity and...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online