This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Econ 508: Issues beyond heteroskedasticity and autocorrelation Juan Fung MSPE April 22, 2011 Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 1 / 26 Outline More violations of the classical assumptions Qualitative explanatory variables Qualitative dependent variables Dynamic models Other issues Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 2 / 26 More violations of the classical assumptions Multicollinearity Recall the assumption that the X i s are independent. Consider two cases, X i = + 1 Z i (1) X i = + 1 Z i + i , (2) where i is a white noise error. Suppose you want to estimate Y i = + 1 X i + 2 Z i + i . In case (1), X i is an exact linear function of Z i . This violates independence, and OLS estimators do not exist. In case (2), X i and Z i have an approximate linear relationship. No assumptions are violated, so OLS estimators exist. Moreover, GaussMarkov holds. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 3 / 26 More violations of the classical assumptions Multicollinearity If we dont have exact multicollinearity, then whats the problem? Note that the estimated variance of 1 can be written as 1 n s 2 y s 2 x 1 R 2 1 R 2 x , where s 2 y is the estimated variance of Y , s 2 x is the estimated variance of X , and R 2 x is the coefficient of determination from regressing X on all other explanatory variables (in this case Z ). a. More variation in X (higher s 2 x ) reduces variance of 1 . b. More collinearity (higher R 2 x ) increases variance of 1 . So high collinearity implies high variance. Misleading inference. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 4 / 26 More violations of the classical assumptions Detecting Multicollinearity High multicollinearity arises because OLS is not given enough independent variation in X to calculate its effect on Y . It is a sample problem. Several methods exist for diagnosing multicollinearity. 1. Coefficients have wrong sign. 2. High R 2 , F and low t s. 3. Small data changes result in large changes in estimates. 4. Partial correlation coefficients. 5. Variance inflation factor, VIF x = 1 1 R 2 x . None of these are necessary or sufficient. Juan Fung (MSPE) Econ 508: Issues beyond heteroskedasticity and autocorrelation April 22, 2011 5 / 26 More violations of the classical assumptions Dealing with multicollinearity Since its a sample problem, you can try to increase variation by incorporating additional information: a. Get more data. b. Respecify the model. Specify relationship between X and Z , or among 1 , 2 . Use theory....
View
Full
Document
 Spring '08
 Staff

Click to edit the document details