ECON301_Handout_10_1213_02

# 3 also because of consequence 1 the t ratio of one or

This preview shows pages 6–10. Sign up to view the full content.

3. Also because of consequence 1, the t ratio of one or more coefficients tends to be statistically insignificant.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: Lecture Notes 7 4. Although the t ratio of one or more coefficients is statistically insignificant, 2 R , the overall measure of goodness of fit, can be very high. 5. Coefficients have wrong signs and unlikely magnitude. 6. The OLS estimators and their standard errors can be sensitive to small changes in the data. Check to see how stable coefficients are when different samples are used. For example, you might randomly divide your sample in two. If coefficients differ dramatically, multicollinearity may be a problem. Or, try a slightly different specification of a model using the same data. See if seemingly “innocuous” changes (adding a variable, dropping a variable, using a different form of a variable) produce big shifts. In particular, as variables are added, look for changes in the signs of effects (e.g. switches from positive to negative) that seem theoretically questionable. Such changes may make sense if you believe suppressor effects are present, but otherwise they may indicate multicollinearity.
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: Lecture Notes 8 3. Detection of Multicollinearity 1. High R 2 but few significant t ratios. T his is the “classic” symptom of multicollinearity. If 2 R is high, say, in excess of 0.8, the F test in most cases will reject the hypothesis that the partial slope coefficients are simultaneously equal to zero, but the individual t tests will show that none or very few of the partial slope coefficients are statistically different from zero. 2. High pair-wise correlations among regressors. Look for high pair-wise correlation coefficients. Look at the correlation matrix for the regressors. 1 2 1 12 1 1 2 1 1 1 y y yk y k ky k k r r r r r r R r r r Another suggested rule of thumb is that if the pair-wise or zero- order correlation coefficient between two regressors is high, say, in excess of 0.8, then multicollinearity is a serious problem. However, the zero-order correlation coefficients can be misleading in models involving more than two X variables since it is possible to have low zero-order correlations and yet find high multicollinearity.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document