{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ECON301_Handout_10_1213_02

5 coefficients have wrong signs and unlikely

Info iconThis preview shows pages 7–10. Sign up to view the full content.

View Full Document Right Arrow Icon
5. Coefficients have wrong signs and unlikely magnitude. 6. The OLS estimators and their standard errors can be sensitive to small changes in the data. Check to see how stable coefficients are when different samples are used. For example, you might randomly divide your sample in two. If coefficients differ dramatically, multicollinearity may be a problem. Or, try a slightly different specification of a model using the same data. See if seemingly “innocuous” changes (adding a variable, dropping a variable, using a different form of a variable) produce big shifts. In particular, as variables are added, look for changes in the signs of effects (e.g. switches from positive to negative) that seem theoretically questionable. Such changes may make sense if you believe suppressor effects are present, but otherwise they may indicate multicollinearity.
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: [email protected] Lecture Notes 8 3. Detection of Multicollinearity 1. High R 2 but few significant t ratios. This is the “classic” symptom of multicollinearity. If 2 R is high, say, in excess of 0.8, the F test in most cases will reject the hypothesis that the partial slope coefficients are simultaneously equal to zero, but the individual t tests will show that none or very few of the partial slope coefficients are statistically different from zero. 2. High pair-wise correlations among regressors. Look for high pair-wise correlation coefficients. Look at the correlation matrix for the regressors. 12 1 12 1 1 1 1 yy yk y k ky k k r r r r r r R r r r    Another suggested rule of thumb is that if the pair-wise or zero- order correlation coefficient between two regressors is high, say, in excess of 0.8, then multicollinearity is a serious problem. However, the zero-order correlation coefficients can be misleading in models involving more than two X variables since it is possible to have low zero-order correlations and yet find high multicollinearity.
Background image of page 8
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: [email protected] Lecture Notes 9 o In other words, if you’ve got a high pairwise correlation, you’ve got problems. However, it isn’t conclusive evidence of an absence of multicollinearity. In situations like these, one may need to examine the partial correlation coefficients. 3. Examination of Partial Correlations This is the correlation coefficient between any two regressors holding the remaining regressors constant. This can be calculated by regressing two of the X variables, say X 2 and X 3 , on the remaining X variables, say X 4 and X 5 , and then computing the correlation coefficient between the two sets of residuals. Recall that this is the partial correlation coefficient and multicollinearity may be present if it is high.
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page7 / 18

5 Coefficients have wrong signs and unlikely magnitude 6...

This preview shows document pages 7 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon bookmark
Ask a homework question - tutors are online