3.Also because of consequence 1, the tratio of one or more coefficients tends to be statistically insignificant.
Subscribe to view the full document.
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: [email protected]Lecture Notes 7 4.Although the tratio of one or more coefficients is statistically insignificant, 2R, the overall measure of goodness of fit, can be very high. 5.Coefficients have wrong signs and unlikely magnitude. 6.The OLS estimators and their standard errors can be sensitive to small changes in the data. Check to see how stable coefficients are when different samples are used. For example, you might randomly divide your sample in two. If coefficients differ dramatically, multicollinearity may be a problem. Or, try a slightly different specification of a model using the same data. See if seemingly “innocuous” changes (adding a variable, dropping a variable, using a different form of a variable) produce big shifts. In particular, as variables are added, look for changes in the signs of effects (e.g. switches from positive to negative) that seem theoretically questionable. Such changes may make sense if you believe suppressor effects are present, but otherwise they may indicate multicollinearity.
ECON 301 - Introduction to Econometrics I May 2013 METU - Department of Economics Instructor: Dr. Ozan ERUYGUR e-mail: [email protected]Lecture Notes 8 3. Detection of Multicollinearity 1. High R2but few significant t ratios. This is the “classic”symptom of multicollinearity. If 2Ris high, say, in excess of 0.8, the Ftest in most cases will reject the hypothesis that the partial slope coefficients are simultaneously equal to zero, but the individual ttests will show that none or very few of the partial slope coefficients are statistically different from zero. 2. High pair-wise correlations among regressors. Look for high pair-wise correlation coefficients. Look at the correlation matrixfor the regressors. 12112112111yyykykkykkrrrrrrRrrrAnother suggested rule of thumb is that if the pair-wise or zero-order correlation coefficient between two regressors is high, say, in excess of 0.8, then multicollinearity is a serious problem. However, the zero-order correlation coefficients can be misleading in models involving more than two X variables since it is possible to have low zero-order correlations and yet find high multicollinearity.