Lecture2full

# It is also the case that the incorrect omission of a

• Notes
• 25

This preview shows page 7 - 8 out of 25 pages.

It is also the case that the incorrect omission of a set of variables may result in the estimated regression model failing one or more of the tests which we discuss below. A DIGRESSION: MULTICOLLINEARITY Multicollinearity (MC) exists whenever there is a non-zero correlation between two regressors (or linear combinations of regressors) in the model being estimated. Given that the likelihood of all variables in X being perfectly uncorrelated with one another is close to zero, MC nearly always exists when doing applied research. In an extreme case, perfect multicollinearity is said to exist when two regressors (or linear combinations of regressors) exhibit perfect correlation in the sample data set. In this case, the estimator will break down, as a required matrix inverse cannot be obtained. Intuitively, parameter estimates are unobtainable as OLS is unable, in this extreme case, to identify the contributions that any individual variable makes to explaining the dependent variable. The more common case of less-than-perfect multicollinearity is sometimes described as a “problem” when the degree of correlation is high. But such a description is very misleading. Provided the assumptions of the LRM are satisfied, multicollinearity does not affect the properties of the OLS estimator. Even where it exists, OLS will be unbiased and efficient, standard errors are correct, and t and F tests remain valid (subject, as always, to the caveat that the assumptions of the LRM are satisfied). However, the high correlation will tend to lead to the standard errors of the estimators being large (relative to what they would be if regressors had a low degree of correlation). As a result, confidence intervals will tend to be large, and the probability of making Type 2 errors (incorrectly accepting a false null) will tend to be high. In other words, hypothesis tests will have low power. In summary, it adversely affects the precision of our estimation and testing. This is of course “undesirable” but it is not a problem per se. It does not invalidate the use of OLS or any of the tests we might wish to perform. Can anything be done to avoid multicollinearity? In general the answer is no. Multicollinearity is a property of the data we use; unless we are willing to not use that data, it cannot be “avoided”. Increasing the sample size may reduce collinearity, but this begs the question of why the larger data set was not used in the first place. Alternatively, it may be possible to reparameterise the model in such a way that there is lower correlation between members of the re-parameterised data set than between the original variables. For example, regressions involving mixtures of differences and levels will tend to exhibit lower collinearity than regressions among levels of variables alone. (This is one reason -albeit not the main one - why an ECM parameterisation may be preferable than a levels only parameterisation).

Subscribe to view the full document.

You've reached the end of this preview.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern