Multicollinearity notes

Multicollinearity notes - Multicollinearity a simplified...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Multicollinearity – a simplified example: Assume that we have the following model: (1) 01 12 2 ii i i i i YX X u ββ β =+ + + ; where the two variables are linearly associated: (2) 21 i X Xv λ . The degree to which X 1 and X 2 are linearly associated depends upon the terms v i . If v i are relatively large, then most of X 2 constitutes new information ; if most of X 2 is the linear combination of X 1 , then there is little new information provided by the variable X 2 . Let’s see how this relationship affects the OLS estimates and standard errors. First we make some reasonable assumptions about the term that we’ve designated new information , v i : 2 1 0; 0 i i v v and x v => = ∑∑ . None of these assumptions are troubling. The first follows directly from the fact that the deviations sum to zero, the second simply says that not all v i can be zero (if they were, we have perfect multicollinearity), and the final says that this is new information that is independent of the variable X 1 .
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 2

Multicollinearity notes - Multicollinearity a simplified...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online