Multicollinearity notes

# Multicollinearity notes - Multicollinearity a simplified...

This preview shows pages 1–2. Sign up to view the full content.

Multicollinearity – a simplified example: Assume that we have the following model: (1) 01 12 2 ii i i i i YX X u ββ β =+ + + ; where the two variables are linearly associated: (2) 21 i X Xv λ . The degree to which X 1 and X 2 are linearly associated depends upon the terms v i . If v i are relatively large, then most of X 2 constitutes new information ; if most of X 2 is the linear combination of X 1 , then there is little new information provided by the variable X 2 . Let’s see how this relationship affects the OLS estimates and standard errors. First we make some reasonable assumptions about the term that we’ve designated new information , v i : 2 1 0; 0 i i v v and x v => = ∑∑ . None of these assumptions are troubling. The first follows directly from the fact that the deviations sum to zero, the second simply says that not all v i can be zero (if they were, we have perfect multicollinearity), and the final says that this is new information that is independent of the variable X 1 .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 2

Multicollinearity notes - Multicollinearity a simplified...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online