Lecture 12 slides(Multicollinearity)

Lecture 12 slides(Multicollinearity) - Multicollinearity...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Multicollinearity Can think about the problem of multicollinearity from two perspectives: Difficulty of shared explanatory power between two covariates Summarize the properties of the variance-covariance estimate of via an eigenvalue decomposition Lecture 12 p. 1/32 Multicollinearity Partition the X matrix into columns: X = bracketleftBig 1 X 1 X 2 X p bracketrightBig Now center and scale each column by subtracing the mean x j and dividing by the radicalBig S j XX = radicalbig n i =1 ( x ij- x j ) 2 Lecture 12 p. 2 / 3 Multicollinearity Centering and scaling the columns means that X t X becomes a simple correlation matrix , i.e. X t X = 1 1 r 12 r 1 p r 2 p r p 1 r p 2 1 r jk is the correlation between the j-th and k- th covariates (correlation because the X variables are not considered random) Lecture 12 p. 3/32 Multicollinearity We can remove the column of ones from the design matrix and write y = 1 + X * which gives: X * t X * = 1 r 12 r 13 r 1 p r 21 r 2 p r 31 r p 1 r p 2 r p 3 1 Can use either one to describe collinearity, depending on the context Lecture 12 p. 4 / 3 Multicollinearity From a practical standpoint, it is obvious why having two highly related covariates will cause problems in the regression Imagine regressing someones weight on their height in cm (measured on Monday) and their height in inches (measured on Wednesday) The two height measurements will be very close to each other Writing the regression equation as: y i = + 1 x i 1 + 2 x i 2 we see that this model will be very close to: y i = + 1 x i 1 + 2 (2 . 54)( x i 1 + i ) Lecture 12 p. 5/32 Multicollinearity It will be very difficult to estimate 1 and 2 as they will be measuring roughly the same relationship between height and weight (just scaled differently with a tiny bit of measurement error) In fact, if i = 0 , we will get y i = + ( 1 + 2 . 54 2 ) x i 1 1 and 2 are non-identifiable in this case We can vary 1 by some amount and as long as we adjust by the same amount in 2 the predicted value will not change Therefore, there is very little independent information about the parameters, only about 1 + 2 . 54 2 Lecture 12 p. 6 / 3 Multicollinearity Obviously, the reverse-thinking says that two covariates should be as linearly independent as possible (i.e....
View Full Document

Page1 / 8

Lecture 12 slides(Multicollinearity) - Multicollinearity...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online