Section6 - Section 6 - Econ 140 GSIs: Hedvig, Tarso, Xiaoyu...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Section 6 - Econ 140 GSIs: Hedvig, Tarso, Xiaoyu * 1 Review: Multiple Regression (continued) 1.1 Set-up (reminder) Y i = β + β 1 X 1 i + β 2 X 2 i + .. + β K X Ki + u i , (1) where β k is the partial (marginal) e ect of X k on Y , in other words, it shows the change in Y as a result of a one unit increase in X k holding all other regressors constant ( ceteris paribus e ect). You can think about the intercept as being the slope coe cient of a regressor X i , which takes the value of 1 for all observations. 1.2 Goodness-of- t in the MLR model R 2 : fraction of the sample variance of Y i explained (or predicted by) the regressors. Formula: R 2 = 1- SSR TSS In multiple regression, the R 2 increases whenever a regressor is added, unless the estimated coef- cient on the added regressor is exactly zero. Adjusted R 2 (or ¯ R 2 ): modi ed version of the R 2 that does not necessarily increase when a new regressor is added. Formula: ¯ R 2 = 1- N- 1 N- K- 1 SSR TSS = 1- s 2 ˆ u s 2 Y Three useful things about ¯ R 2 : 1. N- 1 N- K- 1 is always greater than 1 ⇒ ¯ R 2 always less than R 2 2. Adding a regressor has two opposite e ects on ¯ R 2 : SSR falls, which increases it, but N- 1 N- K- 1 increases, which decreases it. 3. ¯ R 2 can be negative: this happens when the regressors, taken together, reduce the sum of squared residuals by such a small amount that this reduction fails to o set the factor N- 1 N- K- 1 . SER = q SSR N- K- 1 , where K is the number of explanatory variables and -1 stands for the constant. * Many thanks to previous GSIs, Edson Severnini and Raymundo M. Campos-Vazquez, as this note is based on theirs. All errors are ours. 1 1.3 Multicollinearity 1.3.1 De nition Presence of linear correlation between explanatory variables. 1.3.2 Imperfect multicollinearity Imperfect multicollinearity arises when one of the regressors is very highly correlated - but not perfectly correlated - with the others regressors. In other words, there is a linear function of the regressors that is highly correlated with another regressor. Consequence : higher SE ( ˆ β OLS ) ⇒ it is easier to fail to reject the null hypothesis. Remark : The imperfect multicollinearity does not pose any problems for the theory of the OLS esti- mators. Indeed, a purpose of OLS is to sort out the independent in uences of the various regressors when these regressors are potentially correlated. 1.3.3 Extreme case: perfect collinearity Perfect collinearity arises when one of the regressors is perfectly correlated with the others regressors. In other words, one regressor can be expressed as a linear function of the others. Consequence : Let Y i = β + β 1 X 1 i + β 2 X 2 i + ... + β k X ki + u i be the regression model. Now, suppose for example that X 1 i = 1 + X 2 i + X 3 i + ... + X ki : ⇒ Y i = β + β 1 (1 + X 2 i + X 3 i + ... + X ki ) + β 2 X 2 i + ... + β k X ki + u i ⇒ Y i = ( β + β 1 ) + ( β 1 + β 2 ) X 2 i + ... + ( β 1 + β k ) X ki + u i ⇒ Y i = γ...
View Full Document

This note was uploaded on 02/02/2012 for the course ECON 140 taught by Professor Duncan during the Spring '08 term at Berkeley.

Page1 / 8

Section6 - Section 6 - Econ 140 GSIs: Hedvig, Tarso, Xiaoyu...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online