chapter 3 [article]

chapter 3 [article] - Contents 1 OLS and multiple...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Contents 1 OLS and multiple regression 1 1.1 OLS estimators . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2 Multicollinearity 4 2.1 Estimator properties . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 Multicollinearity . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3 Testing 9 3.1 t tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.2 F tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1 OLS and multiple regression Contents 1.1 OLS estimators Contents So far, simple regression: one regressor It will usually be the case that we think multiple variables will help explain the dependent variable: Y i = 1 + 2 X 2 i + 3 X 3 i + + k X ki + u i true model k parameters to estimate Model specification With many potential explanatory variables, we must decide which to in- clude on the right-hand-side of our regression model Will deal with this later For this chapter, we will assume that our assumption regarding the form of the true model is correct i.e.we know which regressors should be included Example: explaining individual earnings We might think that work experience as well as education might help determine earnings: EARNINGS i = 1 + 2 S i + 3 EXP i + u i (3.1) EARNINGS is hourly earnings and S is highest grade completed as before; EXP is years spent working after education Just as with simple regression, we have OLS estimators of our i param- eters Conceptually identical to before: the OLS estimators b i of the (assumed) true model parameters i are those that minimise the residual sum of squares RSS : RSS = n X i =1 e 2 i = n X i =1 Y i- Y i 2 = n X i =1 ([ b 1 + b 2 X 2 i + + b k X ki ]- Y i ) 2 Minimising the RSS with respect to the k (unknown) b i coefficients gives us k first-order conditions: RSS b 1 = 0 , RSS b 2 = 0 , ..., RSS b k = 0 We have k equations in k unknowns (the b i ) can solve for the b i to get our OLS estimators in terms of the X and Y sample data IMPORTANT While the principle underlying the derivation of OLS is unchanged when we move from simple to multiple regression, the expressions for the estimators are NOT the same For example, with two regressors X 2 and X 3 , it is not the case that b 2 is the same as before: b 2 6 = n i =1 ( X 2 i- X 2 )( Y i- Y ) n i =1 ( X 2 i- X 2 ) 2 2 For actual b 2 expression with two regressors, see equation (3 . 11) in book Nasty, and you do not need to know it Better done using matrix algebra, which you dont need to know either... Why do our OLS estimator expressions differ?...
View Full Document

Page1 / 14

chapter 3 [article] - Contents 1 OLS and multiple...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online