lecture+9_complete - 1 Multiple Regression Model:...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 Multiple Regression Model: Estimation y = b + b 1 x 1 + b 2 x 2 + . . . b k x k + u 2 OUTLINE 1. Motivation for the Multiple Regression Model (MRM) 2. OLS Estimator in the MRM 3. Changing more than one independent variable simultaneously 4. Algebraic properties of OLS 5. Goodness of fit 6. Unbiasedness of OLS 3 Model: y = b + b 1 x 1 + b 2 x 2 + . . . b k x k + u y = Dependent variable x 1, x 2 . . . x k = k regressors/independent variables/explanatory variables b = One intercept parameter b 1 b 2 . . . b k = k slope parameters So, k = # of independent variables and k+1 = # of unknown b parameters u = Disturbance/unobservables/ error term (all factors other than x 1, x 2 . . . x k that affect y) The key assumption in the MRM is: E( u | x 1 , x 2 , . . . x k ) = 0 The error (u) is mean independent of all the independent variables (the xs) i.e. the zero conditional mean assumption. 1. MULTIPLE REGRESSION MODEL (MRM) 5 There are several reasons why the MRM is more useful than the SLR. 1. We can measure the causal (ceteris paribus) effects of more than one variable. 2. Better R 2 : allows us to build better models 3. We can correct for omitted variable bias. 4. We can incorporate general functional form relationships (nonlinear) between two variables. Allows more flexibility. 1. MOTIVATION FOR THE MRM 6 Suppose that the true model in the population is: colGPA = b + b 1 hsGPA + b 2 SAT + u such that: E(u | hsGPA, SAT) = 0 By estimating this model we can measure the ceteris paribus effects of 2 variables: hsGPA and SAT (as opposed to the SLR where we could get the ceteris paribus effect of only 1 variable). Example 1: Determinants of College GPA 7 Suppose that, instead, we estimate the SLR: colGPA = b + b 1 hsGPA + v It is clear that: v = b 2 SAT+ u Therefore, in general, v is not mean independent of hsGPA because SAT scores is likely to be correlated with high school performance (hsGPA), i.e. the error term v of the SLR model is correlated with the regressor hsGPA of the model. So assumption 2 discussed in lecture 7 is violated. This implies, OLS estimation of the SLR will provide a bias estimator of b 1 i.e. we have an omitted variable bias . Example 1: Determinants of College GPA 8 Suppose that the true model in the population is: wage = b + b 1 educ + b 2 exper + b 3 exper2 + u such that: E(u | educ, exper, exper2) = 0 By estimating this model we can measure the ceteris paribus effects of 2 variables: education and of experience (as opposed to the SLR where we could get the ceteris paribus effect of only 1 variable). Note that the regressor exper2 is exper*exper , and therefore the model permits a nonlinear (quadratic) relationship between an individuals wage and her labor market experience i.e. allows for more flexibility....
View Full Document

This note was uploaded on 02/29/2012 for the course ECONOMICS 220:322 taught by Professor Otusbo during the Spring '10 term at Rutgers.

Page1 / 41

lecture+9_complete - 1 Multiple Regression Model:...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online