This preview shows pages 1–8. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECON 103, Lecture 12B: Internal Validity of Multiple Regression Analysis Maria Casanova May 14th (version 0) Maria Casanova Lecture 12B Requirements for this lecture: Chapter 9 of Stock and Watson Maria Casanova Lecture 12B 0. Introduction Studies based on regression analysis are internally valid if the estimated regression coefficients: are unbiased are consistent have associated standard errors that yield confidence intervals with the desired confidence level. In this lecture and the next: We will see 5 reasons why the OLS estimator of the multiple regression coefficients might be biased. We will investigate the circumstances that lead to inconsistent standard errors Maria Casanova Lecture 12B 0. Introduction Next we will see in detail the 5 main reasons why the OLS estimator may be biased. The 5 sources of bias arise because the explanatory variable(s) is (are) correlated with the error term in the population regression, violating the first Least Squares Assumption: Ass1: E ( i  X i ) = 0 = Cov ( i , X i ) = 0 In none of the 5 cases will the bias go to 0 as we increase the sample size, so the OLS estimator will also be inconsistent. Maria Casanova Lecture 12B 0. Introduction The potential sources of bias are: 1 Omitted variable bias 2 Misspecification of the functional form of the regression function 3 Imprecise measurement of the independent variable (errors in variables) 4 Sample selection 5 Simultaneous causality Maria Casanova Lecture 12B 3. Errorsinvariables bias The errorsinvariables bias arises when the independent variable is measured with error. Suppose that we have a model with one single regressor: Y i = + 1 X i + u i (1) When we try to estimate the model from the data, we dont observe the variable X directly, but only an imprecise measure of X (lets call it X ). So the model we are estimating is: Y i = + 1 X i + i (2) Notice that equation (1) can be rewritten as: Y i = + 1 X i + 1 X i 1 X i + u i Maria Casanova Lecture 12B 3. Errorsinvariables bias Rearranging terms we have: Y i = + 1 X i + 1 X i 1 X i + u i Hence we have that: i = 1 X i 1 X i + u i This creates a correlation between the the regressor in (2) and , the error term in (2): Corr ( X , ) = Corr ( X , 1 X i 1 X i + u i ) 6 = 0 Thus the OLS estimator 1 will be biased and inconsistent ....
View
Full
Document
 Spring '07
 SandraBlack

Click to edit the document details