lecture12B

# lecture12B - ECON 103 Lecture 12B Internal Validity of...

This preview shows pages 1–8. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECON 103, Lecture 12B: Internal Validity of Multiple Regression Analysis Maria Casanova May 14th (version 0) Maria Casanova Lecture 12B Requirements for this lecture: Chapter 9 of Stock and Watson Maria Casanova Lecture 12B 0. Introduction Studies based on regression analysis are internally valid if the estimated regression coefficients: are unbiased are consistent have associated standard errors that yield confidence intervals with the desired confidence level. In this lecture and the next: We will see 5 reasons why the OLS estimator of the multiple regression coefficients might be biased. We will investigate the circumstances that lead to inconsistent standard errors Maria Casanova Lecture 12B 0. Introduction Next we will see in detail the 5 main reasons why the OLS estimator may be biased. The 5 sources of bias arise because the explanatory variable(s) is (are) correlated with the error term in the population regression, violating the first Least Squares Assumption: Ass1: E ( ε i | X i ) = 0 = ⇒ Cov ( ε i , X i ) = 0 In none of the 5 cases will the bias go to 0 as we increase the sample size, so the OLS estimator will also be inconsistent. Maria Casanova Lecture 12B 0. Introduction The potential sources of bias are: 1 Omitted variable bias 2 Misspecification of the functional form of the regression function 3 Imprecise measurement of the independent variable (”errors in variables”) 4 Sample selection 5 Simultaneous causality Maria Casanova Lecture 12B 3. Errors-in-variables bias The errors-in-variables bias arises when the independent variable is measured with error. Suppose that we have a model with one single regressor: Y i = β + β 1 X i + u i (1) When we try to estimate the model from the data, we don’t observe the variable X directly, but only an imprecise measure of X (let’s call it ˜ X ). So the model we are estimating is: Y i = β + β 1 ˜ X i + ε i (2) Notice that equation (1) can be re-written as: Y i = β + β 1 X i + β 1 ˜ X i- β 1 ˜ X i + u i Maria Casanova Lecture 12B 3. Errors-in-variables bias Rearranging terms we have: Y i = β + β 1 ˜ X i + β 1 X i- β 1 ˜ X i + u i Hence we have that: ε i = β 1 X i- β 1 ˜ X i + u i This creates a correlation between the the regressor in (2) and ε , the error term in (2): Corr ( ˜ X , ε ) = Corr ( ˜ X , β 1 X i- β 1 ˜ X i + u i ) 6 = 0 Thus the OLS estimator ˆ β 1 will be biased and inconsistent ....
View Full Document

{[ snackBarMessage ]}

### Page1 / 21

lecture12B - ECON 103 Lecture 12B Internal Validity of...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online