{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Econ103_spring11_lec9

# Econ103_spring11_lec9 - ECON 103 Lecture 9 Multiple...

This preview shows pages 1–7. Sign up to view the full content.

ECON 103, Lecture 9: Multiple Regression II Maria Casanova April 28 (version 0) Maria Casanova Lecture 9

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1. Introduction General Multiple Regression Model: Y i = β 0 + β 1 X 1 i + β 2 X 2 i + ... + β k X ki + u i We have k regressors: X 1 i , X 2 i , ..., X ki k slope coefficients (parameters): β 1 , β 2 , ..., β k Each slope coefficient β j measures the effect of a one unit change in the corresponding regressor X ji , holding all else (e.g. the other regressors) constant. β 0 is the intercept, as before the regression error u i : still omitted variables (but hopefully there are less in here since we are including more regressors) Maria Casanova Lecture 9
1. Introduction Outline: OLS assumptions for multiple regression New assumption: no perfect multicollinearity between regressors Estimation Formally Adding % Still Learning English to our regression of Test Scores on STR Dummy variables in multiple regression Measures of goodness of fit Maria Casanova Lecture 9

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2. OLS Assumptions for Multiple Regression As in the simple regression model, we need to make some assumptions in order to estimate the coefficients β 0 , β 1 , ..., β k . The first 3 are very similar to our previous set of assumptions. 1 E ( u i | X 1 i = x 1 i , X 2 i = x 2 i , ..., X ki = x ki ). In words, the expectation of u i is zero regardless of the values of the k regressors. 2 ( X 1 i , X 2 i , ..., X ki , Y i ) are independently and identically distributed ( i . i . d . ). This is true with random sampling. 3 ( X 1 i , X 2 i , ..., X ki , Y i ) have finite fourth moments. That is, large outliers are unlikely (this is generally true in economic data). Maria Casanova Lecture 9
2. OLS Assumptions for Multiple Regression We also need a fourth assumption in the multiple regression model. This fourth assumption addresses how the various X ji ’s are related to each other. 4 The regressors ( X 1 i , X 2 i , ..., X ki ) are not perfectly multicollinear . This means that none of the regressors can be written as a perfect linear function of only the other regressors. Assumption 4 is rarely violated in practice, and when it is, it is typically by accident. However, if the correlation between any too regressors is ‘high’, that will also be problematic. (We will discuss this later.) Maria Casanova Lecture 9

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2. OLS Assumptions for Multiple Regression Example for perfect multicollinearity and why it is a problem: We have a sample of grades ( Y i ), and we interview the students in order to measure: the number of hours that they studied for the final exam ( X 1 i ).
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 23

Econ103_spring11_lec9 - ECON 103 Lecture 9 Multiple...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online