This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECON 103, Lecture 9: Multiple Regression II Maria Casanova April 28 (version 0) Maria Casanova Lecture 9 1. Introduction General Multiple Regression Model: Y i = + 1 X 1 i + 2 X 2 i + ... + k X ki + u i We have k regressors: X 1 i , X 2 i , ..., X ki k slope coefficients (parameters): 1 , 2 , ..., k Each slope coefficient j measures the effect of a one unit change in the corresponding regressor X ji , holding all else (e.g. the other regressors) constant. is the intercept, as before the regression error u i : still omitted variables (but hopefully there are less in here since we are including more regressors) Maria Casanova Lecture 9 1. Introduction Outline: OLS assumptions for multiple regression New assumption: no perfect multicollinearity between regressors Estimation Formally Adding % Still Learning English to our regression of Test Scores on STR Dummy variables in multiple regression Measures of goodness of fit Maria Casanova Lecture 9 2. OLS Assumptions for Multiple Regression As in the simple regression model, we need to make some assumptions in order to estimate the coefficients , 1 , ..., k . The first 3 are very similar to our previous set of assumptions. 1 E ( u i  X 1 i = x 1 i , X 2 i = x 2 i , ..., X ki = x ki ). In words, the expectation of u i is zero regardless of the values of the k regressors. 2 ( X 1 i , X 2 i , ..., X ki , Y i ) are independently and identically distributed ( i . i . d . ). This is true with random sampling. 3 ( X 1 i , X 2 i , ..., X ki , Y i ) have finite fourth moments. That is, large outliers are unlikely (this is generally true in economic data). Maria Casanova Lecture 9 2. OLS Assumptions for Multiple Regression We also need a fourth assumption in the multiple regression model. This fourth assumption addresses how the various X ji s are related to each other. 4 The regressors ( X 1 i , X 2 i , ..., X ki ) are not perfectly multicollinear . This means that none of the regressors can be written as a perfect linear function of only the other regressors. Assumption 4 is rarely violated in practice, and when it is, it is typically by accident. However, if the correlation between any too regressors is high, that will also be problematic. (We will discuss this later.) Maria Casanova Lecture 9 2. OLS Assumptions for Multiple Regression Example for perfect multicollinearity and why it is a problem: We have a sample of grades ( Y i ), and we interview the students in order to measure: the number of hours that they studied for the final exam ( X 1 i )....
View
Full
Document
This note was uploaded on 09/23/2011 for the course ECON 103 taught by Professor Sandrablack during the Spring '07 term at UCLA.
 Spring '07
 SandraBlack

Click to edit the document details