ECO375H_Slides_4

# ECO375H_Slides_4 - Lecture 4: Multiple Regression Analysis:...

This preview shows pages 1–10. Sign up to view the full content.

Lecture 4: Multiple Regression Analysis: Statistical Properties Junichi Suzuki University of Toronto October 1st, 2009

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Announcement I Important change: 2nd problem set is due October 16th. I Will be available by the next lecture I tomorrow. I See comments from Rebecca (available at the Blackboard) I Information Session on Applying to Graduate School I October 15th (Th) 4:10-5:30pm I SS2118
I Goal: Wrap up Chapter 3 I Response to a question from a student I Proving "Partialling Out Interpretation of MR" I Statistical properties of OLSE I Assumptions I Expected value I Variance I E¢ ciency (Gauss-Markov Theorem) I Consider both I why some good properties hold I what happen if some assumptions do not hold I especially pay attention to omitted variable problem

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
What OLS Estimates Imply: Question I Consider the following regression and the corresponding sample regression: ln w = β 0 + β 1 educ + u d ln w = ˆ β 0 + ˆ β 1 educ I β 1 represents a percent increase in her salary when one gets additional one (year) education dw deduc = β 1 w I This implies that a percent increase of additional two-year education is dw deduc = 2 β 1 w I Why not dw deduc = [( 1 + β 1 ) 2 ± 1 ] w
What OLS Estimates Imply: Answer I Go back to the original equation w = exp ( β 0 + β 1 educ + u ) w 1 w 0 ± 1 = exp ( β 0 + β 1 ( educ + 2 ) + u ) exp ( β 0 + β 1 educ + u ) ± 1 = exp ( 2 β 1 ) ± 1 I When β 1 = 0 . 05 , 8 < : exp ( 2 β 1 ) ± 1 = 10 . 52 % 2 β 1 = 10 . 00 % ( 1 + β 1 ) 2 ± 1 = 10 . 25 % I I The last two are good approximations

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Proving "Partialling Out Interpretation of MR" I Consider a regression of y on x = ( x 1 , x 2 , , x k ) y = β 0 + β 1 x 1 + β 2 x 2 + + β k x k + u I Want to show ˆ β j = n i = 1 ˆ r ij y i n i = 1 ˆ r 2 ij I ˆ r ij is the residual of the regression of x j on other regressors ˆ x j = ˆ δ 1 + l 6 = j ˆ δ l x l ˆ r ij = x ij ± ˆ x ij I Note that FOC implies n i = 1 ˆ r ij ˆ x ij = n i = 1 ˆ r ij ˆ δ 1 + l 6 = j ˆ δ l x l ! = 0
Proving "Partialling Out Interpretation of MR" Consider the FOC 0 = n i = 1 x ij y i ˆ β 0 ˆ β 1 x i 1 ˆ β k x ik ± = n i = 1 ( ˆ x ij + ˆ r ij ) y i ˆ β 0 ˆ β 1 x i 1 ˆ β k x ik ± = n i = 1 ˆ δ 1 + l 6 = j ˆ δ l x l ! ˆ u ij + n i = 1 ˆ r ij y i ˆ β j n i = 1 ˆ r ij x ij = n i = 1 ˆ r ij y i ˆ β j n i = 1 ˆ r ij ( ˆ x ij + ˆ r ij ) = n i = 1 ˆ r ij y i ˆ β j n i = 1 ˆ r 2 ij

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Statistical Properties of OLSE
Gauss-Markov Assumptions MLR.1 (Linear in parameters): y = β 0 + β 1 x 1 + β 2 x 2 + + β k x k + u MLR.2 (Random Sampling): ( x i 1 , x i 2 , . . . , x ik , y i ) n i = 1 is a random sample from the population regression MLR.3 (No perfect Collinearity): In the sample , none of the ind var is constant, and no exact linear relationships among them

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 10/16/2009 for the course ECON ECO375 taught by Professor Junichisuzuki during the Fall '09 term at University of Toronto- Toronto.

### Page1 / 38

ECO375H_Slides_4 - Lecture 4: Multiple Regression Analysis:...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online