ETW
ETW2410 Week 3 (Chp 3)

# ETW2410 Week 3 (Chp 3) - 3 Multiple Regression Analysis yi...

• Notes
• 31

This preview shows pages 1–8. Sign up to view the full content.

ETC2410 1 3. Multiple Regression Analysis 1. Estimation 0 1 1 2 2 i i i k ki i y x x x u 

This preview has intentionally blurred sections. Sign up to view the full version.

Learning Objectives At the end of this chapter, students will able to (1) identify the advantages of multiple regression over simple linear regression (2) estimate and interpret the parameters in the multiple regression model using the method of ordinary least squares and lastly (3) identify various statistical properties of the OLS estimators which include the concept of unbiasness and efficiency ETC2410 2
ETC2410 3 Parallels with Simple Regression 0 is still the intercept 1 to k all called slope parameters u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E( u|x 1 ,x 2 , …, x k ) = 0 u is uncorrelated with each of the k x variables Still minimizing the sum of squared residuals, so have k+1 first order conditions E(u)=0 1 condition E(u/xi)=0 for all i =1,2,….k

This preview has intentionally blurred sections. Sign up to view the full version.

ETC2410 4 Interpreting Multiple Regression 0 1 1 2 2 1 1 2 2 2 1 1 1 ˆ ˆ ˆ ˆ ˆ ... , so ˆ ˆ ˆ ˆ ... , so holding ,..., fixed implies that ˆ ˆ , that is each has a interpretation ˆ k k k k k y x x x y x x x x x y x ceteris paribus predicted effect on y of a unit c 11 in x all other x's constant hange holding 2 0 ........ 0 k x x
ETC2410 5 A “Partialling Out” Interpretation 0 1 1 2 2 2 1 1 1 1 1 0 2 2 ^ 1 1 1 Consider the case where 2, i.e. ˆ ˆ ˆ ˆ , then ˆ ˆ ˆ ˆ , where are the residuals from the estimated ˆ ˆ ˆ ˆ regression ˆ i i i i i i i k y x x r y r r x x r x x

This preview has intentionally blurred sections. Sign up to view the full version.

ETC2410 6 “Partialling Out” continued Previous equation implies that regressing y on x 1 and x 2 gives same effect of x 1 as regressing y on residuals from a regression of x 1 on x 2 (i.e. 2 step estimation (2 simple regression) gives the same as multiple regression estimation) This means only the part of x i1 that is uncorrelated with x i2 are being related to y i so we’re estimating the effect of x 1 on y after x 2 has been “partialled out”
What is the relevance of this result? Well, the residuals are the part of x i1 that is uncorrelated with x i2 . Another way of looking at is that is x i1 after the effects of x i2 have been partialled out or netted out . Thus, measures the sample relationship between y and x 1 after x 2 has been partialled out.

This preview has intentionally blurred sections. Sign up to view the full version.

This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern