Econ 399 Chapter3b - 3. ,wecanthen yi = 0 1 xi1 2 xi 2 k...

Info icon This preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
3.2 OLS Fitted Values and Residuals - after obtaining OLS estimates, we can then obtain fitted or predicted values for y: (3.20) ˆ ... ˆ ˆ ˆ ˆ 2 2 1 1 0 ik k i i i x x x y -given our actual and predicted values for y, as in the simple regression case we can obtain residuals: (3.21) ˆ ˆ i i i y y u -a positive uhat indicates underprediction (y>yhat) and a negative uhat indicates overprediction (y<yhat)
Image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
3.2 OLS Fitted Values and Residuals - We can extend the single variable case to obtain important properties for fitted values and residuals: 1) The sample average of the residuals is zero, therefore: ˆ y y 2) Sample covariance between each independent variable and the OLS residual is zero… -Therefore the sample covariance between the OLS fitted values and the OLS residual is zero -Since the fitted values come from our independent variables and OLS estimates
Image of page 2
3.2 OLS Fitted Values and Residuals 3) The point ) , ,..., , ( 2 1 y x x x n Is always on the OLS regression line: k k x x x y ˆ ... ˆ ˆ ˆ 2 2 1 1 0 Notes: These properties come from the FOC’s in (3.13): -the first FOC says that the sum of residuals is zero and proves (1) -the rest of the FOC’s imply zero covariance between the independent variables and uhat (2) -(3) follows from (1)
Image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
3.2 “Partialling Out” - In multiple regression analysis, we don’t need formulas to obtain OLS’s estimates of B j -However, explicit formulas can give us interesting properties -In the 2 independent variable case: (3.22) ˆ ˆ ˆ 2 1 1 1 i i i r y r -Where rhat are the residuals from regressing x 1 on x -ie: the regression: 2 1 0 1 x ˆ ˆ ˆ x
Image of page 4
3.2 “Partialling Out” - rhat i1 are the part of x i1 that are uncorrelated with x 12 -rhat i1 is equivalent to x i1 after x i2 ’s effects have been “partialled out” or “netted out” -thus B 1 hat measures x 1 ’s effect on y after x 2 has been “partialled out” -In a regression with k variables, the residuals come from a regression of x 1 on ALL other x’s -in this case B 1 hat measures x 1 ’s effect on y after all other x’s have been “partialled out”
Image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
3.2 Comparing Simple and Multiple
Image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern