Is regressed on x p mx 0 this result is fundamental

Info icon This preview shows pages 19–27. Sign up to view the full content.

 is regressed on  X p MX  =  0  (This result is fundamental!)       How do we interpret this result in terms of residuals?       When a column of X is regressed on X, we get a   perfect fit and zero residuals. p (Therefore)   My   =   MXb  +  Me  =  Me   =         (You should be able to prove this. p y = Py + My, P = X ( X’X )-1 X’ = (I - M).              PM = MP = 0. p Py  is the projection of  y  into the column space of  X .   ™    18/26
Image of page 19

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Part 3: Least Squares Algebra The M Matrix p M = I- X(X’X)-1X’  is an nxn matrix p M  is symmetric –  M  =  M p M  is idempotent –  M * M  =  M     (just multiply it out) p M  is singular –  M -1 does not exist.     (We will prove this later as a side result in  another derivation.) ™    19/26
Image of page 20
Part 3: Least Squares Algebra Results when X Contains a Constant Term p X  = [ 1 , x 2,…, x K] p The first column of  X  is a column of ones p Since  X’e  =  0 x1’e  = 0 – the residuals sum to zero. ™    20/26 = = = = = = + n i i=1 Define  [1,1,...,1] '  a column of n ones  =   y ny implies (after dividing by n) y    (the regression line passes through the means) These do not apply if the model has no y Xb e i i'y i'y i'Xb + i'e = i'Xb x b  constant term.
Image of page 21

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Part 3: Least Squares Algebra Least Squares Algebra ™    21/26
Image of page 22
Part 3: Least Squares Algebra Least Squares ™    22/26
Image of page 23

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Part 3: Least Squares Algebra Residuals ™    23/26
Image of page 24
Part 3: Least Squares Algebra Least Squares Residuals ™    24/26
Image of page 25

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Part 3: Least Squares Algebra Least Squares Algebra-3 M is nxn potentially huge ™    25/26 I X X X X M X e
Image of page 26
Part 3: Least Squares Algebra Least Squares Algebra-4 MX   =     26/26
Image of page 27
This is the end of the preview. Sign up to access the rest of the document.
  • Fall '10
  • H.Bierens
  • Econometrics, Yi, Linear least squares, Σi, Stern School of Business, Squares Algebra, Professor William Greene

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern