04ProofOfTheGauss-MarkovTheorem

04ProofOfTheGauss-MarkovTheorem - Proof of the Gauss-Markov...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Proof of the Gauss-Markov Theorem Gauss-Markov Thm: The OLS estimator, C , is the unique BLUE of C in GM model: y = X + , N ( , 2 I ) Need to show Var ( C ) is strictly less than the variance of any other linear unbiased estimator of C for all IR p and 2 IR + . Outline of the proof: Consider some other linear unbiased estimator, dy . Write Var ( dy ) as Var [( dy- C ) + C ] Show Cov ( dy- C , C ) = and Var ( dy- C ) > unless dy = C Hence, Var ( dy ) > Var ( C ) unless dy = C . Copyright c 2011 Dept. of Statistics (Iowa State University) Statistics 511 1 / 4 Proof of the Gauss-Markov Theorem Suppose dy is any linear unbiased estimator other than the OLS estimator C . Need to show Var ( dy ) > Var ( C ) . Can relate the two Var by writing Var ( dy ) = Var ( dy- C + C ) Var ( dy ) = Var ( dy- C + C ) = Var ( dy- C ) + Var ( C ) + 2 Cov ( dy- C ,...
View Full Document

Page1 / 4

04ProofOfTheGauss-MarkovTheorem - Proof of the Gauss-Markov...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online