This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Proof of the Gauss-Markov Theorem Gauss-Markov Thm: The OLS estimator, C , is the unique BLUE of C in GM model: y = X + , N ( , 2 I ) Need to show Var ( C ) is strictly less than the variance of any other linear unbiased estimator of C for all IR p and 2 IR + . Outline of the proof: Consider some other linear unbiased estimator, dy . Write Var ( dy ) as Var [( dy- C ) + C ] Show Cov ( dy- C , C ) = and Var ( dy- C ) > unless dy = C Hence, Var ( dy ) > Var ( C ) unless dy = C . Copyright c 2011 Dept. of Statistics (Iowa State University) Statistics 511 1 / 4 Proof of the Gauss-Markov Theorem Suppose dy is any linear unbiased estimator other than the OLS estimator C . Need to show Var ( dy ) > Var ( C ) . Can relate the two Var by writing Var ( dy ) = Var ( dy- C + C ) Var ( dy ) = Var ( dy- C + C ) = Var ( dy- C ) + Var ( C ) + 2 Cov ( dy- C ,...
View Full Document
- Spring '08