This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Proof of the Gauss-Markov Theorem Gauss-Markov Th’m: The OLS estimator, C ˆ β , is the unique BLUE of C β in GM model: y = X β + , ∼ N ( ,σ 2 I ) Need to show Var ( C ˆ β ) is strictly less than the variance of any other linear unbiased estimator of C β for all β ∈ IR p and σ 2 ∈ IR + . Outline of the proof: Consider some other linear unbiased estimator, dy . Write Var ( dy ) as Var [( dy- C ˆ β ) + C ˆ β ] Show Cov ( dy- C ˆ β , C ˆ β ) = and Var ( dy- C ˆ β ) > unless dy = C ˆ β Hence, Var ( dy ) > Var ( C ˆ β ) unless dy = C ˆ β . Copyright c 2011 Dept. of Statistics (Iowa State University) Statistics 511 1 / 4 Proof of the Gauss-Markov Theorem Suppose dy is any linear unbiased estimator other than the OLS estimator C ˆ β . Need to show Var ( dy ) > Var ( C ˆ β ) . Can relate the two Var by writing Var ( dy ) = Var ( dy- C ˆ β + C ˆ β ) Var ( dy ) = Var ( dy- C ˆ β + C ˆ β ) = Var ( dy- C ˆ β ) + Var ( C ˆ β ) + 2 Cov ( dy- C ˆ β ,...
View Full Document
- Spring '08