04ProofOfTheGauss-MarkovTheorem

# 04ProofOfTheGauss-MarkovTheorem - Proof of the Gauss-Markov...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Proof of the Gauss-Markov Theorem Gauss-Markov Th’m: The OLS estimator, C ˆ β , is the unique BLUE of C β in GM model: y = X β + , ∼ N ( ,σ 2 I ) Need to show Var ( C ˆ β ) is strictly less than the variance of any other linear unbiased estimator of C β for all β ∈ IR p and σ 2 ∈ IR + . Outline of the proof: Consider some other linear unbiased estimator, dy . Write Var ( dy ) as Var [( dy- C ˆ β ) + C ˆ β ] Show Cov ( dy- C ˆ β , C ˆ β ) = and Var ( dy- C ˆ β ) > unless dy = C ˆ β Hence, Var ( dy ) > Var ( C ˆ β ) unless dy = C ˆ β . Copyright c 2011 Dept. of Statistics (Iowa State University) Statistics 511 1 / 4 Proof of the Gauss-Markov Theorem Suppose dy is any linear unbiased estimator other than the OLS estimator C ˆ β . Need to show Var ( dy ) > Var ( C ˆ β ) . Can relate the two Var by writing Var ( dy ) = Var ( dy- C ˆ β + C ˆ β ) Var ( dy ) = Var ( dy- C ˆ β + C ˆ β ) = Var ( dy- C ˆ β ) + Var ( C ˆ β ) + 2 Cov ( dy- C ˆ β ,...
View Full Document

{[ snackBarMessage ]}

### Page1 / 4

04ProofOfTheGauss-MarkovTheorem - Proof of the Gauss-Markov...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online