100BHWIVS - STAT 100B HW III solution Problem 1 Suppose X1...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT 100B HW III solution Problem 1: Suppose X 1 ,...,X n f ( x ) independently. Let μ = E[ X ], and σ 2 = Var[ X ]. Let ¯ X = 1 n n X i =1 X i , and s 2 = 1 n - 1 n X i =1 ( X i - ¯ X ) 2 . Prove E[ ¯ X ] = μ , and E( s 2 ) = σ 2 . A: E[ ¯ X ] = E[( n i =1 X i ) /n ] = ( n i =1 E[ X i ]) /n = nμ/n = μ . E[ s 2 ] = E[ 1 n - 1 n X i =1 ( X i - ¯ X ) 2 ] (1) = 1 n - 1 E[ n X i =1 ( X i - μ ) 2 - n ( ¯ X - μ ) 2 ] (2) = 1 n - 1 [ n X i =1 E( X i - μ ) 2 - n E( ¯ X - μ ) 2 ] (3) = 1 n - 1 ( 2 - n ( σ 2 /n )) = σ 2 . (4) Problem 2: Suppose Y i = x i β true + ² i , i = 1 ,...,n , where x i are fixed, β true is an unknown constant. ² i are random errors, with E[ ² i ] = 0, Var[ ² i ] = σ 2 . Suppose we want to estimate β true by ˆ β = n i =1 w i Y i , where ( w i ,i = 1 ,...,n ) may depend on ( x i ,i = 1 ,...,n ). If we want ˆ β to be unbiased with minimum variance, then what should be the values of w i , i = 1 ,...,n ? A: E[ ˆ β ] = n i =1 w i E[ Y i ] = n i =1 w i E[ x i β true + ² i ] = n i =1 w i x i β true = β true . So
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/27/2011 for the course STATS 100b taught by Professor Staff during the Fall '08 term at UCLA.

Ask a homework question - tutors are online