This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: STAT 100B: Homework 5 Solutions Denise Tsai March 3, 2011 1. For the simplest regression model y i = βx i + i , i = 1 ,...,n , suppose x i are fixed, and E[ i ] = 0, Var[ i ] = σ 2 , and i are independent. Consider the estimator ˆ β = ∑ n i =1 w i y i / ∑ n i =1 w i x i , where w i may depends on x i , but not on y i . (a) Find E[ ˆ β ] and Var[ ˆ β ]. Solution: E( ˆ β ) = E ∑ n i =1 w i y i ∑ n i =1 w i x i = ∑ n i =1 E[ w i ( βx i + i )] ∑ n i =1 w i x i = ∑ n i =1 ( w i βx i + 0) ∑ n i =1 w i x i since E( w i i ) = w i E( i ) = 0 = β ( ∑ n i =1 w i x i ) ∑ n i =1 w i x i = β Var( ˆ β ) = Var ∑ n i =1 w i y i ∑ n i =1 w i x i = ∑ n i =1 Var[ w i ( βx i + i )] ( ∑ n i =1 w i x i ) 2 = ∑ n i =1 (0 + w 2 i σ 2 ) ∑ n i =1 w i x i since Var( w i i ) = w 2 i Var( i ) = w 2 i σ 2 = ∑ n i =1 w 2 i σ 2 ∑ n i =1 w i x i 1 (b) Show that Var[ ˆ β ] is minimized when w i ∝ x i . Calculate the minimum. Solution: Since σ 2 > 0, min ∑ n i =1 w 2 i σ 2 ∑ n i =1 w i x i = min ∑ n i =1 w 2 i ∑ n i =1 w i x i . Then, ∑ n i =1 w 2 i ∑ n i =1 w i x i =  ~w  2 h ~w,~x i 2 =  ~w  2  ~w  2  ~x  2 cos 2 θ = 1  ~x  2 cos 2 θ ≥ 1  ~x  2 Var( β ) is minimized when cos 2 θ = 1 ⇒ θ = 0 or θ = π ⇒ ~w and ~x are on the same line. ⇒ ~w ∝ ~x ( w i ∝ x i ) 2. For the simple regression y i = β + β 1 x i + i , i = 1 ,...,n , define the sample variance of ( x i ,i = 1 ,...,n ) as V x = ∑ n...
View
Full
Document
This note was uploaded on 09/20/2011 for the course STAT 100B taught by Professor Wu during the Winter '11 term at UCLA.
 Winter '11
 Wu

Click to edit the document details