12 - CHAPTER 9 Solution to Problem 9.1. Let Xi denote the...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CHAPTER 9 Solution to Problem 9.1. Let X i denote the random homework time for the i th week, i =1 ,..., 5. We have the observation vector X = x ,where x =(10 , 14 , 18 , 8 , 20). In view of the independence of the X i ,for θ [0 , 1], the likelihood function is f X ( x ; θ )= f X 1 ( x 1 ; θ ) ··· f X 5 ( x 5 ; θ ) = θe x 1 θ x 5 θ = θ 5 e ( x 1 + ··· + x 5 ) θ = θ 5 e (10+14+18+8+20) θ = θ 5 e 71 θ . To derive the ML estimate, we set to 0 the derivative of f X ( x ; θ )w ithrespectto θ , obtaining d ( θ 5 e 71 θ ) =5 θ 4 e 71 θ 71 θ 5 e 71 θ =(5 71 θ ) θ 4 e 71 θ =0 . Therefore, ˆ θ = 5 71 = 5 x 1 + + x 5 . Solution to Problem 9.2. (a) Let the random variable N bethenumbero ftosses until the k th head. The likelihood function is the Pascal PMF of order k : p N ( n ; θ n 1 k 1 θ k (1 θ ) n k ,n = k, k +1 ,... We maximize the likelihood by setting its derivative with respect to θ to zero: 0= k n 1 k 1 (1 θ ) n k θ k 1 ( n k ) n 1 k 1 (1 θ ) n k 1 θ k , which yields the ML estimator ˆ Θ 1 = k N . Note that ˆ Θ 1 is just the fraction of heads observed in N tosses. (b) In this case, n is a ±xed integer and K is a random variable. The PMF of K is binomial: p K ( k ; θ n k θ k (1 θ ) n k ,k , 1 , 2 ,...,n. 111
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
For given n and k , this is a constant multiple of the PMF in part (a), so the same calculation yields the estimator ˆ Θ 2 = K n . We observe that the ML estimator is again the fraction of heads in the observed trials. Note that although parts (a) and (b) involve di±erent experiments and di±erent random variables, the ML estimates obtained are similar. However, it can be shown that ˆ Θ 2 is unbiased [since E [ ˆ Θ 2 ]= E [ K ] /n = θ · n/n = θ ], whereas ˆ Θ 1 is not [since E [1 /N ]=1 / E [ N ]]. Solution to Problem 9.3. (a) Let s be the sum of all the ball numbers. Then for all i , E [ X i s k , E [ Y i s k . We have E [ ˆ S E " k n n X i =1 X i # = k n n X i =1 E [ X i k n n X i =1 s k = s, so ˆ S is an unbiased estimator of s . Similarly, E [ ˜ S s . Finally, let L = S k = 1 N n X i =1 X i = 1 N N X j =1 Y j . E [ L n X n =1 E [ L | N = n ] p N ( n ) = n X n =1 E " 1 n n X i =1 Y i N = n # p N ( n ) = n X n =1 E [ Y 1 ] p N ( n ) = E [ Y 1 ] = s k , so that S = k E [ L s ,and S is an unbiased estimator of s . (b) We have var( ˆ S )= k 2 n X 1 ) , ˜ S k 2 m Y 1 ) . 112
Background image of page 2
Thus, var( ˆ S )= k 2 n X 1 ) = k 2 n ( p E [ Y 2 1 ] p 2 ( E [ Y 1 ]) 2 ) = k 2 n 1 p E [ Y 2 1 ] ( E [ Y 1 ] ) 2 = k 2 n Y 1 )+ 1 p p E [ Y 2 1 ] = k 2 n Y 1 ) 1+ r (1 p ) p =var( ˜ S ) · m n · p + r (1 p ) p . It follows that when m = n , ˜ S ) ˆ S ) = p p + r (1 p ) .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/11/2011 for the course MATH 170 taught by Professor Staff during the Spring '08 term at UCLA.

Page1 / 10

12 - CHAPTER 9 Solution to Problem 9.1. Let Xi denote the...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online