{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# 12 - CHAPTER 9 Solution to Problem 9.1 Let Xi denote the...

This preview shows pages 1–3. Sign up to view the full content.

C H A P T E R 9 Solution to Problem 9.1. Let X i denote the random homework time for the i th week, i = 1 , . . . , 5. We have the observation vector X = x , where x = (10 , 14 , 18 , 8 , 20). In view of the independence of the X i , for θ [0 , 1], the likelihood function is f X ( x ; θ ) = f X 1 ( x 1 ; θ ) · · · f X 5 ( x 5 ; θ ) = θe x 1 θ · · · θe x 5 θ = θ 5 e ( x 1 + ··· + x 5 ) θ = θ 5 e (10+14+18+8+20) θ = θ 5 e 71 θ . To derive the ML estimate, we set to 0 the derivative of f X ( x ; θ ) with respect to θ , obtaining d ( θ 5 e 71 θ ) = 5 θ 4 e 71 θ 71 θ 5 e 71 θ = (5 71 θ ) θ 4 e 71 θ = 0 . Therefore, ˆ θ = 5 71 = 5 x 1 + · · · + x 5 . Solution to Problem 9.2. (a) Let the random variable N be the number of tosses until the k th head. The likelihood function is the Pascal PMF of order k : p N ( n ; θ ) = n 1 k 1 θ k (1 θ ) n k , n = k, k + 1 , . . . We maximize the likelihood by setting its derivative with respect to θ to zero: 0 = k n 1 k 1 (1 θ ) n k θ k 1 ( n k ) n 1 k 1 (1 θ ) n k 1 θ k , which yields the ML estimator ˆ Θ 1 = k N . Note that ˆ Θ 1 is just the fraction of heads observed in N tosses. (b) In this case, n is a fixed integer and K is a random variable. The PMF of K is binomial: p K ( k ; θ ) = n k θ k (1 θ ) n k , k = 0 , 1 , 2 , . . . , n. 111

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
For given n and k , this is a constant multiple of the PMF in part (a), so the same calculation yields the estimator ˆ Θ 2 = K n . We observe that the ML estimator is again the fraction of heads in the observed trials. Note that although parts (a) and (b) involve different experiments and different random variables, the ML estimates obtained are similar. However, it can be shown that ˆ Θ 2 is unbiased [since E [ ˆ Θ 2 ] = E [ K ] /n = θ · n/n = θ ], whereas ˆ Θ 1 is not [since E [1 /N ] = 1 / E [ N ]]. Solution to Problem 9.3. (a) Let s be the sum of all the ball numbers. Then for all i , E [ X i ] = s k , E [ Y i ] = s k . We have E [ ˆ S ] = E k n n i =1 X i = k n n i =1 E [ X i ] = k n n i =1 s k = s, so ˆ S is an unbiased estimator of s . Similarly, E [ ˜ S ] = s . Finally, let L = S k = 1 N n i =1 X i = 1 N N j =1 Y j . We have E [ L ] = n n =1 E [ L | N = n ] p N ( n ) = n n =1 E 1 n n i =1 Y i N = n p N ( n ) = n n =1 E [ Y 1 ] p N ( n ) = E [ Y 1 ] = s k , so that S = k E [ L ] = s , and S is an unbiased estimator of s .
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 10

12 - CHAPTER 9 Solution to Problem 9.1 Let Xi denote the...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online