This preview shows page 1. Sign up to view the full content.
Unformatted text preview: 7-12 Solutions Manual for Statistical Inference 7.27 a. The log likelihood is
n log L =
i=1 -i + yi log(i ) - i + xi log(i ) - log yi ! - log xi ! and differentiation gives log L log L j
i=1 -i + - +
n y i i i = n j=1 n i=1 yi n i=1 i = yj xj -i+ j j
n j=1 j = yj . xj + yj 1+ j =
j=1 xj + 1+ ^ Combining these expressions yields = n j=1 yj / n j=1 xj and j = ^ xj +yj ^ . 1+ b. The stationary point of the EM algorithm will satisfy ^ = 1 ^ j ^ = =
n i=1 yi n i=2 1 + ^ 1 + y 1 ^ ^ +1 xj + yj . ^ +1 xi The second equation yields 1 = y1 /, and substituting this into the first equation yields n n = j=2 yj / j=2 xj . Summing over j in the third equation, and substituting = n n n n ^ j=2 yj / j=2 xj shows us that j=2 j = j=2 xj , and plugging this into the first equa^ tion gives the desired expression for . The other two equations in (7.2.16) are obviously satisfied. ^ c. The expression for was derived in part (b), as were the expressions for i . ^ 7.29 a. The joint density is the product of the individual densities. b. The log likelihood is
n log L =
i=1 -mi + yi log(mi ) + xi log(i ) + log m! - log yi ! - log xi ! and log L = 0 log L = 0 j
n n n = j = n i=1 yi n i=1 mi xj + yj . m ^ Since j = 1, = i=1 yi /m = i=1 yi / i=1 xi . Also, j j = j (yj + xj ) = 1, which implies that m = j (yj + xj ) and j = (xj + yj )/ i (yi + xi ). ^ c. In the likelihood function we can ignore the factorial terms, and the expected complete-data (r) (r) likelihood is obtained by on the rth iteration by replacing x1 with E(X1 |^1 ) = m^1 . Substituting this into the MLEs of part (b) gives the EM sequence. ...
View Full Document
- Spring '12