This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Stat 411 Homework 05 Solutions 1. (a) The likelihood function looks like L ( 1 , 2 ) = n Y i =1 f ( X i )  n/ 2 2 e (1 / 2 2 ) n i =1 (log X i 1 ) 2 . Taking a natural logarithm gives ( 1 , 2 ) = const n log 2 2 1 2 2 n X i =1 (log X i 1 ) 2 . Differentiating with respect to 1 and 2 separately and setting equal to zero gives the likelihood equations: 0 = 1 ( 1 , 2 ) = 1 2 n X i =1 (log X i 1 ) , 0 = 2 ( 1 , 2 ) = n 2 2 + 1 2 2 2 n X i =1 (log X i 1 ) 2 . The solution to the first equation is 1 = 1 n n i =1 log X i , irrespective of the value of 2 . With the solution 1 plugged in to the second equation, the solution for 2 is clearly 2 = 1 n n i =1 (log X i 1 ) 2 . This is just like if we had originally made a transformation Y = log X and did the calculations with Y 1 ,...,Y n iid N ( 1 , 2 ); that is, the MLEs would be 1 = Y and 2 = 1 n n X i =1 ( Y i Y ) 2 , Y i = log X i , i = 1 ,...,n. (b) Towards the Fisher information matrix, we need various derivatives of log f ( x ): 1 log f ( x ) = log x 1 2 2 log f ( x ) = 1 2 2 + (log x 1 ) 2 2 2 2 2 2 1 log f ( x ) = 1 2 2 2 2 log f ( x ) = 1 2 2 2 (log x 1 ) 2 3 2 2 1 2 log f ( x ) = log x 1 2 2 ....
View
Full
Document
This note was uploaded on 03/12/2012 for the course STAT 411 taught by Professor Staff during the Spring '08 term at Ill. Chicago.
 Spring '08
 STAFF

Click to edit the document details