This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: HOMEWORK #3 SOLUTIONS IE121 719. f ( x) = e  x x!
n L ( ) = i =1
n n e e = xi !
xi   n n n xi i =1 x !
i i =1 ln L( ) =  n ln e + x i ln  ln x i !
i =1 i =1 d ln L( ) 1 = n + d n x
i =1 i n i 0 n+ x
i =1 =0 x
i =1 n i = n ^ =
726. x
i =1 n i n a) Because a is less than or equal to "a" for every sample, E( a ) cannot equal "a". n n b) Yes, E( a ) is less than "a" by a factor of . As n , 1, and E( a ) a. n+1 n+1 (n +1) , because Ea (n +1) = n + 1E(a) = a . c) a n n a d) FY(y) = P(Y y)=P(X1 y, X2 y,..., Xn y) = P(X1 y)P(X2 y)...P(Xn y) = (y/a)n for 0 y a. f ( y) = FY ( y) nyn1 = for 0ya y an =0 otherwise The maximum likelihood estimator for a is Y. To show that the mle for a is biased, need to show that E(Y) a: E( Y ) = y
0 a nyn1 an = nyn an (n + 1) a =
0 n a. n+1 735. Thus, E(Y) a, and the mle of a is biased. 3.5 = = 1.429 , X = 75.5 psi X = n 6
75.75  75.5 1.429 X  P ( X 75.75) = P / n = P ( Z 0.175) = 1  P ( Z 1.75) = 1  0.56945 = 0.43055 751. X ~ N (50,144)
53 50 12 / 36 P (47 X 53) = P 47 50 Z 12 / 36 = P (1.5 Z 1.5) = P ( Z 1.5)  P ( Z 1.5) = 0.9332  0.0668 = 0.8664
757. V ( X ) = V [ aX 1 + (1  a ) X 2 ] = a 2V ( X 1 ) + (1  a ) 2V ( X 2 ) = a 2 ( ) + (1  2a + a 2 )( ) n n
1 2 2 2 = a 2 2 2 2a 2 a 2 2 +  + n2 n2 n2 n1 2 = (n2 a 2 + n1  2n1a + n1a 2 )( ) n1n2 2 V ( X ) = ( )(2n2 a  2n1 + 2n1a ) 0 n1n2 a 0 = 2n2 a  2n1 + 2n1a 2a ( n2 + n1 ) = 2n1 a ( n2 + n1 ) = n1 n1 a= n2 + n1 n 1 i =1 770. E (V ) = k [ E ( X i2+1 ) + E ( X i2 )  2 E ( X i X i +1 )] = k ( 2 + 2 + 2 + 2  2 2 )
i =1 n 1 Therefore, k = = k ( n  1)2 2 1 2 ( n 1) ...
View
Full Document
 Spring '08
 Perevalov
 Maximum likelihood, Estimation theory, Likelihood function, maximum likelihood estimator, ASCII, Bias of an estimator

Click to edit the document details