p3soln10 - STAT 330 SOLUTIONS PART III 46.(a) Note that n P...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT 330 SOLUTIONS PART III 46 . ( a ) Note that n P i =1 ¡ X i ¯ X ¢ = μ n P i =1 X i n ¯ X =0 and n P i =1 ( s i ¯ s )=0 . Therefore n P i =1 t i X i = n P i =1 ³ s i ¯ s + s n ´ ¡ X i ¯ X + ¯ X ¢ (1) = n P i =1 h s i ¡ X i ¯ X ¢ + ³ s n ¯ s ´ ¡ X i ¯ X ¢ +( s i ¯ s ) ¯ X + s n ¯ X i = n P i =1 s i U i + ³ s n ¯ s ´ n P i =1 ¡ X i ¯ X ¢ + ¯ X n P i =1 ( s i ¯ s )+ s ¯ X = n P i =1 s i U i + s ¯ X. Also since X 1 ,X 2 ,...,X n are independent N ¡ μ , σ 2 ¢ random variables E exp μ n P i =1 t i X i ¶¸ = n Y i =1 E [exp ( t i X i )] = n Y i =1 exp μ μ t i + 1 2 σ 2 t 2 i (2) =e x p μ μ n P i =1 t i + 1 2 σ 2 n P i =1 t 2 i . Therefore by (1) and (2) E exp μ n P i =1 s i U i + s ¯ X ¶¸ = E exp μ n P i =1 t i X i ¶¸ (3) x p μ μ n P i =1 t i + 1 2 σ 2 n P i =1 t 2 i . 46 . ( b ) n P i =1 t i = n P i =1 ³ s i ¯ s + s n ´ = n P i =1 ( s i ¯ s n s n =0+ s = s (4) n P i =1 t 2 i = n P i =1 ³ s i ¯ s + s n ´ 2 (5) = n P i =1 ( s i ¯ s ) 2 +2 s n n P i =1 ( s i ¯ s n n P i =1 ³ s n ´ 2 = n P i =1 ( s i ¯ s ) 2 +0+ s 2 n = n P i =1 ( s i ¯ s ) 2 + s 2 n 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
46 . ( c ) M ( s 1 ,...,s n ,s )= E exp μ n P i =1 s i U i + s ¯ X ¶¸ = E exp μ n P i =1 t i X i ¶¸ =e x p μ μ n P i =1 t i + σ 2 2 n P i =1 t 2 i by (3) x p ½ μ s + 1 2 σ 2 n P i =1 ( s i ¯ s ) 2 + s 2 n ¸¾ by (4) and (5) x p μ s + 1 2 σ 2 μ s 2 n ¶¸ exp 1 2 σ 2 n P i =1 ( s i ¯ s ) 2 ¸ 46 . ( d ) Since M ¯ X ( s M (0 ,..., 0 )=exp μ s + 1 2 σ 2 μ s 2 n ¶¸ and M U ( s 1 n M ( s 1 n , 0) = exp 1 2 σ 2 n P i =1 ( s i ¯ s ) 2 ¸ we have M ( s 1 n M ¯ X ( s ) M U ( s 1 n ) . By the Independence Theorem for m.g.f.’s, ¯ X and U are independent random variables and by Corollary 3 . 4 . 3 , ¯ X and n P i =1 U 2 i = n P i =1 ¡ X i ¯ X ¢ 2 are independent random variables. 2
Background image of page 2
47 . G n ( y )= P ( Y n y P (min ( X 1 ,...,X n ) y ) =1 P ( X 1 > y,. ..,X n >y ) n Y i =1 P ( X i ) since X 0 i s are independent random variables Z y e ( x θ ) dx n e n ( y θ ) ,y > θ (6) Since lim n →∞ G n ( y ( 1 if y> θ 0 if y< θ therefore Y n p θ and by the Limit Theorems U n = Y n θ p 1 . Now P ( V n v P ( n ( Y n θ ) <v P ³ Y n v n + θ ´ e n ( v/n + θ θ ) using (6) e v ,v 0 which is the c.d.f. of an EXP (1) random variable. Therefore V n v EXP (1) for n , 2 ,... which implies V n D V v EXP (1) . Since P ( W n w P ¡ n 2 ( Y n θ ) <w ¢ = P ³ Y n v n 2 + θ ´ e n ( w/n 2 + θ θ ) e w/n ,w 0 therefore lim n →∞ P ( W n w )=0 for all w < which is not a c.d.f. Therefore W n has no limiting distribution. 3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
48 . We f rst note that P ( Y n y )= P (max ( X 1 ,...,X n ) y ) = P ( X 1 y,. ..,X n y ) = n Q i =1 P ( X i y ) since X 0 i s are independent random variables = n Q i =1 F ( y ) =[ F ( y )] n ,y < . (7) Since F is a c.d.f., F takes on values between 0 and 1 . Therefore the function n [1 F ( · )] takes on values between 0 and n . G n ( z P ( Z n z ) , the c.d.f.
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 07/18/2011 for the course STAT 330 taught by Professor Paulasmith during the Spring '08 term at Waterloo.

Page1 / 35

p3soln10 - STAT 330 SOLUTIONS PART III 46.(a) Note that n P...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online