HW11b - n independent exponential random variables 3 Let X...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
1. Let X n Beta( n,n ), n =1 , 2 , 3 , · · · . Show that X n P (1 / 2) [i.e., show that P ( | X n - 1 / 2 | > ± ) 0 as n →∞ for every ± > 0. Use Chebyshevs’s inequality.] 2. Slutsky’s theorem states the following: if X n D X and Y n P a where a ± = 0 is a constant, then X n /Y n D X/a (here Z n D Z means that P ( Z n z ) P ( Z z ) for every z ( -∞ , ) at which the cdf of Z is continuous). Use this to prove that for X n Beta( n,n ) one has n ( X n - 1 / 2) D Normal(0 , σ 2 ). Find the value of σ 2 . [Hint: Use the fact that X n = W n / ( W n + V n ) where W n and V n are independent Gamma( n, 1) variables each of which can be written as the sum of
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: n independent exponential random variables!] 3. Let X 1 ,X 2 , ,X n Uniform(0 , ) where > 0. Can you give formulas for L n = L ( X 1 , ,X n ) and U n = U ( X 1 , ,X n ) such that P ( L n < < U n ) = 0 . 95. [Hint: Look at the distribution of M n = max( X 1 , ,X n ), the central limit theorem is of NO help here!]...
View Full Document

This note was uploaded on 01/16/2011 for the course STAT 213 taught by Professor Ioannam during the Fall '09 term at Duke.

Ask a homework question - tutors are online