f00soln6 - Statistics 265 Elements of Probability Theory...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Statistics 265 Elements of Probability Theory Fall Term 2000 Assignment 6 Due: Tuesday December 5, 2000 Solutions 6.2 Let Y be a random variable with a density function given by (3 2 ;1 y 1 f (y) = 2 y 0 otherwise. a. Find the density function of U1 = 3Y: b. Find the density function of U2 = 3 ; Y: c. Find the density function of U3 = Y 2 : Solution: The distribution function of Y is F (y) = P (Y y) = Zy ;1 f (t) dt = Zy3 ; t2 dt = 1 1 + y3 2 2 ;1 for ;1 y 1: a. If U1 = 3Y then FU1 (u) = P (3Y u) = P Y u = F u = 1 1 + u3 3 3 2 27 for ;3 u 3: Di erentiating, we have 82 <u fU1 (u) = : 18 0 ;3 u 3 otherwise. b. If U2 = 3 ; Y then FU2 (u) = P (3 ; Y u) = P (Y 1 3 ; u) = 1 ; P (Y < 3 ; u) = 1 ; F (3 ; u) = 2 1 ; (3 ; u)3 for ;1 3 ; u 1 that is, 2 u 4: Di erentiating, we have (3 2 2u4 fU2 (u) = 2 (3 ; u) 0 otherwise. c. If U3 = Y 2 then p p FU3 (u) = P (Y 2 u) = P (; u Y u) = P (Y p p 1u3 + 1u2 = u3 2 = F ( u) ; F (; u) = 2 2 2 3 for 0 u 1: Di erentiating, we have fU3 (u) = u 0 31 2 2 0u1 otherwise. pu) ; P (Y < ;pu) 6.8 The total time from arrival to completion of service at a fast{food outlet, Y1 and the time spent waiting in line before arriving at the service window, Y2 were given in Exercise 5.9 with joint density function e;y1 0 f (y1 y2) = 0 y2 y1 < 1 otherwise. Another random variable of interest is U = Y1 ; Y2 the time spent at the service window. a. Find the probability density function for U: b. Find E (U ) and V (U ): Compare your answers with the results of Exercise 5.68. Solution: a. The region for which y1 ; y2 u is shown in the gure, over the region for which y1 and y2 are positive and y2 y1 : y =y y 1 2 2 y =u+y 1 u 2 y 1 a. The distribution function of U = Y1 ; Y2 is given by FU (u) = P (Y1 ; Y2 u) = Z 1 Z y2 +u 0 y2 e;y1 dy1dy2 = Z 1; 0 e;y2 ; e;(y2 +u) dy2 = 1 ; e;u for 0 u < 1: Di erentiating, we have fU (u) = e;u 0 0 u<1 otherwise: b. The random variable U = Y1 ; Y2 has a gamma distribution with parameters Therefore, E (U ) = = 1 and V (U ) = 2 = 1: 6.12 In Exercise 4.7, we determined that 8b < f (y) = : y2 0 = 1 and = 1: yb otherwise is a bona de probability density function for a random variable, Y: Assuming b is a known constant and U has a uniform distribution on the interval 0 1] transform U to obtain a random variable with the same distribution as Y: Solution: The distribution function of Y is given by FY (y) = Zy ;1 f (t) dt = for y b: 2 Zyb b 2 dt = 1 ; y t b Since U has a uniform distribution on 0 1] then P (U u) = u b for 0 u 1 and we want to nd a function G such that Y = G(U ) with FY (y) = 1 ; y for y b: Now, FY (y) = P (Y y) = P (G(U ) y) = P (U G;1(y)) = P (U u) = u for 0 u 1 since U has a uniform distribution on 0 1]: Therefore, we want b u= 1; y or b y = 1;u b that is, Y = G(U ) = 1 ; U : 6.16 Let the random variable Y possess a uniform distribution on the interval 0 1]: a. Derive the distribution of the random variable W = Y 2 : p b. Derive the distribution of the random variable W = Y : Solution: a. If W = Y 2 then FW (w) = P (W w) = P (Y pw ) = Z ww p w ) = P (; 2 pw 0 for 0 w 1: Di erentiating, the density function for W = Y 2 is 1 w; 2 0 1 2 fW (w) = 0<w<1 otherwise. p b. If W = Y then FW (w) = P (W p w) = P ( Y w ) = P (Y w2 ) = p for 0 w 1: Di erentiating, the density function for W = Y is fW (w) = 2w 0 3 0<w<1 otherwise. Z w2 0 p 1 dy = w 1 dy = w2 6.22 The Weibull density function is given by ( 1 m;1 ; ym my e f (y ) = 0 0<y<1 otherwise where and m are positive constants. This density function is often used as a model for the lengths of life of physical systems. Suppose Y has the Weibull distribution just given. a. Find the density function of U = Y m : b. Find E (yk ) for any positive integer k: Solution: 1 a. Let U = Y m then using the transformation approach, we have Y = U m and ; dy = 1 u (1;m) = 1 u; (mm 1) m du m m so that ; 1 m;1 ; u 1 u; (mm 1) = 1 e; u gU (u) = 1 m u m e m for u > 0: b. If Y has the Weibull distribution given above, and k is a positive integer, then Z1 k u k k E (Y k ) = E U m = 1 u m e; du = 1 ; m + 1 0 k m +1 k =; m +1 k m: Note that the integrand is the density (except for constants) of a gamma variable with parameters k m + 1 and so that the integration can be done by choosing the necessary constants. 6.24 Let Y have a uniform 0 1] distribution. Show that U = ;2 ln Y has an exponential distribution with mean 2: Solution: The density function for Y is given by f (y ) = 1 for 0 y 1 and since u = ;2 ln y then 2 y = e; u : Now dy 1 ;u 2 du = ; 2 e so that dy 1 2 2 fU (u) = f (y) du = 1 ; 2 e; u = 1 e; u 2 for u > 0 which is the density function of an exponential distribution with = 2: 4 6.25 The speed of a molecule in a gas at equilibrium is a random variable V whose density function is given by f (v) = av2 e;bv 2 v>0 where b = 2m where k T and m denote Boltzmann's constant, the absolute temperature, and the mass kT of the molecule, respectively. a. Derive the distribution of W = 1 mV 2 the kinetic energy of the molecule. 2 b. Find E (W ): Solution: r 2 W a. If W = mV then V = 2m and 2 r dv = 1 2 w; 1 = p 1 2 dw 2 m 2mw Therefore, ; p p a 2w w w bw 2 fW (w) = p m e; 2m = a 2w e; kT = a 32 w 1 e; kT 3 2mw m2 m2 and since the above density must integrate to 1 and since the variable part of the density is that of a gamma variable with = 3 and = kT then a must be chosen so that 2 p a 2= ; 1 3 2 m 3 ; 3 (kT ) 2 2 and the density is 1 w fW (w) = ; 3 1 3 w 2 e; kT ; 2 (kT ) 2 for w 0: b. For a gamma{type random variable, we have E (W ) = 5 = 3 kT: 2 6.32 Suppose that Y1 and Y2 are independent, standard normal random variables. Find the density function of U = Y12 + Y22 : Solution: Since Y1 and Y2 are independent standard normal random variables, the moment{generating functions for Y12 and Y22 can be written as mY12 (t) = and therefore, 1 1 (1 ; 2t) 2 and mY22 (t) = 1 2 (1 ; 2t) 1 1 mU (t) = mY12 (t) mY22 (t) = 1 ; 2t which is the moment generating function for a gamma random variable with = 1 and = 2: By the uniqueness theorem, U has a gamma distribution with = 1 and = 2: Equivalently, U has a 2 distribution with 2 degrees of freedom. 6.45 Show that, if Y1 has a 2 distribution with 1 degrees of freedom and Y2 has a 2 distribution with 2 degrees of freedom, then U1 + U2 has a 2 distribution with 1 + 2 degrees of freedom, provided that Y1 and Y2 are independent. Solution: From Example 4.13 with = 2i and = 2 the generating functions are mYi (t) = (1 ; 2t) 2i for i = 1 2: Since Y1 and Y2 are independent, then mU (t) = mY1 (t) mY2 (t) = (1 ; 2t);( 1 + 2 ) 2 which is the moment generating function for a 2 random variable with 1 + 2 degrees of freedom. The result now follows by the uniqueness theorem for moment{generating functions. 6.46 Let Y1 and Y2 be independent normal random variables, each with mean 0 and variance 2 : De ne U1 = Y1 + Y2 and U2 = Y1 ; Y2 : Show that U1 and U2 are independent normal random variables, each with mean 0 and variance 2 2: Hint: If (U1 U2) has a joint moment{generating function m(t1 t2) then U1 and U2 are independent if and only if m(t1 t2) = mU1 (t) mU2 (t): Solution: We have i h ih E et1 (Y1+Y2 )+t2 (Y1;Y2 ) = E eY1 (t1+t2 )+Y2 (t1;t2 ) h i = E e(t1 +t2 )Y1 e(t1 ;t2)Y2 = mY1 (t1 + t2) mY2 (t1 ; t2 ) 2 2 2 2 = e 2 (t1 +t2 ) e 2 (t1 ;t2 ) 1 2 = e 2 t2 e 2 t2 = mU1 (t1 ) mU2 (t2) and since the joint moment{generating function is the product of the marginal moment{generating functions, then the random variables U1 and U2 are independent. 6 ...
View Full Document

This note was uploaded on 10/19/2011 for the course MATH Statistics taught by Professor Issac during the Fall '00 term at University of Alberta.

Ask a homework question - tutors are online