7 - (b) We have using the chain rule E[Y ] = d MY (s) ds =...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
(b) We have using the chain rule E [ Y ]= d ds M Y ( s ) ± ± ± ± ± s =0 = d ds M X ( s ) ± ± ± ± ± s =0 · λe λ ( M X ( s ) 1) ± ± ± ± ± s =0 = 1 2 · λ = λ 2 , wherewehaveusedthefactthat M X (0) = 1. (c) From the law of iterated expectations we obtain E [ Y E ² E [ Y | N ] ³ = E ² N E [ X ] ³ = E [ N ] E [ X λ 2 . Solution to Problem 4.42. Take X and Y to be normal with means 1 and 2, respectively, and very small variances. Consider the random variable that takes the value of X with some probability p and the value of Y with probability 1 p . This random variable takes values near 1 and 2 with relatively high probability, but takes values near its mean (which is 3 2 p ) with relatively low probability. Thus, this random variable is not normal. Now let N be a random variable taking only the values 1 and 2 with probabilities p and 1 p , respectively. The sum of a number N of independent normal random variables with mean equal to 1 and very small variance is a mixture of the type discussed above, which is not normal. Solution to Problem 4.43. (a) Using the total probability theorem, we have P ( X> 4) = 4 X k =0 P ( k lights are red) P ( 4 | k lights are red) . We have P ( k lights are red) = 4 k ´ 1 2 4 . The conditional PDF of X given that k lights are red, is normal with mean k minutes and standard deviation (1 / 2) k .Thu s , X is a mixture of normal random variables and the transform associated with its (unconditional) PDF is the corresponding mixture of the transforms associated with the (conditional) normal PDFs. However, X is not normal, because a mixture of normal PDFs need not be normal. The probability P ( 4 | k lights are red) can be computed from the normal tables for each k ,and P ( 4) is obtained by substituting the results in the total probability formula above. (b) Let K be the number of traffic lights that are found to be red. We can view X as the sum of K independent normal random variables. Thus the transform associated with X can be found by replacing in the binomial transform M K ( s )=(1 / 2+(1 / 2) e s ) 4 the occurrence of e s by the normal transform corresponding to μ =1and σ =1 / 2. Thus M X ( s )= 1 2 + 1 2 e (1 / 2) 2 s 2 2 + s ´´ 4 . Note that by using the formula for the transform, we cannot easily obtain the proba- bility P ( 4). 61
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Solution to Problem 4.44. (a) Using the random sum formulas, we have E [ N ]= E [ M ] E [ K ] , var( N )= E [ M ]var( K )+ ( E [ K ] ) 2 M ) . (b) Using the random sum formulas and the results of part (a), we have E [ Y E [ N ] E [ X E [ M ] E [ K ] E [ X ] , Y E [ N X ( E [ X ] ) 2 N ) = E [ M ] E [ K X ( E [ X ] ) 2 E [ M K ( E [ K ] ) 2 M ) . (c) Let N denote the total number of widgets in the crate, and let X i denote the weight of the i th widget. The total weight of the crate is Y = X 1 + ··· + X N , with N = K 1 + + K M , so the framework of part (b) applies. We have E [ M 1 p , M 1 p p 2 , (geometric formulas) , E [ K μ, M μ, (Poisson formulas) , E [ X 1 λ , M 1 λ 2 , (exponential formulas) .
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/11/2011 for the course MATH 170 taught by Professor Staff during the Spring '08 term at UCLA.

Page1 / 10

7 - (b) We have using the chain rule E[Y ] = d MY (s) ds =...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online