확률_및_랜덤변수_솔&e

확률_및_랜덤변수_솔&e

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Problem Solutions – Chapter 6 Problem 6.1.1 Solution The random variable X 33 is a Bernoulli random variable that indicates the result of flip 33. The PMF of X 33 is P X 33 ( x ) = 1 px = 0 = 1 0 otherwise (1) Note that each X i has expected value E [ X ]= p and variance Var [ X p ( 1 p ) . The random variable Y = X 1 +···+ X 100 is the number of heads in 100 coin flips. Hence, Y has the binomial PMF P Y ( y ) = ( ± 100 y ) p y ( 1 p ) 100 y y = 0 , 1 ,..., 100 0 otherwise (2) Since the X i are independent, by Theorems 6.1 and 6.3, the mean and variance of Y are E [ Y ] = 100 E [ X ] = 100 p Var [ Y 100 Var [ X 100 p ( 1 p ) (3) Problem 6.1.2 Solution Let Y = X 1 X 2 . (a) Since Y = X 1 + ( X 2 ) , Theorem 6.1 says that the expected value of the difference is E [ Y ] = E [ X 1 ] + E [ X 2 ] = E [ X ] E [ X ] = 0 (1) (b) By Theorem 6.2, the variance of the difference is [ Y [ X 1 ]+ [− X 2 2Var [ X ] (2) Problem 6.1.3 Solution (a) The PMF of N 1 , the number of phone calls needed to obtain the correct answer, can be determined by observing that if the correct answer is given on the n th call, then the previous n 1 calls must have given wrong answers so that P N 1 ( n ) = ² ( 3 / 4 ) n 1 ( 1 / 4 ) n = 1 , 2 ,... 0 otherwise (1) (b) N 1 is a geometric random variable with parameter p = 1 / 4. In Theorem 2.5, the mean of a geometric random variable is found to be 1 / p . For our case, E [ N 1 4. (c) Using the same logic as in part (a) we recognize that in order for n to be the fourth correct answer, that the previous n 1 calls must have contained exactly 3 correct answers and that the fourth correct answer arrived on the n -th call. This is described by a Pascal random variable. P N 4 ( n 4 ) = ² ± n 1 3 ) ( 3 / 4 ) n 4 ( 1 / 4 ) 4 n = 4 , 5 0 otherwise (2) 243
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
(d) Using the hint given in the problem statement we can find the mean of N 4 by summing up the means of the 4 identically distributed geometric random variables each with mean 4. This gives E [ N 4 ]= 4 E [ N 1 16. Problem 6.1.4 Solution We can solve this problem using Theorem 6.2 which says that Var [ W [ X ]+ [ Y 2Cov [ X , Y ] (1) The first two moments of X are E [ X ] = Z 1 0 Z 1 x 0 2 xdydx = Z 1 0 2 x ( 1 x ) dx = 1 / 3 (2) E ± X 2 = Z 1 0 Z 1 x 0 2 x 2 dydx = Z 1 0 2 x 2 ( 1 x ) = 1 / 6 (3) (4) Thus the variance of X is Var [ X E [ X 2 ]− ( E [ X ] ) 2 = 1 / 18. By symmetry, it should be apparent that E [ Y E [ X 1 / 3 and Var [ Y [ X 1 / 18. To find the covariance, we first find the correlation E [ XY ] = Z 1 0 Z 1 x 0 2 xydydx = Z 1 0 x ( 1 x ) 2 = 1 / 12 (5) The covariance is Cov [ X , Y ] = E [ ] E [ X ] E [ Y ] = 1 / 12 ( 1 / 3 ) 2 =− 1 / 36 (6) Finally, the variance of the sum W = X + Y is [ W [ X [ Y X , Y ] = 2 / 18 2 / 36 = 1 / 18 (7) For this specific problem, it’s arguable whether it would easier to find Var [ W ] by first deriving the CDF and PDF of W . In particular, for 0 w 1, F W (w) = P [ X + Y w ] = Z w 0 Z w x 0 2 = Z w 0 2 (w x ) = w 2 (8) Hence, by taking the derivative of the CDF, the PDF of W is f W (w) = ² 2 w 0 w 1 0 otherwise (9) From the PDF, the first and second moments of W are E [ W ] = Z 1 0 2 w 2 d w = 2 / 3 E ± W 2 = Z 1 0 2 w 3 d w = 1 / 2 (10) The variance of W is Var [ W E [ W 2 ( E [ W ] ) 2 = 1 / 18. Not surprisingly, we get the same answer both ways.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/12/2011 for the course EE EE980 taught by Professor Ji during the Spring '11 term at Akademia Rolnicza w Wrocławiu.

Page1 / 29

확률_및_랜덤변수_솔&e

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online