Probset11sol

Probset11sol - Probability and Stochastic Processes...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Probability and Stochastic Processes Homework 10 Solutions Spring2010 Problem Solutions : Yates and Goodman, 6.1.1 6.1.2 6.1.4 6.2.1 6.2.4 6.2.5 6.3.1 6.3.3 6.4.1 6.4.4 6.4.5 6.4.6 6.5.1 6.5.2 and 6.5.3 Problem 6.1.1 Solution The random variable X 33 is a Bernoulli random variable that indicates the result of flip 33. The PMF of X 33 is P X 33 ( x ) = 1- p x = 0 p x = 1 otherwise (1) Note that each X i has expected value E [ X ] = p and variance Var[ X ] = p (1- p ). The random variable Y = X 1 + + X 100 is the number of heads in 100 coin flips. Hence, Y has the binomial PMF P Y ( y ) = braceleftbigg ( 100 y ) p y (1- p ) 100- y y = 0 , 1 ,... , 100 otherwise (2) Since the X i are independent, by Theorems 6.1 and 6.3, the mean and variance of Y are E [ Y ] = 100 E [ X ] = 100 p Var[ Y ] = 100Var[ X ] = 100 p (1- p ) (3) Problem 6.1.2 Solution Let Y = X 1- X 2 . (a) Since Y = X 1 + (- X 2 ), Theorem 6.1 says that the expected value of the difference is E [ Y ] = E [ X 1 ] + E [- X 2 ] = E [ X ]- E [ X ] = 0 (1) (b) By Theorem 6.2, the variance of the difference is Var[ Y ] = Var[ X 1 ] + Var[- X 2 ] = 2Var[ X ] (2) Problem 6.1.4 Solution We can solve this problem using Theorem 6.2 which says that Var[ W ] = Var[ X ] + Var[ Y ] + 2Cov [ X,Y ] (1) The first two moments of X are E [ X ] = integraldisplay 1 integraldisplay 1- x 2 xdy dx = integraldisplay 1 2 x (1- x ) dx = 1 / 3 (2) E bracketleftbig X 2 bracketrightbig = integraldisplay 1 integraldisplay 1- x 2 x 2 dy dx = integraldisplay 1 2 x 2 (1- x ) dx = 1 / 6 (3) (4) 1 Thus the variance of X is Var[ X ] = E [ X 2 ]- ( E [ X ]) 2 = 1 / 18. By symmetry, it should be apparent that E [ Y ] = E [ X ] = 1 / 3 and Var[ Y ] = Var[ X ] = 1 / 18. To find the covariance, we first find the correlation E [ XY ] = integraldisplay 1 integraldisplay 1- x 2 xy dy dx = integraldisplay 1 x (1- x ) 2 dx = 1 / 12 (5) The covariance is Cov [ X,Y ] = E [ XY ]- E [ X ] E [ Y ] = 1 / 12- (1 / 3) 2 =- 1 / 36 (6) Finally, the variance of the sum W = X + Y is Var[ W ] = Var[ X ] + Var[ Y ]- 2Cov [ X,Y ] = 2 / 18- 2 / 36 = 1 / 18 (7) For this specific problem, its arguable whether it would easier to find Var[ W ] by first deriving the CDF and PDF of W . In particular, for 0 w 1, F W ( w ) = P [ X + Y w ] = integraldisplay w integraldisplay w- x 2 dy dx = integraldisplay w 2( w- x ) dx = w 2 (8) Hence, by taking the derivative of the CDF, the PDF of W is f W ( w ) = braceleftbigg 2 w w 1 otherwise (9) From the PDF, the first and second moments of W are E [ W ] = integraldisplay 1 2 w 2 dw = 2 / 3 E bracketleftbig W 2 bracketrightbig = integraldisplay 1 2 w 3 dw = 1 / 2 (10) The variance of W is Var[ W ] = E [ W 2 ]- ( E [ W ]) 2 = 1 / 18. Not surprisingly, we get the same answer both ways....
View Full Document

This note was uploaded on 02/13/2011 for the course 332 226 taught by Professor Staff during the Spring '08 term at Rutgers.

Page1 / 8

Probset11sol - Probability and Stochastic Processes...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online