{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

hw1sol - Stochastic Processes nctuee07f Homework 1...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Stochastic Processes nctuee07f Homework 1 Solutions 1. f X ( x ) = d dx P [ X x ] = d dx X n P [ X x | Y = n ] P [ Y = n ] (by total probability theorem) = X n d dx P [ X x | Y = n ] P [ Y = n ] = X n f X | Y ( x | Y = n ) P [ Y = n ] . 2. The probability that S = - 1, given that we have observed Y = y is P [ S = - 1 | Y = y ], which, according to the formula on page 1-12 of Topic 1, can be calculated by P [ S = - 1 | Y = y ] = P [ S = - 1] f Y | S ( y | S = - 1) P [ S = 1] f Y | S ( y | S = 1) + P [ S = - 1] f Y | S ( y | S = - 1) . With the given received signal model, the conditional density of Y given S = - 1 is just Gaussian with mean - 1 and variance σ 2 . Thus, we have f Y | S ( y | S = - 1) = 1 2 πσ 2 e - ( y +1) 2 2 σ 2 . And, similarly, f Y | S ( y | S = 1) = 1 2 πσ 2 e - ( y - 1) 2 2 σ 2 . After a little algebraic effort, it gives P [ S = - 1 | Y = y ] = e - y σ 2 e y σ 2 + e - y σ 2 . 3. (a) This identity holds true for both discrete and continuous random variables. Consider discrete X and Y . E [ E [ X | Y ]] = X y E [ X | Y = y ] p Y ( y ) = X y ˆ X x xp X | Y ( x | y ) ! p Y ( y ) = X x x ˆ X y p X | Y ( x | y ) p Y ( y ) ! = X x x ˆ X y p X,Y ( x, y ) ! = X x xp X ( x ) = E [ X ] . (b) The result of this problem is intuitively obvious. The intuition is, given that Y = y , there is nothing random about h ( Y ) | Y = y . Thus, it serves as a deterministic (non-random) value and can be pulled out of the expectation.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
But we still need to justify it mathematically. We consider the discrete random variables case here. Continuous random variables case can also be shown in a similar fashion. Let X and Y be two discrete random variables. From the rule of expected value, the conditional expectation is E h g ( X ) · h ( Y ) fl fl fl Y = y i = X x,y 0 g ( x ) h ( y 0 ) p X,Y | Y ( x, y 0 | y ) = X x,y 0 g ( x ) h ( y 0 ) P [ X = x, Y = y 0 , Y = y ] P [ Y = y ] , where P [ X = x, Y = y 0 , Y = y ] = P [ X = x, Y = y ] if y 0 = y , 0 otherwise. Thus, the conditional expectation is evaluated only when y 0 = y , which gives E h g ( X ) · h ( Y ) fl fl fl Y = y i = X x,y 0 g ( x ) h ( y 0 ) P [ X = x, Y = y 0 , Y = y ] P [ Y = y ] = X x g ( x ) h ( y ) P [ X = x, Y = y ] P [ Y = y ] (a result from y 0 = y ) = h ( y ) X x g ( x ) P [ X = x, Y = y ] P [ Y = y ] = h ( y ) ˆ X x g ( x ) p X | Y ( x
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}