# homework2 - y are related by: y = Hx + z , where H is an n...

This preview shows page 1. Sign up to view the full content.

Stochastic Processes nctuee07f Homework 2 Due on 11/13/2007 , Tuesday, in class 1. Understand Theorem 3.2 on page 65 of Gallager’s note. Then, do exercise 3.2 in Gallager’s note. 2. Let x , y , and z be collectively jointly Gaussian random vectors. That is the elements of the random vector [ x T , y T , z T ] T are jointly Gaussian. (a) If y and z are statistically independent, show that E [ x | y , z ] = E [ x | y ] + E [ x | z ] - m x , where m x = E [ x ] . (b) If y and z are not necessarily statistically independent, show that E [ x | y , z ] = E [ x | y , ˆ z ] , where ˆ z = z - E [ z | y ] . 3. Let X 1 and X 2 be two standard Gaussian random variables and let Y 1 = X 1 + X 2 Y 2 = X 1 - X 2 Y 3 = 3 X 1 + X 2 . Compute the conditional pdf f Y 1 | Y 2 ,Y 3 ( y 1 | y 2 ,y 3 ) of Y 1 given [ Y 2 Y 3 ] T and compute its mean. 4. In a common communication model, the transmitted signal x and the received signal
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: y are related by: y = Hx + z , where H is an n by m matrix that is not random, x is jointly Gaussian with N (0; K x ) and z is additive Gaussian noise N (0; K z ). The noise is independent of the signal x . (a) Let u = [ x T ; y T ] T . Show that u is jointly Gaussian. (b) Find a simple condition on H , K x and K z for which the covariance matrix K u of u is invertible. (c) Find the conditional distribution of the input x given the output y = y ? 5. Show that a circularly symmetric complex Gaussian random variable must have i.i.d. real and imaginary components....
View Full Document

## This note was uploaded on 11/28/2010 for the course EE 301 taught by Professor Gfung during the Winter '10 term at National Chiao Tung University.

Ask a homework question - tutors are online