# Prob3 - Hint inputs b and c have the same transition...

This preview shows page 1. Sign up to view the full content.

ECE 1520 Data Communications Fall 2011 Problem Set 3 Due in class, Monday, Oct. 12th, 2011. 1. I had asked you to do this in class: Find the diferential entropy o± a zero mean complex Gaussian random variable with variance o± σ 2 . Using this, ²nd the mutual in±ormation o± a complex AWGN channel, Y = X + N , where X ∼ CN (0 , E x ) and N ∼ CN (0 2 ). This is the capacity o± the complex AWGN channel 2. Show that the diferential entropy o± a length- n Gaussian random vector X N ( 0 , C ) is h ( X ) = 1 2 log 2 (2 πe ) n | C | , where | C | denotes the determinant o± the covariance matrix C . 3. P6.43: A channel has 3 possible inputs, X = a , b or c and two possible outputs Y = 1 or 2. We know that P ( Y = 2 /X = a ) = 0 and P ( Y = 1 /X = b ) = P ( Y = 1 /X = c ) = 1 / 2. What is the optimal input distribution and the overall capacity o± this channel?
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Hint: inputs b and c have the same transition probabilities. 4. P6.60: Slightly diferent ±rom what I had said I would put on the HW in class: In a binary erasure channel bits “disappear” without probability p (output state ’ e ’), else bits are received without error, i.e., P ( Y = 1 /X = 1)) = 1-p = P ( Y = /X = 0) and P ( Y = e/X = 0) = P ( Y = e/X = 1) = p . I± P ( X = 0) = α , determine the mutual in±ormation I ( X ; Y ) as a ±unction o± α , then the value o± α that maximizes I ( X ; Y ) and hence the channel capacity....
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online