soln_5_229Aspr07

soln_5_229Aspr07 - EECS 229A * Solutions to Homework 5 1....

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
EECS 229A Spring 2007 * * Solutions to Homework 5 1. Problem 8.8 on pg. 258 of the text. Solution : Channel with uniformly distributed noise Consider the probability distribution ( α i ,i = 0 , ± 1 , ± 2) on X where i α i = 1. This results in Y having a density f Y ( y ) = 1 2 α - 2 if y ( - 3 , - 2) 1 2 ( α - 2 + α - 1 ) if y ( - 2 , - 1) 1 2 ( α - 1 + α 0 ) if y ( - 1 , 0) 1 2 ( α 0 + α 1 ) if y (0 , 1) 1 2 ( α 1 + α 2 ) if y (1 , 2) 1 2 α 2 if y (2 , 3) The corresponding diFerential entropy h ( Y ) equals the entropy of the probability distri- bution on 6 points given by ( 1 2 α - 2 , 1 2 ( α - 2 + α - 1 ) , 1 2 ( α - 1 + α 0 ) , 1 2 ( α 0 + α 1 ) , 1 2 ( α 1 + α 2 ) , 1 2 α 2 ) . The largest this can be is log 6, with equality achieved when α - 2 = α 0 = α 2 = 1 3 , α - 1 = α 1 = 0 . We also note that the conditional diFerential entropy h ( Y | X ) does not depend on the probability distribution of X , and equals 0. Hence the capacity of the channel is log 6. 2. Problem 9.3 on pg. 291 of the text. Solution : Output power constraint We would expect that the capacity of the channel is given by C = max p ( x ) I ( X ; Y ) where the maximum is taken over all distributions p ( x ) on X for which the resulting Y given by Y = X + Z where Z q X and Z N (0 2 ) satisfies E [ Y 2 ] P . Assuming this is true, we can write, for any choice of p ( x ), and assuming that P σ 2 , I ( X ; Y ) = h ( Y ) - h ( Y | X ) = H ( Y ) - 1 2 log 2 πeσ 2 ( a ) 1 2 log 2 πeP - 1 2 log 2 πeσ 2 = 1 2 log P σ 2 , 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
where step (a) comes from applying the output power constraint. If P < σ 2 it is impossible to meet the output power constraint, so the capacity must be 0 in this case. We thus get the bound C ( 1 2 log P σ 2 ) + , where ( u ) + denotes max( u, 0) for a real number u . This bound can be achieved when P σ 2 by choosing p ( x ) to be the Gaussian distribution of mean 0 and variance P - σ 2 . Hence we have C = ( 1 2 log P σ 2 ) + . (1) This discussion only depended on the assumption that C max p ( x ) I ( X ; Y ) , (2) since the achievement of the expression in equation (1) may be proved by a random coding argument as was done for the AWGN with the traditional input power constraint. However, the truth of the bound in equation (2) follows immediately from equation (9.45) on pg. 269 of the text, which was derived using Fano’s inequality and applies irrespective of the nature of the constraints imposed on the communication process. 3. Problem 9.4 on pg. 291 of the text. Solution : Exponential noise channels Strictly speaking, the claim of this problem is false, as we will see at the end of this discussion. What they intended was to also impose the condition that the input be nonnegative, in which case the claim is true, as we will see.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/25/2010 for the course ECE 544 taught by Professor Liu during the Spring '10 term at Ill. Chicago.

Page1 / 7

soln_5_229Aspr07 - EECS 229A * Solutions to Homework 5 1....

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online