hw4sol2 - Solutions to Homework Set #4 Channel and Source...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Solutions to Homework Set #4 Channel and Source coding 1. Lossless source coding with side information. Consider the lossless source coding with side information that is avail- able at the encoder and decoder, where the source X and the side information Y are i.i.d. P X,Y ( x,y ). X n f ( X n ,Y n ) { 1 , 2 ,..., 2 nR } Encoder Decoder Y n Y n X n ( f ( X n ,Y n ) ,Y n ) Figure 1: Lossless source coding with side information at the encoder and decoder. Show that a code with rate R < H ( X | Y ) can not be achievable, and interpret the result. Hint: Let T defines f ( X n ,Y n ). Consider nR H ( T ) H ( T | Y n ) , (1) and use similar steps, including Fanos inequality, as we used in the class to prove the converse where side information was not available. Solution Sketch of the solution (please fill in the explanation for each step): nR H ( T ) H ( T | Y n ) , I ( X n ; T | Y n ) = H ( X n | Y n )- H ( X n | T,Y n ) = nH ( X | Y )- n , where n 0. 1 2. Preprocessing the output. One is given a communication channel with transition probabilities p ( y | x ) and channel capacity C = max p ( x ) I ( X ; Y ) . A helpful statisti- cian preprocesses the output by forming Y = g ( Y ) , yielding a channel p ( y | x ). He claims that this will strictly improve the capacity. (a) Show that he is wrong. (b) Under what conditions does he not strictly decrease the capacity? Solution: Preprocessing the output. (a) The statistician calculates Y = g ( Y ). Since X Y Y forms a Markov chain, we can apply the data processing inequality. Hence for every distribution on x , I ( X ; Y ) I ( X ; Y ) . Let p ( x ) be the distribution on x that maximizes I ( X ; Y ). Then C = max p ( x ) I ( X ; Y ) I ( X ; Y ) p ( x )= p ( x ) I ( X ; Y ) p ( x )= p ( x ) = max p ( x ) I ( X ; Y ) = C. Thus, the helpful suggestion is wrong and processing the output does not increase capacity. (b) We have equality (no decrease in capacity) in the above sequence of inequalities only if we have equality in data processing inequal- ity, i.e., for the distribution that maximizes I ( X ; Y ), we have X Y Y forming a Markov chain. Thus, Y should be a sufficient statistic. 3. The Z channel. The Z-channel has binary input and output alphabets and transition probabilities p ( y | x ) given by the following matrix: Q = bracketleftbigg 1 1 / 2 1 / 2 bracketrightbigg x,y { , 1 } Find the capacity of the Z-channel and the maximizing input probabil- ity distribution. 2 Solution: The Z channel. First we express I ( X ; Y ), the mutual information between the input and output of the Z-channel, as a function of = Pr( X = 1): H ( Y | X ) = Pr( X = 0) 0 + Pr( X = 1) 1 = H ( Y ) = H (Pr( Y = 1)) = H ( / 2) I ( X ; Y ) = H ( Y )- H ( Y | X ) = H ( / 2)- Since I ( X ; Y ) is strictly concave on (why?) and I ( X ; Y ) = 0 when = 0 and = 1, the maximum mutual information is obtained for...
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 11

hw4sol2 - Solutions to Homework Set #4 Channel and Source...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online