sample_final_sol

sample_final_sol - EE 376A/Stat 376A Information Theory...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A/Stat 376A Information Theory Prof. T. Weissman Friday, March 17, 2006 Solutions to Practice Final Problems These problems are sampled from a couple of the actual finals in previous years. 1. ( 20 points) Errors and erasures. Consider a binary symmetric channel (BSC) with crossover probability p .-- 3 Q Q Q Q Q Q Q Qs 1 1 1- p 1- p p p A helpful genie who knows the locations of all bit flips offers to convert flipped bits into erasures. In other words, the genie can transform the BSC into a binary erasure channel. Would you use his power? Be specific. Solution: Errors and erasures. Although it is very tempting to accept the genies offer, on a second thought, one realizes that it is disadvantageous to convert the bit flips into erasures when p is large. For example, when p = 1 , the original BSC is noiseless, while the helpful genie will erase every single bit coming out from the channel. The capacity C 1 ( p ) of the binary symmetric channel with crossover probability p is 1- H ( p ) while the capacity C 2 ( p ) of the binary erasure channel with erasure probability p is 1- p . One would convert the BSC into a BEC only if C 1 ( p ) C 2 ( p ), that is, p p * = . 7729 . (See Figure 1.) 2. (20 points) Code constraint. What is the capacity of a BSC( p ) under the constraint that each of the codewords has a proportion of 1s less than or equal to , i.e., 1 n n X i =1 X i ( w ) , for w { 1 , 2 ,..., 2 nR } . (Pay attention when > 1 / 2.) 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 p 1-H(p) 1-p pstar Solution: Code constraint. Using the similar argument for the capacity of Gaussian channels under the power constraint P , we find that the capacity C of a BSC( p ) under the proportion constraint is C = max p ( x ): EX I ( X ; Y ) . Now under the Bernoulli( ) input distribution with , we have I ( X ; Y ) = H ( Y )- H ( Y | X ) = H ( Y )- H ( Z | X ) = H ( Y )- H ( Z ) = H ( * p )- H ( p ) , (1) where * p = (1- ) p + (1- p ). (Breaking I ( X ; Y ) = H ( X )- H ( X | Y ) = H ( X )- H ( Z | Y ) is way more complicated since Z and Y are correlated.) Now when > 1 / 2 , we have max H ( * p )- H ( p ) = 1- H ( p ) , with the capacity-achieving * = 1 / 2 . On the other hand, when 1 / 2 , * = achieves the maximum of (1); hence C = H ( * p )- H ( p ) . 3. ( 20 points) Partition. Let ( X,Y ) denote height and weight. Let [ Y ] be Y rounded off to the nearest pound. (a) Which is greater I ( X ; Y ) or I ( X ;[ Y ]) ? (b) Why? 2 Solution: Partition. (a) I ( X ; Y ) I ( X ;[ Y ]) . (b) Data processing inequality....
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 13

sample_final_sol - EE 376A/Stat 376A Information Theory...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online