{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

hw7sol

# hw7sol - EE 376A Information Theory Prof T Cover Handout#24...

This preview shows pages 1–3. Sign up to view the full content.

EE 376A Handout #24 Information Theory Thursday, March 10, 2011 Prof. T. Cover Solutions to Homework Set #7 1. Source and channel. We wish to encode a Bernoulli( α ) process V 1 ,V 2 ,... for transmission over a binary symmetric channel with error probability p . V n 011011101 −→ X n ( V n ) −→ - - ± ± ± ± ±* H H H H Hj 0 1 0 1 p p −→ Y n −→ ˆ V n 011011101 Find conditions on α and p so that the probability of error P ( ˆ V n ̸ = V n ) can be made to go to zero as n −→ ∞ . Solution: Source and channel. Suppose we want to send a binary i.i.d. Bernoulli( α ) source over a binary symmetric channel with error probability p . By the source-channel separation theorem, in order to achieve the probability of error that vanishes asymptotically, i.e. P ( ˆ V n ̸ = V n ) 0, we need the entropy of the source to be less than the capacity of the channel. Hence, H ( α ) + H ( p ) < 1 , or, equivalently, α α (1 α ) 1 - α p p (1 p ) 1 - p < 1 2 . 2. Cascaded BSCs. Consider the two discrete memoryless channels ( X ,p 1 ( y | x ) , Y ) and ( Y ,p 2 ( z | y ) , Z ). Let p 1 ( y | x ) and p 2 ( z | y ) be binary symmetric channels with crossover probabilities λ 1 and λ 2 respectively. 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
- - ± ± ± ± ± ± ± ± ± ± ± ± ± ± ± 3 Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q s - - ± ± ± ± ± ± ± ± ± ± ± ± ± ± ± 3 Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q s 1 0 X 1 0 Z 1 0 Y 1 λ 1 1 λ 1 λ 1 λ 1 1 λ 2 1 λ 2 λ 2 λ 2 (a) What is the capacity C 1 of p 1 ( y | x )? (b) What is the capacity C 2 of p 2 ( z | y )? (c) We now cascade these channels. Thus p 3 ( z | x ) = y p 1 ( y | x ) p 2 ( z | y ). What is the capacity C 3 of p 3 ( z | x )? Show C 3 min { C 1 ,C 2 } . (d) Now let us actively intervene between channels 1 and 2, rather than passively transmitting y n . What is the capacity of channel 1 followed by channel 2 if you are allowed to decode the output y n of channel 1 and then reencode it as ˜ y n for transmission over channel 2? (Think W −→ x n ( W ) −→ Y n −→ ˜ y n ( Y n ) −→ Z n −→ ˆ W. ) (e) What is the capacity of the cascade in part c) if the receiver can view both Y and Z ? Solution: Cascaded BSCs (a) C 1 is just a capacity of a BSC ( λ 1 ). Thus, C 1 = 1 H ( λ 1 ). (b) Similarly,
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 8

hw7sol - EE 376A Information Theory Prof T Cover Handout#24...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online