This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECE 534: Elements of Information Theory, Fall 2010 Homework 7 Solutions all by Kenneth Palacio Baus October 24, 2010 1. Problem 7.23. Binary multiplier channel (a) Consider the channel Y = XZ , where X and Z are independent binary random variables that take on values and 1 . Z is Bernoulli( ) [i.e., P ( Z = 1) = ]. Find the capacity of this channel and the maximizing distribution on X . X Z Y 1 1 1 1 1 Table 1: Possibles values for Y = XZ . Define the distribution of X as follows: p ( x ) = , 1 p 1 , p Then, we have the following distribution for Y: p ( y ) = , 1 p 1 , p So, we can compute the capacity as: C = max dist I ( X ; Y ) (1) = H ( Y ) H ( Y  X ) (2) = H ( p ) [ P [ X = 0] H ( Y  X = 0) + P [ X = 1] H ( Y  X = 1)] (3) = H ( p ) pH ( ) (4) Since, H ( Y  X = 0) = 0 and H ( Y  X = 1) = H ( Z ) = H ( ). 1 Now, we need to find the parameter p which maximizes the mutual information. We take the approach of the derivative equaled to zero. C = H ( p ) pH ( ) (5) = p log 2 p (1 p ) log 2 (1 p ) pH ( ) (6) 0 = d dp ( p log 2 p (1 p ) log 2 (1 p ) pH ( )) (7) = log 2 p log 2 ( e ) + log 2 (1 p ) + log 2 ( e ) H ( ) (8) = log 2 p + log 2 (1 p ) H ( ) (9) H ( ) = (log 2 p log 2 (1 p )) (10) 2 H ( ) = 2 log 2 ( p (1 p ) ) (11) p = (1 p )2 H ( ) (12) p 1 + 2 H ( ) = 2 H ( ) (13) We obtain: p = 2 H ( ) (1 + 2 H ( ) ) (14) = 1 (2 H ( ) + 1) (15) Then, we can compute the capacity: C = H ( p ) pH ( ) (16) = H 1 (2 H ( ) + 1) ! 1 (2 H ( ) + 1) H ( ) (17) (b) Now suppose that the receiver can observe Z as well as Y. What is the capacity? If we observe Z and Y , the expression for the capacity is: C = max dist I ( X ; Y,Z ) (18) I ( X ; Y,Z ) = I ( X ; Z ) + I ( X ; Y  Z ) (19) I ( X ; Z ) = 0 since they are independent. I ( X ; Y  Z ) = H ( Y  Z ) H ( Y  X,Z ) (20) 2 H ( Y  X,Z ) = 0 since given X and Z , there is no uncertainty in Y . I ( X ; Y  Z ) = H ( Y  Z ) (21) = P ( Z = 0) H ( Y  Z = 0) + P ( Z = 1) H ( Y  Z = 1) (22) = P ( Z = 1) H ( Y  Z = 1) (23) = H ( X ) (24) = H ( p ) (25) Then the capacity: C = max dist I ( X ; Y,Z ) (26) = max dist H ( p ) (27) = (28) Since, the distribution that maximizes the H ( p ) is obtained for p = 1 / 2. 3 2. Problem 7.28. Choice of channels . Find the capacity C of the union of two channels ( X 1 ,p 1 ( y 1  x 1 ) ,Y 1 ) and ( X 2 ,p 2( y 2  x 2 ) ,Y 2 ), where at each time, one can send a symbol over channel 1 or channel 2 but not both. Assume that the output alphabets are distinct and do not intersect....
View
Full
Document
This note was uploaded on 10/27/2010 for the course ECE 221 taught by Professor Sd during the Spring '10 term at HustonTillotson.
 Spring '10
 sd

Click to edit the document details