{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

7.23 - ECE 534 Elements of Information Theory Fall 2010...

Info icon This preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 534: Elements of Information Theory, Fall 2010 Homework 7 Solutions – all by Kenneth Palacio Baus October 24, 2010 1. Problem 7.23. Binary multiplier channel (a) Consider the channel Y = XZ , where X and Z are independent binary random variables that take on values 0 and 1 . Z is Bernoulli( α ) [i.e., P ( Z = 1) = α ]. Find the capacity of this channel and the maximizing distribution on X . X Z Y 0 0 0 0 1 0 1 0 0 1 1 1 Table 1: Possibles values for Y = XZ . Define the distribution of X as follows: p ( x ) = 0 , 1 - p 1 , p Then, we have the following distribution for Y: p ( y ) = 0 , 1 - 1 , So, we can compute the capacity as: C = max dist I ( X ; Y ) (1) = H ( Y ) - H ( Y | X ) (2) = H ( ) - [ P [ X = 0] H ( Y | X = 0) + P [ X = 1] H ( Y | X = 1)] (3) = H ( ) - pH ( α ) (4) Since, H ( Y | X = 0) = 0 and H ( Y | X = 1) = H ( Z ) = H ( α ). 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Now, we need to find the parameter p which maximizes the mutual information. We take the approach of the derivative equaled to zero. C = H ( ) - pH ( α ) (5) = - log 2 - (1 - ) - log 2 (1 - ) - pH ( α ) (6) 0 = d dp ( - log 2 - (1 - ) - log 2 (1 - ) - pH ( α )) (7) = - α log 2 - α log 2 ( e ) + α log 2 (1 - ) + α log 2 ( e ) - H ( α ) (8) = - α log 2 + α log 2 (1 - ) - H ( α ) (9) H ( α ) = - α (log 2 - log 2 (1 - )) (10) 2 H ( α ) - α = 2 log 2 ( (1 - ) ) (11) = (1 - )2 - H ( α ) α (12) αp 1 + 2 - H ( α ) α = 2 - H ( α ) α (13) We obtain: p = 2 - H ( α ) α (1 + 2 - H ( α ) α ) α (14) = 1 (2 H ( α ) α + 1) α (15) Then, we can compute the capacity: C = H ( ) - pH ( α ) (16) = H 1 (2 H ( α ) α + 1) ! - 1 (2 H ( α ) α + 1) α H ( α ) (17) (b) Now suppose that the receiver can observe Z as well as Y. What is the capacity? If we observe Z and Y , the expression for the capacity is: C = max dist I ( X ; Y, Z ) (18) I ( X ; Y, Z ) = I ( X ; Z ) + I ( X ; Y | Z ) (19) I ( X ; Z ) = 0 since they are independent. I ( X ; Y | Z ) = H ( Y | Z ) - H ( Y | X, Z ) (20) 2
Image of page 2
H ( Y | X, Z ) = 0 since given X and Z , there is no uncertainty in Y . I ( X ; Y | Z ) = H ( Y | Z ) (21) = P ( Z = 0) H ( Y | Z = 0) + P ( Z = 1) H ( Y | Z = 1) (22) = P ( Z = 1) H ( Y | Z = 1) (23) = αH ( X ) (24) = αH ( p ) (25) Then the capacity: C = max dist I ( X ; Y, Z ) (26) = max dist αH ( p ) (27) = α (28) Since, the distribution that maximizes the H ( p ) is obtained for p = 1 / 2. 3
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2. Problem 7.28. Choice of channels . Find the capacity C of the union of two channels ( X 1 , p 1 ( y 1 | x 1 ) , Y 1 ) and ( X 2 , p 2( y 2 | x 2 ) , Y 2 ), where at each time, one can send a symbol over channel 1 or channel 2 but not both. Assume that the output alphabets are distinct and do not intersect. (a) Show that 2 C = 2 C 1 + 2 C 2 . Thus, 2 C is the effective alphabet size of a channel with capacity C. Solution: In this communication system we can choose between two sub-channels with a certain prob- ability, lets call it λ . We can define a Bernoulli( λ ) random variable: Q = 1 , use sub-channel 1 with probability λ 2 , use sub-channel 2 with probability 1 - λ So, we can see the input of the channel as X = ( Q, X Q ). We also have that since Y 1 and Y 2 don’t intersect, Q = f ( Y ), so we can do: I ( X ; Y, Q ) = I ( X Q , Q ; Y, Q ) (29) = I ( Y, Q ; Q ) + I ( Y, Q ; X Q | Q ) (30) = I ( Q ; Q ) + I ( Q ; Y | Q ) + I ( Y ; X Q | Q ) (31) = H ( Q ) - H ( Q | Q ) + H ( Q | Q ) - H ( Q | Y, Q ) + I ( Y ; X Q | Q ) (32) = H ( Q ) + I ( Y ; X Q | Q ) (33) = H ( λ ) + λI ( Y ; X Q | Q = 1) + (1 - λ ) I ( Y ; X Q | Q = 2) (34) = H ( λ ) + λ ( Y 1 ; X 1 ) + (1 - λ ) I ( Y 2 ; X 2 ) (35) Capacity follows from the mutual information: C = max
Image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern