Binary Multiplier Channel

# Binary Multiplier Channel - Information Theory and Coding...

• Notes
• 7

This preview shows pages 1–3. Sign up to view the full content.

Information Theory and Coding EPFL Winter Semester 2009/2010 Prof. Suhas Diggavi Handout # 23, Thursday, 17 December, 2009 Solutions: Homework Set # 6 Problem 1 ( Cascade Network ) (a) We know that the capacity of the channel is equal to C = max P X I ( X ; V ) . From the problem setup we observe that we have the following Markov chain X Y U V. By data processing inequality we have I ( X ; V ) I ( X ; Y ) and I ( X ; V ) I ( U ; V ). Then we can proceed as follows. We have I ( X ; V ) I ( X ; Y ) so we can find the max P X of both side so we have C = max P X I ( X ; V ) max P X I ( X ; Y ) = C ( p ) . To show C C ( q ) is a little bit more tricky and it should be done in two step of maxi- mization. Again we have I ( X ; V ) I ( U ; V ) so we can write C = max P X I ( X ; V ) I ( U ; V ) | for some P U dictated by choosing P X to be the maximizer of I ( X ; V ) max P U I ( U ; V ) = C ( q ) . So finally we have C min[ C ( p ) , C ( q )] . Note that the above argument works for every two cascade channel, not only the binary symmetric channel. (b) In this case, when there is no processing at the relay, U = Y , the overall channel from X to V can be regarded as a new binary symmetric channel with some new transition probability. To find the transition probability of the overall channel we process as follows P [ V = 0 | X = 0] = summationdisplay i ∈{ 0 , 1 } P [ V = 0 , Y = i | X = 0] (1) = summationdisplay i ∈{ 0 , 1 } P [ V = 0 | X = 0 , Y = i ] · P [ Y = i | X = 0] (2) = summationdisplay i ∈{ 0 , 1 } P [ V = 0 | Y = i ] · P [ Y = i | X = 0] = (1 p )(1 q ) + pq 1

This preview has intentionally blurred sections. Sign up to view the full version.

where (1) follows from the chain rule for probability, (2) follows from the Markov chain we have. Then we can write P [ V = 0 | X = 1] = summationdisplay i ∈{ 0 , 1 } P [ V = 0 , Y = i | X = 1] = summationdisplay i ∈{ 0 , 1 } P [ V = 0 | X = 1 , Y = i ] · P [ Y = i | X = 1] = summationdisplay i ∈{ 0 , 1 } P [ V = 0 | Y = i ] · P [ Y = i | X = 1] = p (1 q ) + (1 p ) q. Similarly we can find P [ V = 1 | X = 0] and P [ V = 1 | X = 1] using the same method but the answer will be the same. So the for the capacity in this case we have C = 1 h 2 ( p (1 q ) + (1 p ) q ). (c) In this part we assume that relay can do some processing. The scheme that we suggest is as follows. Let r = min[ C ( p ) , C ( q )]. First source S used some channel code with rate r to encode its data and send it over the first channel. Then relay wait until receive the whole block of data and decode it. Because the source sends information at rate below the capacity of first channel ( r C ( p )) we can make the decoding error as small as possible. After decoding, the relay re-encode the information using some channel code with rate r (this time for the second channel) and send it to the destination D . Again because we have r C ( q ) the destination can decode the data with a very small probability of error.
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern