Binary Multiplier Channel

Binary Multiplier Channel - Information Theory and Coding...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Information Theory and Coding EPFL Winter Semester 2009/2010 Prof. Suhas Diggavi Handout # 23, Thursday, 17 December, 2009 Solutions: Homework Set # 6 Problem 1 ( Cascade Network ) (a) We know that the capacity of the channel is equal to C = max P X I ( X ; V ) . From the problem setup we observe that we have the following Markov chain X ↔ Y ↔ U ↔ V. By data processing inequality we have I ( X ; V ) ≤ I ( X ; Y ) and I ( X ; V ) ≤ I ( U ; V ). Then we can proceed as follows. We have I ( X ; V ) ≤ I ( X ; Y ) so we can find the max P X of both side so we have C = max P X I ( X ; V ) ≤ max P X I ( X ; Y ) = C ( p ) . To show C ≤ C ( q ) is a little bit more tricky and it should be done in two step of maxi- mization. Again we have I ( X ; V ) ≤ I ( U ; V ) so we can write C = max P X I ( X ; V ) ≤ I ( U ; V ) | for some P ′ U dictated by choosing P ∗ X to be the maximizer of I ( X ; V ) ≤ max P U I ( U ; V ) = C ( q ) . So finally we have C ≤ min[ C ( p ) ,C ( q )] . Note that the above argument works for every two cascade channel, not only the binary symmetric channel. (b) In this case, when there is no processing at the relay, U = Y , the overall channel from X to V can be regarded as a new binary symmetric channel with some new transition probability. To find the transition probability of the overall channel we process as follows P [ V = 0 | X = 0] = summationdisplay i ∈{ , 1 } P [ V = 0 ,Y = i | X = 0] (1) = summationdisplay i ∈{ , 1 } P [ V = 0 | X = 0 ,Y = i ] · P [ Y = i | X = 0] (2) = summationdisplay i ∈{ , 1 } P [ V = 0 | Y = i ] · P [ Y = i | X = 0] = (1 − p )(1 − q ) + pq 1 where (1) follows from the chain rule for probability, (2) follows from the Markov chain we have. Then we can write P [ V = 0 | X = 1] = summationdisplay i ∈{ , 1 } P [ V = 0 ,Y = i | X = 1] = summationdisplay i ∈{ , 1 } P [ V = 0 | X = 1 ,Y = i ] · P [ Y = i | X = 1] = summationdisplay i ∈{ , 1 } P [ V = 0 | Y = i ] · P [ Y = i | X = 1] = p (1 − q ) + (1 − p ) q. Similarly we can find P [ V = 1 | X = 0] and P [ V = 1 | X = 1] using the same method but the answer will be the same. So the for the capacity in this case we have C ′ = 1 − h 2 ( p (1 − q ) + (1 − p ) q ). (c) In this part we assume that relay can do some processing. The scheme that we suggest is as follows. Let r = min[ C ( p ) ,C ( q )]. First source S used some channel code with rate r to encode its data and send it over the first channel. Then relay wait until receive the whole block of data and decode it. Because the source sends information at rate below the capacity of first channel ( r ≤ C ( p )) we can make the decoding error as small as possible....
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 7

Binary Multiplier Channel - Information Theory and Coding...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online