HW6s - ECE 534: Elements of Information Theory, Fall 2010...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 534: Elements of Information Theory, Fall 2010 Homework 6 Solutions October 19, 2010 1. Problem 7.2. Additive noise channel (Matteo Carminati) . Find the channel capacity of the following discrete memoryless channel: Where PrZ = 0 = PrZ = a = 1 2 . The alphabet for x is X = 0 , 1. Assume that Z is indepen- dent of X. Observe that the channel capacity depends on the value of a. Solution: We can identify two different values for the capacity of the channel according to the value of a . • First of all, we can note that the outputs of the channel are nonoverlapped if a has a value different from 1 or- 1. Thus, in these cases the channel we are analysing is a noisy channel with nonoverlapping outputs. It is known that the capacity of this channel is 1[ bit ] since: C = max I ( X ; Y ) = max( H ( Y )- H ( Y | X )) (mutual information definition) = max( H ( Y )) (from nonoverlapping outputs) = 1 (max of entropy for a binary variable) • In the second case, when a = 1 or a =- 1, the outputs of the channel can overlap: if a = 1, Y can be 1 either if X = 0 or X = 1, and if a =- 1, Y can be 0 either if X = 0 or X = 1. Let’s compute the channel capacity for a = 1, the same value and a similar expression can be found for a =- 1. H ( Y ) =- p ( y = 0)log 2 p ( y = 0)- p ( y = 1)log 2 p ( y = 1)- p ( y = 2)log 2 p ( y = 2) =- 2 4 log 2 1 4- 1 2 log 2 1 2 = 1 + 1 2 H ( Y | X ) =- p ( x = 0) H ( Y | x = 0)- p ( x = 1) H ( Y | x = 1) = 1 2 + 1 2 = 1 Thus C = max I ( X ; Y ) = max( H ( Y )- H ( Y | X )) = 1 + 1 2- 1 = 1 2 1 2. Problem 7.7. Cascade of binary symmetric channels (Davide Basilio Bartolini) . Show that a cascade of n identical independent binary symmetric channels, X → BSC → X 1 → ... → X n- 1 → BSC → X n , each with raw error probability p , is equivalent to a single BSC with error probability 1 2 (1- (1- 2 p ) n ) and hence that lim n →∞ I ( X ; X n ) = 0 if p 6 = 0 , 1 . No encoding or decod- ing takes place at the intermediate terminals X 1 ,...,X n- 1 . Thus, the capacity of the cascade tends to zero. Solution: The conditional probability distribution p ( y | x ) for each of the BSCs may be expressed by the transition probability matrix A , given by: A = 1- p p p 1- p The transition matrix for the cascade is given by A n = A n ; it is possible to exploit the singular...
View Full Document

This note was uploaded on 12/20/2011 for the course COMMUNICAT 101 taught by Professor Hoangxinhung during the Fall '11 term at 金沢工業大学.

Page1 / 7

HW6s - ECE 534: Elements of Information Theory, Fall 2010...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online