This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 478 Handout #11 Multiple User Information Theory October 14, 2008 Homework Set #2 Solutions 1. Solution: (a) We need to show that for any B 1 , B 2 and [0 , 1], C ( B 1 ) + (1 ) C ( B 2 ) C ( B 1 + (1 ) B 2 ) . Let X 1 p 1 ( x ) achieves C ( B 1 ) and X 2 p 2 ( x ) achieves C ( B 2 ). Define Q Bern( ) and X = X 1 , Q = 1; X 2 , Q = 0. (1) By this definition, p ( x ) = p 1 ( x ) + (1 ) p 2 ( x ), and E B ( X ) = X x X p ( x ) b ( x ) = X x X p 1 ( x ) b ( x ) + (1 ) X x X p 2 ( x ) b ( x ) = B 1 + (1 ) B 2 . On the other hand, by chain rule, I ( X,Q ; Y ) = I ( Q ; Y ) + I ( X ; Y  Q ) = I ( X ; Y ) + I ( Q ; Y  X ) . But Q X Y form a Markov chain, and therefore I ( Q ; Y  X ) = 0 and I ( X ; Y ) I ( X ; Y  Q ). Using this observation, and the definition of C ( B 1 +(1 ) B 2 ) as maximum mutual information between X and Y among all the distributions of X that satisfy E B ( X ) B 1 + (1 ) B 2 , yields C ( B 1 + (1 ) B 2 ) I ( X ; Y ) I ( X ; Y  Q ) = I ( X 1 ; Y ) + (1 ) I ( X 2 ; Y ) . (b) We need to show that n i =1 b ( x i ( w )) nB . Note that the codewords are chosen from typical codewords. Therefore, n X i =1 b ( x i ( w )) = n X x X ( x  x n ( w )) b ( x ) n X x X ( p ( x ) + p ( x )) b ( x ) n (E b ( X ) + E b ( X )) n ( B ( ))(1 + ) = nB (1 1 )(1 + ) = nB (1 2 2 1 ) < nB. 1 The probability of error analysis is exactly like that of channel with no cost constraint. 2. Solution: Consider I ( X ; X + Z * ) = h ( X + Z * ) h ( X + Z *  X ) = h ( X + Z * ) h ( Z * ) h ( X * + Z * ) h ( Z * ) = I ( X * ; X * + Z * ) , where the inequality follows from the fact that given the variance, the entropy is maximized by the normal distribution. To prove the other inequality, we use the entropy power inequality 2 2 h ( X + Z ) 2 2 h ( X ) + 2 2 h ( Z ) . I ( X * ; X * + Z ) = h ( X * + Z ) h ( X * + Z  X * ) = h ( X * + Z ) h ( Z ) = 1 2 log 2 2 h ( X * + Z ) h ( Z ) 1 2 log 2 2 h ( X * ) + 2 2 h ( Z )  h ( Z ) = 1 2 log 2 eP + 2 2 h ( Z )  1 2 log 2 2 h ( Z ) = 1 2 log 1 + 2 eP 2 2 h ( z ) 1 2 log 1 + 2 eP 2 2 h ( Z * ) = 1 2 log 1 + P N = I ( X * ; X * + Z * ) ....
View
Full
Document
This note was uploaded on 10/24/2011 for the course ELECTRICAL ECE 571 taught by Professor Kelly during the Spring '11 term at University of Illinois, Urbana Champaign.
 Spring '11
 Kelly

Click to edit the document details