# 4s - EE 478 Multiple User Information Theory Handout#17...

This preview shows pages 1–3. Sign up to view the full content.

EE 478 Handout #17 Multiple User Information Theory October 28, 2008 Homework Set #4 Solutions 1. Solution: We need to bound the probability of error events described in the lecture notes. Note that P { u n 1 ,U n 2 ,X n 1 u n 1 ) ,X n 2 ( U n 2 ) ,Y n ) ∈ T ( n ) ± for some ˜ u n 1 6 = U n 1 } = X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) X ˜ u n 1 6 = u n 1 u n 1 ,u n 2 ) ∈T ( n ) ± P { u n 1 ,u n 2 ,X n 1 u n 1 ) ,X n 2 ( u n 2 ) ,Y n ) ∈ T ( n ) ± } X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) X ˜ u n 1 6 = u n 1 u n 1 ,u n 2 ) ∈T ( n ) ± 2 - n ( I ( U 1 ,X 1 ; Y | U 2 ,X 2 ) - 3 δ ( ± )) X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) |T ( n ) ± ( U 1 | u n 2 ) | 2 - n ( I ( U 1 ,X 1 ; Y | U 2 ,X 2 ) - 3 δ ( ± )) X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 )2 n ( H ( U 1 | U 2 )+ δ ( ± )) 2 - n ( I ( U 1 ,X 1 ; Y | U 2 ,X 2 ) - 3 δ ( ± )) 2 n ( H ( U 1 | U 2 ) - I ( U 1 ,X 1 ; Y | U 2 ,X 2 ) - 2 δ ( ± )) , which goes to zero if H ( U 1 | U 2 ) < I ( U 1 ,X 1 ; Y | U 2 ,X 2 ). However, I ( U 1 ,X 1 ; Y | U 2 ,X 2 ) = I ( X 1 ; Y | U 2 ,X 2 ) since ( U 1 ,U 2 ) ( X 1 ,X 2 ) Y form a Markov chain. Similarly, it can be shown that P { ( U n 1 , ˜ u n 2 ,X n 1 ( U n 1 ) ,X n 2 u n 2 ) ,Y n ) ∈ T ( n ) ± for some ˜ u n 2 6 = U n 2 } → 0 , if H ( U 2 | U 1 ) < I ( X 2 ; Y | U 1 ,X 1 ). For the last part of the problem, we have P { u n 1 , ˜ u n 2 ,X n 1 u n 1 ) ,X n 2 u n 2 ) ,Y n ) ∈ T ( n ) ± for some ˜ u n 1 6 = U n 1 , ˜ u n 2 6 = U n 2 } = X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) X ˜ u n 1 6 = u n 1 , ˜ u n 2 6 = u n 2 u n 1 , ˜ u n 2 ) ∈T ( n ) ± P { u n 1 , ˜ u n 2 ,X n 1 u n 1 ) ,X n 2 u n 2 ) ,Y n ) ∈ T ( n ) ± } X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) X ˜ u n 1 6 = u n 1 , ˜ u n 2 6 = u n 2 u n 1 , ˜ u n 2 ) ∈T ( n ) ± 2 - n ( I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ) - 2 δ ( ± )) X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 ) |T ( n ) ± ( U 1 ,U 2 ) | 2 - n ( I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ) - 2 δ ( ± )) X ( u n 1 ,u n 2 ) p ( u n 1 ,u n 2 )2 n ( H ( U 1 ,U 2 )+ δ ( ± )) 2 - n ( I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ) - 2 δ ( ± )) 2 n ( H ( U 1 ,U 2 ) - I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ) - δ ( ± )) , which goes to zero if H ( U 1 ,U 2 ) < I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ). However, I ( U 1 ,U 2 ,X 1 ,X 2 ; Y ) = I ( X 1 ,X 2 ; Y ) since ( U 1 ,U 2 ) ( X 1 ,X 2 ) Y form a Markov chain. Therefore the probability goes to zero if H ( U 1 ,U 2 ) < I ( X 1 ,X 2 ; Y ). 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2. Solution: (a) Consider I ( U ; Y 2 ) = H ( Y 2 ) - H ( Y 2 | U ) 1 - H ( Y 2 | U ) . We ﬁnd an upper bound and a lower bound for H ( Y 2 | U ). This term can be upper bounded by 1 and lower bounded by H ( Y 2 | U ) H ( Y 2 | U,X ) = H ( X + Z 1 + Z 0 2 | U,X ) = H ( Z 1 + Z 0 2 ) = H ( p 1 * α ) = H ( p 2 ) . Hence
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 10/24/2011 for the course ELECTRICAL ECE 571 taught by Professor Kelly during the Spring '11 term at University of Illinois, Urbana Champaign.

### Page1 / 8

4s - EE 478 Multiple User Information Theory Handout#17...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online