6s - EE 478 Handout #22 Multiple User Information Theory...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 478 Handout #22 Multiple User Information Theory November 13, 2008 Homework Set #6 Solutions 1. Solution: (a) Codebook generation: For each m [1 : 2 nR ], m 01 [1 : 2 nR 01 ], and m 02 [1 : 2 nR 02 ] generate u n ( m ,m 01 ,m 02 ) producttext n i =1 p U ( u i ). Choose R 11 > R 11 and R 22 > R 22 , and for each ( m ,m 01 ,m 02 ) generate a Marton table as described in the Lecture note, i.e., for each 11 [1 : 2 n R 11 ] generate u n 1 ( 11 | m ,m 01 ,m 02 ) producttext n i =1 p U 1 | U ( u 1 i | u i ) and for each 22 [1 : 2 n R 22 ] generate u n 2 ( 22 | m ,m 01 ,m 02 ) producttext n i =1 p U 2 | U ( u 2 i | u i ). Divide 2 n R 11 u n 1 sequences into 2 nR 11 equal size bins and divide 2 n R 22 u n 2 sequences into 2 nR 22 equal size bins. For each message ( m ,m 1 ,m 2 ), define B ( m ,m 1 ,m 2 ) = { ( u n ( m ,m 01 ,m 02 ) ,u n 1 ( 11 | m ,m 01 ,m 02 ) ,u n 2 ( 22 | m ,m 01 ,m 02 )) T ( n ) ( U ,U 1 ,U 2 ) : ( m 11 1)2 n ( R 11 R 11 ) + 1 11 m 11 2 n ( R 11 R 11 ) , ( m 22 1)2 n ( R 22 R 22 ) + 1 22 m 22 2 n ( R 22 R 22 ) bracerightBig . For each message ( m ,m 1 ,m 2 ) look into B ( m ,m 1 ,m 2 ) and if it is non empty then choose an arbitrary pair ( u n ,u n 1 ,u n 2 ) B ( m ,m 1 ,m 2 ), and generate x n producttext n i =1 p X | U ,U 1 ,U 2 ( x i | u ,u 1 i ,u 2 i ). If it is empty choose an arbitrary sequence x n . Decoding: The decoder i looks for some ( m , m 01 , m 02 , ii ) such that ( u n ( m , m 01 , m 02 ) ,u n i ( ii | m , m 01 , m 02 ) ,y n i ) T ( n ) . Finally, it declares the bin number of ii as message m 11 . Analysis of probability of error: Assume without loss of generality that m = m 01 = m 02 = m 11 = m 22 = 1. The decoding probability of error of decoder i can be upper bounded as P ( n ) e < P( E 1 ) + P( E 2 E c 1 ) + P( E 01 E c 1 ) + P( E 11 E c 1 ) + P( E 02 E c 1 ) + P( E 22 E c 1 ) , where E 1 = { ( U n (1 , 1 , 1) ,U n 1 (1 , 1 , 1) ,U n 2 (1 , 1 , 1) ,X n (1 , 1 , 1) ,Y n 1 ,Y n 2 ) T ( n ) } , E 01 = { ( U n ( m , m 01 , m 02 ) ,U n 1 ( m , m 1 , m 02 ) ,Y n 1 ) T ( n ) for some ( m , m 01 , m 02 ) negationslash = (1 , 1 , 1) and some m 11 } , E 11 = { ( U n (1 , 1 , 1) ,U n 1 (1 , m 1 , 1) ,Y n 1 ) T ( n ) for some m 11 negationslash = 1 } , E 02 = { ( U n ( m , m 01 , m 02 ) ,U n 2 ( m , m 2 , m 01 ) ,Y n 2 ) T ( n ) for some ( m , m 01 , m 02 ) negationslash = (1 , 1 , 1) and some m 22 } , E 22 = { ( U n (1 , 1 , 1) ,U n 2 (1 , m 2 , 1) ,Y n 2 ) T ( n ) for some m 22 negationslash = 1 } ....
View Full Document

This note was uploaded on 10/24/2011 for the course ELECTRICAL ECE 571 taught by Professor Kelly during the Spring '11 term at University of Illinois, Urbana Champaign.

Page1 / 9

6s - EE 478 Handout #22 Multiple User Information Theory...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online