This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 1 ECE 534: Elements of Information Theory Solutions to Midterm Exam (Spring 2006) Problem 1 [20 pts.] A discrete memoryless source has an alphabet of three letters, x i , i = 1 , 2 , 3 , with probabilities . 4 , . 4 , and . 2 , respectively. (a) Find the binary Huffman code for this source and determine the average number of bits needed for each source letter. (b) Suppose two letters at a time are encoded into a binary sequence. Find the Huffman code and the average number of bits needed per source letter. Solution: (a) One possible Huffman code is: C (1) = 0 , C (2) = 10 , and C (3) = 11 . The average number of bits per source letter is . 4 + (0 . 4 + 0 . 2) 2 = 1 . 6 . (b) One possible code construction is: C (11) = 000 , C (12) = 001 , C (13) = 100 , C (21) = 010 , C (22) = 011 , C (23) = 110 , C (31) = 111 , C (32) = 1010 , and C (33) = 1011 . The average number of bits per source letter is [3 + (0 . 08 + 0 . 04)] / 2 = 1 . 56 . Problem 2 [20 pts.] A source X produces letters from a threesymbol alphabet with the probability assignment P X (0) = 1 / 4 , P X (1) = 1 / 4 , and P X (2) = 1 / 2 . Each source letter x is transmitted through two channels simultaneously with outputs y and z and the transition probabilities indicated below: 2 1 1 x y P ( y  x ) 1 1 1 2 1 2 a0 a0 a0 a0 a0 a0 a0 a0 a0 a0 a0 a0 a0 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 2 1 1 x z P ( z  x ) 1 1 1 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 a8 Calculate H ( X ) , H ( Y ) , H ( Z ) , H ( Y, Z ) , I ( X ; Y ) , I ( X ; Z ) , I ( X ; Y  Z ) , and I ( X ; Y, Z ) ....
View
Full
Document
This note was uploaded on 10/24/2011 for the course ELECTRICAL ECE 571 taught by Professor Kelly during the Spring '11 term at University of Illinois, Urbana Champaign.
 Spring '11
 Kelly

Click to edit the document details