HW5s[1] - ECE 534: Elements of Information Theory, Fall...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 534: Elements of Information Theory, Fall 2010 Homework 5 - BONUS SOLUTIONS ALL by Kenneth S. Palacio Baus great job!!! October 2, 2010 1. Problem 5.8. Huffman coding . Consider the random variable: X = x 1 x 2 x 3 x 4 x 5 x 6 x 7 . 49 0 . 26 0 . 12 0 . 04 0 . 04 0 . 03 0 . 02 Solution: (a) Find a binary Huffman code for X. Figure 1: Binary Huffman code, prob. 5.8 (b) Find the expected code length for this encoding. L ( C ) = 0 . 49(1) + 0 . 26(2) + 0 . 12(3) + 0 . 04(5) + 0 . 04(5) + 0 . 03(5) + 0 . 02(5) (1) = 1 . 96 (2) 2 bits (3) (c) Find a ternary Huffman code for X. Figure 2: Ternary Huffman code, prob. 5.8 1 2. Problem 5.12. Shannon codes and Huffman codes . Consider a random variable X that takes on four values with probabilities ( 1 3 , 1 3 , 1 4 , 1 12 ). (a) Construct a Huffman code for this random variable. Figure 3: Binary Huffman code (1), prob. 5.12a Figure 4: Binary Huffman code (2), prob. 5.12a (b) Show that there exist two different sets of optimal lengths for the codewords; namely, show that codeword length assignments (1 , 2 , 3 , 3) and (2 , 2 , 2 , 2) are both optimal. For lengths (1 , 2 , 3 , 3): L ( C 1 ) = 1(1 / 3) + 2(1 / 3) + 3(1 / 4) + 3(1 / 12) (4) = 2 (5) Kraft Inequality test: 2- 1 + 2- 2 + 2- 3 + 2- 3 1 (6) 1 = 1 (7) For lengths (2 , 2 , 2 , 2): L ( C 2 ) = 2(1 / 3) + 2(1 / 3) + 2(1 / 4) + 2(1 / 12) (8) = 2 (9) Kraft Inequality test: 2- 2 + 2- 2 + 2- 2 + 2- 2 1 (10) 1 = 1 (11) 2 The two codes have the same expected length of 2 bits, and both satisfy Kraft inequality. So, the two codes are optimal. (c) Conclude that there are optimal codes with codeword lengths for some sym- bols that exceed the Shannon code length d log 2 1 p ( x ) e First we need to compute the Shanon code lengths for the distribution: p i 1 / 3 1 / 3 1 / 4 1 / 12 log 2 1 p ( x ) 1.58 1.58 2 3.58 d log 2 1 p ( x ) e 2 2 2 4 Table 1: Computation of Shanon code lengths, problem 5.12 We can compare the lengths for each x i : x i p ( x i ) Opt.Cod.1 Opt.Cod.2 Shanon x 1 1 / 3 1 2 2 x 2 1 / 3 2 2 2 x 3 1 / 4 3 2 2 x 4 1 / 12 3 2 4 Table 2: Lengths for Cod1, Cod2 and Shanon, problem 5.12 From the previous table, we can see that the length for x 3 is 3 for the optimal Huffman code 1, and its length 2 for the Shanon code. We know that the expected length of the Huffman1, and its length 2 for the Shanon code....
View Full Document

This note was uploaded on 01/19/2012 for the course ECE 534 taught by Professor Natashadevroye during the Fall '10 term at Ill. Chicago.

Page1 / 9

HW5s[1] - ECE 534: Elements of Information Theory, Fall...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online