{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

HW5s[1]

# HW5s[1] - ECE 534 Elements of Information Theory Fall 2010...

This preview shows pages 1–4. Sign up to view the full content.

ECE 534: Elements of Information Theory, Fall 2010 Homework 5 - BONUS SOLUTIONS – ALL by Kenneth S. Palacio Baus – great job!!! October 2, 2010 1. Problem 5.8. Huffman coding . Consider the random variable: X = x 1 x 2 x 3 x 4 x 5 x 6 x 7 0 . 49 0 . 26 0 . 12 0 . 04 0 . 04 0 . 03 0 . 02 Solution: (a) Find a binary Huffman code for X. Figure 1: Binary Huffman code, prob. 5.8 (b) Find the expected code length for this encoding. L ( C ) = 0 . 49(1) + 0 . 26(2) + 0 . 12(3) + 0 . 04(5) + 0 . 04(5) + 0 . 03(5) + 0 . 02(5) (1) = 1 . 96 (2) 2 bits (3) (c) Find a ternary Huffman code for X. Figure 2: Ternary Huffman code, prob. 5.8 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2. Problem 5.12. Shannon codes and Huffman codes . Consider a random variable X that takes on four values with probabilities ( 1 3 , 1 3 , 1 4 , 1 12 ). (a) Construct a Huffman code for this random variable. Figure 3: Binary Huffman code (1), prob. 5.12a Figure 4: Binary Huffman code (2), prob. 5.12a (b) Show that there exist two different sets of optimal lengths for the codewords; namely, show that codeword length assignments (1 , 2 , 3 , 3) and (2 , 2 , 2 , 2) are both optimal. For lengths (1 , 2 , 3 , 3): L ( C 1 ) = 1(1 / 3) + 2(1 / 3) + 3(1 / 4) + 3(1 / 12) (4) = 2 (5) Kraft Inequality test: 2 - 1 + 2 - 2 + 2 - 3 + 2 - 3 1 (6) 1 = 1 (7) For lengths (2 , 2 , 2 , 2): L ( C 2 ) = 2(1 / 3) + 2(1 / 3) + 2(1 / 4) + 2(1 / 12) (8) = 2 (9) Kraft Inequality test: 2 - 2 + 2 - 2 + 2 - 2 + 2 - 2 1 (10) 1 = 1 (11) 2
The two codes have the same expected length of 2 bits, and both satisfy Kraft inequality. So, the two codes are optimal. (c) Conclude that there are optimal codes with codeword lengths for some sym- bols that exceed the Shannon code length d log 2 1 p ( x ) e First we need to compute the Shanon code lengths for the distribution: p i 1 / 3 1 / 3 1 / 4 1 / 12 log 2 1 p ( x ) 1.58 1.58 2 3.58 d log 2 1 p ( x ) e 2 2 2 4 Table 1: Computation of Shanon code lengths, problem 5.12 We can compare the lengths for each x i : x i p ( x i ) Opt.Cod.1 Opt.Cod.2 Shanon x 1 1 / 3 1 2 2 x 2 1 / 3 2 2 2 x 3 1 / 4 3 2 2 x 4 1 / 12 3 2 4 Table 2: Lengths for Cod1, Cod2 and Shanon, problem 5.12 From the previous table, we can see that the length for x 3 is 3 for the optimal Huffman code 1, and it’s length 2 for the Shanon code. We know that the expected length of the Huffman code is minimum and less than the Shanon code. As shown in this problem, there might be

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 9

HW5s[1] - ECE 534 Elements of Information Theory Fall 2010...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online