This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: ECE 1502 Information Theory Problem Set 3 solutions 1 February 15, 2006 5.12 Shannon codes and Huffman codes. (a) Applying the Huffman algorithm gives us the following table Code Symbol Probability 1 1 / 3 1 / 3 2 / 3 1 11 2 1 / 3 1 / 3 1 / 3 101 3 1 / 4 1 / 3 100 4 1 / 12 which gives codeword lengths of 1,2,3,3 for the different codewords. (b) Both set of lengths 1,2,3,3 and 2,2,2,2 satisfy the Kraft inequality, and they both achieve the same expected length (2 bits) for the above distribution. Therefore they are both optimal. (c) The symbol with probability 1 / 4 has an Huffman code of length 3, which is greater than d log 1 p e . Thus the Huffman code for a particular symbol may be longer than the Shannon code for that symbol. But on the average, the Huffman code cannot be longer than the Shannon code. 5.15 Codes. (a) No, the code is not instantaneous, since the first codeword, 0, is a prefix of the second codeword, 01. (b) Yes, the code is uniquely decodable. Given a sequence of codewords, first isolate occur- rences of 01 (i.e., find all the ones) and then parse the rest into 0s. (c) Yes, all uniquely decodable codes are non-singular. 5.19 Average length of an optimal code. The longest possible codeword in an optimal code has n- 1 binary digits. This corresponds to a completely unbalanced tree in which each codeword has a different length. Using a D-ary alphabet for codewords can only decrease its length. Since we know the maximum possible codeword length, there are only a finite number of possible codes to consider. For each candidate code C , the average codeword length is determined by the probability distribution p 1 ,p 2 ,...,p n : L ( C ) = n X i =1 p i ` i . This is a linear, and therefore continuous, function of p 1 ,p 2 ,...,p n . The optimal code is the candidate code with the minimum L , and its length is the minimum of a finite number of continuous functions and is therefore itself a continuous function of p 1 ,p 2 ,...,p n . 5.21 Optimal codes for uniform distributions. 1 Solutions to problems from the text are supplied courtesy of Joy A. Thomas. 1 (a) For uniformly probable codewords, there exists an optimal binary variable length prefix code such that the longest and shortest codewords differ by at most one bit. If two codes differ by 2 bits or more, call m s the message with the shorter codeword C s and m ` the message with the longer codeword C ` . Change the codewords for these two messages so that the new codeword C s is the old C s with a zero appended ( C s = C s 0) and C ` is the old C s with a one appended ( C ` = C s 1). C s and C ` are legitimate codewords since no other codeword contained C s as a prefix (by definition of a prefix code), so obviously no other codeword could contain C s or C ` as a prefix. The length of the codeword for m s increases by 1 and the length of the codeword for m ` decreases by at least 1. Since these messages are equally likely, L L . By this method we can transform any optimal code....
View Full Document
- Spring '10