infohw - Information theoryhomework exercises Edited by...

Info icon This preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Information theory—homework exercises Edited by: G´ abor Lugosi 1 Entropy, source coding Problem 1 (Alternative definition of unique decodability) An f : X → Y * code is called uniquely decodable if for any messages u = u 1 · · · u k and v = v 1 · · · v k (where u 1 , v i , . . . , u k , v k ∈ X ) with f ( u 1 ) f ( u 2 ) · · · f ( u k ) = f ( v 1 ) f ( v 2 ) · · · f ( v k ) , we have u i = v i for all i . That is, as opposed to the definition given in class, we require that the codes of any pair of messages with the same length are equal. Prove that the two definitions are equivalent. Problem 2 (Average length of the optimal code) Show that the expected length of the codewords of the optimal binary code may be arbitrarily close to H ( X ) + 1 . More precisely, for any small > 0 , construct a distribution on the source alphabet X such that the average codeword length of the optimal binary code satisfies E | f ( X ) | > H ( X ) + 1 - . Problem 3 (Equality in Kraft’s inequality) An f prefix code is called full if it loses its prefix property by adding any new codeword to it. A string x is called undecodable if it is impossible to construct a sequence of codewords such that x is a prefix of their concatenation. Show that the following three statements are equivalent. (a) f is full, (b) there is no undecodable string with respect to f , (c) n i =1 s - l i = 1 , where s is the cardinality of the code alphabet, l i is the codeword length of the i th codeword, and n is the number of codewords. Problem 4 (Shannon-Fano code) Consider the following code construction. Order the elements of the source alphabet X according to their decreasing probabilities: p ( x 1 ) p ( x 2 ) ≥ · · · ≥ p ( x n ) > 0 . Introduce the numbers w i as follows: w 1 = 0 , w i = i - 1 j =1 p ( x i ) ( i = 2 , . . . , n ) . Consider the binary expansion of the numbers w i . Write down the binary expansion of w i until the first bit such that the expansion differs from the expansion of all w j ( j = i ). Thus, we have obtained n finite string. Define the binary codeword f ( x i ) as the obtained binary expansion of w i following the decimal point. Prove that the lengths of the codewords of the obtained code satisfy | f ( x i ) | < - log p ( x i ) + 1 . Therefore, the expected codeword length is smaller than the entropy plus one. 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Problem 5 (Bad codes) Which of the following binary codes cannot be a Huffman code for any distribu- tion? Verify your answer. (a) 0, 10, 111, 101 (b) 00, 010, 011, 10, 110 (c) 1, 000, 001, 010, 011 Problem 6 Assume that the probability of each element of the source alphabet X = { x 1 , . . . , x n } is of the form 2 - i , where i is a positive integer. Prove that the Shannon-Fano code is optimal. Show that the average codeword length of a binary Huffman code is equal to the entropy if and only if the distribution is of the described form. Problem 7 Assume that the source alphabet X has five elements with the following corresponding prob- abilities: 0 . 4; 0 . 35; 0 . 1; 0 . 1; 0 . 05 . Determine the entropy of the source. Construct a Shannon-Fano code the way it is described in Problem 4, and construct a binary prefix code with codeword lengths l i = - log p ( x i ) using the binary tree representation shown in class. What is the average codeword length?
Image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern