7.36 - University of Illinois at Chicago Department of...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: University of Illinois at Chicago Department of Electrical and Computer Engineering ECE 534: Information Theory Fall 2009 Midterm 1 - Solutions NAME: This exam has 4 questions, each of which is worth 15 points. You will be given the full class time: 75 minutes. Use it wisely! Many of the problems have short answers; try to find shortcuts. You may bring and use one 8.5x11 double-sided crib sheet. No other notes or books are permitted. No calculators are permitted. Talking, passing notes, copying (and all other forms of cheating) is forbidden. Make sure you explain your answers in a way that illustrates your understanding of the problem. Ideas are important, not just the calculation. Partial marks will be given. Write all answers directly on this exam. 1 1. (15 points) True of false and short answer. Brief explanations (rather than lengthy proofs) suffice. (a) (5 points) Which of the following sequences of codeword lengths cannot be the codeword lengths of a 3- ary (ternary, D = 3) Huffman code? (i) (1,1,2,2,3,3,3) (ii) (1,1,2,2,3,3) (iii) (1,1,2,2,3) (iv) (1,2,2,2,2,2,2) (v) (1,2,2,2,2) Solution: The easiest way to see which can be ternary Huffman codeword lengths by trying to construct the Huffman tree. From the figure, we can see that (iii) and (v) are impossible and could be shortened/pruned to (1,1,2,2,2) and (1,1,2,2,2) respectively. (i) Yes (ii) Yes (iii) No (iv) Yes (v) No (b) (6 points) We have defined the mutual information I ( X ; Y ) between the two random variables X and Y . Let us *try* to define the mutual information between three random variables X,Y and Z as I ( X ; Y ; Z ) = I ( X ; Y )- I ( X ; Y | Z ). (i) Is this definition symmetric in its arguments? Prove why or give an example of why not. (ii) Is I ( X ; Y ; Z ) positive? Prove why or give an example of why not. (iii) True or false: I ( X ; Y ; Z ) = H ( X,Y,Z )- H ( X )- H ( Y )- H ( Z ) + I ( X ; Y ) + I ( Y ; Z ) + I ( Z ; X ). (iv) True of false: I ( X ; Y ; Z ) = H ( X,Y,Z )- H ( X,Y )- H ( Y,Z )- H ( Z,X ) + H ( X ) + H ( Y ) + H ( Z ). Solution: (i) Yes, it is symmetric, as can be seen from (iii) and (iv). Both expansions will be shown to be true and are symmetric in their arguments. (ii) No, it is not necessarily positive. Take X and Y independent and identical coin flips and let Z = X + Y mod 2. Then I ( X ; Y ) = 0 but I ( X ; Y | Z ) = H ( X | Z )- H ( Y | X,Z ) = H ( X | Z )- 0 = 1. 2 (iii) True. I ( X ; Y ) = I ( X ; Y | Z ) = H ( X )- H ( X | Y )- H ( X | Z ) + H ( X | Y,Z ) = H ( X )- H ( X,Y ) + H ( Y )- H ( X,Z ) + H ( Z ) + H ( X,Y,Z )- H ( Y,Z ) = H ( X,Y,Z ) + ( H ( X ) + H ( Y )- H ( X,Y )) + ( H ( X ) + H ( Z )- H ( X,Z )) + ( H ( Y ) + H ( Z )- H ( = H ( X,Y,Z ) + I ( X ; Y ) + I ( X ; Z ) + I ( Y ; Z )- H ( X )- H ( Y )- H ( Z ) (iv) True....
View Full Document

This note was uploaded on 10/27/2010 for the course ECE 221 taught by Professor Sd during the Spring '10 term at Huston-Tillotson.

Page1 / 8

7.36 - University of Illinois at Chicago Department of...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online