hw3_sol - EE595A Introduction to Information Theory Winter...

This preview shows pages 1–3. Sign up to view the full content.

EE595A Introduction to Information Theory University of Washington Winter 2004 Dept. of Electrical Engineering Handout 6: Problem Set 3: Solutions Prof: Jeff A. Bilmes <[email protected]> Lecture 10, Feb 27, 2004 6.1 Problems from Text Book Problems Problem 4.8 Pairwise independence. Let X 1 , X 2 , . . . , X n - 1 be i.i.d. random variables taking values in { 0 , 1 } , with Pr { X i = 1 } = 1 2 . Let X n = 1 if n - 1 i =1 X i is odd and X n = 0 otherwise. Let n 3 . 1. Show that X i and X j are independent, for i 6 = j, i, j ∈ { 1 , 2 , . . . , n } . 2. Find H ( X i , X j ) , for i 6 = j . 3. Find H ( X 1 , X 2 , . . . , X n ) . Is this equal to nH ( X 1 ) ? Solution 4.8 ( Pairwise Independence ) X 1 , X 2 , . . . , X n - 1 are i.i.d. Bernoulli(1/2) random variables. We will first prove that for any k n - 1 , the probability that k i =1 X i is odd is 1 / 2 . We will prove this by induction. Clearly this is true for k = 1 . Assume that it is true for k - 1 . Let S k = k i =1 X i . Then P ( S k odd ) = P ( S k - 1 odd ) P ( X k = 0) + P ( S k - 1 even ) P ( X k = 1) (6.1) = 1 2 1 2 + 1 2 1 2 (6.2) = 1 2 . (6.3) Hence for all k n - 1 , the probability that S k is odd is equal to the probability that it is even. Hence, P ( X n = 1) = P ( X n = 0) = 1 2 . (6.4) 1. It is clear that when i and j are both less than n , X i and X j are independent. The only possible problem is when j = n . Taking i = 1 without loss of generality, P ( X 1 = 1 , X n = 1) = P ( X 1 = 1 , n - 1 X i =2 X i even ) (6.5) = P ( X 1 = 1) P ( n - 1 X i =2 X i even ) (6.6) = 1 2 1 2 (6.7) = P ( X 1 = 1) P ( X n = 1) (6.8) and similarly for other possible values of the pair X 1 , X n . Hence X 1 and X n are independent. 2. Since X i and X j are independent and uniformly distributed on { 0 , 1 } , H ( X i , X j ) = H ( X i ) + H ( X j ) = 1 + 1 = 2 bits . (6.9) 6-1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6-2 3. By the chain rule and the independence of X 1 , X 2 , . . . , X n 1 , we have H ( X 1 , X 2 , . . . , X n ) = H ( X 1 , X 2 , . . . , X n - 1 ) + H ( X n | X n - 1 , . . . , X 1 ) (6.10) = n - 1 X i =1 H ( X i ) + 0 (6.11) = n - 1 , (6.12) since X n is a function of the previous X i ’s. The total entropy is not n , which is what would be obtained if the X i ’s were all independent. This example illustrates that pairwise independence does not imply complete independence. Problem 5.6 Bad codes. Which of these codes cannot be Huffman codes for any probability assignment? 1. { 0 , 10 , 11 } . 2. { 00 , 01 , 10 , 110 } . 3. { 01 , 10 } . Solution 5.6 Bad codes 1. { 0,10,11 } is a Huffman code for the distribution (1/2,1/4,1/4). 2. The code { 00,01,10, 110 } can be shortened to { 00,01,10, 11 } without losing its instantaneous property, and therefore is not optimal, so it cannot be a Huffman code. Alternatively, it is not a Huffman code because there is a unique longest codeword. 3. The code
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/12/2009 for the course EE 596A taught by Professor Jeffa.bilmes during the Winter '04 term at Washington University in St. Louis.

Page1 / 13

hw3_sol - EE595A Introduction to Information Theory Winter...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online