This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 376A/Stat 376A Handout #5 Information Theory Thursday, January 8, 2009 Prof. T. Cover Due Thursday, January 15, 2009 Homework Set #1 1. Entropy of Hamming Code. Consider information bits X 1 , X 2 , X 3 , X 4 ∈ { , 1 } chosen at random, and check bits X 5 , X 6 , X 7 chosen to make the parity of the circles even. X 4 X 2 X 6 X 3 X 7 X 5 X 1 Thus, for example, 1 1 1 becomes 1 1 1 1 That is, 1011 becomes 1011010. 1 (a) What is the entropy of H ( X 1 , X 2 , ..., X 7 )? Now we make an error (or not) in one of the bits (or none). Let Y = X ⊕ e , where e is equally likely to be (1 , , . . ., 0) , (0 , 1 , , . . ., 0) , . . ., (0 , , . . ., , 1), or (0 , , . . ., 0), and e is independent of X . (b) What is the entropy of Y ? (c) Suppose Y = 0101110. Given that 1 or fewer errors were made, what is ( X 1 , X 2 , X 3 , X 4 )? (d) What is H ( X  Y )? (e) What is I ( X ; Y )? 2. Entropy of functions of a random variable. Let X be a discrete random variable. Show that the entropy of a function of X is less than or equal to the entropy of...
View
Full Document
 Spring '09
 B.K.Dey
 Information Theory, Probability theory, Randomness, Dice

Click to edit the document details