# hw1 - EE 376A/Stat 376A Handout#5 Information Theory...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A/Stat 376A Handout #5 Information Theory Thursday, January 8, 2009 Prof. T. Cover Due Thursday, January 15, 2009 Homework Set #1 1. Entropy of Hamming Code. Consider information bits X 1 , X 2 , X 3 , X 4 ∈ { , 1 } chosen at random, and check bits X 5 , X 6 , X 7 chosen to make the parity of the circles even. X 4 X 2 X 6 X 3 X 7 X 5 X 1 Thus, for example, 1 1 1 becomes 1 1 1 1 That is, 1011 becomes 1011010. 1 (a) What is the entropy of H ( X 1 , X 2 , ..., X 7 )? Now we make an error (or not) in one of the bits (or none). Let Y = X ⊕ e , where e is equally likely to be (1 , , . . ., 0) , (0 , 1 , , . . ., 0) , . . ., (0 , , . . ., , 1), or (0 , , . . ., 0), and e is independent of X . (b) What is the entropy of Y ? (c) Suppose Y = 0101110. Given that 1 or fewer errors were made, what is ( X 1 , X 2 , X 3 , X 4 )? (d) What is H ( X | Y )? (e) What is I ( X ; Y )? 2. Entropy of functions of a random variable. Let X be a discrete random variable. Show that the entropy of a function of X is less than or equal to the entropy of...
View Full Document

## This note was uploaded on 02/05/2012 for the course EE EE308 taught by Professor B.k.dey during the Spring '09 term at IIT Bombay.

### Page1 / 4

hw1 - EE 376A/Stat 376A Handout#5 Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online