hw2 - EE 376A Information Theory Prof. T. Weissman...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A Information Theory Prof. T. Weissman Thursday, January 21, 2010 Homework Set #2 (Due: Thursday, January 28, 2010) 1. Prove that (a) Data processing decreases entropy: If Y = f ( X ) then H ( Y ) H ( X ). [ Hint: expand H ( f ( X ) , X ) in two different ways.] (b) Data processing on side information increases entropy: If Y = f ( X ) then H ( Z | X ) H ( Z | Y ). (c) Assume Y and Z are conditionally independent given X , denoted as Y X Z . In other words, P { Y = y | X = x, Z = z } = P { Y = y | X = x } for all x X , y Y , z Z . Prove that H ( Z | X ) H ( Z | Y ). 2. Entropy of a disjoint mixture. Let X 1 and X 2 be discrete random variables drawn according to probability mass functions p 1 ( ) and p 2 ( ) over the respective alphabets X 1 = { 1 , 2 , . . ., m } and X 2 = { m + 1 , . . ., n } . Let be the result of a biased coin flip, i.e., P { = 1 } = and P { = 0 } = 1 . X 1 , X 2 and are mutually independent....
View Full Document

Page1 / 3

hw2 - EE 376A Information Theory Prof. T. Weissman...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online