HW1_ES250 - Harvard SEAS ES250 Information Theory Homework...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Harvard SEAS ES250 – Information Theory Homework 1 (Due Date: Oct. 2 2007) 1. Let p ( x, y ) be given by X ´ Y 0 1 0 1/3 1/3 1 0 1/3 Evaluate the following expressions: (a) H ( X ), H ( Y ) (b) H ( X | Y ), H ( Y | X ) (c) H ( X, Y ) (d) H ( Y ) - H ( Y | X ) (e) I ( X ; Y ) (f) Draw a Venn diagram for the quantities in (a) through (e) 2. Entropy of functions of a random variable (a) Let X be a discrete random variable. Show that the entropy of a function of X is less than or equal to the entropy of X by justifying the following steps: H ( X, g ( X )) ( a ) = H ( X ) + H ( g ( X ) | X ) ( b ) = H ( X ) H ( X, g ( X )) ( c ) = H ( g ( X )) + H ( X | g ( X )) ( d ) H ( g ( X )) Thus H ( g ( X )) H ( X ). (b) Let Y = X 7 , where X is a random variable taking in positive and negative integer values. What is the relationship of H ( X ) and H ( Y )? What if Y = cos( πX/ 3) ? 3. One wishes to identify a random object X p ( x ). A question Q r ( q ) is asked at random accord- ing to
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 3

HW1_ES250 - Harvard SEAS ES250 Information Theory Homework...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online