hw3sol - EE 376A Prof. T. Weissman Information Theory...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
EE 376A Information Theory Prof. T. Weissman Thursday, Feb. 4th, 2010 Solution, Homework Set #3 1. Venn diagrams. Consider the following quantity: I ( X ; Y ; Z ) = I ( X ; Y ) - I ( X ; Y | Z ) . This quantity is symmetric in X , Y and Z , despite the preceding asymmetric de±nition. Unfortunately, I ( X ; Y ; Z ) is not necessarily nonnegative. Find X , Y and Z such that I ( X ; Y ; Z ) < 0, and prove the following two identities: (a) I ( X ; Y ; Z ) = H ( X,Y,Z ) - H ( X ) - H ( Y ) - H ( Z ) + I ( X ; Y ) + I ( Y ; Z ) + I ( Z ; X ) (b) I ( X ; Y ; Z ) = H ( X,Y,Z ) - H ( X,Y ) - H ( Y,Z ) - H ( Z,X )+ H ( X )+ H ( Y )+ H ( Z ) The ±rst identity can be understood using the Venn diagram analogy for entropy and mutual information. The second identity follows easily from the ±rst. Solutions: Venn Diagrams. To show the ±rst identity, I ( X ; Y ; Z ) = I ( X ; Y ) - I ( X ; Y | Z ) by de±nition = I ( X ; Y ) - ( I ( X ; Y,Z ) - I ( X ; Z )) by chain rule = I ( X ; Y ) + I ( X ; Z ) - I ( X ; Y,Z ) = I ( X ; Y ) + I ( X ; Z ) - ( H ( X ) + H ( Y,Z ) - H ( X,Y,Z )) = I ( X ; Y ) + I ( X ; Z ) - H ( X ) + H ( X,Y,Z ) - H ( Y,Z ) = I ( X ; Y ) + I ( X ; Z ) - H ( X ) + H ( X,Y,Z ) - ( H ( Y ) + H ( Z ) - I ( Y ; Z )) = I ( X ; Y ) + I ( X ; Z ) + I ( Y ; Z ) + H ( X,Y,Z ) - H ( X ) - H ( Y ) - H ( Z ) . To show the second identity, simply substitute for I ( X ; Y ), I ( X ; Z ), and I ( Y ; Z ) using equations like I ( X ; Y ) = H ( X ) + H ( Y ) - H ( X,Y ) . These two identities show that I ( X ; Y ; Z ) is a symmetric (but not necessarily nonneg- ative) function of three random variables. 2. Conditional entropy. Under what conditions does H ( X | g ( Y )) = H ( X | Y )? Solutions: ( Conditional Entropy ). If H ( X | g ( Y )) = H ( X | Y ), then H ( X ) - H ( X | g ( Y )) = H ( X ) - H ( X | Y ), i.e., I ( X ; g ( Y )) = I ( X ; Y ). This is the condition for equality in the data processing inequality. From the derivation of the inequality, we have equality i² X g ( Y ) Y forms a Markov chain. Hence H ( X | g ( Y )) = H ( X | Y ) i² X g ( Y ) 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Y . This condition includes many special cases, such as g being one-to-one, and X and Y being independent. However, these two special cases do not exhaust all the possibilities. 3.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 7

hw3sol - EE 376A Prof. T. Weissman Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online