Module2_2 - Module 2, Lecture 2 Fundamental Concepts:...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Module 2, Lecture 2 Fundamental Concepts: Entropy, Relative Entropy, Mutual Information G.L. Heileman Module 2, Lecture 2 Entropy Definition (Entropy) The entropy of a RV X p ( x ) is: H ( X ) = X x X p ( x ) log 1 p ( x ) = E log 1 p ( X ) Properties of H ( X ): 1 H ( X ) 0 with equality when p ( x ) = 1 for some one x X . 2 H ( X ) log( |X| ) with equality when p ( x ) = 1 |X| x X . 3 H b ( X ) = (log b 2) H ( X ). Proof: 1 p ( x ) 1 log 1 p ( x ) 0. 2 well prove this momentarily. 3 log b c = (log b a )(log a c ). G.L. Heileman Module 2, Lecture 2 Binary Entropy Function p 1/2 1 1 H(p,1-p) G.L. Heileman Module 2, Lecture 2 Entropy Theorem H ( X ) log( |X| ) with equality when p ( x ) = 1 |X| x X . Proof: Consider H ( 1 n , . . . , 1 n )- H ( p 1 , . . . , p n ) where the p i s are arbitrary probabilities satisfying the probability axioms. We will show that the above quantity is 0 for any valid choices of the p i s. To do this, rewrite the previous equation as: log b n + n X i =1 p i log b p i = log b e log e n + n X i =1 p i (log e p i log b e ) = log b e (ln n + n X i =1 p i ln p i ) G.L. Heileman Module 2, Lecture 2 Entropy Proof (cont): = log b e (ln n n X i =1 p i + n X i =1 p i ln p i ) = log b e n X i =1 p i ln( np i ) . Now, using the fact that ln 1 x 1- x x > 0, we can write H 1 n , . . . , 1 n- H ( p 1 , . . . , p n ) log b e n X i =1 p i 1- 1 np i = log b e n X i =1 p i- n X i =1 1 n ! = . Thus, equiprobable events produce maximum entropy. G.L. Heileman Module 2, Lecture 2 Joint Entropy Definition (Joint Entropy) The joint entropy of a pair of RVs ( X , Y ) p ( x , y ) is: H ( X , Y ) = X x X X y Y p ( x , y ) log 1 p ( x , y ) = E log 1 p ( X , Y ) When X and Y are independent, p ( x , y ) = p ( x ) p ( y ), and H ( X , Y ) = X x , y p ( x ) p ( y ) log 1 p ( x ) + log 1 p ( y ) = X y p ( y ) X x p ( x ) log 1 p ( x ) + X x p ( x ) X y p ( y ) log 1 p ( y ) = H ( X ) + H ( Y ) G.L. Heileman Module 2, Lecture 2 Conditional Entropy The conditional entropy of X given that Y = y is observed can be written as: H ( X | Y = y ) = X x p ( x | Y = y ) log 1 p...
View Full Document

Page1 / 23

Module2_2 - Module 2, Lecture 2 Fundamental Concepts:...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online