This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Information Theory and CodingHW 3 V Balakrishnan Department of ECE Johns Hopkins University October 1, 2006 1 Fano’s inequality Let us first maximize H(p) subject to p 1 = P e and m X i =2 p i = 1 P e So we would have the unconstrained Lagrange’s equation as (1 P e ) log(1 P e ) m X i =2 p i log( p i ) λ ( m X i =2 p i P e ) Differentiating wrt p i for i &gt; 1,we get that p i = K where K is a constant, which means that p i = P e m 1 So we have (1 P e ) log(1 P e ) m X i =2 p i log( p i ) ≤  (1 P e ) log(1 P e ) P e log P e m 1 which gives H ( P e ) + P e log( m 1) ≥ H ( p ) which can be diluted to P e ≥ H ( p ) 1 log( m 1) where H ( p ) = m X i =1 p i log( p i ) 1 2 Logical order of ideas 2.1 Part a Clearly the conditional version of the mutual information is derived from the conditional ver sion of entropy. Though the chain rule for relative entropy is derived independently,it is not as widely used(or has practical implications) as the mutual information or entropy. So the order would be 1 chain rule for H ( X 1 , X 2 . . . X n ) 2 chain rule for I ( X 1 , X 2 . . . X n ; Y ) 3 chain rule for D ( p ( x 1 . . . x n )  q ( x 1 . . . x n )) 2.2 Part b Clearly Jenson’s inequality is the strongest,then we have relative entropy following mutual in formation. This is because,we derive the fact that I ( X ; Y ) ≥ 0 by rewriting it as D ( p ( x, y )  p ( x ) p ( y )). So the order is 1 Jenson’s inequality 2 D ( f  g ) ≥ 3 I ( X ; Y ) ≥ 3 Entropy of Missorted file Let X be any permutation of the numbers 1 , 2 . . . n we can easily observe the following distri bution P ( X ) = 1 n 2 Case I..if X(i)=j and X(j)=i and X(k)=k for k 6 = i, k 6 = j,  i j  &gt; 1 2 n 2 Case II..if X(i)=j and X(j)=i and X(k)=k for k 6 = i, k 6 = j,  i j  = 1 1 n Case III..if X(k)=k for all k Imagine we have n numbers placed in a row and there are n+1 dots surrounding it. simply speaking ( dot )1( dot )2( dot )3 . . . ( dot ) n ( dot ) If we pick a number,we dont want to place that number is any of the adjacent 2 dots(to not...
View
Full Document
 Spring '09
 B.K.Dey
 Information Theory, Pk, pi log

Click to edit the document details