This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 376B Handout #11 Information Theory Tuesday, May 17, 2011 Prof. T. Cover Prepared by T.A. Gowtham Kumar Solutions to Homework Set #4 1. Maximum Entropy and Counting Let X = { 1 , 2 , . . . , m } . Show that the number of sequences x n X n satisfying 1 n n i =1 g ( x i ) is approximately equal to 2 nH * , to first order in the exponent, for n sufficiently large, where H * = max P : P ( i ) g ( i ) H ( P ) . Solution: Maximum Entropy and Counting The key here is that there are only a polynomial number of types whereas there are exponential number of sequences of each type. Let P * be the entropymaximizing distribution. Lower Bound: Consider the rational type with denominator n P * n , satisfying the given constraint, that minimizes  P * n P *  . There are atleast 2 n ( H ( P * n ) n ) sequences of type P * n . Also, as n ,  P * n P *  = 0. Since H ( p ) is a continuous function of p , it follows that  H ( P * n ) H ( P * )  0. Hence, we conclude there are atleast 2 n ( H ( P * ) n ) sequences satisfying the constraint. (where n 0 as n ). Upper Bound: There are only a polynomial number of types. Thus the total number of types is less than a polynomial, ( n + 1) m . Therefore, the total number of sequences is upper bounded by X P n : E P n g ( X ) 2 n ( H ( P n )+ n ) ( n + 1) m 2 n ( H ( P * )+ n ) 2 n 00 n 2 n ( H ( P * )+ n ) = 2 n ( H ( P * )+ n ) This completes the proof. 2. Counting states Suppose a die X takes on values in { 1 , 2 , . . . , 6 } and EX = 5. 1 (a) Find the maximum entropy pdf p * ( x ) subject to this condition. (b) Let X 1 , X 2 , . . . be i.i.d. p * ( x ). It is now observed that the frequency of occurrence of 1 is twice the frequency of occurrence of 6. Assume n large, find the conditional distribution of the state of the first die X 1 , given this observation. Solution: Counting states (a) The maximum entropy distribution is given by p * ( x ) = exp( + 1 x ) = exp( 1 x ) x exp( 1 x ) We determine the constant 1 such that the constraint EX = 5 is satisfied, i.e. We solve 6 x =1 x exp( 1 x ) 6 x =1 exp( 1 x ) = 5 . Using a numerical solver (for instance fsolve on Matlab), we find the solution 1 = 0 . 6296, which corresponds to the distribution p * = . 0205 0 . 0385 0 . 0723 0 . 1357 0 . 2548 0 . 4781 (b) According to the Conditional Limit theorem, the most likely distribution is the distribution q * that minimizes D ( q  p * ) over all q that satisfies the given con straint. As in the previous homework, we express the linear constraint on q ( x ) as an expected value constraint. We have q (1) = 2 q (6) or q (1) 2 q (6) = 0. This can be written as Eg ( X ) = 0 where g (1) = 1, g (6) = 2 and g ( i ) = 0 for i = 2 , 3 , 4 , 5....
View
Full
Document
This note was uploaded on 10/05/2011 for the course EE 376B at Stanford.
 '11
 cover

Click to edit the document details