This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 376B/Stat 376B Handout #10 Information Theory Thursday, April 27, 2006 Prof. T. Cover Solutions to Homework Set #2 1. Maximum entropy with marginals. What is the maximum entropy probability mass function p ( x,y ) with the following marginals? You may wish to guess and verify a more general result. y 1 y 2 y 3 x 1 p 11 p 12 p 13 1 / 2 x 2 p 21 p 22 p 23 1 / 4 x 3 p 31 p 32 p 33 1 / 4 2 / 3 1 / 6 1 / 6 Solution: Maximum entropy with marginals. Given the marginal distributions of X and Y , H ( X ) and H ( Y ) are fixed. We may write H ( X,Y ) = H ( X ) + H ( Y  X ) H ( X ) + H ( Y ) , (1) with equality if and only if X and Y are independent. Hence the maximum value of H ( X,Y ) is H ( X ) + H ( Y ), and is attained by choosing the joint distribution to be the product distribution, i.e., y 1 y 2 y 3 x 1 1 / 3 1 / 12 1 / 12 1 / 2 x 2 1 / 6 1 / 24 1 / 24 1 / 4 x 3 1 / 6 1 / 24 1 / 24 1 / 4 2 / 3 1 / 6 1 / 6 This problem can also be solved by using the maximum entropy distribution from Theorem 11.1.1 with the r i ( x,y ) as indicator functions on x and y for each of the six constraints, and recognizing that the solution is the product distribution....
View Full
Document
 Spring '05
 TomCover

Click to edit the document details