This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: EE 670 : Homework #1 Solution Prof. Uf Tureli, Stevens Institute of Technology email/tel/fax: [email protected], 201.216.5603/8246 Note: Cover and Thomas, Q. 2.5,11,19,23,3.3 1. Question: Entropy of functions of a random variable. Let X be a random variable. Show that the entropy of a function of X is lenss than or equal to the entropy of X by justifying the following steps: H ( X, g ( X )) = H ( x ) + H ( g ( X ) | X ) = H ( X ); H ( X, g ( X )) = H ( g ( X )) + H ( X | g ( X )) = H ( g ( X )) (1) Solution : Entropy of functions of a random variable (a) H ( X, g ( X )) = H ( X ) + H ( g ( X ) | X ) by the chain rule for entropies. (b) H ( g ( X ) , X ) = 0, since for any particular value of X , g ( X ) is fixed and therefore H ( g ( X ) | X ) = ∑ x p ( x ) H ( g ( X ) | X = x ) = ∑ x 0. (c) H ( X, g ( X ) = H ( g ( X )) + H ( X | g ( X )) by chain rule. (d) H ( X | g ( X )) ≥ 0, with equality iff X is a function of g ( X ), i.e., g ( · ) si one-to-one. Hence H ( X, g ( X )) ≥ H ( g ( X )). Combining parts (b) and (d) we obtain H ( X ) ≥ H ( g ( X )). 2. Question: Average Entropy. Let H ( p ) =- p log 2 p- (1- p ) log 2 (1- p ) be the binary entropy function. (a) Evaluate H (1 / 4) using the fact that log 2 3 ≈ 1 . 584. Solution : We can generate two bits of information by picking one of four equally likely alternatives. This selection can be made in two steps. First, we decided whether the first outcome occurs. Since, this has probability 1the first outcome occurs....
View Full Document
- Spring '05
- Probability theory, Stevens Institute of Technology, Average Entropy, p1 log2 p1, X= X1 X2