hw1sol - EE376B/Stat 376B Handout #5 Information Theory...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE376B/Stat 376B Handout #5 Information Theory Tuesday, April 12, 2011 Prof. T. Cover Solutions to Homework Set #1 1. Differential entropy. Evaluate the differential entropy h ( X ) = − ∫ f ln f for the following: (a) The Laplace density, f ( x ) = 1 2 λe − λ | x | . Relate this to the entropy of the expo- nential density λe − λx , x ≥ 0. (b) The sum of X 1 and X 2 , where X 1 and X 2 are independent normal random vari- ables with means µ i and variances σ 2 i , i = 1 , 2 . Solution: Differential entropy. (a) Laplace density. Note that the Laplace density is a two sided exponential density, so each side has a differential entropy of the exponential and one bit is needed to specify which side. So for f ( x ) = λe − λx , x ≥ 0 we have, h ( f ) = 1 2 h ( f ( x )) + 1 2 h ( f ( − x )) + H ( 1 2 ) (1) = log e λ + log 2 bits. (2) = log 2 e λ bits. (3) (b) Sum of two independent normal distributions. The sum of two independent normal random variables is also normal, so applying the result derived the class for the normal distribution, since X 1 + X 2 ∼ N ( µ 1 + µ 2 ,σ 2 1 + σ 2 2 ), h ( f ) = 1 2 log 2 πe ( σ 2 1 + σ 2 2 ) bits. (4) 2. Maximum entropy A die comes up 6 twice as often as it comes up 1. What is the maximum entropy ( p 1 ,p 2 ,...,p 6 )? 1 Solution: We have the constraint p 6 = 2 p 1 or p 6 − 2 p 1 = 0. This can be written as a constraint Ef ( X ) = 0 (5) where f ( x ) = 2 , x = 6 − 1 , x = 1 , Otherwise Now the optimal distribution is p ∗ ( x ) = exp( λ + λ 1 f ( x )) where λ and λ 1 are determined using the constraints ∑ x p ( x ) = 1 and Ef ( X ) = 0. From equn. (5), we have 2 exp( λ + 2 λ 1 ) − exp( λ + λ 1 ( − 1)) = 0 2 exp( λ + 2 λ 1 ) = exp( λ + λ 1 ( − 1)) exp(3 λ 1 ) = 1 2 λ 1 = − 1 3 ln 2 Finding λ is equivalent to normalizing the distribution. Thus p ∗ ( x ) = exp(2 λ 1 ) exp(2 λ 1 )+ exp ( − λ 1 )+4 , x = 6 exp( − λ 1 ) exp(2 λ 1 )+ exp ( − λ 1 )+4 , x = 1 1 exp(2 λ 1 )+ exp ( − λ 1 )+4 , Otherwise 3. Maximum entropy of atmosphere. Maximize h ( Z,V x ,V y ,V z ) , Z ≥ , ( V x ,V y ,V z ) ∈ R 3 , subject to the energy constraint E ( 1 2 m ∥ V ∥ 2 + mgZ ) = E . Show that the resulting distribution yields E 1 2 m ∥ V ∥ 2 = 3 5 E EmgZ = 2 5 E . Thus 2 5 of the energy is stored in the potential field, regardless of its strength g ....
View Full Document

This note was uploaded on 10/05/2011 for the course EE 376B at Stanford.

Page1 / 9

hw1sol - EE376B/Stat 376B Handout #5 Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online