HW5_ES250 - ) of Y under the constraint E [ X 2 1 ] = P 1 ,...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Harvard SEAS ES250 – Information Theory Homework 5 (Due Date: Nov. 20 2007) 1. Suppose that ( X, Y, Z ) are jointly Gaussian and that X Y Z forms a Markov chain. Let X and Y have correlation coeFcient ρ 1 and let Y and Z have correlation coeFcient ρ 2 . ±ind I ( X ; Z ). 2. Evaluate the di²erential entropy h ( X ) = - i f ln f for the following cases: (a) The Laplace density, f ( x ) = 1 2 λe - λ | x | . Relate this to the entropy of the exponential density λe - λx , x 0. (b) The sum of X 1 and X 2 , where X 1 and X 2 are independent normal random variables with means μ i and variances σ 2 i , i = 1 , 2. 3. We wish to show that any density f 0 can be considered to be a maximum entropy density. Let f 0 ( x ) be a density and consider the problem of maximizing h ( f ) subject to the constraint I f ( x ) r ( x ) dx = α where r ( x ) = ln f 0 ( x ). Show that there is a choice of α , α = α 0 , such that the maximizing dis- tribution is f * ( x ) = f 0 ( x ). Thus f 0 ( x ) is indeed a maximum entropy density under the constraint i f ln f 0 = α 0 . 4. Let Y = X 1 + X 2 . ±ind the maximum entropy (over all distributions on X 1 and X 2
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ) of Y under the constraint E [ X 2 1 ] = P 1 , E [ X 2 2 ] = P 2 . (a) if X 1 and X 2 are independent. (b) if X 1 and X 2 are allowed to be dependent. 5. Let the input random variable X to a channel be uniformly distributed over the interval-1 2 ≤ x ≤ + 1 2 . Let the output of the channel be Y = X + Z , where the noise random variable is uniformly distributed over the interval-a 2 ≤ z ≤ + a 2 . (a) ±ind I ( X ; Y ) as a function of a . (b) ±or a = 1 ³nd the capacity of the channel when the input X is peak-limited; that is, the range of X is limited to-1 2 ≤ x ≤ + 1 2 . What probability distribution on X maximizes the mutual information I ( X ; Y )? (c) [Optional] ±ind the capacity of the channel for all values of a , again assuming that the range of X is limited to-1 2 ≤ x ≤ + 1 2 . 1...
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Ask a homework question - tutors are online