HW5_ES250 - of Y under the constraint E X 2 1 = P 1 E X 2 2...

Info icon This preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Harvard SEAS ES250 – Information Theory Homework 5 (Due Date: Nov. 20 2007) 1. Suppose that ( X, Y, Z ) are jointly Gaussian and that X Y Z forms a Markov chain. Let X and Y have correlation coefficient ρ 1 and let Y and Z have correlation coefficient ρ 2 . Find I ( X ; Z ). 2. Evaluate the differential entropy h ( X ) = - integraltext f ln f for the following cases: (a) The Laplace density, f ( x ) = 1 2 λe - λ | x | . Relate this to the entropy of the exponential density λe - λx , x 0. (b) The sum of X 1 and X 2 , where X 1 and X 2 are independent normal random variables with means μ i and variances σ 2 i , i = 1 , 2. 3. We wish to show that any density f 0 can be considered to be a maximum entropy density. Let f 0 ( x ) be a density and consider the problem of maximizing h ( f ) subject to the constraint integraldisplay f ( x ) r ( x ) dx = α where r ( x ) = ln f 0 ( x ). Show that there is a choice of α , α = α 0 , such that the maximizing dis- tribution is f * ( x ) = f 0 ( x ). Thus f 0 ( x ) is indeed a maximum entropy density under the constraint integraltext f ln f 0 = α 0 . 4. Let Y = X 1 + X 2 . Find the maximum entropy (over all distributions on
Image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ) of Y under the constraint E [ X 2 1 ] = P 1 , E [ X 2 2 ] = P 2 . (a) if X 1 and X 2 are independent. (b) if X 1 and X 2 are allowed to be dependent. 5. Let the input random variable X to a channel be uniformly distributed over the interval-1 2 ≤ x ≤ + 1 2 . Let the output of the channel be Y = X + Z , where the noise random variable is uniformly distributed over the interval-a 2 ≤ z ≤ + a 2 . (a) ±ind I ( X ; Y ) as a function of a . (b) ±or a = 1 ³nd the capacity of the channel when the input X is peak-limited; that is, the range of X is limited to-1 2 ≤ x ≤ + 1 2 . What probability distribution on X maximizes the mutual information I ( X ; Y )? (c) [Optional] ±ind the capacity of the channel for all values of a , again assuming that the range of X is limited to-1 2 ≤ x ≤ + 1 2 . 1...
View Full Document

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern