Unformatted text preview: ) of Y under the constraint E [ X 2 1 ] = P 1 , E [ X 2 2 ] = P 2 . (a) if X 1 and X 2 are independent. (b) if X 1 and X 2 are allowed to be dependent. 5. Let the input random variable X to a channel be uniformly distributed over the interval1 2 ≤ x ≤ + 1 2 . Let the output of the channel be Y = X + Z , where the noise random variable is uniformly distributed over the intervala 2 ≤ z ≤ + a 2 . (a) ±ind I ( X ; Y ) as a function of a . (b) ±or a = 1 ³nd the capacity of the channel when the input X is peaklimited; that is, the range of X is limited to1 2 ≤ x ≤ + 1 2 . What probability distribution on X maximizes the mutual information I ( X ; Y )? (c) [Optional] ±ind the capacity of the channel for all values of a , again assuming that the range of X is limited to1 2 ≤ x ≤ + 1 2 . 1...
View
Full Document
 '09
 Normal Distribution, Probability distribution, probability density function, Harvard SEAS, maximum entropy density

Click to edit the document details