This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Problem Set 4 MATH 778C, Spring 2009, Cooper Expiration: Thursday April 30 You are awarded up to 25 points per problem, 5 points for submitting solutions in L A T E X, and 5 points per solution that is used for the answer key. All answers must be fully rigorous do not assume anything that you are not sure everyone else in the class knew prior to Day 1 of this class. However, you may cite without proof any result proven in class. 1. Two nsided dice with sides labeled 1 through n are rolled, resulting in the i.i.d. random variables X and Y . Let Z = X + Y . Compute H ( X ), H ( X,Y ), H ( Z ), H ( X  Z ), I ( X ; Y ) and I ( X ; Y  Z ). (Note that this last quantity is not I ( X ;( Y  Z )).) Solution (SerWei Fu): Let f ( k ) = k i =1 i log i . Observe that X and Y are uniform random variables. H ( X ) = log n. H ( X,Y ) = 2log n. Using the fact that the entropy of an uniform random variable is the logarithm of the size of the support. H ( Z ) = X z p ( z )log p ( z ) = n X z =2 p ( z )log p ( z ) 2 n X z = n +1 p ( z )log p ( z ) . Observe that the number of pairs with the same sum is increasing for small sums and decreasing after z = n + 1. H ( Z ) = n X z =2 z 1 n 2 log z 1 n 2 2 n X z = n +1 2 n z + 1 n 2 log 2 n z + 1 n 2 = n 1 X i =1 i n 2 log i n 2 n X i =1 i n 2 log i n 2 = 2 n 1 X i =1 i n 2 log n + 2 n X i =1 i n 2 log n 1 n 2 ( f ( n 1) + f ( n )) = 2log n 1 n 2 ( f ( n 1) + f ( n )) . H ( X  Z ) = X z p ( z ) H ( X  Z = z ) = n X z =2 p ( z ) H ( X  Z = z ) + 2 n X z = n +1 p ( z ) H ( X  Z = z ) . Observe that ( X  Z = z ) is a uniform random variable. H ( X  Z ) = 1 n 2 n 1 X i =1 i log i + n X i =1 i log i ! = 1 n 2 ( f ( n 1) + f ( n )) . I ( X ; Y ) = 0 . Since X and Y are independent. I ( X ; Y  Z ) = H ( X  Z ) H ( X  Y,Z ) = 1 n 2 ( f ( n 1) + f ( n )) . The part of H ( X  Y,Z ) is equal to zero because given y,z the value of x is fixed. The original way we looked at this. I ( X ; Y  Z ) = I ( X ; Y,Z ) I ( X ; Z ) = H ( X ) + H ( Y,Z ) H ( X,Y,Z ) H ( X ) H ( X  Z ) = H ( Y,Z ) H ( X,Y,Z ) + H ( X  Z ) = H ( Z ) + H ( Y  Z ) + H ( X  Z ) H ( X,Y,Z ) = 1 n 2 ( f ( n 1) + f ( n )) . Notice that H ( X,Y,Z ) = 2log n since it is another uniform random vari able. 2. Consider a channel with binary inputs that has both erasures and errors. Let the probability of error be and the probability of erasure be . (Hence, the probability of correct transmission is 1  .) What is the capacity of this channel? Solution (Aaron Dutle): Suppose P (0) = p and P (1) = 1 p is a maximizing probability distribu tion for I ( X ; Y ) . Since the channel is symmetric with respect to the labels , 1 , we get that the distribution P (0) = 1 p and P (1) = p is also a maximizing distribution. Since I ( X ; Y ) is always concave in p ( x ), we see that the average of these two distributions (which is p (0) = p (1) = 1 / 2) is also a maximizing distribution....
View Full
Document
 Spring '10
 sd

Click to edit the document details