hw1-2011 - ECE 563, Fall 2011 Homework I Issued: August...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 563, Fall 2011 Homework I Issued: August 30th, 2011 Due: September 13th, 2011 1. Entropy and Majorization. Entropy after Mixing Symbols . The interpretation of entropy as a measure of uncertainty suggests that “more uniform” distributions have larger entropy. For two distributions P and Q on X we call P “more uniform” than Q , symbolically P > Q , if for the non-increasing ordering , | X |= n , of their probabilities, it holds that for every Show that P > Q implies H( P ) H( Q ). As a special case, consider what happens to the entropy when two outcomes (say, a and b ) of two random variables are “mixed”, i.e., the probability of symbols a and b are both replaced by the average (p(a)+p(b))/2. 2. Mutual information of heads and tails. a. Consider a fair coin flip. What is the mutual information between the top side and the bottom side of the coin? b. A 6-sided fair die is rolled. What is the mutual information between the top side and the bottom side? c. What is the mutual information between the top side and the front face (the side
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 2

hw1-2011 - ECE 563, Fall 2011 Homework I Issued: August...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online