# hw1_soln_fall2011 - Question 1 Entropy and Majorization...

This preview shows pages 1–7. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Question 1: Entropy and Majorization. Entropy after Mixing Symbols. Check class notes for a complete solution. Solution for the special case: ECE 563, Fall 2011 Homework I Solution Parts of the solutions are copied from the solution manual of “Elements of Information Theory” by Cover and Thomas. Question 2. Mutual information of heads and tails. Part a. o b serve that ; ; ; ;| 2 1 since ; ~ ;~ 1/2 , and ; is a function of . Here ; and stand for Bottom and To p res p ectively. Part b. o b serve that the b ottom side B is again a function of the to p side T , and there are six equally p ro b a b le p ossi b ilities for B . (In fact, B / 7 ) Hence, ; ; ; ;| 6 3 / 1 Part c. Note that having o b served a side F of the cu b e facing us, there are four equally p ro b a b le p ossi b ilities for the to p T . Thus, ; 3 |3 6 4 log 3 1 Since T has uniform distri b ution on g1,2, …,6… . Question 3. Functions of random variables. "; " #\$%& )*+ " / " |" " |" , - . " |" . / , 0 / ⇒ " |" "; " #\$%& )*+ " / "| " ⇒ 3 4|5 4 67 " (a) For n coins, there are 2 n + 1 possible situations or “states”. • One of the n coins is heavier. • One of the n coins is lighter. • They are all of equal weight. Each weighing has three possible outcomes — equal, left pan heavier or right pan heavier. Hence with k weighings, there are 3 k possible outcomes and hence we can distinguish between at most 3 k different “states”. Hence 2 n + 1 ≤ 3 k or n ≤ (3 k − 1) / 2. Looking at it from an information theoretic viewpoint, each weighing gives at most log 2 3 bits of information. There are 2 n + 1 possible “states”, with a maximum entropy of log 2 (2 n + 1) bits. Hence in this situation, one would require at least log 2 (2 n + 1) / log 2 3 weighings to extract enough information for determination of the odd coin, which gives the same result as above. (b) There are many different solutions to this problem, and we will give two. But per- haps more useful than knowing the solution is knowing how to go about thinking through the problem, and how this relates to information theory. So let’s say a few words about the thinking process first, or at least how to start it. Perhaps the first question we are faced with when trying to solve the problem is: “How many coins should we put on each pan in the first weighing?” We can use an information theoretic analysis to answer this question. Firstly, it’s clear that putting a different number of coins on each pan isn’t very helpful, so let’s put the same number on both. Thus, the most number of coins we could put on one pan is six, since there are 12 coins in total. Now let’s suppose we choose to weigh six coins against six coins. On one hand, it may happen that the pans are balanced, in which case, we have solved the problem with merely one weighing. However, it is also possible that the left pan goes down, and the right pan goes up. In this case, what information has the weighing given us? We can only conclude thatcase, what information has the weighing given us?...
View Full Document

{[ snackBarMessage ]}

### Page1 / 16

hw1_soln_fall2011 - Question 1 Entropy and Majorization...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online