# hw2solb - EE 376A/Stat 376A Handout#10 Information Theory...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A/Stat 376A Handout #10 Information Theory Tuesday, January 27, 2009 Prof. T. Cover Solutions prepared by William Wu Homework Set #2 Solutions 1. Entropy and pairwise independence. Let X, Y, Z be three binary Bernoulli( 1 2 ) random variables that are pairwise indepen- dent; that is, I ( X ; Y ) = I ( X ; Z ) = I ( Y ; Z ) = 0. (a) Under this constraint, what is the minimum value for H ( X, Y, Z )? (b) Give an example achieving this minimum. (c) Now suppose that X, Y, Z are three random variables each uniformly distributed over the alphabet { 1 , 2 , . . ., m } . Again, they are pairwise independent. What is the minimum value for H ( X, Y, Z )? Solution: Entropy and pairwise independence. (a) Due to pairwise independence and the marginal distributions, H ( X, Y ) = H ( X )+ H ( Y ) = 2. Thus, H ( X, Y, Z ) = H ( X, Y ) + H ( Z | X, Y ) = H ( X ) + H ( Y ) + H ( Z | X, Y ) ≥ H ( X ) + H ( Y ) = 2 with equality if and only if Z is a deterministic function of X and Y . This minimum is achieved by the example in part (b). (b) Let Z = X ⊕ Y , where ⊕ denotes XOR. It is easy to check that all the marginal distributions are satisfied, as well as pairwise independence. (c) Here is one possible solution. Without loss of generality, relabel the alphabet to be { , 1 , 2 , . . ., m − 1 } instead of { 1 , 2 , . . ., m } . Let Z = X + Y mod m . Then H ( X, Y, Z ) = H ( X ) + H ( Y ) = log m + log m = 2 log m. One can then justify that under this construction, all the marginal distributions are satisfied, and we also have pairwise independence. For example, to argue pairwise 1 independence between Z and X , I ( X ; Z ) = H ( Z ) − H ( Z | X ) = log m − H ( X + Y | X ) = log m − H ( Y | X ) = log m − H ( Y ) = log m − log m = 0 . Thus, X and Z are independent. 2. The value of a question. Let X ∼ p ( x ) , x = 1 , 2 , . . ., m . We are given a set S ⊆ { 1 , 2 , . . ., m } . We ask whether X ∈ S and receive the answer Y = braceleftbigg 1 , if X ∈ S , if X negationslash∈ S. Suppose Pr { X ∈ S } = α . Find the decrease in uncertainty H ( X ) − H ( X | Y ). Apparently any set S with a given probability α is as good as any other. Solution: The value of a question. H ( X ) − H ( X | Y ) = I ( X ; Y ) = H ( Y ) − H ( Y | X ) = H ( α ) − H ( Y | X ) = H ( α ) since H ( Y | X ) = 0. 3. Random questions. One wishes to identify a random object X ∼ p ( x ). A question Q ∼ r ( q ) is asked at random according to r ( q ). This results in a deterministic answer A = A ( x, q ) ∈ { a 1 , a 2 , . . . } . Suppose the object X and the question Q are independent. Then I ( X ; Q, A ) is the uncertainty in X removed by the question-answer ( Q, A )....
View Full Document

{[ snackBarMessage ]}

### Page1 / 18

hw2solb - EE 376A/Stat 376A Handout#10 Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online