{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# hw1sol - EE 376A Prof T Weissman Information Theory...

This preview shows pages 1–2. Sign up to view the full content.

EE 376A Information Theory Prof. T. Weissman Thursday, January 14, 2010 Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required. (a) Find the entropy H ( X ) in bits. The following expressions may be useful: summationdisplay n =0 r n = 1 1 r , summationdisplay n =0 nr n = r (1 r ) 2 . (b) A random variable X is drawn according to this distribution. Find an “efficient” sequence of yes-no questions of the form, “Is X contained in the set S ?” Compare H ( X ) to the expected number of questions required to determine X . Solution: (a) The number X of tosses till the first head appears has the geometric distribution with parameter p = 1 / 2, where P ( X = n ) = pq n - 1 , n ∈ { 1 , 2 ,... } . Hence the entropy of X is H ( X ) = summationdisplay n =1 pq n - 1 log( pq n - 1 ) = bracketleftBigg summationdisplay n =0 pq n log p + summationdisplay n =0 npq n log q bracketrightBigg = p log p 1 q pq log q p 2 = p log p q log q p = H ( p ) /p bits.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}