hw1 - minimum. 3. Inequality. Show ln x 1-1 x for x...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
EE 376A Information Theory Prof. T. Weissman Thursday, January 14, 2010 Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin fips. A fair coin is Fipped until the ±rst head occurs. Let X denote the number of Fips required. (a) ²ind the entropy H ( X ) in bits. The following expressions may be useful: s n =0 r n = 1 1 - r , s n =0 nr n = r (1 - r ) 2 . (b) A random variable X is drawn according to this distribution. ²ind an “e³cient” sequence of yes-no questions of the form, “Is X contained in the set S ?” Compare H ( X ) to the expected number of questions required to determine X . 2. Minimum entropy. What is the minimum value of H ( p 1 ,...,p n ) = H ( p ) as p ranges over the set of n -dimensional probability vectors? ²ind all p ’s which achieve this
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: minimum. 3. Inequality. Show ln x 1-1 x for x > . 4. InFnite entropy. This problem shows that the entropy of a discrete random variable can be innite. Let A = n =2 ( n log 2 n )-1 . (It is easy to show that A is nite by bounding the innite sum by the integral of ( x log 2 x )-1 .) Show that the integer-valued random variable X dened by Pr( X = n ) = ( An log 2 n )-1 for n = 2 , 3 , ... , has H ( X ) = + . 5. Markovs inequality or probabilities. Let p ( x ) be a probability mass function. Prove, for all d 0, Pr { p ( X ) d } log p 1 d P H ( X ) . (1) 1...
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Ask a homework question - tutors are online