hw4sol - EE 376A/Stat 376A Handout#14 Information Theory...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A/Stat 376A Handout #14 Information Theory Thursday, February 4, 2011 Prof. T. Cover Solutions to Homework Set #4 1. Horse race. Three horses run a race. A gambler offers 3-for-1 odds on each of the horses. These are fair odds under the assumption that all horses are equally likely to win the race. The true win probabilities are known to be p = ( p 1 ,p 2 ,p 3 ) = ( 1 2 , 1 4 , 1 4 ) . (1) Let b = ( b 1 ,b 2 ,b 3 ), b i ≥ 0, ∑ b i = 1, be the amount invested on each of the horses. The expected log wealth is thus W ( b ) = 3 ∑ i =1 p i log 3 b i . (2) (a) Maximize this over b to find b ∗ and W ∗ . Thus the wealth achieved in repeated horse races should grow to infinity like 2 nW * with probability one. (b) Show that if instead we put all of the current wealth on horse 1, the most likely winner, on each race, we will eventually go broke with probability one. Solution: Horse race . (a) The doubling rate W ( b ) = ∑ i p i log b i o i = ∑ i p i log 3 b i = ∑ p i log 3 + ∑ p i log p i − ∑ p i log p i b i = log 3 − H ( p ) − D ( p || b ) ≤ log 3 − H ( p ) , 1 with equality iff p = b . Hence b ∗ = p = ( 1 2 , 1 4 , 1 4 ) and W ∗ = log 3 − H ( 1 2 , 1 4 , 1 4 ) = 1 2 log 9 8 = 0 . 085. By the strong law of large numbers, S n = ∏ j 3 b ( X j ) = 2 n ( 1 n ∑ j log 3 b ( X j )) → 2 nE log 3 b ( X ) = 2 nW ( b ) When b = b ∗ , W ( b ) = W ∗ and S n . = 2 nW * = 2 . 085 n = (1 . 06) n . (b) If we put all the money on the first horse, then the probability that we do not go broke in n races is ( 1 2 ) n . Since this probability goes to zero with n , the probability of the set of outcomes where we do not ever go broke is zero, and we will go broke with probability 1. Alternatively, if b = (1 , , 0), then W ( b ) = −∞ and S n → 2 nW = 0 w.p.1 (3) by the strong law of large numbers. 2. Can side information make a bad situation worse? Suppose we have a horse race with outcome X ∈ { 1 , 2 ,...,m } and side information Y , where ( X,Y ) ∼ p ( x,y ) = p ( x ) p ( y | x ). The odds are m for 1. (a) Find the growth optimal strategy b ( x ) and the associated growth rate of wealth max b ( · ) W ( b ( x ) ,p ( x )) for the gambler. (b) Given side information, what is the growth optimal b ( x | y ) and the associated growth rate? What is the improvement ∆ W ? Call it ∆ W p . Now suppose another gambler believes (incorrectly) that X ∼ q ( x ), and that ( X,Y ) ∼ q ( x ) p ( y | x ), i.e. he believes the joint distribution is q ( x,y ) = q ( x ) p ( y | x ). Note that the conditional distribution q ( y | x ) = p ( y | x ) is the same as in parts (a) and (b). Thus the noise in the observation of Y given X is the same in each version. Only the estimate of the true distribution of X is different....
View Full Document

{[ snackBarMessage ]}

Page1 / 13

hw4sol - EE 376A/Stat 376A Handout#14 Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online