hw7sol - EE 376B Handout #21 Information Theory Thursday,...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376B Handout #21 Information Theory Thursday, June 3, 2010 Prof. T. Cover Homework Set #7 1. Minimax regret data compression and Channel Capacity. First consider uni- versal data compression with respect to four source distributions. Let the alphabet V = { 1 , 2 , 3 , 4 ,e } and for i = 1 , 2 , 3 , 4 , let p i ( v ) put mass 1- on v = i and on v = e . We assign word lengths to V according to l ( v ) = log 1 p ( v ) , the ideal codeword length with respect to a cleverly chosen probability mass function p ( v ). The worst case excess description length (above the entropy of the true distribution) is max i parenleftbigg E p i log 1 p ( V )- E p i log 1 p i ( V ) parenrightbigg = max i D ( p i || p ) Thus the minimax regret is R = min p max i D ( p i || p ). (a) Find R . (b) Find the p ( v ) achieving R * . (c) Compare R to the capacity of the erasure channel 1- 1- 1- 1- and comment. Solution: Minimax regret data compression and channel capacity. (a) The whole trick to this problem is to employ the duality between universal data- compression and channel capacity. Although we dont immediately know how to solve the data-compression problem, we turn it into a channel-capacity problem whose answer we already know. We know that if we wish to compress a source that is drawn from any of four possible distributions p 1 ( v ) ,p 2 ( v ) ,p 3 ( v ) or p 4 ( v ), but we dont know which, then the minimax regret, R , that we can achieve using a universal data-compression scheme is equivalent to the capacity of a channel whose channel-transistion matrix has rows which are precisely the probability distributions p 1 ,p 2 ,p 3 and p 4 1 p i ( v ) put mass 1- on v = i and on v = e for i = 1 , 2 , 3 , 4, yielding the channel-transition matrix 1- 1- 1- 1- which we recognize immediately as a quaternary erasure channel with erasure probability and hence capacity 2(1- ). Thus, the minimax regret R is 2(1- ). (b) We know that the p ( v ) achieving R will be the center of the smallest relative- entropy ball that contains the p i ( v ). In the dual problem, involving the quaternary erasure channel, the center of this ball is the distribution induced on the channel output when we send according to the input distribution that achieves capacity....
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 7

hw7sol - EE 376B Handout #21 Information Theory Thursday,...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online