hw8 - { .X 1 X 2 X 3 X 4 X 5 < . 01110 } = .F 1...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
EE 376B/Stat 376B Handout #20 Information Theory Thursday, May 25, 2006 Prof. T. Cover Due Thursday, June 1, 2006 Homework Set #8 1. Universal data compression. Consider three possible source distributions on X , P a = (0 . 7 , 0 . 2 , 0 . 1) , P b = (0 . 1 , 0 . 7 , 0 . 2) , P c = (0 . 2 , 0 . 1 , 0 . 7) . (a) Find the minimum incremental cost of compression D * = min P max θ D ( P θ || P ) , the associated mass function P = ( p 1 ,p 2 ,p 3 ), and ideal codeword lengths l i = log(1 /p i ). (b) What is the channel capacity of a channel matrix with rows P a , P b , P c ? 2. Arithmetic coding. Let { X i } be binary stationary Markov with transition matrix " 1 3 2 3 2 3 1 3 # (a) Find F (01110) = Pr
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: { .X 1 X 2 X 3 X 4 X 5 < . 01110 } = .F 1 F 2 ... . (b) How many bits .F 1 F 2 ... can be known for sure if it is not known how X = 01110 continues? 3. Lemple-Ziv. Give the LZ78 parsing and encoding of 00000011010100000110101. 4. Compression of constant sequence. We are given the constant sequence x n = 1111 ... (a) Give the LZ78 parsing of this sequence. (b) Argue that the number of encoding bits per symbol for this sequence goes to zero as n . 1...
View Full Document

This note was uploaded on 06/10/2008 for the course ECE 376B taught by Professor Tomcover during the Spring '05 term at Stanford.

Ask a homework question - tutors are online