{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

soln05

# soln05 - EE 376B/Stat 376B Handout#26 Information Theory...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376B/Stat 376B Handout #26 Information Theory Thursday, June 1, 2006 Prof. T. Cover Solutions to Practice Final Each problem is worth 20 points. 1. Kolmogorov complexity- n ? 6 n What is the Kolmogorov complexity (to first order) of a square with a rectangle eaten out of a corner? The square and rectangle are on an n × n grid with the axes lined up with the borders. Solution: Kolmogorov complexity This question is similar to question 1f of homework set 6. To describe the square, one needs 2log( n ) bits to describe the upper left hand corner, and log( n ) bits to specify the length of an edge. We can then describe the rectangle by first describing the corner of the square it intersects, this takes log(4) bits, and then specify the opposing corner of the rectangle, which takes 2log( n ) bits. K (square with a rectangle eaten out of a corner | n ) ≤ 5log( n ) + c. 2. Slepian Wolf Let X 1 = U ⊕ Z 1 , X 2 = U ⊕ Z 2 , where U ∼ Bern( p ), Z 1 ∼ Bern( α 1 ), and Z 2 ∼ Bern( α 2 ), and let U, Z 1 , and Z 2 be independent. What rates ( R 1 ,R 2 ) suffice to describe X 1 and X 2 ? 1 Decoder---- X n 2 X n 1 j ( X n 2 ) ∈ 2 nR 2 i ( X n 1 ) ∈ 2 nR 1- ‡ ˆ X n 1 , ˆ X n 2 · Solution: Slepian Wolf We know R 1 ≥ H ( X 1 | X 2 ) R 2 ≥ H ( X 2 | X 1 ) R 1 + R 2 ≥ H ( X 1 ,X 2 ) . H ( X 1 ,X 2 ) = H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 ) . H ( X 1 | X 2 ) = H ( X 1 ,X 2 )- H ( X 2 ) = H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 )- H B ( α 2 * p ) where H B is the binary entropy function and α 2 * p = ¯ pα 2 + p ¯ α 2 . Similarly, H ( X 2 | X 1 ) = H ( X 1 ,X 2 )- H ( X 1 ) = H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 )- H B ( α 1 * p ) . Plugging back into the Slepian-Wolf region gives R 1 ≥ H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 )- H B ( α 2 * p ) R 2 ≥ H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 )- H B ( α 1 * p ) R 1 + R 2 ≥ H ( pα 1 α 2 + ¯ p ¯ α 1 ¯ α 2 , pα 1 ¯ α 2 + ¯ p ¯ α 1 α 2 , p ¯ α 1 α 2 + ¯ pα 1 ¯ α 2 , p ¯ α 1 ¯ α 2 + ¯ pα 1 α 2 ) ....
View Full Document

{[ snackBarMessage ]}

### Page1 / 11

soln05 - EE 376B/Stat 376B Handout#26 Information Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online