# hw1 - EE 376B/Stat 376B Information Theory Prof T Cover...

This preview shows pages 1–2. Sign up to view the full content.

EE 376B/Stat 376B Handout #3 Information Theory Thursday, April 6, 2005 Prof. T. Cover Due Thursday, April 13, 2005 Homework Set #1 1. Monotonicity of entropy per element. For a stationary stochastic process X 1 ,X 2 ,...,X n , show that H ( X 1 ,X 2 ,...,X n ) n H ( X n | X n - 1 ,...,X 1 ) . 2. Entropy rates of Markov chains. (a) Find the entropy rate of the two-state Markov chain with transition matrix P = 1 - p 01 p 01 p 10 1 - p 10 . (b) What values of p 01 ,p 10 maximize the entropy rate? (c) Find the entropy rate of the two-state Markov chain with transition matrix P = 1 - p p 1 0 . (d) Find the maximum value of the entropy rate of the Markov chain of part (c). We expect that the maximizing value of p should be less than 1 2 , since the 0 state permits more information to be generated than the 1 state. 3. Second law of thermodynamics. Let X 1 ,X 2 ,X 3 ,... be a stationary ﬁrst-order Markov chain. We know that H ( X n | X 1 ) H ( X n - 1 | X 1 ) for n = 2 , 3 ,... . Thus, conditional uncertainty about the future grows

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 4

hw1 - EE 376B/Stat 376B Information Theory Prof T Cover...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online