# lecture-29 - Chapter 29 Entropy Rates and Asymptotic...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 29 Entropy Rates and Asymptotic Equipartition Section 29.1 introduces the entropy rate — the asymptotic en- tropy per time-step of a stochastic process — and shows that it is well-defined; and similarly for information, divergence, etc. rates. Section 29.2 proves the Shannon-MacMillan-Breiman theorem, a.k.a. the asymptotic equipartition property, a.k.a. the entropy ergodic theorem: asymptotically, almost all sample paths of a sta- tionary ergodic process have the same log-probability per time-step, namely the entropy rate. This leads to the idea of “typical” se- quences, in Section 29.2.1. Section 29.3 discusses some aspects of asymptotic likelihood, us- ing the asymptotic equipartition property, and allied results for the divergence rate. 29.1 Information-Theoretic Rates Definition 376 (Entropy Rate) The entropy rate of a random sequence X is h ( X ) ≡ lim n H ρ [ X n 1 ] n (29.1) when the limit exists. Definition 377 (Limiting Conditional Entropy) The limiting conditional entropy of a random sequence X is h ( X ) ≡ lim n H ρ [ X n | X n- 1 1 ] (29.2) when the limit exists. 197 CHAPTER 29. RATES AND EQUIPARTITION 198 Lemma 378 For a stationary sequence, H ρ [ X n | X n- 1 1 ] is non-increasing in n . Moreover, its limit exists if X takes values in a discrete space. Proof: Because “conditioning reduces entropy”, H ρ [ X n +1 | X n 1 ] ≤ H [ X n +1 | X n 2 ]. By stationarity, H ρ [ X n +1 | X n 2 ] = H ρ [ X n | X n- 1 1 ]. If X takes discrete values, then conditional entropy is non-negative, and a non-increasing sequence of non- negative real numbers always has a limit. Remark: Discrete values are a sufficient condition for the existence of the limit, not a necessary one. We now need a natural-looking, but slightly technical, result from real anal- ysis. Theorem 379 (Ces` aro) For any sequence of real numbers a n → a , the se- quence b n = n- 1 ∑ n i =1 a n also converges to a . Proof: For every > 0, there is an N ( ) such that | a n- a | < whenever n > N ( ). Now take b n and break it up into two parts, one summing the terms below N ( ), and the other the terms above. lim n | b n- a | = lim n n- 1 n i =1 a i- a (29.3) ≤ lim n n- 1 n i =1 | a i- a | (29.4) ≤ lim n n- 1 N ( ) i =1 | a i- a | + ( n- N ( )) (29.5) ≤ lim n n- 1 N ( ) i =1 | a i- a | + n (29.6) = + lim n n- 1 N ( ) i =1 | a i- a | (29.7) = (29.8) Since was arbitrary, lim b n = a . Theorem 380 (Entropy Rate) For a stationary sequence, if the limiting con- ditional entropy exists, then it is equal to the entropy rate, h ( X ) = h ( X ) . Proof: Start with the chain rule to break the joint entropy into a sum of conditional entropies, use Lemma 378 to identify their limit as h ] prime ( X ), and CHAPTER 29. RATES AND EQUIPARTITION 199 then use Ces` aro’s theorem: h ( X ) = lim n 1 n H ρ [ X n 1 ] (29.9) = lim n 1 n n i =1 H ρ [ X i | X i- 1 1 ] (29.10) = h ( X ) (29.11) as required....
View Full Document

## This note was uploaded on 12/20/2011 for the course STAT 36-754 taught by Professor Schalizi during the Spring '06 term at University of Michigan.

### Page1 / 9

lecture-29 - Chapter 29 Entropy Rates and Asymptotic...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online