# Module8_1 - Module 8 Lecture 1 Differential Entropy and the...

This preview shows pages 1–5. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Module 8, Lecture 1 Differential Entropy and the Gaussian Channel G.L. Heileman Module 8, Lecture 1 Differential Entropy Definition (Differential entropy) The differential entropy of a continuous RV X with pdf f ( x ) is h ( x ) =- Z S f ( x ) log f ( x ) dx , where S is the support of X . The support of X is the subset of x ∈ < for which f ( x ) > 0. Just like the discrete case, the differential entropy depends only on the probability density of the RV, not on the values that the RV may assume. Thus, it’s sometimes written as h ( f ). The differential entropy of a RV exists only if the equation above is integrable. G.L. Heileman Module 8, Lecture 1 Differential Entropy Ex: Consider a zero mean Gaussian RV: X ∼ Φ( x ) = 1 √ 2 πσ 2 e- x 2 2 σ 2 , then h (Φ) =- Z Φ ln Φ dx =- Z Φ( x )- x 2 2 σ 2- ln √ 2 πσ 2 dx = E X 2 2 σ 2 + 1 2 ln 2 πσ 2 = 1 2 + 1 2 ln 2 πσ 2 = 1 2 ln e + 1 2 ln 2 πσ 2 = 1 2 ln 2 π e σ 2 nats. Carrying through the same derivation with base 2 logarithms yields: h (Φ) = 1 2 log 2 π e σ 2 bits. G.L. Heileman Module 8, Lecture 1 The AEP for Continuous RVs Theorem (AEP) Let X , X 1 , . . . , X n be a sequence of continuous RVs drawn according to pdf f ( x ) . Then- 1 n log f ( X 1 , . . . , X n ) p-→ E [- log f ( X )] = h ( X ) ....
View Full Document

{[ snackBarMessage ]}

### Page1 / 10

Module8_1 - Module 8 Lecture 1 Differential Entropy and the...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online