5-differential_entropy - Harvard SEAS ES250 Information...

Info icon This preview shows pages 1–3. Sign up to view the full content.

Harvard SEAS ES250 – Information Theory Differential Entropy and Maximum Entropy * 1 Differential Entropy 1.1 Definitions Definition The differential entropy h ( X ) of a continuous random variable X with density f ( x ) is defined as h ( X ) = - integraldisplay S f ( x ) log f ( x ) dx, where S is the support set of the random variable. 1.2 AEP for Continuous Random Variables Theorem Let X 1 , X 2 , ..., X n be a sequence of random variables drawn i.i.d. to the density f ( x ). Then - 1 n log f ( X 1 , X 2 , ..., X n ) E [ - log f ( X )] = h ( X ) in probability . Definition For epsilon1 > 0 and any n , we define the typical set A ( n ) epsilon1 with respect to f ( x ) as follows: A ( n ) epsilon1 = braceleftbigg ( x 1 , x 2 , ..., x n ) S n : vextendsingle vextendsingle vextendsingle vextendsingle - 1 n log f ( x 1 , x 2 , ..., x n ) - h ( X ) vextendsingle vextendsingle vextendsingle vextendsingle epsilon1 bracerightbigg , where f ( x 1 , x 2 , ..., x n ) = producttext n i =1 f ( x i ). Definition The volume Vol( A ) of a set A ⊂ R n is defined as Vol( A ) = integraldisplay A dx 1 dx 2 · · · dx n . Theorem The typical set A ( n ) epsilon1 has the following properties: 1. Pr( A ( n ) epsilon1 ) > 1 - epsilon1 for n sufficiently large. 2. Vol( A ( n ) epsilon1 ) 2 n ( h ( X )+ epsilon1 ) for all n . 3. Vol( A ( n ) epsilon1 ) (1 - epsilon1 )2 n ( h ( X ) - epsilon1 ) for n sufficiently large. Theorem The set A ( n ) epsilon1 is the smallest volume set with probability 1 - epsilon1 , to first order in the exponent. * Based on Cover & Thomas, Chapter 8 and 12 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Harvard SEAS ES250 – Information Theory 1.3 Relation of Differential Entropy to Discrete Entropy Assume the density is continuous. Then, by mean value theorem, there exists x i s.t.
Image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.
  • Fall '09
  • Normal Distribution, Probability theory, probability density function, differential entropy, Harvard SEAS

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern