HW2_ES250

HW2_ES250 - Harvard SEAS ES250 – Information Theory...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Harvard SEAS ES250 – Information Theory Homework 2 (Due Date: Oct. 16 2007) 1. An n-dimensional rectangular box with sides X 1 , X 2 , ··· , X n is to be constructed. The volume is V n = Q n i =1 X i . The edge-length l of an n-cube with the same volume as the random box is l = V 1 /n n . Let X 1 , X 2 , ··· be i.i.d. uniform random variables over the interval [0 , a ]. Find lim n →∞ V 1 /n n , and compare to ( EV n ) 1 /n . Clearly the expected edge length does not capture the idea of the volume of the box. 2. Let X 1 , X 2 , ··· be drawn i.i.d. according to the following distribution: X i = 1 , 1 2 2 , 1 4 3 , 1 4 Find the limiting behavior of the product ( X 1 X 2 ··· X n ) 1 /n 3. Let X 1 , X 2 , ··· be an i.i.d. sequence of discrete random variables with entropy H ( X ). Let C n ( t ) = { x n ∈ X n : p ( x n ) ≥ 2- nt } denote the subset of n-sequences with probabilities ≥ 2- nt . (a) Show | C n ( t ) | ≤ 2 nt . (b) For what values of t does P ( { X n ∈ C n ( t ) } ) → 1 ? 4. Let X 1 , X 2 , ··· be independent, identically distributed random variables drawn according to the probability mass function p ( x ), x ∈ { 1 , 2 , ··· , m } . Thus, p ( x 1 , x 2 , ··· , x n ) = Q n i =1 p ( x i ). We know that- 1 n log p ( X 1 , X 2 , ··· , X n ) → H (...
View Full Document

This note was uploaded on 12/01/2010 for the course ADLAC 1023 at Stanford.

Page1 / 3

HW2_ES250 - Harvard SEAS ES250 – Information Theory...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online