This preview shows page 1. Sign up to view the full content.
Unformatted text preview: the minimum value for H (X, Y, Z )?
4. Bottleneck.
Suppose a (nonstationary) Markov chain starts in one of n states, necks down to k < n
states, and then fans back to m > k states. Thus X1 → X2 → X3 , X1 ∈ {1, 2, . . . , n},
2. The value .of ,a }, X ∈ {1, 2, . . . , m}, and p(x , x , x ) = p(x )p(x x )p(x x ).
X2 ∈ {1, 2, . . k question.
3
1
2
3
1
21
32
Let X ∼ p(x), x = 1, 2, . . . , m.
(a) Show that the ⊆University X We ask3 UrbanaChampaign
We are given a set Sdependence ,of of 1Illinois, whether X by S and receive the answer
{1, 2, . . . m}. and X is limited ∈ the bottleneck by proving
that I (X1 ; X3 ) ≤ log k. ECE 563, Information Theory
,Fall 2011
if X that
(b) Evaluate I (X1 ; X3 ) for k = 1, and1conclude∈ S no dependence can survive such
Y=
0, if X ∈ S.
a bottleneck. HW #2 Suppose Pr{X ∈ S } = α. Posted:
5. Find the decrease in uncertainty H (X September 13th, 2011
Conditional mutual information. ) − H (X Y ).
Due: September
Consider a sequence with a given probability α 22nd, ,2011 . , X . Each
Apparently any set S of n binary random variablesas good,as .anynother. nsequence
is X1 X2 .
−(n−1)
with an even number of 1’s has probability 2
and each nsequence with an odd
number of 1’s has probability 0. Find the mutual informations
Problem 1:
3. Random questions.
(X1 ; X , I (X2 ; X3 X1 ), . . I ( pn−1 A question n− )
One wishes to Iidentify2 )a random object . X , ∼ X(x).; Xn X1 , . . . , XQ 2∼. r (q ) is asked
at random according to r (q ). This results in a deterministic answer A = A(x, q ) ∈
{a1 , a2 , . . .}. Suppose the object X and the question Q are independent. Then
6. I (X ; Q, A) is the uncertainty in X removed by the questionanswer (Q, A).
Fano’s inequality.
Let Pr(X I (X ; Q, pi ) i= H1,AQ.). , Interpret. p1 ≥ p2 ≥ p3 ≥ · · · ≥ pm . The minimal
(a) Show = i) = A, = ( 2, . . m and let
ˆ
probability of error predictor of X is X = 1, with resulting probability of error Pe =
(b) p . Maximize H (p) two i.i.d. questions Q1 , Q2 ∼ p (q= are to ﬁnd a bound answers
Now suppose that subject to the constraint 1 − r ) P asked, eliciting on P in
1− 1
1
e
e
A of H . This is Fano’s two questions are less valuable than twice
terms 1 and A2 . Show thatinequality in the absence of conditioning. the value of a
single question in the sense that I (X ; Q1 , A1 , Q2 , A2 ) ≤ 2I (X ; Q1 , A1 ).
Problem 2. 7. Random box size. 1
An ndimensional rectangular box with sides X1 , X2 , X3 , . . . , Xn is to be constructed.
The volume is Vn = n=1 Xi . The edgelength l of an ncube with the same volume as
i
1
the random box is l = Vn /n . Let X1 , X2 , . . . be i.i.d. uniform random variables over
the interval [0,a].
1
Find limn→∞ Vn /n , and compare to (EVn )1/n . Clearly the expected edge length does
not capture the idea of the volume of the box. Problem 3. 10. Entropy of a disjoint mixture.
Let X1 and X2 be discrete random variables drawn according to probability mass
functions p1 (·) and p2 (·) over the respective alphabets X1 = {1, 2, . . . , m} and X2 =
{m + 1, . . . , n}. Notice that these sets do not intersect. Let
X= X1 , with probability α,
X2 , with probability 1 − α. (a) Find H (X ) in terms of H (X1 ) and H (X2 ) and α.
2
(b) Maximize over α to show that 2H (X ) ≤ 2H (X1 ) + 2H (X2 ) and interpret using the
notion that 2H (X ) is the eﬀective alphabet size.
(c) Let X1 and X2 be uniformly distributed over their alphabets. What is the maximizing α and the associated H (X )? 11. Entropy of a random tree.
Consider the following method of generating a random tree with n nodes. First expand
the root node: X1 8. An AEPlike limit and the AEP.
Problem 4. (a)
p(x). Find
ECE 563 Let X1 , X2 , . . . be i.i.d. drawn according to probability mass function Fall 2010
1
September 7, 2010
lim [p(X1 , X2 , . . . , Xn )] n .
n→∞
HOMEWORK ASSIGNMENT 2
(b) Let Reading: .Sections 2.92.10 according to the following distribution:
X1 , X2 , . . be drawn i.i.d. and Chapter 3 of Cover & Thomas
Due Date: September 16, 2010 (in class) 2 2, 5 1. Problem 2.42 (Inequalities) on page 52Xi = 3,
of text. 2
5 4, 1
2. Problem 2.32 (Fano) on page 50 of text. (Note: This is similar to Kevin’s example
5
that we talked about in class.)
Find the limiting behavior of the product
3. Problem 2.39 (Entropy and pairwise independence) on page 52 of text.
(X1 X2 · · · Xn )1/n .
4. Problem 3.1 (Markov’s Inequality and Chebyshev’s Inequality) on page 64 of text.
1
(c) Evaluate the limit of p(X1 , X2 , . . . , Xn ) n for the distribution in part b.
5. Problem 3.6 (AEPlike limit) on page 66 of text. 6. Problem 5. (AEP and source coding) on page 66 of text.
Problem 3.7
9. AEP.
7. Let random variables ∼ 1 ,(X2y ). .We n be independent ∼ p(x), x ∈ of .the hypothesis that
Let (Xi , Yi ) be i.i.d. X p x, , . . , X form the log likelihood ratio X Let Nx denote the
number Y are independent symbol x in a giventhat X and1Yx2 , . . .dependent.empirical
X and of occurences of a vs. the hypothesis sequence x , are , xn . The What is
probabilityof
the limit mass function is deﬁned by
1
p (X n )p (Y n )
log Nx n n ?
pn (x) = p(X for x)∈ X
ˆn
, ,Y
n
(a) Show that
3
p(x1 , x2 , . . . , xn ) =
p(x)Nx
x ∈X and 1
p
p
− log p(x1 , x2 , . . . , xn ) = H (ˆn ) + D(ˆn p)
n
(b) For a given x1 , x2 , . . . , xn what is
max p(x1 , x2 , . . . , xn )
p where the maximization is over all probability mass functions on X ? What probability
mass function achieves this maximum likelihood?
8. In class we proved the following theorem:
Theorem (AEP Converse): Let X1 , X2 , . . . be i.i.d. with entropy H . For any sequence
of sets B (n) ∈ X n , if limn→∞ P(B (n) ) = 1, then
lim inf
n→∞ 1
log B (n)  ≥ H
n (X1 X2 · · · Xn )1/n .
1 (c) Evaluate the limit of p(X1 , X2 , . . . , Xn ) n for the distribution in part b.
Problem 6. 9. AEP.
Let (Xi , Yi ) be i.i.d. ∼ p(x, y ). We form the log likelihood ratio of the hypothesis that
X and Y are independent vs. the hypothesis that X and Y are dependent. What is
the limit of
1
p (X n )p (Y n )
log
?
n
p (X n , Y n )
3 ...
View
Full
Document
This note was uploaded on 10/24/2011 for the course ELECTRICAL ECE 571 taught by Professor Kelly during the Spring '11 term at University of Illinois, Urbana Champaign.
 Spring '11
 Kelly

Click to edit the document details