This preview shows page 1. Sign up to view the full content.
Unformatted text preview: Homework Set #3
Homework Set #3
Homework Set #3
1. Random walk in a cube.
1. Random walk in a cube.
A bird ﬂies from room to room in a 3 × 3 × 3 cube (equally likely through each interior
1. Random walk in a cube.
A bird ﬂies from room to room in a 3 × 3 × 3 cube (equally likely through each interior
wall). ﬂies fromthe entropy rate? 3 × × 3 Fall (equally likely through each interior
A bird What is the entropy rate? aECE3563,cube 2011
wall). What is room to room in
University
wall). What is the entropy rate? of Illinois, UrbanaChampaign
HW #3
2. Entropy of graphs.
Issued: September 22nd, 2011
2. Entropy of graphs.
Consider ofrandom walk on a (connected) graph with 3 edges.
2. Entropy a random walk on a (connected) graph with 3 edges.
Due: October 4th, 2011
Consider a graphs.
Consider a random walk on a (connected) graph with 3 edges.
(a) Which graph has the lowest entropy rate? What is the rate?
(a) Which graph has the lowest entropy rate? What is the rate?
Problem 1:
(a) Which graph has the lowest entropy rate? What is the rate?
(b) Which has the highest entropy rate?
(b) Which has the highest entropy rate?
Problem 3.13 in has text, page 68. entropy rate?
(b) Which your the highest
Problem 2: 3. Stationary processes.
3. Stationary processes.
Let ......,,X− ,X00,,X11,,..... be a stationary (not necessarily Markov) stochastic process.
3. Stationary11,processes.. be a stationary (not necessarily Markov) stochastic process.
Let
X− X X
Which of −1 , following statements are true? Prove or provide a counterexample. s.
Let . . . ,of the X0 , X1 , . . .statements are true? Prove or provide a counterexample.
X the following be a stationary (not necessarily Markov) stochastic proces
Which
Which of the following statements are true? Prove or provide a counterexample.
(a) H ((XnX0)) = H ((X−nX0)) ..
(a) H Xn X0 = H X−n X0
(a) H ((X X0 ))= H ((X −XX )) ..
(b) H (XnnX0) ≥ H (X−n 11 0 )0 .
(b) H Xn X0 ≥ H Xn − X0
n
(b) H (Xn X0n−1 Xn+1 ) −1X0 ) .
(c) H ((XnXn)−≥, H nXn is nonincreasing in n..
Xn X11 1, X( +1 ) is nonincreasing in n
(c) H
n−1
2n+1
(c) H (Xn X1X n, Xn2n+1)isisnonincreasing in nn.
(d) H ((Xn+1 X1,,Xn+1 ) ) is nonincreasing in .n.
(d) H Xn+1 1n Xn+2
nonincreasing in
+2
2n+1
n
(d) H (Xn+1 X1 , Xn+2 ) is nonincreasing in n.
Problem 3: 4. Entropy rate.
4. Entropy rate.
4. Entropy be a stationary {0,,1}valued stochastic process obeying
Let {Xii } rate.stationary {0 1}valued stochastic process obeying
Let {X } be a
Let {Xi } be a stationary {0, 1}valued stochastic process obeying
Xkk+1 = Xkk ⊕ Xkk−11 ⊕ Zk+1
X +1 = X ⊕ X − ⊕ Zk+1
Xk+1 = Xk ⊕ Xk−1 ⊕ Zk+1
where {Zii} is Bernoulli(p)) and ⊕ denotes mod 2 addition. What is the entropy rate
where {Z } is Bernoulli(p and ⊕ denotes mod 2 addition. What is the entropy rate
where)? Zi} is Bernoulli(p) and ⊕ denotes mod 2 addition. What is the entropy rate
H ((X)?{
HX
H (X )?
5. Problem 4: has little to say about the future.
5. The past has little to say about the future.
The past
5. Theapast has little to say aboutX11,,X2future.
For a stationary stochastic process Xthe 2,,......,, show that
For stationary stochastic process
X
show that
For a stationary stochastic process X1 , X2 , . . . , show that
1
1
lim 1 I ((X1,,X22,,......,,Xn ;;Xn+1 ,,Xn+2 ,,......,,X22n ) = 0..
lim
I X1 X
Xn Xn+1 Xn+2
X n) = 0
n→∞ 2n
n→∞
lim 2nI (X1 , X2 , . . . , Xn ; Xn+1 , Xn+2 , . . . , X2n ) = 0.
n→∞ 2n
1
1
1
Problem 5:
Problem 4.1 in the text.
Problem 6:
Problem 4.12 in the text. 8. Problem 5.30 (Relative entropy is cost of miscoding) on page 151 of text.
9. Problem 5.32 (Bad wine) on page 153 of text.
10. Problem 4.6 (Monotonicity of entropy per element) on page 91 of text.
11. Problem 7:4.11 (Stationary processes) on page 93 of text.
Problem
12. Consider the stationary Markov chain with probability transition matrix 1/4 1/4 1/2
P = 3/8 3/8 1/4 3/8 3/8 1/4
where Pij = Pr{X2 = j X1 = i}. (a) Find the stationary probability distribution µ.
(b) Find the entropy rate of this process.
13. Consider a source that produces an i.i.d. Bernoulli ∼ θ sequence of random variables.
(a) Write a program (in your favorite language) to compute the arithmetic code corresponding to such a sequence of arbitrary length n.
(b) Write a program to compute the ZivLempel code for a sequence of arbitrary
length n. (This should work for any binary sequence of length n.)
(c) Now generate Bernoulli ∼ 0.1 sequences of various lengths on the computer and
compare the lengths of the corresponding arithmetic and ZivLempel codes. For
large n (I’m not sure how large, since I haven’t written the programs myself!),
the two codes should have approximately the same length. ...
View
Full
Document
 Spring '11
 Kelly

Click to edit the document details