# hw3sol - EE 376A/Stat 376A Handout #13 Information Theory...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 376A/Stat 376A Handout #13 Information Theory Tuesday, Feburary 3, 2011 Prof. T. Cover Solutions to Homework Set #3 1. Random walk in a cube. A bird ies from room to room in a 3 3 3 cube (equally likely through each interior wall). What is the entropy rate? Solution: Random walk in a cube. The entropy rate of a random walk on a graph with equal weights is given by equation 4.41 in the text: H ( X ) = log(2 E ) H ( E 1 2 E ,..., E m 2 E ) There are 8 corners, 12 edges, 6 faces and 1 center. Corners have 3 edges, edges have 4 edges, faces have 5 edges and centers have 6 edges. Therefore, the total number of edges E = 54. So, H ( X ) = log(108) + 8 ( 3 108 log 3 108 ) + 12 ( 4 108 log 4 108 ) + 6 ( 5 108 log 5 108 ) + 1 ( 6 108 log 6 108 ) =2 . 03 bits. 2. Entropy of graphs. Consider a random walk on a (connected) graph with 3 edges. (a) Which graph has the lowest entropy rate? What is the rate? (b) Which has the highest entropy rate? Solution: Entropy of graphs . There are three choices for graphs with 3 edges (See Figure 1). The entropy rate is given by: H = i i j P ij log P ij = i W i W log( W i ) 1 GRAPH 1 GRAPH 2 GRAPH 3 Figure 1: Graphs with three edges (a) Graph 1 { W i } = { 2 , 2 , 2 } H = 3 ( 2 6 log(2) ) = 1 (b) Graph 2 { W i } = { 1 , 1 , 1 , 3 } H = 3 6 log(3) = 0 . 79 (c) Graph 3 { W i } = { 1 , 2 , 2 , 1 } H = 2 ( 2 6 log(2) ) = 0 . 667 Thus, Graph 1 has highest entropy while Graph 3 has the lowest. 3. Stationary processes. Let ...,X- 1 ,X ,X 1 ,... be a stationary (not necessarily Markov) stochastic process. Which of the following statements are true? Prove or provide a counterexample. (a) H ( X n | X ) = H ( X- n | X ) . (b) H ( X n | X ) H ( X n- 1 | X ) . (c) H ( X n | X n- 1 1 ,X n +1 ) is nonincreasing in n . (d) H ( X n +1 | X n 1 ,X 2 n +1 n +2 ) is nonincreasing in n . Solution: Stationary processes. (a) H ( X n | X ) = H ( X- n | X ). This statement is true, since H ( X n | X ) = H ( X n ,X ) H ( X ) (1) H ( X- n | X ) = H ( X- n ,X ) H ( X ) (2) and H ( X n ,X ) = H ( X- n ,X ) by stationarity. (Note that Pr( X n = a | X = b ) = Pr( X = a | X n = b ) in general.) 2 (b) H ( X n | X ) H ( X n- 1 | X ). This statement is not true in general, though it is true for first order Markov chains. A simple counterexample is a periodic process with period N . Let X ,X 1 ,X 2 ,...,X N- 1 be i.i.d. Bern( 1 2 ) random variables and let X mN + k = X k for k = 0 ,...,N 1 and m = 1 , 1 , 2 , 2 ,... . Note that this is a stationary process. In this case, for n = mN , H ( X n | X ) = 0 and H ( X n- 1 | X...
View Full Document

## This note was uploaded on 04/05/2011 for the course EE 5368 taught by Professor Staff during the Spring '08 term at UT Arlington.

### Page1 / 8

hw3sol - EE 376A/Stat 376A Handout #13 Information Theory...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online