EE 376B/Stat 376B Information Theory Prof. T. Cover Solutions to Practice Final Each problem is worth 20 points. 1. Kolmogorov complexity
6
Handout #26 Thursday, June 1, 2006
n

n
?
What is the Kolmogorov complexity (to first order) of a squa
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #19 Tuesday, May 30, 2006
Solutions to Homework Set #7 1. Images. Consider an n n array x of 0's and 1's . Thus x has n2 bits.
(a)
(b)
(c)
Find the Kolmogorov complexity K(x  n) (to
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #16 Thursday, May 18, 2006 Due Thursday, May 25, 2006
Homework Set #7 1. Images. Consider an n n array x of 0's and 1's . Thus x has n2 bits.
(a)
(b)
(c)
Find the Kolmogorov complexit
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #20 Thursday, May 25, 2006 Due Thursday, June 1, 2006
Homework Set #8 1. Universal data compression. Consider three possible source distributions on X , Pa = (0.7, 0.2, 0.1), Pb = (0.1, 0.
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #27 Tuesday, June 6, 2006
Solutions to Homework Set #8 1. Universal data compression. Consider three possible source distributions on X , Pa = (0.7, 0.2, 0.1), Pb = (0.1, 0.7, 0.2), Pc = (
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #24 Thursday, June 1, 2006 Due Tuesday, June 6, 2006
Homework Set #9 1. Horse race. Three horses run a race. A gambler offers 3for1 odds on each horse. These are fair odds under the assu
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #29 Tuesday, June 8, 2006
Solutions to Homework Set #9 1. Horse race. Three horses run a race. A gambler offers 3for1 odds on each horse. These are fair odds under the assumption that al
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #17 Tuesday, May 23, 2006
Solutions to Homework Set #6 1. SlepianWolf for deterministically related sources. Find and sketch the SlepianWolf rate region for the simultaneous data compres
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #15 Tuesday, May 16, 2006
Solutions to Homework Set #5 1. Gaussian multiple access. A group of m users, each with power P , is using a Gaussian multiple access channel at capacity, so that
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #14 Thursday, May 11, 2006 Due Thursday, May 18, 2006
Homework Set #6 1. SlepianWolf for deterministically related sources. Find and sketch the SlepianWolf rate region for the simultaneo
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #6 Thursday, April 13, 2006 Due Thursday, April 20, 2006
Homework Set #2
1. Maximum entropy with marginals. What is the maximum entropy probability mass function p(x, y) with the followin
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #3 Thursday, April 6, 2005 Due Thursday, April 13, 2005
Homework Set #1 1. Monotonicity of entropy per element. For a stationary stochastic process X1 , X2 , . . . , Xn , show that H(X1 ,
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #8 Thursday, April 20, 2006 Due Thursday, April 27, 2006
Homework Set #3
1. Maximum entropy discrete processes. (a) Find the maximum entropy rate binary stochastic process {Xi } , Xi {0,
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #10 Thursday, April 27, 2006
Solutions to Homework Set #2
1. Maximum entropy with marginals. What is the maximum entropy probability mass function p(x, y) with the following marginals? Yo
EE376B/Stat 376B Information Theory Prof. T. Cover
Handout #9 Thursday, April 27, 2006 Due Thursday, May 4, 2006
Homework Set #4 1. Multiple layer waterfilling 1 Let C(x) = 2 log(1 + x) denote the channel capacity of a Gaussian channel with signal
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #11 Thursday, April 27, 2006
Solutions to Homework Set #3
1. Maximum entropy discrete processes. (a) Find the maximum entropy rate binary stochastic process {Xi } , Xi {0, 1}, i= satisf
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #13 Tuesday, May 9, 2006
Solutions to Homework Set #4 1. Multiple layer waterfilling 1 Let C(x) = 2 log(1 + x) denote the channel capacity of a Gaussian channel with signal to noise ratio
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #12 Thursday, May 4, 2006 Due Thursday, May 11, 2006
Homework Set #5 1. Gaussian multiple access. A group of m users, each with power P , is using a Gaussian multiple access channel at cap
Preface
Books are individual and idiosyncratic. In trying to understand what makes a good book, there is a limited amount that one can learn from other books; but at least one can read their prefaces, in hope of help. Our own research shows that authors u