EE376B/Stat 376B Information Theory Prof. T. Cover
Handout #5 Tuesday, April 12, 2011
Solutions to Homework Set #1 1. Differential entropy. Evaluate the differential entropy h(X) = - f ln f for the following: (a) The Laplace density, f (x) = 1 e-|x| . 2 n
EE 376B/Stat 376B
Information Theory
Prof. T. Cover
Handout #20
Saturday, June 5, 2011
Practice Final Exam 2
1. Entropy rate. (10)
Let cfw_Zn be i.i.d. N (0, 2 ). What is the dierential entropy rate h(X ) of the
stationary process
1
1
Xn+1 = Xn + Xn1 + Z
EE 376B
Information Theory
Prof. T. Cover
Handout #19
Thursday, June 2, 2011
Solutions to practice Final Examination
1. Slepian Wolf (30 pts) We do a coin-ip experiment repeatedly. In each experiment, we keep ipping a biased coin with probability p of get
EE 376B/Stat 376B
Information Theory
Prof. T. Cover
Handout #21
Tuesday, June 5, 2011
Solutions to Practice Final Exam 2
1. Entropy rate. (10)
Let cfw_Zn be i.i.d. N (0, 2 ). What is the dierential entropy rate h(X ) of the
stationary process
1
1
Xn+1 =
EE 376A
Information Theory
TTh 11-12:15pm, Building 540, Room 103
Handout #1
Tuesday, March 30, 2011
T. Cover
Information Questionnaire
Name:
Major:
Year:
BS,
MS,
PhD,
> PhD
Credit?/Audit?:
EE376A?
Other Relevant courses:
Have you taken Network Informa
EE 376A
Information Theory
T. Cover
Handout #2
Tuesday, March 30, 2011
Information Theory
Course Information
Web Page:
http:/www.stanford.edu/class/ee376b/
Instructor:
Prof. Tom Cover
Packard 254
723-4505
cover@stanford.edu
Oce hours: Wednesday 2-3 pm. Ge
EE376B
Information Theory
Prof. T. Cover
Handout #3
Thursday, April 5, 2011
Due Thursday, April 12, 2011
Homework Set #1
1. Dierential entropy
Evaluate the dierential entropy h(X ) = f ln f for the following:
(a) The Laplace density, f (x) = 1 e|x| .
2
ne
EE376B/Stat 376B Information Theory Prof. T. Cover
Handout #17 Tuesday, May 31, 2011
Solutions to Homework Set #7 1. Growth rate. Let
cfw_
X=
(1, a), (1, 1/a),
with probability 1/2 , with probability 1/2
where a > 1. This vector X represents a stock marke
EE 376B Information Theory Prof. T. Cover
Handout #16 Tuesday, May 31, 2011
Solutions to Homework Set #6 1. One bit quantization of a single Gaussian random variable Let X N (0, 2 ) and let the distortion measure be squared error. Here we do not allow blo
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #14 Thursday, May 26, 2011
Solutions to Homework Set #5
1. Random program. Will the sun rise tomorrow? Suppose that a random program (symbols i.i.d. uniform over the symbol set) is fed into the n
EE 376B Information Theory Prof. T. Cover
Handout #11 Tuesday, May 17, 2011 Prepared by T.A. Gowtham Kumar
Solutions to Homework Set #4 1. Maximum Entropy and Counting Let X = cfw_1, 2, . . . , m. Show that the number of sequences xn X n satisfying n 1 nH
EE 376B Information Theory Prof. T. Cover
Handout #9 Tuesday, May 2, 2011
Solutions to Homework Set #3 1. The cooperative capacity of a multiple access channel.
(W1 , W2 )
BX 1 r rr rr n jX
2
n
E
p(y|x1 , x2 )
E
EYn
E (W ^
1 , W2 )
^
Figure 1: Multiple ac
EE 376B Information Theory Prof. T. Cover
Handout #7 Thursday, April 21, 2011
Solutions to Homework Set #2 1. Multiple layer waterfilling Let C(x) = 1 log(1 + x) denote the channel capacity of a Gaussian channel with signal 2 to noise ratio x. Show ( ) (
EE 376B
Information Theory
Prof. T. Cover
Handout #18
Thursday, June 2, 2011
Practice Final Examination
1. Slepian Wolf (30 pts) We do a coin-ip experiment repeatedly. In each experiment, we keep ipping a biased coin with probability p of getting a head
u