EE229A Spring 2011 HW6 solutions
Solution to Problem 1
(a) If X is the range of values that X takes on with non-zero probability, then
E[X] =
x Pr(X = x)
xX
If for all x X , x > E(X), then the R.H.S. is greater than E[X], a contradiction. Hence, there exi
EE 229A: Information Theory and Coding
Final Examination
11:30 - 2:30pm, May 9,2011
Total number of questions: 6
Total points:100
Please give clear justications to all your answers except those for Q. 1. Write all your answers
in the exam booklet. Dont fo
EE229A: Information Theory and Coding
Spring 2011
Homework 3
Lecturer: David Tse
Prob 3.2. AEP and mutual information. Let (Xi , Yi) be i.i.d. p(x, y). We form the log
likelihood ratio of the hypothesis that X and Y are independent vs. the hypothesis
that
EE229A: Information Theory and Coding
SP11
Homework 2
Lecturer: David Tse
Problem 2.35 Relative entropy is not symmetric: Let the random variable X have
three possible outcomes cfw_a, b, c. Consider two distributions on this random variable
Symbol p(x) q(
EE229A Spring 2011 HW1 solutions
Solution to Problem 2.1
1. The number X of tosses till the rst head appears has the geometric
distribution with parameter p = 1/2, where P (X = n) = pq n1 , n
cfw_1, 2, . . .. Hence the entropy of X is
H(X)
=
pq n1 log(pq
EE229A Spring 2011 HW4 solutions
Solution to Problem 5.4
Examples of Human codes.
1. The Human
Codeword
1
00
011
01000
01001
01010
01011
tree for this distribution is
x1
x2
x3
x4
x5
x6
x7
0.49
0.26
0.12
0.04
0.04
0.03
0.02
0.49
0.26
0.12
0.05
0.04
0.04
0.
Please justify all your answers except for Q. 1. Write all your answers in the separate sheets
provided.
[14] 1. X,Y, Z are three discrete-valued random variables taking values in alphabets X , Y ,Z
respectively. Are these statements necessarily True? If
EE229A: Information Theory and Coding
SP11
Homework 5
Lecturer: David Tse
Problem 7.5 Using two channels at once. Consider two discrete memoryless channels (X 1 , p(y1 | x1 ), Y1 ) and (X2 , p(y2 | x2 ), Y2 ) with capacities C1 and C2 respectively. A
new
EE229A: Information Theory and Coding
Spring 2011
Homework 9
Lecturer: David Tse
Prob 15.8. Slepian-Wolf for deterministically related sources. Find and sketch the SlepianWolf rate region for the simultaneous data compression of (X, Y ), where y = f (x) i
EE229A: Information Theory and Coding
SP11
Homework 8
Lecturer: David Tse
Problem 10.5 Rate distortion for uniform source with Hamming distortion.
Consider a source X uniformly distributed on the set cfw_1, 2, . . . , m. Find the rate distortion
function
EE229A: Information Theory and Coding
Spring 2011
Homework 7
Lecturer: David Tse
Prob 8.1. Dierential entropy. Evaluate the dierential entropy h(X) = f ln f for the
following:
(a) The exponential density, f (x) = ex , x 0.
(b) The Laplace density, f (x) =
Please justify all your answers except for Q. 1. Write all your answers in the separate sheets
provided.
[14] 1. X,Y, Z are three discrete-valued random variables taking values in alphabets X , Y ,Z
respectively. Are these statements necessarily True? If
Sample Midterm Solutions
The examination is for 120 minutes. The maximum score is 70 points. Your answers
should be unambiguous. Please show all work to allow for the possibility of partial credit.
1. (8 points)
I have a coin in my pocket. It is either fa