472 Tutorial Sheet 2 Solutions to q3 and 41
18 Febuary 2008
3 (C&T 2.9) Let X1 and X2 be discrete random variables drawn accorind to distributions p1 and p2 from their respective alphabets X1 = cfw_1,
472 Tutorial Sheet 3, Solutions to q1 and 21
27 Febuary 2008
1. (C&T 2.2) Entropy of functions. Let X be a random variable taking on a nite
number of values. What is the general inequality relating H
472 Tutorial Sheet 3, Solution to q31
28 Febuary 2008
3 (C&T 2.8) Drawing with and without replacement. An urn contains r red, w white
and b black balls. Which has higher entropy, drawing k 2 balls fr
472 Tutorial Sheet 4, Solutions1
20 March 2008
1. (C&T 2.14) Enropy of a sum. Let X and Y be random variables that take on values
x1 , x2 , . . ., xr and y1 , y2 , . . . , ys respectively. Let Z = X +
472 Tutorial Sheet 6: solution to q2.1
28 April 2008
2. An alternative divergence is the -divergence,
D (p q ) = DKL (p p + (1 )q ) + (1 )DKL (q p + (1 )q )
(1)
Unlike the KL divergence, this is symme
472 Tutorial Sheet 5: solutions for q1 and q2.1
20 April 2008
1. 1 (C&T 2.43b) A fair six-sided dice is rolled. What is the mutual information between
the top face of the dice and the side most facing
472 Tutorial Sheet 6: solution to q31
28 April 2008
3 For = 1/2, the -divergence is known as Jensen-Shannon divergence. Show this
satises
(a) DJS (p1 , p2 ) 0 with equality if and only if p1 = p2 .
(b
472 Tutorial Sheet 5: solution to question 31
1 May 2008
1. 3 (C&T 2.32) Fano. X and Y are two independent variables with X = cfw_1, 2, 3 and
Y = cfw_a, b, c. p(1, a) = p(2, b) = p(3, c) = 1/6, all th
472 Tutorial Sheet 6: solution to q11
28 April 2008
1. (C&T 2.37) Let X ,Y and Z be three random variable with a joint probability distribution p(x, y, z ). The relative entropy between the joint dist
472 Tutorial Sheet 5: solution for q41
28 April 2008
4. (C&T 2.35 and 2.36). Consider two distributions over the set cfw_a, b, c: p(a) = 1/2 and
p(b) = p(c) = 1/4 and q (a) = q (b) = q (c) = 1/3. Find
472 Tutorial Sheet 21
11 Febuary 2008
1. (C&T 2.6) Find joint random variables X , Y and Z such that
(a) I (X ; Y |Z ) < I (X ; Y )
(b) I (X ; Y |Z ) > I (X ; Y )
2. (C&T 2.7) Suppose that one has n c
472 Tutorial Sheet 51
27 Febuary 2008
1. (C&T 2.43b) A fair six-sided dice is rolled. What is the mutual information between
the top face of the dice and the side most facing you?
2. (C&T 2.29) For ra
472 Tutorial Sheet 81
10 April 2008
1. (C& T 5.2) Let p(Si ) = pi for some set of outcomes cfw_S1 , S2 , . . . , Sn . The Si s are
uniquely encoded into strings from a D -symbol alphabet in a uniquely
472 Tutorial Sheet 41
20 Febuary 2008
1. (C&T 2.14) Enropy of a sum. Let X and Y be random variables that take on values
x1 , x2 , . . ., xr and y1 , y2 , . . . , ys respectively. Let Z = X + Y .
(a)
472 Tutorial Sheet 61
19 March 2008
1. (C&T 2.37) Let X ,Y and Z be three random variable with a joint probability distribution p(x, y, z ). The relative entropy between the joint distribution and the
472 Tutorial Sheet 31
13 Febuary 2008
1. (C&T 2.2) Entropy of functions. Let X be a random variable taking on a nite
number of values. What is the general inequality relating H (X ) and H (Y ) if
(a)
472 Tutorial Sheet 2 Solutions to q1 and 21
11 Febuary 2008
1. (C&T 2.6) Find joint random variables X , Y and Z such that
(a) I (X ; Y |Z ) < I (X ; Y )
(b) I (X ; Y |Z ) > I (X ; Y )
Solution: So th