Entropy, Relative Entropy
and Mutual Information Exercises
Exercise 2.1
:
Coin Flips
. A fair coin is flipped until the first head occurs. Let
X
denote the number of flips
required.
(a) Find the entropy
H
(
X
) in bits. The following expressions may be useful:
∞
X
n
=1
r
n
=
r
1

r
,
∞
X
n
=1
nr
n
=
r
(1

r
)
2
(1)
(b) A random variable
X
is drawn according to this distribution. Find an “efficient” sequence
of yesno questions of the form, “Is
X
contained in the set
S
?”
Compare
H
(
X
) to the
expected number of questions required to determine
X
.
Solution
:
The probability for the random variable is given by
P
{
X
=
i
}
= 0
.
5
i
. Hence,
H
(
X
) =

X
i
p
i
log
p
i
=

X
i
0
.
5
i
log(0
.
5
i
)
=

log(0
.
5)
X
i
i
·
0
.
5
i
=
0
.
5
(1

0
.
5)
2
= 2
(2)
Exercise 2.3
:
Minimum entropy
. What is the minimum value of
H
(
p
1
, . . . , p
n
) =
H
(
p
) as
p
ranges over the
set of
n
dimensional probability vectors? Find all
p
’s which achieve this minimum.
Solution
:
Since
H
(
p
)
≥
0 and
∑
i
p
i
= 1, then the minimum value for
H
(
p
) is 0 which is achieved when
p
i
= 1 and
p
j
= 0
, j
6
=
i
.
Exercise 2.11
:
Average entropy
. Let
H
(
p
) =

p
log
2
p

(1

p
) log
2
(1

p
) be the binary entropy function.
(a) Evaluate
H
(1
/
4).
(b) Calculate the average entropy
H
(
p
) when the probability
p
is chosen uniformly in the range
0
≤
p
≤
1.
1
Solution
:
(a)
H
(1
/
4) =

1
/
4 log
2
(1
/
4)

(1

1
/
4) log
2
(1

1
/
4)
= 0
.
8113
(3)
(b)
¯
H
(
p
) =
E
[
H
(
p
)]
=
Z
∞
∞
H
(
p
)
f
(
p
)
dp
(4)
Now,
f
(
p
) =
1
,
0
≤
p
≤
1
,
0
,
otherwise.
You've reached the end of your free preview.
Want to read all 6 pages?
 Spring '10
 sd
 Information Theory, Probability theory, Randomness