EE 376A/Stat 376A
Handout #9
Information Theory
Thursday, January 27, 2011
Prof. T. Cover
Homework Set #2 Solutions
1.
Entropy and pairwise independence.
Let
X,Y,Z
be three binary Bernoulli(
1
2
) random variables that are pairwise indepen
dent; that is,
I
(
X
;
Y
) =
I
(
X
;
Z
) =
I
(
Y
;
Z
) = 0.
(a) Under this constraint, what is the minimum value for
H
(
X,Y,Z
)?
(b) Give an example achieving this minimum.
(c) Now suppose that
X,Y,Z
are three random variables each uniformly distributed
over the alphabet
{
1
,
2
,...,m
}
. Again, they are pairwise independent. What is
the minimum value for
H
(
X,Y,Z
)?
Solution: Entropy and pairwise independence.
(a) Due to pairwise independence and the marginal distributions,
H
(
X,Y
) =
H
(
X
)+
H
(
Y
) = 2. Thus,
H
(
X,Y,Z
) =
H
(
X,Y
) +
H
(
Z

X,Y
)
=
H
(
X
) +
H
(
Y
) +
H
(
Z

X,Y
)
≥
H
(
X
) +
H
(
Y
)
= 2
with equality if and only if
Z
is a deterministic function of
X
and
Y
. This minimum
is achieved by the example in part (b).
(b) Let
Z
=
X
⊕
Y
, where
⊕
denotes XOR. It is easy to check that all the marginal
distributions are satisﬁed, as well as pairwise independence.
(c) Here is one possible solution. Without loss of generality, relabel the alphabet to
be
{
0
,
1
,
2
,...,m
−
1
}
instead of
{
1
,
2
,...,m
}
. Let
Z
=
X
+
Y
mod
m
. Then
H
(
X,Y,Z
) =
H
(
X
) +
H
(
Y
) = log
m
+ log
m
= 2 log
m.
One can then justify that under this construction, all the marginal distributions are
satisﬁed, and we also have pairwise independence. For example, to argue pairwise
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Z
and
X
,
I
(
X
;
Z
) =
H
(
Z
)
−
H
(
Z

X
)
= log
m
−
H
(
X
+
Y

X
)
= log
m
−
H
(
Y

X
)
= log
m
−
H
(
Y
)
= log
m
−
log
m
= 0
.
Thus,
X
and
Z
are independent.
2.
The value of a question.
Let
X
∼
p
(
x
)
,
x
= 1
,
2
,...,m
.
We are given a set
S
⊆ {
1
,
2
,...,m
}
. We ask whether
X
∈
S
and receive the answer
Y
=
{
1
,
if
X
∈
S
0
,
if
X
̸∈
S.
Suppose Pr
{
X
∈
S
}
=
α
.
Find the decrease in uncertainty
H
(
X
)
−
H
(
X

Y
).
Apparently any set
S
with a given probability
α
is as good as any other.
Solution: The value of a question.
H
(
X
)
−
H
(
X

Y
) =
I
(
X
;
Y
)
=
H
(
Y
)
−
H
(
Y

X
)
=
H
(
α
)
−
H
(
Y

X
)
=
H
(
α
)
since
H
(
Y

X
) = 0.
3.
Random questions.
One wishes to identify a random object
X
∼
p
(
x
). A question
Q
∼
r
(
q
) is asked
at random according to
r
(
q
). This results in a deterministic answer
A
=
A
(
x,q
)
∈
{
a
1
,a
2
,...
}
.
Suppose the object
X
and the question
Q
are independent.
Then
I
(
X
;
Q,A
) is the uncertainty in
X
removed by the questionanswer (
Q,A
).
(a) Show
I
(
X
;
Q,A
) =
H
(
A

Q
). Interpret.
(b) Now suppose that two i.i.d. questions
Q
1
,Q
2
∼
r
(
q
) are asked, eliciting answers
A
1
and
A
2
. Show that two questions are less valuable than twice the value of a
single question in the sense that
I
(
X
;
Q
1
,A
1
,Q
2
,A
2
)
≤
2
I
(
X
;
Q
1
,A
1
).
2
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '08
 Staff
 Probability theory, probability density function, Tn, X1

Click to edit the document details