EE 376A
Information Theory
Prof. T. Weissman
Thursday, Feb. 4th, 2010
Solution, Homework Set #3
1.
Venn diagrams.
Consider the following quantity:
I
(
X
;
Y
;
Z
) =
I
(
X
;
Y
)

I
(
X
;
Y

Z
)
.
This quantity is symmetric in
X
,
Y
and
Z
, despite the preceding asymmetric definition.
Unfortunately,
I
(
X
;
Y
;
Z
) is not necessarily nonnegative. Find
X
,
Y
and
Z
such that
I
(
X
;
Y
;
Z
)
<
0, and prove the following two identities:
(a)
I
(
X
;
Y
;
Z
) =
H
(
X, Y, Z
)

H
(
X
)

H
(
Y
)

H
(
Z
) +
I
(
X
;
Y
) +
I
(
Y
;
Z
) +
I
(
Z
;
X
)
(b)
I
(
X
;
Y
;
Z
) =
H
(
X, Y, Z
)

H
(
X, Y
)

H
(
Y, Z
)

H
(
Z, X
)+
H
(
X
)+
H
(
Y
)+
H
(
Z
)
The first identity can be understood using the Venn diagram analogy for entropy and
mutual information. The second identity follows easily from the first.
Solutions:
Venn Diagrams.
To show the first identity,
I
(
X
;
Y
;
Z
)
=
I
(
X
;
Y
)

I
(
X
;
Y

Z
)
by definition
=
I
(
X
;
Y
)

(
I
(
X
;
Y, Z
)

I
(
X
;
Z
))
by chain rule
=
I
(
X
;
Y
) +
I
(
X
;
Z
)

I
(
X
;
Y, Z
)
=
I
(
X
;
Y
) +
I
(
X
;
Z
)

(
H
(
X
) +
H
(
Y, Z
)

H
(
X, Y, Z
))
=
I
(
X
;
Y
) +
I
(
X
;
Z
)

H
(
X
) +
H
(
X, Y, Z
)

H
(
Y, Z
)
=
I
(
X
;
Y
) +
I
(
X
;
Z
)

H
(
X
) +
H
(
X, Y, Z
)

(
H
(
Y
) +
H
(
Z
)

I
(
Y
;
Z
))
=
I
(
X
;
Y
) +
I
(
X
;
Z
) +
I
(
Y
;
Z
) +
H
(
X, Y, Z
)

H
(
X
)

H
(
Y
)

H
(
Z
)
.
To show the second identity, simply substitute for
I
(
X
;
Y
),
I
(
X
;
Z
), and
I
(
Y
;
Z
) using
equations like
I
(
X
;
Y
) =
H
(
X
) +
H
(
Y
)

H
(
X, Y
)
.
These two identities show that
I
(
X
;
Y
;
Z
) is a symmetric (but not necessarily nonneg
ative) function of three random variables.
2.Conditional entropy.Under what conditions doesH(Xg(Y)) =H(XY)?
3.
Sequence length.
How much information does the length of a sequence give about the content of a
sequence? Suppose we consider a Bernoulli(1
/
2) process
{
X
i
}
.
You've reached the end of your free preview.
Want to read all 7 pages?
 Spring '10
 sd
 Probability theory, probability density function, typical set