This preview shows pages 1–2. Sign up to view the full content.
Introduction to Information Theory (67548)
January 12, 2009
Assignment 4: Gaussian Channel and Diﬀerential Entropy
Lecturer: Prof. Michael Werman
Due: Sunday, January 25, 2009
Note: Unless speciﬁed otherwise, all entropies and logarithms should be taken with
base
2
.
Problem 1 Diﬀerential Entropy
1. Let
X
be a continuous random variable, with entropy
h
(
X
). Let
Y
be another continuous random
variable, deﬁned via the relation
Y
=
aX
+
c
where
a,c
are scalars and
a
6
= 0. Find an expression
for
h
(
Y
) as a function of
h
(
X
)
,a,c
.
2. Prove the chain rule for diﬀerential entropy: if
X
1
,...,X
n
are continuous random variables, it
holds that
h
(
X
1
,...,X
n
) =
n
X
i
=1
h
(
X
i

X
1
,...,X
i

1)
.
Conclude that
h
(
X
1
,...,X
n
)
≤
∑
n
i
=1
h
(
X
i
).
Problem 2 Discrete Input, Continuous Output
Consider a channel whose input alphabet is
X
=
{
0
,
±
1
,
±
2
}
, and whose output is
Y
=
X
+
Z
, where
Z
is uniformly distributed over the interval [

1
,
1]. Thus, the input of the channel is a discrete random
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '08
 MichaelWerman

Click to edit the document details