Introduction to Information Theory (67548)
January 12, 2009
Assignment 4: Gaussian Channel and Differential Entropy
Lecturer: Prof. Michael Werman
Due: Sunday, January 25, 2009
Note:
Unless specified otherwise, all entropies and logarithms should be taken with
base
2
.
Problem 1
Differential Entropy
1. If
Y
=
aX
+
c
, we have that
f
Y
(
y
) =
1

a

f
X
(
y

c
a
)
, where
f
X
(
·
)
, f
Y
(
·
) are the density functions of
X, Y
respectively. Therefore,
h
(
Y
) =

∞
y
=
∞
f
Y
(
y
) log(
f
Y
(
y
))
dy
=

∞
y
=
∞
1

a

f
X
y

c
a
log
1

a

f
X
y

c
a
dy
=

∞
y
=
∞
1

a

f
X
y

c
a
log
f
X
y

c
a
dy

∞
y
=
∞
1

a

f
X
y

c
a
log
1

a

dy
=

∞
z
=
∞
f
X
(
z
) log(
f
X
(
z
))
dz

∞
z
=
∞
f
X
(
z
) log
1

a

dz
=
H
(
X
) + log(

a

)
,
where the shift from the third to the fourth line was performed with a change of variables
z
=
(
y

c
)
/a
.
So we see that shifting a random variable
X
by a constant
c
does not change its
differential entropy, while multipliying by a constant
a
changes its differential entropy by log(

a

).
2. The chain rule for differential entropy simply follows from the definitions (try it!), and so is the
resulting conclusion.
Problem 2
Discrete Input, Continuous Output
Recall that
I
(
X
;
Y
) =
h
(
Y
)

h
(
Y

X
). Given the value of
X
,
Y
is uniformly distributed in an interval of
length 2. Therefore,
h
(
Y

This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '08
 MichaelWerman
 Normal Distribution, Variance, Probability distribution, probability density function, Gaussian Mutual Information

Click to edit the document details