EECS 229A
Spring 2007
*
*
Solutions to Homework 5
1. Problem 8.8 on pg. 258 of the text.
Solution
:
Channel with uniformly distributed noise
Consider the probability distribution (
α
i
,i
= 0
,
±
1
,
±
2) on
X
where
∑
i
α
i
= 1. This
results in
Y
having a density
f
Y
(
y
) =
1
2
α

2
if
y
∈
(

3
,

2)
1
2
(
α

2
+
α

1
)
if
y
∈
(

2
,

1)
1
2
(
α

1
+
α
0
)
if
y
∈
(

1
,
0)
1
2
(
α
0
+
α
1
)
if
y
∈
(0
,
1)
1
2
(
α
1
+
α
2
)
if
y
∈
(1
,
2)
1
2
α
2
if
y
∈
(2
,
3)
The corresponding diFerential entropy
h
(
Y
) equals the entropy of the probability distri
bution on 6 points given by
(
1
2
α

2
,
1
2
(
α

2
+
α

1
)
,
1
2
(
α

1
+
α
0
)
,
1
2
(
α
0
+
α
1
)
,
1
2
(
α
1
+
α
2
)
,
1
2
α
2
)
.
The largest this can be is log 6, with equality achieved when
α

2
=
α
0
=
α
2
=
1
3
, α

1
=
α
1
= 0
.
We also note that the conditional diFerential entropy
h
(
Y

X
) does not depend on the
probability distribution of
X
, and equals 0. Hence the capacity of the channel is log 6.
2. Problem 9.3 on pg. 291 of the text.
Solution
:
Output power constraint
We would expect that the capacity of the channel is given by
C
= max
p
(
x
)
I
(
X
;
Y
)
where the maximum is taken over all distributions
p
(
x
) on
X
for which the resulting
Y
given by
Y
=
X
+
Z
where
Z
q
X
and
Z
∼
N
(0
,σ
2
) satisﬁes
E
[
Y
2
]
≤
P
. Assuming this
is true, we can write, for any choice of
p
(
x
), and assuming that
P
≥
σ
2
,
I
(
X
;
Y
)
=
h
(
Y
)

h
(
Y

X
)
=
H
(
Y
)

1
2
log 2
πeσ
2
(
a
)
≤
1
2
log 2
πeP

1
2
log 2
πeσ
2
=
1
2
log
P
σ
2
,
1