ECE 534: Elements of Information Theory, Fall 2010
Homework 12
Solutions (ALL DUE TO Kenneth S. Palacio Baus)
December 1, 2010
1.
Problem
15.7.
Convexity of capacity region of broadcast channel
. Let
C
⊆
R
2
be the ca
pacity region of all achievable rate pairs
R
= (
R
1
, R
2
) for the broadcast channel. Show that
C
is a convex set by using a timesharing argument. Specifically, show that if
R
(1)
and
R
(2)
are achievable,
λ
R
(1)
+ (1

λ
)
R
(2)
is achievable for 0
≤
λ
≤
1.
Solution:
We have two rate pairs that are achievable:
R
(1)
= (
R
(1)
1
, R
(1)
2
) and
R
(2)
= (
R
(2)
1
,
(2)
2
) for
which, we have two sequences of codes: ((2
nR
(1)
1
,
2
nR
(1)
2
)
, n
) and ((2
nR
(2)
1
,
2
nR
(2)
2
)
, n
). Like it
is done in the proof of
Theorem 15.3.2
for the case of the multipleaccess channel in the
textbook, here we can apply a similar argument and construct a third codebook of length
n
at a rate
λ
R
(1)
+(1

λ
)
R
(2)
using the first codebook for the first
λn
symbols, and the second
codebook for the last (1

λ
)
n
.
We have that the number of
X
1
codewords for the new code is given by:
2
nλR
(1)
1
2
n
(1

λ
)
R
(2)
1
= 2
n
(
λR
(1)
1
+(1

λ
)
R
(2)
1
)
,
And the number of
X
2
codewords for the new code is given by:
2
nλR
(1)
2
2
n
(1

λ
)
R
(2)
2
= 2
n
(
λR
(1)
2
+(1

λ
)
R
(2)
2
)
,
So, we have obtained a rate:
λ
R
(1)
+ (1

λ
)
R
(2)
. Recalling that the overall probability of
error is less that the sum of the probabilities of error for each of the segments:
P
(
n
)
e
≤
P
(
λn
)
e
(1) +
P
((1

λ
)
n
)
e
(2)
We see that the probability of error goes to 0 as
n
→ ∞
, hence the rate is achievable.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
2.
Problem
15.11.
Converse for the degraded broadcast channel
. The following chain of inequal
ities proves the converse for the degraded discrete memoryless broadcast channel.
Provide
reasons for each of the labeled inequalities.
Setup for converse for degraded broadcast channel capacity:
(
W
1
, W
2
)
indep
→
X
n
(
W
1
, W
2
)
→
Y
n
1
→
Y
n
2
 Encoding:
f
n
: 2
nR
1
×
2
nR
2
→ X
n
 Decoding:
g
n
:
Y
n
1
→
2
nR
1
, h
n
:
Y
n
2
→
2
nR
2
.
Let
U
i
= (
W
2
, Y
i

1
.
Then
nR
2
≤
Fano
I
(
W
2
;
Y
n
2
)
(1)
=
(
a
)
n
X
i
=1
I
(
W
2
;
Y
2
i

Y
i

1
2
)
(2)
=
(
b
)
X
i
(
H
(
Y
2
i

Y
i

1
2
)

H
(
Y
2
i

W
2
, Y
i

1
2
))
(3)
≤
(
c
)
X
i
(
H
(
Y
2
i
)

H
(
Y
2
i

W
2
, Y
i

1
2
, Y
i

1
1
))
(4)
=
(
d
)
X
i
(
H
(
Y
2
i
)

H
(
Y
2
i

W
2
, Y
i

1
1
))
(5)
=
(
e
)
X
i
I
(
U
i
;
Y
2
i
)
(6)
Solution:
Reasons for each of the labeled inequalities:
(a) is given by the Chain rule.
(b) corresponds to the definition of conditional mutual information.
(c) conditioning reduces entropy.
(d) since
Y
2
i
is conditionally independent of
Y
i

1
2
given
Y
i

1
1
.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '10
 NatashaDevroye
 Information Theory, Mutual Information, broadcast channel, capacity region

Click to edit the document details