Mathematical methods in communication
2nd Semester 2009
Homework Set #4
Channel and Source coding
1.
Lossless source coding with side information.
Consider the lossless source coding with side information that is avail
able at the encoder and decoder, where the source
X
and the side
information
Y
are i.i.d.
∼
P
X,Y
(
x, y
).
X
n
f
(
X
n
, Y
n
)
∈ {
1
,
2
, ...,
2
nR
}
Encoder
Decoder
Y
n
Y
n
ˆ
X
n
(
f
(
X
n
, Y
n
)
, Y
n
)
Figure 1: Lossless source coding with side information at the encoder and
decoder.
Show that a code with rate
R < H
(
X

Y
) can not be achievable, and
interpret the result.
Hint: Let
T
defines
f
(
X
n
, Y
n
). Consider
nR
≥
H
(
T
)
≥
H
(
T

Y
n
)
,
(1)
and use similar steps, including Fano’s inequality, as we used in the
class to prove the converse where side information was not available.
2.
Preprocessing the output.
One is given a communication channel with transition probabilities
p
(
y

x
) and channel capacity
C
= max
p
(
x
)
I
(
X
;
Y
)
.
A helpful statisti
cian preprocesses the output by forming
˜
Y
=
g
(
Y
)
,
yielding a channel
p
(˜
y

x
). He claims that this will strictly improve the capacity.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 '09
 Information Theory, Discrete Memoryless Channels, Y. Let Y1

Click to edit the document details