Introduction to Information Theory (67548)
January 27, 2009
Assignment 5 Solution
Lecturer: Prof. Michael Werman
Due:
Note: Unless speciﬁed otherwise, all entropies and logarithms should be taken with base
2
.
Problem 1
Another Error Correcting Codes Bound
1. Assume we have three diﬀerent code words,
x
1
,x
2
,x
3
. Without loss of generality, we may assume
that the ﬁrst 2
n/
3 bits of
x
1
,x
2
are all 0’s and all 1’s respectively. Now, consider codeword
x
3
.
Regardless of the values of its last
n/
3 bits, it has to have at least
n/
3 + 1 1’s in its ﬁrst 2
n/
3
entries, in order to have a distance of more than 2
n/
3 from
x
1
. But this implies that the distance
of
x
3
and
x
2
is less than 2
n/
3, a contradiction. Therefore we can have at most 2 codewords (say
the all 0’s and all 1’s bit strings).
2. This is a simple counting argument: the number of ordered pairs
x,y
∈
C,x
6
=
y
is

C

(

C
 
1).
For each such pair,
d
(
x,y
)
≥
d
, from which the inequality follows.
3. In each column, any pair of a zero and a one contribute 2 to the sum
∑
x,y
∈
C,x
6
=
y
d
(
x,y
) (1 for the
pair (
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '08
 MichaelWerman
 Information Theory, Probability theory

Click to edit the document details