This preview shows pages 1–2. Sign up to view the full content.
(Fall 2007)
Lecture 19: EliasBassalygo Bound
October 10, 2007
Lecturer: Atri Rudra
In the last lecture, we saw the qary version of the Johnson bound on the rate and distance of a
code, which we repeat below.
1
Johnson bound
Theorem 1.1
(Johnson Bound)
.
Let
C
⊆
[
q
]
n
be a code of distance
d
. If
ρ < J
q
(
d
n
)
, then
C
is a
(
ρ, qdn
)
list decodable code, where the function
J
q
(
δ
)
is deﬁned as
J
q
(
δ
) =
±
1

1
q
²
³
1

s
1

qδ
q

1
!
.
Recall that the best upper bound on
R
(in terms of
δ
) that we have seen so far is a combination
of the Plotkin and Hamming bounds (see Figure 1).
2
EliasBassalygo bound
We begin with the statement of a new upper bound on the rate called the EliasBassalygo bound.
Theorem 2.1
(EliasBassalygo bound)
.
Every
q
ary code of rate
R
, distance
δ
, and large enough
block length, satisﬁes the following:
R
≤
1

H
q
(
J
q
(
δ
)) +
o
(1)
The proof of theorem above uses the following lemma:
Lemma 2.2.
Given a
q
ary code,
C
⊆
[
q
]
n
and
0
≤
e
≤
n
, there exists a Hamming ball of radius
e
with at least

C

V ol
q
(
0
,e
)
q
n
codewords in it.
Proof.
We will prove the existence of the required Hamming ball by the probabilistic method. Pick
a received word
y
∈
[
q
]
n
at random. It is easy to check that the expected value of

B
q
(
y
, e
)
∩
C

is

C

V ol
q
(
0
,e
)
q
n
. (We have seen this argument earlier when we proved the negative part of the list
decoding capacity.)
This implies the existence of a
y
∈
[
q
]
n
such that

B
q
(
y
, e
)
∩
C
 ≥

C

V ol
q
(
0
, e
)
q
n
,
as desired.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '11
 RUDRA
 Algorithms

Click to edit the document details