This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 9: Converse of Shannon’s Capacity Theorem September 17, 2007 Lecturer: Atri Rudra Scribe: ThanhNhan Nguyen & Atri Rudra In the last lecture, we stated Shannon’s capacity theorem for the BSC, which we restate here: Theorem 0.1. Let ≤ p < 1 / 2 be a real number. For every < ε ≤ 1 / 2 p , the following statements are true for large enough integer n : (i) There exists a real δ > , an encoding function E : { , 1 } k → { , 1 } n , and a decoding function D : { , 1 } n → { , 1 } k , where k ≤ b (1 H ( p + ε )) n c such that the following holds for every m ∈ { , 1 } k : Pr noise e of BSC p [ D ( E ( m ) + e ) 6 = m ] ≤ 2 δn . (ii) If k ≥ d (1 H ( p ) + ε ) n e then for every encoding and decoding functions E : { , 1 } k → { , 1 } n and D : { , 1 } n → { , 1 } k the following is true for some m ∈ { , 1 } k : Pr noise e of BSC p [ D ( E ( m ) + e ) 6 = m ] ≥ 1 / 2 . In today’s lecture, we will prove part (ii) of Theorem 0.1. 1 Preliminaries Before we begin with the proof we will need a few results, which we discuss first. 1.1 Chernoff Bound Chernoff bound states a bound on the tail of a certain distribution that will be useful for us. Here we state the version of the Chernoff bound that we will need. Proposition 1.1. For i = 1 , ··· , n , let X i be a binary random variable that takes a value of 1 with probability p and a value of with probability 1 p . Then the following bounds are true: (i) Pr [ ∑ n i =1 X i ≥ (1 + ε...
View
Full
Document
 Spring '10
 Rejaei
 Electromagnet

Click to edit the document details