{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# lect30 - Error Correcting Codes Combinatorics Algorithms...

This preview shows pages 1–2. Sign up to view the full content.

Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 30: Achieving the BSC p capacity (II) Tuesday, November 6, 2007 Lecturer: Atri Rudra Scribe: Nathan Russell & Atri Rudra In the last lecture, we started with the description of our BSC p capacity achieving code C * , which is a concatenated code C out C in , where C out and C in satisfying the following properties: (i) C out : The outer code with block length N and rate 1 - ε 2 over F 2 k , with k = O (log N ) . Further, the outer code has a unique decoding algorithm D out that can correct at most γ fraction of worst-case errors in time T out ( N ) . (ii) C in : The inner code has dimension k , dimension n and a rate of 1 - H ( p ) - ε/ 2 . Further, there is a decoding algorithm D in that runs in T in ( k ) time and has decoding error probability no more than γ 2 over BSC p . In today’s lecture, we will analyze the properties of C * and also see how to get our hands on C out and C in with the desired properties. For the rest of the lecture, we will assume that p is an absolute constant. Note that this implies that k = Θ( n ) and thus, we will use k and n interchangeably in our asymptotic bounds. Finally, we will use N = nN to denote the block length of C * . 1 Decoding Error Probability We begin this section by analyzing the natural decoding algorithm that we saw in the last lecture: Step 1 : Let y 0 i = D in ( y i ) , 1 i N . Step 2 : Run D out on y 0 = ( y 0 1 , . . . , y 0 N ) . By the properties of D in , for any fixed i , there is an error at y 0 i with probability γ 2 . Each such error is independent, since errors in BSC p itself are independent by definition. Because of this, and by linearity of expectation, the expected number of errors in y 0 is γN 2 . Taken together, those two facts allow us to conclude that, by the Chernoff bound, the probability that the total number of errors will be more than γN is at most e - γN 6 . Since the decoder D out fails only when there are more than γN errors, this is also the decoding error probability. Expressed in asymptotic terms, the error probability is 2 - Ω( γ N n ) .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern