{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

co331 - Compact course notes Combinatorics and Optimization...

Info icon This preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Compact course notes Combinatorics and Optimization 331, Winter 2011 Coding Theory Professor: D.Jao transcribed by: J. Lazovskis University of Waterloo April 6, 2011 Contents 1 Introduction 2 1.1 Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Decoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Error detection & correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 Finite fields 3 2.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Polynomial rings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3 Linear codes 5 3.1 Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3.2 Dual codes and parity-check matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 4 Cyclic codes 7 4.1 Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 4.2 Encoding with cyclic codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 4.3 Burst errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 4.4 BCH codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
1 Introduction It is always assumed that the source and the receiver are separated by space and/or time. 1.1 Fundamentals Definition 1.1.1. An alphabet is a finite set of symbols. Definition 1.1.2. A word is a finite sequence of symbols from a given alphabet. Definition 1.1.3. The length of a word is the number of symbols in the word. Definition 1.1.4. A code is a subset of the set of words in a given alphabet. Definition 1.1.5. A code word is a word in a particular code. Definition 1.1.6. A block code is a code where every code word has the same length. Definition 1.1.7. The length of a block code is the length of any code word in the block code. Definition 1.1.8. An [ n, M ]-code is a block code C of length n with | C | = M . 1.2 Channels Definition 1.2.1. A channel is a medium over which a symbol is sent. Definition 1.2.2. A symmetric channel is a channel satisfying the following properties: 1. Only symbols from a set alphabet A are received. 2. No symbols are deleted, inserted, or translated. 3. Random independent probability p of error for each symbol. Definition 1.2.3. Given an alphabet A = { a 1 , a 2 , . . . , a q } , let X i be the i th symbol sent, and let Y i be the i th symbol received. Then a q -symmetric channel with symbol error probability p has the property that for all 1 6 j, k 6 q, P ( Y i = a k | X i = a j ) = 1 - p j = k p q - 1 j 6 = k Definition 1.2.4. A binary symmetric channel is a symmetric channel using only the binary alphabet. Definition 1.2.5. The information rate of an [ n, M ]-code defined over an alphabet A of size q is r = log q ( M ) n Definition 1.2.6. Let A be an alphabet with words x, y A n . Then the Hamming distance of x and y is defined to be the number of positions in which x and y differ in symbols. It is denoted by d ( x, y ). Theorem 1.2.7. [Properties of Hamming distance] 1. d ( x, y ) > 0 and d ( x, y ) = 0 ⇐⇒ x = y 2. d ( x, y ) = d ( y, x ) 3. d ( x, y ) + d ( y, z ) > d ( x, z ) Remark 1.2.8. The main goals of coding theory are: 1. High error correction capability 2. High information rate 3. Efficient encoding and decoding algorithms 2
Image of page 2
1.3 Decoding Algorithm 1.3.1. [Incomplete maximum likelihood decoding (IMLD)] Suppose r A n is received. If r C , accept r . If r / C , then: If there exists a unique c o C such that d ( r, c o ) < d ( r, c ) for all c C, c 6 = c o , return c o . Else reject r . Algorithm 1.3.2. [Complete maximum likelihood decoding (CMLD)] Identical to IMLD, except in last step choose a c o arbitrarily from { c o C | d ( r, c o ) 6 d ( r, c ) c C, c 6 = c o } Theorem 1.3.3. For r A n , IMLD outputs the code word c C with the property that it maximizes P ( r | c ) := P ( r is received | c is sent).
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern