This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS 151 Complexity Theory Spring 2011 Problem Set 4 Out: April 21 Due: April 28 Reminder: you are encouraged to work in groups of two or three; however you must turn in your own writeup and note with whom you worked. You may consult the course notes and the text (Papadimitriou). Please attempt all problems. To facilitate grading, please turn in each problem on a separate sheet of paper and put your name on each sheet. Do not staple the separate sheets. Problem 4 is optional for extra credit. 1. Define ] ZPP to be the class of all languages decided by a probabilistic Turing Machine running in expected polynomial time. That is, for every language L ∈ ] ZPP there is a probabilistic Turing Machine M (with two readonly tapes — the first tape containing the input, and the second tape containing a random bit in every tape square) with the following behavior: on input x ∈ L , M always accepts, on input x ̸∈ L , M always rejects, and for every input x , E[# steps before M halts] =  x  O (1) . Show that ] ZPP = ZPP . 2. Listdecoding of the binary Hadamard code. Throughout this problem F 2 is the field with 2 elements (addition and multiplication are performed modulo 2). Given a kbit message m , the associated Hadamard codeword C ( m ) is described by first producing a linear multivariate polynomial p m ( x ,x 1 ,...,x k − 1 ) = ∑ k − 1 i =0 m i x i , and then evaluating that polynomial at all vectors in the space F k 2 : C ( m ) = ( p m ( w )) w ∈ F k 2 . Thus the codeword has n = 2 k bits, and the wth bit is the inner product mod 2 of the kbit vectors m and w . The bits of a codeword C = C ( m ) are naturally indexed by F k 2 ; we write C w (with w ∈ F k 2 ) to mean the wth coordinate, which is just p m ( w ). Since the distance of the Hadamard code is (1 / 2) n (by SchwartzZippel), unique decoding is only possible from up to (1 / 4) n errors. In this problem you will show that eﬃcient list decoding is possible from a received word R that has suffered up to (1 / 2 − ϵ ) n errors....
View
Full
Document
This document was uploaded on 01/05/2012.
 Fall '09

Click to edit the document details