This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 6.841 Advanced Complexity Theory April 7, 2009 Lecture 16 Lecturer: Madhu Sudan Scribe: Paul Christiano 1 WorstCase vs. AverageCase Complexity So far we have dealt mostly with the worstcase complexity of problems. We might also wonder aobut the averagecase complexity of a problem, namely, how difficult the problem is to solve for an instance chosen randomly from some distribution. It may be the case for a certain problem that hard instances exist but are extremely rare. Many reductions use very specifically constructed problems, so their conclusions may not apply to “average” problems from a class. For example, consider the problem of sorting a list of integers. We can define the worstcase complexity: T ( n ) = min A max  x  = n { Running time of A ( x ) } where the minimum is taken over all algorithms A such that A ( x ) always outputs the list x sorted in increasing order. If we have a distribution D over the inputs we can define the average case complexity the same way, except that the minimum is taken over all algorithms A such that A ( x ) outputs a sorted list except on a set of small probability. We can define the worst case and average case complexity of any other language or function in the same way. We could also define several slightly different notions of average case com plexity. For example, we could only consider algorithms A which are correct on all inputs but we could measure the expected running time instead of the running time on the worst instance. Notice that this is a strictly weaker notion: if we have an algorithm running in expected polynomial time, you can convert it to one which always runs in polynomial time but which is occassionally wrong by timing your algorithm and terminating it after it has run for a polynomial number of steps (for an appropriately chosen polynomial). We will always deal with our stronger definition. Worst case complexity is a very useful measure for constructing reductions: we are allowed to reduce to instances of an arbitrarily rare special form in order to make a statement about worst case complexity. If we are dealing with average case complexity, reductions become much more difficult to describe. Another disadvantage of average case complexity is that it is only defined when we have a probability distribution over the inputs, while there isn’t always a natural way to define a distribution. For example, if we want to consider the average case complexity of 3SAT we could use the distribution which chooses one clause 161 per variable uniformly at random. But if we did this then simply outputting “unsatisfiable” would almost always be correct, and the average case complexity of 3SAT would be constant....
View
Full Document
 Spring '09
 MadhuSudan
 Analysis of algorithms, Computational complexity theory, Lecturer, average case complexity, Madhu Sudan

Click to edit the document details