lect16

lect16 - 6.841 Advanced Complexity Theory April 7, 2009...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 6.841 Advanced Complexity Theory April 7, 2009 Lecture 16 Lecturer: Madhu Sudan Scribe: Paul Christiano 1 Worst-Case vs. Average-Case Complexity So far we have dealt mostly with the worst-case complexity of problems. We might also wonder aobut the average-case complexity of a problem, namely, how difficult the problem is to solve for an instance chosen randomly from some distribution. It may be the case for a certain problem that hard instances exist but are extremely rare. Many reductions use very specifically constructed problems, so their conclusions may not apply to average problems from a class. For example, consider the problem of sorting a list of integers. We can define the worst-case complexity: T ( n ) = min A max | x | = n { Running time of A ( x ) } where the minimum is taken over all algorithms A such that A ( x ) always outputs the list x sorted in increasing order. If we have a distribution D over the inputs we can define the average case complexity the same way, except that the minimum is taken over all algorithms A such that A ( x ) outputs a sorted list except on a set of small probability. We can define the worst case and average case complexity of any other language or function in the same way. We could also define several slightly different notions of average case com- plexity. For example, we could only consider algorithms A which are correct on all inputs but we could measure the expected running time instead of the running time on the worst instance. Notice that this is a strictly weaker notion: if we have an algorithm running in expected polynomial time, you can convert it to one which always runs in polynomial time but which is occassionally wrong by timing your algorithm and terminating it after it has run for a polynomial number of steps (for an appropriately chosen polynomial). We will always deal with our stronger definition. Worst case complexity is a very useful measure for constructing reductions: we are allowed to reduce to instances of an arbitrarily rare special form in order to make a statement about worst case complexity. If we are dealing with average case complexity, reductions become much more difficult to describe. Another disadvantage of average case complexity is that it is only defined when we have a probability distribution over the inputs, while there isnt always a natural way to define a distribution. For example, if we want to consider the average case complexity of 3SAT we could use the distribution which chooses one clause 16-1 per variable uniformly at random. But if we did this then simply outputting unsatisfiable would almost always be correct, and the average case complexity of 3SAT would be constant....
View Full Document

This note was uploaded on 04/02/2010 for the course CS 6.841 taught by Professor Madhusudan during the Spring '09 term at MIT.

Page1 / 6

lect16 - 6.841 Advanced Complexity Theory April 7, 2009...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online