This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CSE 599d  Quantum Computing Simons Algorithm Dave Bacon Department of Computer Science & Engineering, University of Washington Bernstein and Vazirani showed the first superpolynomial separation (relative to an oracle) between quantum and bounded error classical query complexity. Following this result, Simon was able to come up with a problem for which there is an exponential separation. In this lecture we will study this problem. I. SIMONS PROBLEM In Simons problem we are given a function from n bit strings to n bit strings, f : { , 1 } n { , 1 } n which is guaranteed to satisfy f ( x ) = f ( y ) iff x = y s . Simons problem is then, by querying f ( x ) to determine whether the function belongs to (i) s = 0 n or to (ii) s 6 = 0 n . Actually sometimes Simons problem is stated as the problem of being given a function which satisfies the promise and then to identify s . We will continue with the tradition of blurring the distinction between these two problems and call both of the Simons problem. A. Yaos Principle and a Classical Query Lower Bound for Simons Algorithm Here we will introduce a useful method for proving lower bounds of algorithms which utilize randomness which is known as Yaos principle. Yaos principle makes a connection between algorithms which fail with a certain probability and distributions over inputs of deterministic algorithms. Suppose we have an algorithm which computes some function F . Let R ( F ) denote the minimal complexity over all algorithms of the algorithm successfully computing F for all inputs with the probability of failing at most on any of these inputs. Let D ( F ) denote the minimal complexity of computing F deterministically on at least 1 of the inputs given the probability distribution on these inputs. Then the Yao principle states that R ( F ) = max D ( F ) . (1) In other words if you look at deterministic algorithms and take their performance averaged over their inputs with the worst possible distribution (for the complexity) over these inputs, then this is equal to the complexity of the best algorithm (in terms of complexity) for computing a function which fails with probability over all inputs. Why is this important? Well mostly because it means that we can bound algorithms with error by placing bounds on deterministic algorithms over a particular distribution : R ( F ) D ( F ) . (2) We wont prove Yaos principle, but some of you may be interested to know that its proof comes from von Neumann minmax theorem. I point this out because this will make computer scientists happy. And it will also make physicists and mathematicians happy. von Neumann is neutral ground in the turf wars between the scientific disciplines....
View
Full
Document
This note was uploaded on 11/06/2011 for the course CSE 599 taught by Professor Staff during the Fall '08 term at University of Washington.
 Fall '08
 Staff
 Computer Science

Click to edit the document details