Unformatted text preview: r all states y such that y · s = 0. This means that Fourier sampling
1
1
( √2 x0 + √2 x0 ⊕ s results in a uniform superposition of y such that y · s = 0.
Recall that
1
H ⊗n  x =
αy y where αy = (−1)x·y
Ny
1
So H ⊗2 √2 (x0 + x0 ⊕ s) = 1 y αy y where
2 1
αy = √ (−1)x0 ·y + (−1)(x0 ⊕s)·y
2
1
= √ (−1)x0 ·y (1 + (−1)s·y )
2 1
Now it is easy to see that if s · y = 0, αy = ± √2 , but if s · y = 1, αy = 0. Therefore when we measure the ﬁrst register, we will measure a y such
that y · s = 0.
Step 3: Repeat until there are enough such y ’s that we can classically
solve for s.
There are exactly n linearly independent values of y such that y · s =
y1 s1 + y2 s2 + · · · yn sn = 0, and one of these is the trivial solution y = 0.
Therefore, there are n−1 nontrivial, linearly independent solutions to y ·s = 0.
But if y1 and y2 are linearly independent solutions, (y1 + y2 ) · s = y1 · s + y2 · s = 0
so linear combinations of solutions are also solutions. This gives us a total of
2n−1 y ’s such that y · s = 0. To solve for s, we need to ﬁnd exactly n − 1
nontrivial, linearly independent y such that y · s = 0.
For example, if s = 010, then y0 = 000, y1 = 001, and y2 = 100 are linearly
independent solutions to y · s = 0. But the linear combination y1 + y2 = 101 is 46 CHAPTER 4. FOURIER SAMPLING & SIMON’S ALGORITHM also a solution. We need only ﬁnd two of {y1 , y2 , y1 + y2 } in order to classically
solve for s.
How long should we expect this to take? The probability that we fail on
the ﬁrst run is the probability that we ﬁnd y = 0, which is one value out of
2n−1 . So P1 = 1/2n−1 , where P1 denotes the probability of failing on the ﬁrst
run. Lets call the ﬁrst nontrivial solution y1 .
We fail when looking for y2 if we ﬁnd 0 or y1 , so P2 = 2/2n−1 = 1/2n−2 .
When looking for y3 , we fail if we ﬁnd any of {0, y1 , y2 , y1 + y2 }, so P3 =
4/2n−1 = 1/2n−3 . Carrying on in this way, the probability of failing to ﬁnd yi
is Pi = 1/2n−i .
The chance that we fail up to and including yi can be approximated by
P < 1/2n−1 + 1/2n−2 + · · · + 1/2n−i . If we push this approximation all the
way to i = n − 1, we see that we fail with probability less than 1 (compute
the geometric sum). That’s not a strong enough approximation, so instead
notice that our probability of failure up to and including i = n − 2 is less
than 1/2. Then our probability of success up to the n − 2st run is greater
than 1/2. We ﬁnd the ﬁnal linearly independent term on the last run with
probability 1/2 (if you don’t believe this, notice that half of the solutions are
linear combinations that include yn−1 ). Finally our total probability of success
is P (success) > 1/2 ∗ 1/2 = 1/4. Therefore, we expect our process to take
O(n) steps (our limit says 4 by n runs of the algorithm should be enough for
success).
Simon’s algorithm...
View
Full
Document
This document was uploaded on 09/22/2013.
 Fall '13

Click to edit the document details