This preview shows pages 1–9. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 6.045: Automata, Computability, and Complexity (GITCS) Class 17 Nancy Lynch Today Probabilistic Turing Machines and Probabilistic Time Complexity Classes Now add a new capability to standard TMs: random choice of moves. Gives rise to new complexity classes: BPP and RP Topics: Probabilistic polynomialtime TMs, BPP and RP Amplification lemmas Example 1: Primality testing Example 2: Branchingprogram equivalence Relationships between classes Reading: Sipser Section 10.2 Probabilistic PolynomialTime Turing Machines, BPP and RP Probabilistic PolynomialTime TM New kind of NTM, in which each nondeterministic step is a coin flip: has exactly 2 next moves, to each of which we assign probability . Example: 1/4 1/4 1/8 1/8 1/8 1/16 1/16 Computation on input w To each maximal branch, we assign a probability: Has accept and reject states, as for NTMs. Now we can talk about probability of acceptance or rejection, on input w. number of coin flips on the branch Probabilistic PolyTime TMs 1/4 1/4 1/8 1/8 1/8 1/16 1/16 Computation on input w Probability of acceptance = b an accepting branch Pr(b) Probability of rejection = b a rejecting branch Pr(b) Example: Add accept/reject information Probability of acceptance = 1/16 + 1/8 + 1/4 + 1/8 + 1/4 = 13/16 Probability of rejection = 1/16 + 1/8 = 3/16 We consider TMs that halt (either accept or reject) on every branch deciders . So the two probabilities total 1. Acc Acc Acc Acc Rej Acc Rej Probabilistic PolyTime TMs Time complexity: Worst case over all branches, as usual. Q : What good are probabilistic TMs? Random choices can help solve some problems efficiently. Good for getting estimatesarbitrarily accurate, based on the number of choices. f Example: Monte Carlo estimation of areas E.g, integral of a function f. Repeatedly choose a random point (x,y) in the rectangle. Compare y with f(x). Fraction of trials in which y f(x) can be used to estimate the integral of f. Probabilistic PolyTime TMs Random choices can help solve some problems efficiently. Well see 2 languages that have efficient probabilistic estimation algorithms. Q : What does it mean to estimate a language? Each w is either in the language or not; what does it mean to approximate a binary decision? Possible answer: For most inputs w, we always get the right answer, on all branches of the probabilistic computation tree. O r : For most w, we get the right answer with high probability. Better answer: For every input w, we get the right answer with high probability. Probabilistic PolyTime TMs Better answer: For every input w, we get the right answer with high probability....
View
Full
Document
This note was uploaded on 12/26/2011 for the course ENGINEERIN 18.400J taught by Professor Prof.scottaaronson during the Spring '11 term at MIT.
 Spring '11
 Prof.ScottAaronson

Click to edit the document details