This preview shows page 1. Sign up to view the full content.
Unformatted text preview: NP Hardness & CSPs CPS 170 Ron Parr Digression: NP
Hardness • NP hardness is not an AI topic • You will not be tested on it, but • It’s important for all computer scienHsts • Understanding it will deepen your understanding of AI (and other CS) topics • Eat your vegetables; they’re good for you 1 NPhardness
• Many problems in AI are NP
hard (or worse) • What does this mean? • These are some of the hardest problems in CS • IdenHfying a problem as NP hard means: – You probably shouldn’t waste Hme trying to ﬁnd a polynomial Hme soluHon – If you ﬁnd a polynomial Hme soluHon, either • You have a bug • Find a place on your shelf for your Turing award • NP hardness is a major triumph (and failure) for computer science theory What is the class NP? • A class of decision problems (Yes/No) • SoluHons can be veriﬁed in polynomial Hme • Examples: – Graph coloring: WA NT Q SA NSW T V – Sortedness: [1 2 3 4 5 8 7] 2 What is NP completeness? • All NP complete problems can be “reduced” to each other in polynomial Hme • What is a reducHon? – Use one problem to solve another – A is reduced to B, if we can use B to solve A: A instance Poly
Hme xformaHon B Solver poly Hme A solver if B is poly Hme Why care about NP
completeness? • Solving any one NP
complete problem gives you the key to all others • All NP
complete problems are, in a sense, equivalent • Insight into solving any one gives you insight into solving a vast array of problems of extraordinary pracHcal and economic signiﬁcance 3 Proving NP Completeness
• Want to prove problem C is NP complete – Show that C is in NP – Find known NP complete problem reducible to C – Is graph color NP
complete? • Prove that graph coloring is in NP – Verify soluHon in poly Hme – Easy • Reduce known NP complete problem to graph coloring – Much more challenging – ReducHon from SAT The First NP Complete Problem (Cook 1971) • SAT: ( X 1 ∨ X 7 ∨ X 13 ) ∧ ( X 2 ∨ X 12 ∨ X 25 ) ∧ ...
• Want to ﬁnd an assignment to all variables that makes this expression evaluate to true €• NP
complete for clauses of size 3 or greater • How would you prove this? 4 What is NP Hardness? •
•
•
• NP hardness is weaker than NP completeness NP hard if an NP complete problem is reducible to it NP completeness = NP hardness + NP membership Consider the problem #SAT – How many saHsfying assignments to: ( X 1 ∨ X 7 ∨ X 13 ) ∧ ( X 2 ∨ X 12 ∨ X 25 ) ∧ ...
– Is this in NP? (Not even a decision problem) – Is it NP
hard? € #SAT is NP
hard • Theorem: #SAT is NP hard • Proof: – Reduce SAT to #SAT SAT instance #SAT solver x If x > 0 return Y Else return N SAT Solver 5 NP
Completeness Summary • NP
completeness tells us that a problem belongs to class of similar, hard problems. • What if you ﬁnd that a problem is NP hard? – Look for good approximaHons – Find diﬀerent measures of complexity – Look for tractable subclasses – Use heurisHcs CSPs • What is a CSP? • One view: Search with special goal criteria • CSP deﬁniHon (general): – Variables X1,…,Xn – Variable Xi has domain Di – Constaints C1,…,Cm – SoluHon: Each variable gets a value from its domain such that no constraints violated • CSP examples… – htp://www.csplib.org/ 6 Other CSP Examples • SaHsfying curriculum/major requirements • Sudoku • SeaHng arrangements at a party • LSAT QuesHons: htp://www.lsac.org/pdfs/SamplePTJune.pdf A Restricted View • Variables X1,…,Xn • A binary constraint, lists permited assignments to pairs of variables • A binary constraint between binary variables is a table of size 4, lisHng legal assignments for all 4 combinaHons. • A k
ary constraint lists legal assignments to k variables at a Hme. • How large is a k
ary constraint for binary variables? Note: More expressive languages are owen used. 7 CSP Example Graph coloring: Northern Territory (NT) Western Australia South Queensland (Q) (WA) Australia New South Whales (NSW) (SA) Tasmania (T) Victoria (V) Problem: Assign Red, Green and Blue so that no 2 adjacent regions have the same color. (3
coloring) Example Contd. • Variables: {WA, NT, Q, SA, NSW, V, T} • Domains: {R,G,B} • Constraints:
For WA – NT:{(R,G), (R,B), (G,B), (G,R), (B,R), (B,G)} • We have a table for each adjacent pair • Are our constraints binary? • Can every CSP be viewed as a graph problem? 8 Constraint Graph NT NT WA Q SA NSW Q SA NSW WA V T Enumerate all Legal combinaHons Of WA and SA (ignoring other regions) V T CSPs as Search NT Q SA Q NT NSW WA WA SA NSW V V T Nodes: ParHal Assignments T AcHons: Make Assignments 9 Backtracking • Backtracking is the most obvious (and widely used) method for solving CSPs: – Search forward by assigning values to variables – If stuck, undo the most recent assignment and try again – Repeat unHl success or all combinaHons tried • Embellishments – Methods for picking next variable to assign • Most constrained • Least constrained – Backjumping NP
Completeness of CSPs • Are CSPs in NP? • Are they NP
hard? • CSPs and graph coloring are equivalent • Convert any graph coloring problem to CSP • Convert any CSP to graph coloring • Known: Graph coloring is NP
complete • CSPs are NP
complete • End of the story or just the beginning? 10 Issues
• What are good heurisHcs? – N.B.: Here we use the term “heurisHc” to refer to a procedure for selecHng next variables, not an h(x) funcHon as in A* – Owen good to think of this as a local search – Focus on choosing acHons carefully, instead of pruning nodes carefully (as in A* or alpha
beta) • Can we develop heurisHcs that apply to the enHre class of problems, not just speciﬁc instances? • What’s the best we can hope for? Constraint Graphs • Constraint graphs are important because they capture the structural relaHonships between the variables • IMPORTANT CONCEPT:
Not all instances of a hard problem class are hard – Structural features give insight into hardness – Group problems within class by structural features – New measure of problem complexity 11 Node Consistency NT Q SA NSW WA V T • Check all nodes for inconsistencies • For each node, there must exist at least one valid assignment given assignments to neighbors • Rules out some bad assignments quickly Arc Consistency NT Q SA NSW WA V T • Check all arcs for inconsistencies • For each value at the start, there must exist a consistent value at the terminus • Catches many inconsistencies • Can use to iteraHvely reduce number of possible assignments to each variable
(constraint propagaHon) 12 Generalized Arc Consistency NT Q SA NSW WA • k
consistency – Consider sets of k variables – For each legal seng of a k
1 subset – Check for legal seng for the kth variable • Checks for more distant inﬂuences • Prunes out inconsistent sengs V T • 1
consistency = node consistency • 2 consistency = arc consistency Is this 3
consistent? Facts About Arc Consistency • Strong k
consistency: Consistent for all i<k • What if a graph with n variables is strongly n
consistent? SoluHon exists! • What is the worst
case cost of checking n
consistency? n O(2 )
€ 13 Linear Constraint Structures X1 X2 X3 X4 X5 X6 Are these easy or hard? Suppose our chain is arc consistent… ProperHes of Chains Theorem: Arc consistent linear constraint graphs are strongly n consistent. Proof: InducHon on n. Base: Arc consistent chains of length 1 are consistent. I.H. Arc consistent chains of length i are strongly i consistent I.S. Extending an i step arc
consistent chain by 1 new arc consistent link produces an i+1 link strongly i+1 consistent chain. Proof of I.S.: Since the last link is strongly arc
consistent, any choice for variable i ensures a consistent choice for i+1. No other variables parHcipate in constraints for i+1. 14 ProperHes of Trees Theorem: Arc consistent constraint trees are n consistent. Proof: Same as chain case... Corollary: Hardness of CSPs with constraint trees Polynomial! Cool fact: We now have a graph
based test for separaHng out some of the hard problems from the easy ones. Variable EliminaHon NT Q NT Q SA NSW Eliminate WA WA SA NSW V V Domain(NT,SA) = {(blue, green), (blue, red), (green, blue), (green, red), (red, blue), (red, green)} 15 Eliminate Q NT Q SA NSW NT SA V NSW V Domain(NT,SA,NSW) = {(blue, green, blue), (blue, red, blue), (red, blue, red), (red, green, red), (green, blue, green), (green, red, green)} Simplify NT SA NSW Domain(SA, NSW) = {(blue, green), (blue, red), (green, blue), (green, red), (red, blue), (red, green)} V Domain(NT,SA,NSW) = {(blue, green, blue), (blue, red, blue), (red, blue, red), (red, green, red), (green, blue, green), (green, red, green)} 16 Finish SA NSW Domain(SA, NSW) = {(blue, green), (blue, red), (green, blue), (green, red), (red, blue), (red, green)} V Can idenHfy all sengs of SA, V, NSW for which there is guaranteed to be a consistent seng of the remaining variables. Q: How do we get the sengs of the other variables? Variable EliminaHon Var_elim_CSP_solve (vars, constraints) Q = queue of all variables i = length(vars)+1 While not(empty(Q)) X = pop(Q) Xi = merge(X, neighbors(X)) Simplify Xi remove_from_Q(Q, neighbors(X)) add_to_Q(Q, Xi) i=i+1 Note: Merge operaHon can be tricky to implement, depending upon constraint language. 17 Variable EliminaHon Issues • How expensive is this? ExponenHal in size of largest merged variable set – 1. • Is it sensiHve to eliminaHon ordering? Yes! Variable EliminaHon Ordering Is it beter to start at the edges and work in, or at the center and work out? Edges! 18 Variable EliminaHon Facts • You can ﬁgure out the cost of a parHcular eliminaHon ordering without actually construcHng the tables • Finding opHmal eliminaHon ordering is NP hard • Good heurisHcs for ﬁnding near opHmal orderings • Another structural complexity measure • Investment in ﬁnding good ordering can be amorHzed CSP Summary • CSPs are a specialized language for describing certain types of decision problems • We can formulate special heurisHcs and methods for problems that can be described in this language • In general, CSPs are NP hard – no general, fast soluHons on the horizon • In some cases, we can use structural measures of complexity to ﬁgure out which ones are really hard 19 ...
View
Full
Document
This note was uploaded on 02/17/2012 for the course COMPSCI 170 taught by Professor Parr during the Spring '11 term at Duke.
 Spring '11
 Parr
 Artificial Intelligence

Click to edit the document details