This preview shows page 1. Sign up to view the full content.
Unformatted text preview: CS 473: Algorithms
Chandra Chekuri
chekuri@cs.illinois.edu
3228 Siebel Center
University of Illinois, UrbanaChampaign Fall 2009 Chekuri CS473 Part I
Heuristics Chekuri CS473 Coping with Intractability Some general things that people do.
Consider special cases of the problem which may be tractable.
Run ineﬃcient algorithms (for example exponential time
algorithms for NPhard problems) augmented with (very)
clever heuristics
stop algorithm when time/resources run out
use massive computational power Exploit properties of instances that arise in practice which
may be much easier. Give up on hard instances, which is ok.
Settle for suboptimal solutions, especially for optimization
problems Chekuri CS473 NP and EXP
EXP : all problems that have an exponential time algorithm.
Proposition
NP ⊆ EXP Chekuri CS473 NP and EXP
EXP : all problems that have an exponential time algorithm.
Proposition
NP ⊆ EXP
Proof.
Let X ∈ NP with certiﬁer C . To prove X ∈ EXP , here is an
algorithm for X . Given input s ,
For every t , with t  ≤ p (s ) run C (s , t ); answer “yes” if any
one of these calls returns “yes”, otherwise say “no”.
Every problem in NP has a bruteforce “try all possibilities”
algorithm that runs in exponential time.
Chekuri CS473 Examples SAT: try all possible truth assignment to variables
Independent set: try all possible subsets of vertices
Vertex cover: try all possible subsets of vertices Chekuri CS473 Improving bruteforce via intelligent backtracking Backtrack search: enumeration with bells and whistles to
“heuristically” cut down search space.
Works quite well in practice for several problems, especially for
small enough problem sizes. Chekuri CS473 Backtrack Search Algorithm for SAT
Input: CNF Formula ϕ on n variables x1 , . . . , xn and m clauses
Output: Is ϕ satisﬁable or not.
1 Pick a variable xi 2 ϕ is CNF formula obtained by setting xi = 0 and simplifying
Run a simple (heuristic) check on ϕ : returns “yes”, “no” or
“not sure” 3 If “not sure” recursively solve ϕ
If ϕ is satisﬁable, return “yes”
4
5 ϕ is CNF formula obtained by setting xi = 1
Run simple check on ϕ : returns “yes”, “no” or “not sure”
If “not sure” recursively solve ϕ
If ϕ is satisﬁable, return “yes” 6 Return “no” Certain part of the search space is pruned.
Chekuri CS473 Example Algorithms 286 Figure 9.1 Backtracking reveals that φ is not satisﬁable.
(w ∨ x ∨ y ∨ z ), (w ∨ x), (x ∨ y ), (y ∨ z ), (z ∨ w), (w ∨ z )
w=0 w=1
(x ∨ y ), (y ∨ z ), (z ), (z ) (x ∨ y ∨ z ), (x), (x ∨ y ), (y ∨ z )
x=1 (y ∨ z ), (y ), (y ∨ z )
y=0 () (x ∨ y ), () z=1
(x ∨ y ), (y ), () y=1
() (z ), (z )
z=0 z=0 (), (y ∨ z ) x=0 z=1
() Figure: Backtrack search. Formula b not satisﬁable.
happens to be a singleton, then at least one of the resulting isranches will be terminated. (If
there is a tie in choosing subproblems, one reasonable policy is to pick the one lowest in the
tree, in the hope that it is close to a satisfying assignment.) See Figure 9.1 for the conclusion
of our earlier example.
More abstractly, a backtracking algorithm requires a test that looks at a subproblem and
quickly declares one of three outcomes: Figure from Dasgupta etal book.
1. Failure: the subproblem has no sChekuri .
olution CS473 Backtrack Search Algorithm for SAT
How do we pick the order of variables? Chekuri CS473 Backtrack Search Algorithm for SAT
How do we pick the order of variables? Heuristically! Examples:
pick variable that occurs in most clauses ﬁrst
pick variable that appears in most size 2 clauses ﬁrst
... Chekuri CS473 Backtrack Search Algorithm for SAT
How do we pick the order of variables? Heuristically! Examples:
pick variable that occurs in most clauses ﬁrst
pick variable that appears in most size 2 clauses ﬁrst
...
What are quick tests for satisﬁability? Chekuri CS473 Backtrack Search Algorithm for SAT
How do we pick the order of variables? Heuristically! Examples:
pick variable that occurs in most clauses ﬁrst
pick variable that appears in most size 2 clauses ﬁrst
...
What are quick tests for satisﬁability? Depends on known special
cases and heuristics. Examples.
Obvious test: return “no” if empty clause, “yes” if no clauses
left and otherwise “not sure”
Run obvious test and in addition if all clauses are of size 2
then run 2SAT polynomial time algorithm
... Chekuri CS473 BranchandBound: backtracking for optimization probs
Intelligent backtracking can be used also for optimization
problems. Consider a minimization problem.
Notation: for instance I , opt (I ) is optimum value on I .
P0 initial instance of given problem.
Keep track of the best solution value B found so far. Initialize
B to be crude upper bound on opt (I ).
Let P be a subproblem at some stage of exploration.
If P is a complete solution, update B .
Else use a lower bounding heuristic to quickly/eﬃciently ﬁnd
a lower bound b on opt (P ).
If b ≥ B then prune P
Else explore P further by breaking it into subproblems and
recurse on them. Output best solution found.
Chekuri CS473 Example: Vertex Cover
Given G = (V , E ), ﬁnd a minimum sized vertex cover in G .
Initialize B = n − 1.
Pick a vertex u . Branch on u : either choose u or discard it.
Let b1 be a lower bound on G1 = G − u .
If 1 + b1 < B , recursively explore G1
Let b2 be a lower bound on G2 = G − u − N (u ) where N (u ) is
the set of neighbours of u .
If N (u ) + b2 < B , recursively explore G2
Output B . Chekuri CS473 Example: Vertex Cover
Given G = (V , E ), ﬁnd a minimum sized vertex cover in G .
Initialize B = n − 1.
Pick a vertex u . Branch on u : either choose u or discard it.
Let b1 be a lower bound on G1 = G − u .
If 1 + b1 < B , recursively explore G1
Let b2 be a lower bound on G2 = G − u − N (u ) where N (u ) is
the set of neighbours of u .
If N (u ) + b2 < B , recursively explore G2
Output B .
How do we compute a lower bound?
One possibility: solve an LP relaxation that we saw in previous
lecture.
Chekuri CS473 Local Search Local Search: a simple and broadly applicable heuristic method
Start with some arbitrary solution s
Let N (s ) be solutions in the “neighbourhood” of s obtained
from s via “local” moves/changes
If there is a solution s ∈ N (s ) that is better than s , move to
s and continue search with s
Else, stop search and output s . Chekuri CS473 Local Search Main ingredients in local search:
Initial solution
Deﬁnition of neighbourhood of a solution
Eﬃcient algorithm to ﬁnd a good solution in the
neighbourhood Chekuri CS473 Example: TSP TSP: Given a complete graph G = (V , E ) with cij denoting cost of
edge (i , j ), compute a Hamiltonian cycle/tour of minimum edge
cost. Chekuri CS473 Example: TSP TSP: Given a complete graph G = (V , E ) with cij denoting cost of
edge (i , j ), compute a Hamiltonian cycle/tour of minimum edge
cost.
2change local search:
Start with an arbitrary tour s0
For a solution s deﬁne s to be a neighbour if s can be
obtained from s by replacing two edges in s with two other
edges.
For a solution s at most O (n2 ) neighbours and one can try all
of them to ﬁnd an improvement. Chekuri CS473 ey can’t differ in just one edge (do you see why?), so we will consider differe
s. TSP: ﬁne the 2change neighborhood of tour s as being the set of tours tha
We de 2change example
by removing two edges of s and then putting in two other edges. Here’s an e
move: w have a welldeﬁned local search procedure. How does it measure up under
criteria for algorithms—what is its overall running time, and does it always
olution?
rrassingly, neither of these questions has a satisfactory answer. Each iter
fast, because a tour has only O(n 2 ) neighbors. However, it is not clear how
s will be needed: whether for instance, there might be an exponential nu
kewise, all we can easily say about the ﬁnal tour is that it is locally optima
perior to the tours in its immediate neighborhood. There might be better so
way. For instance, the following picture shows a possible ﬁnal answer that is
al; the range of local moves is simply too limited to improve upon it.
Chekuri CS473 ey can’t differ in just one edge (do you see why?), so we will consider differe
ave a welldeﬁned examplerch procedure. How does it measure up u
local sea
s. TSP: ﬁne the 2change neighborhood of tour s as being the set of tours tha
We de 2change
eyiremovalg oriohms—what dstitenoputtanl runwinothemedgen.dHeres ianae
r a for ing twt edges of s ani h s ver i l g in t n o g ti r e, a s do e’s t l
b
tmov? :
ion e ssingly, neither of these questions has a satisfactory answer. Each
t, because a tour has only O(n 2 ) neighbors. However, it is not clea
ll be needed: whether for instance, there might be an exponentia
ise, all we can easily say about the ﬁnal tour is that it is locally op
ior to the tours in its immediate neighborhood. There might be bet
.w havina taelle,eﬁhed oolowsearchictucedshe.wsoa possiiblmeanure up wndeth
For e s w nc d t n f l l cal ing p prore ur o H w doe t e ﬁ sal ans u er r
Figure below ms v a is mp o era l m n 2change and o p
critraingeroallocatshows es badsiis itslyvtoo lliruitningo iimprovedues n at. ays
the er a fo f gori lhmo—what local optimum fored t t me, heuristic oit i lw olution?
rrassingly, neither of these questions has a satisfactory answer. Each iter
fast, because a tour has only O(n 2 ) neighbors. However, it is not clear how
s will be needed: whether for instance, there might be an exponential nu
kewise, all we can easily say about the ﬁnal tour is that it is locally optima
perior to the tours in its immediate neighborhood. There might be better so
way. For instance, the following picture shows a possible ﬁnal answer that is
ale t e s, we ma c l y a m i r s g pl e t ous ne ed to improve u r n t.
m; thhirange of loyatrmoves os eimeny roo limitighborhood, fopoinistance 3 rs that differ on up to three edges. And indeed, the preceding b
Chekuri CS473 TSP: 3change example overcome this, we may try a more generous neighborhood, for instance 3change
3change iffer search: swap 3 edges An
of tours that dlocal on up to three edges. out. d indeed, the preceding bad cas t there is a downside, in that the size of a neighborhood becomes O(n 3 ), making
on moNeighbourhood ofreohas, nowre may stilto e suboof iΩ(n3l) cal minima, alt
re expensive. Mo s ver the increased l b a size pt mal o
than before. To avchange e, we would have edges up tswapped ge, or higher. I
Can deﬁne k oid thes heuristic where k to go are o 4chan out.
r, efﬁcIncreasesdneighbourhoodusize ut to makes each local siderations in a local s
iency an quality often t rn o and be competing con improvement
ncy destep ds neighborhoods that can be searched quickly, but smaller neighbor
man less eﬃcient. Chekuri CS473 Local Search Variants
Local search terminates with a local optimum which may be far
from a global optimum. Many variants to improve plain local
search.
Randomization and restarts. Initial solution may strongly
inﬂuence the quality of the ﬁnal solution. Try many random
initial solutions.
Simulated annealing is a general method where one allows the
algorithm to move to worse solutions with some probability.
At the beginning this is done more agressively and then slowly
the algorithm converges to plain local search. Controlled by a
parameter called “temperature”.
Tabu search. Store already visited solutions and do not visit
them again (they are “taboo”). Chekuri CS473 Heuristics Several other heuristics used in practice.
Heuristics for solving integer linear programs such as cutting
planes, branchandcut etc are quite eﬀective.
Heuristics to solve SAT (SATsolvers) have gained prominence
in recent years
Genetic algorithms
...
Heuristics design is somewhat adhoc and depends heavily on the
problem and the instances that are of interest. Rigorous analysis is
sometimes possible. Chekuri CS473 Part II
Finals and Closing Thoughts Chekuri CS473 Topics for Finals
Recursion: reduce problem to smaller instance(s) of itself
Divide and Conquer: divide into multiple pieces, solve
recursively and build solution to the original instance
Dynamic Programming: recursion with memoization, runs in
polynomial time if number of subproblems is polynomial. Graph Algorithms: DFS/BFS, directed graphs, DAGs, strong
components, MST, shortest paths
Greedy Algorithms: always think of a proper proof!
Network Flows and Applications
Polynomial time Reductions, P, NP, coNP, NPCompleteness
Light touch: Linear Programming, Integer Programming,
Approximation Algorithm Chekuri CS473 Topics I wish I had time for Data structures  hashing, splay trees etc.
Randomization in algorithms
Basic lower bounds
More on heuristics and applications
Experimental evaluation Chekuri CS473 Theoretical Computer Science Algorithms: ﬁnd eﬃcient ways to solve particular problems
Computational Complexity: understand nature of computation
— classiﬁcation of problems into classes (P, NP, coNP) and
their relationships, limits of computation.
Logic, Languages and Formal Methods
Form the foundations for computer “science” Chekuri CS473 The Computational Lens
The Algorithm: Idiom of Modern Science by Bernard Chazelle
http://www.cs.princeton.edu/chazelle/pubs/algorithm.html
Computation has gained ground as fundamental artifact in
mathematics and science.
nature of proofs, P vs NP , complexity, . . .
quantum computation and information
computational biology and the biological processes, . . .
Standard question in math and sciences: Is there a
solution/algorithm? Chekuri CS473 The Computational Lens
The Algorithm: Idiom of Modern Science by Bernard Chazelle
http://www.cs.princeton.edu/chazelle/pubs/algorithm.html
Computation has gained ground as fundamental artifact in
mathematics and science.
nature of proofs, P vs NP , complexity, . . .
quantum computation and information
computational biology and the biological processes, . . .
Standard question in math and sciences: Is there a
solution/algorithm?
New: Is there an eﬃcient solution/algorithm? Chekuri CS473 Questions? Chekuri CS473 Questions? Final Exam: Thursday, Dec 17th, 1.30  4.30pm in 1404 Siebel. Thanks! Chekuri CS473 ...
View
Full
Document
This note was uploaded on 01/22/2012 for the course CS 573 taught by Professor Chekuri,c during the Fall '08 term at University of Illinois, Urbana Champaign.
 Fall '08
 Chekuri,C
 Algorithms

Click to edit the document details