This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CSC 30155 Wednesday 08/12/10 Dr. Daniel Hughes [email protected] Today’s Lecture l Recap of Approximation Algorithms (20 mins) l The Vertex Cover Algorithm (40 mins) l Multiprocessor Scheduling Algorithm (30 mins) l Free time to work on the assignment (20 mins) CSC 30155 Approximation Algorithms Dr. Daniel Hughes [email protected] Dealing With Complexity l A large number of the optimization problems, including those that we need to solve in practice, are NPhard . l These problems are unlikely to have an efficient algorithm, but that doesn’t mean we can avoid solving them. l If P ≠ NP , we cannot find algorithms which will find optimal solutions to all instances in polynomial time. l There are three possibilities however for relaxing these requirements … Option 1: Heuristics l A heuristic solves a problem, while ignoring whether the solution can be proven to be correct . Heuristics allow us to solve problems that are hard, but at the expense of accuracy or provability. l Typically heuristic solutions learn what avenues to pursue and which ones to disregard by keeping track of past results. Therefore, some possibilities will never be generated as they are measured to be less likely to complete the solution ( reducing complexity ). l Examples: l Realtime antivirus scanners use heuristic signatures to look for specific characteristics of viruses and malware. l Probabilistic routing of messages in wireless networks. l Electronic parsing of human speech. Aside: Branch and Bound l Branch and bound systematically enumerates all candidate solutions, where large subsets of fruitless candidates are efficiently discarded using upper and lower estimated bounds of the quantity being optimized. l Sometimes ( but not very often! ) we can use techniques such as branchandbound to come up with algorithms which are not much worse than polynomial time. Option 2: Probabilistic Analysis l What if we do not require success on all instances? l Sometimes we have information about the probability distribution from which inputs are chosen. l Sometimes we can find a polynomialtime algorithm that finds an optimal solution with high probability for the averagecase input. Option 3: Approximation l We use approximation algorithms where we need a solution quickly, but we can tolerate some deviation from the optimal solution. l Sometimes we can design a polynomialtime algorithm that is guaranteed to produce a solution efficiently that is not much worse than the optimal solution. Optimization Problems l Recall that an optimization problem P is defined by four components: ( I , S , v , goal ) where: l (1) I is the set of the instances of P . l (2) For each x ∈ I , S(x) is the set of feasible solutions associated with x ....
View
Full
Document
This note was uploaded on 05/22/2011 for the course CSC 30155 taught by Professor Garyli during the Spring '11 term at University of Liverpool.
 Spring '11
 GaryLi
 Algorithms, Databases

Click to edit the document details