MIT1_204S10_lec14

MIT1_204S10_lec14 - 1.204 Lecture 14 Dynamic programming:...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
1.204 Lecture 14 Dynamic programming: Job scheduling Dynamic programming formulation To formulate a problem as a dynamic program: Sort by a criterion that will allow infeasible combinations to be eliminated efficiently Choose granularity (integer scale or precision) that allows dominated subsequences to be pruned Choose coarsest granularity that works for your problem Use dynamic programming in fairly constrained problems with tight budgets and bounds If problem is not highly constrained, you will need to apply heuristic constraints to limit the search space Choose between multistage graph, set or custom implementation Decide if a sentinel is helpful in set implementation Experiment Every problem is a special case, since DP is O(2 n ) Can you find special structure that makes your DP fast? 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
DP examples This lecture shows another example Job scheduling, using multistage graph Example of sorting feasibility pruning used effectively Example of sorting, feasibility, pruning used effectively Example of good software implementation No graph data structure built; solution tree built directly Good but not ideal representation of tree/graph nodes; some nodes are created but not used We don’t even consider 2-D arrays, linked lists, etc., which do not scale at all, but which are popular in many texts. Crazy Good DP codes are somewhat hard to write; there is much detail to handle and many lurking inefficiencies to comba detail to handle and many lurking inefficiencies to combat We will not dwell on the code details, but they are important Knapsack problem in next lecture, using sets Example of sorting, feasibility, pruning in different framework Multistage graph doesn’t work: too many nodes per stage Object oriented design is big improvement over past codes Be careful: many texts have zillions of inefficient, tiny objects Job scheduling dynamic program Each job to be scheduled is treated as a project with a profit, time required, and deadline We have a single machine over a given time (resource) Use multistage graph formulation from last lecture Algorithm pseudocode: Sort jobs in deadline order (not profit order as in greedy) Build source node for job 0 Consider each job in deadline order: Build set of nodes for Build set of nodes for nex next stage (job) stage (job) for each state (time spent for each state (time spent) For current job: Build arc with no time assigned to job If time so far + current job time <= job deadline, build arc with job done Build sink node for artificial last job Trace back solution using predecessor nodes 2
Background image of page 2
3 Job scheduling algorithm We will label every node in the graph that we encounter with its profit and time used If we find a better path to that node, we update its profit and time labels This is exactly the same as the shortest path label correcting algorithm We know this algorithm runs fast
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/04/2011 for the course ESD 1.204 taught by Professor Georgekocur during the Spring '10 term at MIT.

Page1 / 16

MIT1_204S10_lec14 - 1.204 Lecture 14 Dynamic programming:...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online