This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Solution 1. CHAPTER 6 Problem 5: We solve this problem by dynamic programming. The key task in
developing a dynamic programming algorithm is to characterize optimal solutions
in terms of optimal subsolutions. This requires understanding the structure of an
optimal solution. The input string is denoted by Y : ylyg...yn. For notational convenience, we write
Ym : for characters if k of Y, and speciﬁcally7 Yk : Y”: : yl...yk for
the ﬁrst k characters of Y. We let OPTUs) be the segmentation of Yk achieving
the highest total quality by the 77black box" quality() measure. Then what do we
know about OPT(k)? If k : 0, then Yk is the empty string, which always has quality quality(Y0) : 0. If
k 2 1, then we have a choice of where to segment. The optimum solution for Yk
must have some last point i < k at which a segmentation is made(this point may
be i : O, expressing what we believe all of Yk to be one segment). The total quality
of this segment is quality(Y;+1,k), plus whatever quality the segmentation of the
characters Yk to the left ofi contributed. But the segmentation of Y} must also be
optimal;— if it were not, then by changing it to the optimum, we would obtain a
better solution for Yk. So we have proved the following: Recursive Relation: The optimum quality OPTUs) satisﬁes OPT(O) : 0, and
OPT(k) : mﬂmogigk (quality(Yi+1,k)+OPT(i)). We can convert this to an actual dynamic programming algorithm in the usual
way. We use a to denote the quality(which we want to coincide with the optimum
OPT())), and 3%] for the index of the last segmentation point that gives the op timum value for Yk.
Algorithm: set alO] :: 0 for k : l to n do Let 0 S i* < k be such that it maximizes the quantity a[i] + guality(l€+1, k).
Let elk] :: 03M + quality0§+b k), and slk] :: a“ end for
Output a[n] as quality, ans n,8[nl, 8[8[?’LH, ...,0 as segmentation points. That the algorithm computes the right values am is immediate, as we explicitly
compute the same function as OPT(k).(A really formal proof of this would use
induction on k.) The fact that the segmentation is in fact a corresponding optimal
segmentation is because we compute each 3%] to be the actual last segmentation
point chosen for the optimum for Yk. The running time of this algorithm is @(ng), as in each of (90%) iterations, we choose
a minimum over @(n) values. Problem 6: Let W : {101, Mg, be the set of ordered words which we
wish to print. Suppose, in the optimal solution the ﬁrst line contains is words then
the rest of lines constitute an optimal solution for the subproblem with the set
{wk+1, ...,wn}, otherwise by replacing the nonoptimal solution of {wk+1, ..., wn}
by an optimal solution into {M1,w2, we obtain a solution which beats the
optimal solution on {101, Mg, ..., mm}, which is a contradiction. Recursive Relation: Let constitute the sum of the squares of the slacks for the optimal solution of {club ..., Then,
Clil : [L i (232% +1) + 07012 if 2;;le +1) + 0n S L : ménj:i,___,i+p,1{[L — +1)+ on]? + OH + 1]} where the ﬁrst p words
can be added to the line.
Time Complexity: Filling each entry would take 0(L) time(since + p 7 1) 7 é—l— 1 : p < L)) and there are 0(n) such entries. Hence, the running time is 0(Ln). Problem 14: (a) This part is really more a hint on how to solve the ”real” question, part So let us solve a (slightly) more general problem: ﬁnding the shortest path PM
which exists in all graph G5, ..., Gj. Also, we will alow for the fact that there may
not be any path which is in all of these graphs, in which case we deﬁne the cost of
such a path to be 00. The simple intuition is that if a path P is to be in all of 02, ..., Gj, then each
of? its edges must be in all of these graphs. In other words, each edge must be in
ﬂiﬂEk. Since that is the only constraint, P is in fact simply the shortest s i t path in the graph VﬂiziEk. Taking the intersection of the edge sets is not difﬁcult,
and after that, we can simply run any shortest paths algorithm, like Dijkstra or
Bellman/Ford. So formally, our algorithm look as follows: (1) Let E : nigh (2) Run Dijkstra’s algorithm on G : (V, E) to ﬁnd a shortest s i t path. (3) If there was no 8 i t path, return 00, else return the shortest s i t path P
and its total length 1(P). (b)We want to characterize the optimum cost OPT(t) up to(and including) time
t in terms of the optimum costs for times t’ < t. We can distinguish between two
possibilities for the optimum solution up to time t. case 1. 1f the s i t path it uses never changes, then it used the same path from
time 0 all the way to time t. The path P it uses must be in all graphs G0, ..., Gt, and , because it is the optimal solution, it must be the cheapest such path. In other
words, the optimum solution uses the path PM, and its total cost is (13+ 1)E(P0,t). case 2. the s —t path changes at least once in the optimal solution. Then, there
must be some last time t’ < t such that the path changed after time step 13". That
means that the path stayed the same after t’,and as argued above, the cheapest path to accomplish that is 13,3413. Also, at time t’, the optimum paid a changing
cost of K. What did the optimum do up until time t"? It must have solved that subproblem op
timally, as otherwise, we could again combine a better solution for graphs G0, ..., Gt:
with the path Pturl’t, giving a better solution. So to sum up, the optimum cost in
this case is OPT(t) : OPTW) —l— K—l— (t — t’)l(Ptr+1,t). Together, these two observations prove the following recursive relation: Recursive Relation: The optimum cost OPT(t) satisﬁes the recurrence OPT(O) :
[(130), and OPT(t) : mén((t+1)t(P0,t),mén3r<t(OPT(tl) + K + (t — tl)t(Py+1,t))) for t > 0. This recursive relation suggests directly a dynamic programming implementa
tion. Again, we let alt] denote our variable that is supposed to compute OPT(t),
and we let slt] be the last point after which we changed paths in the optimum
solution up to time t. Again, that alt] corresponds to the right values, and that the slt] are the right
segmentation points, follows by our arguments proceeding the recursive relation. Time Complexity: In each of 0(1)) iterations, we take a minimum over 0(1))
values. Computing the ylt’] takes one invocation of ShortestSharedPath each.
ShortestShared—Path takes the intersection of 0(3)) graphs, and then runs Dijkstra,
so its running time is 0(3) + m + nlog As we invoke it 0(52) times, the total
running time is 0(3)2 (b + m + nlog In fact, by being a little more efﬁcient, we
can pre—compute all pairwise intersections in time O(b2m), so the running time can
be reduced to 0(52(m —l— nlog n) —l— 52m) : 0(b2 (m + n log The description of
the algorithm is as follows: (1)Let alO] : 1(P0)(use Dijkstra in GO) and le] : 71 (2 fort:1tobdo )
(3) let )3” be such that it minimizes *ylt’] : alt’] + K+ (t i tl)t(P,j/+1,t)
(4) if (t + mpg,» < 7M
(5) set alt] : (t+1)l(P0,3) and slt] : 71
(6) else
(7) set alt] : ylt’] and slt] : t’
(8) end if
(9)end for
(10)Output alb] as total cost, 5, slb], slslb]]..., —1 as points at which path changes, and Pslbl+b P3l3[5]+1..., as corresponding paths ...
View
Full
Document
This note was uploaded on 01/22/2010 for the course CSCI 570 taught by Professor Shamsian during the Fall '06 term at USC.
 Fall '06
 Shamsian
 Algorithms

Click to edit the document details