Unformatted text preview: 0:60] x1 = lfoo(a1)@w1  x2 = lfoo(a2)@w2  x2 = lfoo(a3)@w3 x = x 1 + x 2 + x 3 merge worker2 def lfoo(a2): x2 = map(incr,a2) return x2 worker3 def lfoo(a3): x3 = map(incr,a3) return x3 M
and
W: Split 8 It must be possible to cleanly split the original item set into many par<<ons, such that each can serve as a unit of parallel processing. A decision is needed: how many par<<ons to generate? (What criterion springs to mind?) work split partition1 partition2 partition3 M
and
W: Spawn 9 Since each can serve as a unit of parallel processing, spawn parallel processes to work on them. A decision is needed: how many parallel processes to generate? (Is it likely that we can spawn a process per par<<on?) (What might happen as the number of processes grows?) OS spawn p1 p2 p3 partition1 partition2 partition3 M
and
W: Process, then Merge 10
It must be possible for the results of individual parallel processes to be merged into the ﬁnal result. The ﬁnal result is that which would have been obtained without paralleliza<on. A decision is needed: how do we pick up the pieces to s<tch them together? (How do we know we need wait no more?) (Does order ma`er?) p1 p2 p3 partition1 partition2 partition3 r1 r2 r3 merge result Challenges 11
What if the work cannot be divided into completely separate tasks? How do we assign work units to worker threads? What if we have more work units than threads? How do we know all the workers have ﬁnished? How do we aggregate the results at the end? Each of these challenges characterizes a point at which independent processes must communicate with one another. This may be because they need to synchronize (e.g., wait for one or more workers) or because they need nego<ate access to shared resource (e.g., who does what before whom). Synchronize/Sequence (1) 12 P1: fun foo(x,y): x = x+1 y = x Assume the ini<al condi<ons are:
x = 6 y = 0 P2: fun bar(x,y): y = y+1 x = x+3 What would the ﬁnal condi<ons be? y ∈ {7,8,10,…} Without isola<on, there is a race condi<on. Synchronize/Sequence (2) 13 ...
View
Full Document
 Spring '10
 the00, a00

Click to edit the document details