threads-04

threads-04 - Threads Announcements Cooperating Processes...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon
Threads
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Announcements
Background image of page 2
Cooperating Processes Last time we discussed how processes can be independent or work cooperatively Cooperating processes can be used: to gain speedup by overlapping activities or working in parallel to better structure an application as set of cooperating processes to share information between jobs Sometimes processes are structured as a pipeline each produces work for the next stage that consumes it
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Consider the following code fragment on a dual core CPU: for(k = 0; k < n; k++) a[k] = b[k] * c[k] + d[k] * e[k]; Instead: CreateProcess(fn, 0, n/2); CreateProcess(fn, n/2, n); fn(l, m) for(k = l; k < m; k++) a[k] = b[k] * c[k] + d[k] * e[k]; Case for Parallelism
Background image of page 4
Case for Parallelism Consider a web server: Get network message from socket Get URL data from disk Compose response Write compose. Server connections are fast, but client connections may not be (grandma’s modem connection) Takes server a loooong time to feed the response to grandma While it’s doing that it can’t service any more requests
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Parallel Programs To build parallel programs, such as: Parallel execution on a multiprocessor Web server to handle multiple simultaneous web requests We will need to: Create several processes that can execute in parallel Cause each to map to the same address space because they’re part of the same computation Give each its starting address and initial parameters The OS will then schedule these processes in parallel
Background image of page 6
Process Overheads A full process includes numerous things: an address space (defining all the code and data pages) OS resources and accounting information a “thread of control”, • defines where the process is currently executing That is the PC and registers Creating a new process is costly all of the structures (e.g., page tables) that must be allocated Context switching is costly Implict and explicit costs as we talked about
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Need something more lightweight What’s similar in these processes? They all share the same code and data (address space)
Background image of page 8
Image of page 9
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 29

threads-04 - Threads Announcements Cooperating Processes...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online