This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: 8 CHAPTER 2 Multiprogramming created an environment for concurrent classic processes. It also made it possible for a programmer to create a group of cooperating processes to work concur- rently on a single problem. The resulting computation could even exhibit true parallelism if some of the processes were performing I/O operations while another one of the process- es was using the processor. If the cooperating processes were executing on a multi- processor, then multiple processes in the group could be executing in parallel. The threads refinement made it possible for a single process to take advantage of parallelism between the processor and the I/O and across CPUs on a multiprocessor. However, multiple coop- erating threads introduce the potential for new synchronization problems in software implementations: deadlock , critical sections , and nondeterminacy . These synchronization problems can occur whenever two or more concurrent processes/threads use any shared resource. In this chapter, we will first see how synchronization problems arise in concur- rent applications. Then, we will look at an abstract mechanism that can be used to solve synchronization problems. Finally, we will discuss ways that an OS can implement the abstract mechanisms. 8.1 Cooperating Processes As you have learned in your early programming courses, a program is a realization of a serial algorithmit is a step-by-step procedure for accomplishing some information pro- cessing task. The science (and art ) of designing serial algorithms to specify computations has dominated programming for over half a century. Consequently, computing environ- ments have focused on supporting serial computation. A classic process and a modern Basic Synchronization Principles thread are both sequential abstractions to execute a serial algorithm. While popular pro- gramming languages such as C and C++ have generally neglected to address concurrency, the underlying hardware technology has systematically moved toward it, using distributed and parallel computing machines. Economic pressures have driven application require- ments toward the use of parallel and distributed hardware. This trend is apparent in con- temporary management information systems, office computing environments, and numerical applications. We often rely on synchronization when we meet with other people. Suppose Betty, John, and Pat decide to have a meeting in Bettys office. They must next agree on a time for the meeting. They each consult their calendars to decide when all of them wish to meet. By reaching an agreement about the time of the meeting, they are explicitly com- mitting to synchronize their independent schedules so that they will all be in the same place at the same time. In an espionage movie, synchronization can be finer grained: Suppose that a team has a plan for attacking a fort in which each member of the team must be prepared to perform a specific task at exactly the same time, say 5:00, as all the others...
View Full Document
- Winter '11