Lecture 38 Announcements•Exam 3 on Friday, review Wed.•Topic of the dayParallel programming – an introductionNot in the book ! The von Neumann ComputerMemory: holds the instructions and data Processing Unit: processes the informationInput: external information into the memoryOutput: produces results for the userControl Unit: manages computer activitySingle instruction, single data stream computer (SISD)Memory (RAM)Processing UnitInputOutputMARMDRALUControl UnitPCIR*Register(s)devicehandlerdevicehandleree306bme303Introduction to parallel computing and parallel programmingWhat is Parallelism?•Parallelism and parallel problem solving are common•Parallel Computing is using more than one computer, or a computer with more than one processor, to solve a problem•Parallel programming is a strategy for performing a large, computational task faster.–Task-A logically discrete unit of computational work– A large task can either be performed serially, one step following another, or canbe decomposed into smaller tasks to be performed simultaneously, i.e., in parallel.– Assigning the smaller tasks to multiple worker processors (computers) to work on simultaneously– Coordinating the worker processorsControl Abstractionopen dooropen doordetails of enter door algorithmdoor algorithmsimultaneouslydrink coffeedrink coffeedetails of drink coffee algorithmCintroduces parallelisminto the systemForkJoinI have 2 processorsTerminology of Parallelism•Parallelizable Problem-A problem that can be divided into parallel parts.•Parallel Tasks-Tasks whose computations are (relatively) independent of each other, so that all such tasks can be performed simultaneously with correct results.•Basic Types of Parallelism-–Task parallelism: different tasks performed on different processors. –Data parallelism: each processor performs same task,on different data.•Synchronization-The temporal coordination of parallel tasks. It involves waiting until two or more tasks reach a specified point before continuing any of the tasks - to coordinate information exchange among tasks–Can consume wall-clock time because some tasks(s) can sit idle waiting for other tasks to complete; there are other potential complications–Can be a major factor in decreasing potential parallel speedup•Parallel Overhead-
has intentionally blurred sections.
Sign up to view the full version.