A number of terminals simultaneously connected to the

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ed in a time-sharing system allocates a very short period of CPU time one-by-one to each user process, beginning from the first user process and proceeding through the last one, and then again beginning from the first one. This short period of time during which a user process gets the attention of the CPU is known as a time slice, time slot, or quantum and is typically of the order of 10 to 100 milliseconds. That is, when the CPU is allocated to a user process, the user process will use the CPU until the allotted time slice expires (the system's clock sends an interrupt signal to the CPU after every time slice), or until the process needs to perform some I/O operation, or if the execution of the process is over during this time period. Notice here that in a time-sharing system, the CPU is taken away from a running process when the allotted time slice expires, even though the process can continue to run. Hence the process state diagram of a timesharing system is as shown in Figure 14.10 (compare this with the process state diagram of Figure 14.5). Now let us see how the CPU scheduling algorithm mentioned above gives an impression to each user that he/she has his/her own computer. Let us assume that the time slice of a time-sharing system is 10 milliseconds. That is the CPU scheduling algorithm of this system allocates 10 milliseconds to each user process one-by-one in a circular fashion (when the last process is over, it comes back to the first process). Suppose the processing speed of the system's CPU is of the order of 500 million instructions per second. That is, it can execute 500 x 10 6 x 103 x 10 = 5x106 = 5 million instructions in 10 milliseconds. This is large enough for substantial progress of a single user process. Now suppose there are 100 user terminals in this time-sharing system and 100 users are simultaneously using the system. Then if 10 milliseconds is allocated to each user process one-by-one, a particular user will get the CPU's attention once in every 10 x 100 milliseconds = 1 second. As human reaction time is normally of the order of a few seconds, a particular user will not notice any del...
View Full Document

This document was uploaded on 04/07/2014.

Ask a homework question - tutors are online