The problems are different. The first problem does not specify a time quantum, so you have to assume the quantum is very small compared to a minute. The second problem clearly specifies a one minute scheduler quantum.
Up vote 1 down vote favorite share g+ share fb share tw.
Problem : Five batch jobs A through E, arrive at a computer center at almost the same time. They have estimated running times 10, 6, 2, 4, and 8 minutes. Their (externally determined) priorities are 3, 5, 2, 1, and 4, respectively, with 5 being the highest priority.
Determine the mean process turn around time. Ignore process switching overhead. For Round Robin Scheduling, assume that the system is multiprogramming, and that each job gets it fair share of the CPU.
All jobs are completely CPU bound. Solution #1 The following solution comes from this page : For round robin, during the first 10 minutes, each job gets 1/5 of the CPU. At the end of the 10 minutes, C finishes.
During the next 8 minutes, each job gets 1/4 of the CPU, after which time D finishes. Then each of the three remaining jobs get 1/3 of the CPU for 6 minutes, until B finishes and so on. The finishing times for the five jobs are 10, 18, 24.
28, 30, for an average of 22 minutes. Solution #2 the following solution comes from Cornell University, can be found here, and is obviously different from the previous one even though the problem is given in exactly the same form (this solution, by the way, makes more sense to me) : Remember that the turnaround time is the amount of time that elapses between the job arriving and the job completing. Since we assume that all jobs arrive at time 0, the turnaround time will simply be the time that they complete.
(a) Round Robin: The table below gives a break down of which jobs will be processed during each time quantum. A * indicates that the job completes during that quantum. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 A B C D E A B C* D E A B D E A B D* E A B E A B* E A E A E* A A* The results are different: In the first one C finishes after 10 minutes, for example, whereas in the second one C finishes after 8 minutes.
Which one is the correct one and why? I'm confused.. Thanks in advance! Operating-system scheduling round-robin link|improve this question asked Mar 28 at 23:38user1073400826 81% accept rate.
The problems are different. The first problem does not specify a time quantum, so you have to assume the quantum is very small compared to a minute. The second problem clearly specifies a one minute scheduler quantum.
The mystery with the second solution is why it assumes the tasks run in letter order. I can only assume that this an assumption made throughout the course and so students would be expected to know to make it here.
You can suppose (just for the sake of the exercise) that the quantum is 1 minute. I know it's not realistic and it should be something like 1msec but just to make calculations easier the solution is apparently using a 1 minute quantum. If that's the case, is the 2nd solution correct?
As for the 1st solution, since we're not dealing with a 1 minute quantum why does C finish after 10 minutes and not after 3 or 2minutes? Could you provide me with the formula/reasoning that gives this result to help me understand? – user1073400 Mar 29 at 0:09 1 If you assume 1 minute quanta, the second solution is correct (except it ignores priorities, but that's another story).
But that's an unreasonable assumption and you shouldn't make it unless the problem says so. C finishes after 10 minutes because it needs 2 minutes of CPU time and gets 1/5 of the CPU. So it needs 10 minutes to get 2 minutes of CPU time.
At that time, all processes have gotten 2 minutes of CPU time and you can calculate how much each one still needs. From then on, they get 1/4 until another process completes. And so on.
– David Schwartz Mar 29 at 0:16 1 I don't follow your reasoning. It's getting 1/5 of the CPU on average. Since it's round robin, the CPU is alternating among the tasks giving each one a very small quantum.
C will finish as soon as it has gotten 2 minutes total. This will happen at 10 minutes plus or minus a time quantum or so, which is much, much less than a minute. (We understand the numbers are not precisely exact.
We have to make many assumptions, such as assuming that switching tasks has zero cost. ) So long as all five tasks are ready to run, task C will get 1/5 of each minute of wall time. – David Schwartz Mar 29 at 0:42 1 When C finishes, the time will be 10 minutes.
C will have gotten the CPU for 2 of those 10 minutes. The other tasks will have gotten the CPU for the other 8 minutes. That time has already elapsed at the 10 minute mark.
(We have no idea what order the tasks run in or how long they run without being interrupted, but we don't care. That's all about task switching overhead, which we are to ignore. So we can act as if they run concurrently.
) – David Schwartz Mar 29 at 0:57 1 The first solution would be the same, unless you're supposed to assume a particular ordering of the tasks (as was done in the second). – David Schwartz Mar 29 at 2:55.
In fact, there is no such thing as a 'correct' RR algorithm. RR is merely a family of algorithms, based on the common concept of scheduling several tasks in a circular order. Implementations may vary (for example, you may consider task priorities or you may discard them, or you may manually set the priority as a function of task length or whatever else).
So the answer is - both algorithms seem to be correct, they are just different.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.