- Jan 6, 2001
- 22,530
- 13
- 81
Round robin algorithm (like a game show). There are processes 1,2,3,4,5.
Each round is limited to 200ms. Some of the processes finish the first time they run, some are longer than 200.
The question concerns calculating the mean wait time. This means if a process 1 runs for 200, the wait time for process 2 is 200.
When calculating mean wait time do you divide by 5 (number of different processes) or you divide by how many times it takes all of the processes to finish which would be like 7 or 8.
My operating systems book and google are failing me and my professor is gone at some conference. Thanks for any help or links.
Well it sounds like just 5, anyone have any input?
Each round is limited to 200ms. Some of the processes finish the first time they run, some are longer than 200.
The question concerns calculating the mean wait time. This means if a process 1 runs for 200, the wait time for process 2 is 200.
When calculating mean wait time do you divide by 5 (number of different processes) or you divide by how many times it takes all of the processes to finish which would be like 7 or 8.
My operating systems book and google are failing me and my professor is gone at some conference. Thanks for any help or links.
Well it sounds like just 5, anyone have any input?
