Operating Systems question

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Round robin algorithm (like a game show). There are processes 1,2,3,4,5.

Each round is limited to 200ms. Some of the processes finish the first time they run, some are longer than 200.

The question concerns calculating the mean wait time. This means if a process 1 runs for 200, the wait time for process 2 is 200.

When calculating mean wait time do you divide by 5 (number of different processes) or you divide by how many times it takes all of the processes to finish which would be like 7 or 8.

My operating systems book and google are failing me and my professor is gone at some conference. Thanks for any help or links.

Well it sounds like just 5, anyone have any input?
 

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Yes it is a homework question but I'm asking a simple question. It's about the method, not the actual question. If you can't say anything positive don't say anything at all.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
The OP was asked to move the topic here by someone in the general hardware forum, and I think it is programming related because it's unlikely anyone but a programmer would care about the implementation of this.

OP, I think all you need to do is keep track of the actual wait time between context switches and calculate a running average = total / number of samples.