Not sure if I understood the concept of RR correctly.
Let's say I've got three processes ready to be CPU'd:
A - 1st priority - requires 2 minutes of CPU time;
B - 3rd priority - 5 minutes;
C - 2nd priority - 10 minutes;
So in order to calculate average turnaround time 'on paper', I can presume that quantum=1 minute and process everything according to their priorities (A-C-B, etc).
But one minute is too big for a 'real' quantum, right? Should quantum=10-100 milliseconds, will everything be switched so fast that task order becomes irrelevant? Should I assume that each job will simply consume an equal amount (1/3) of CPU time and go from there? E.g. A will end in 2*3=6 minutes, B will end in (5-2)*2+6=12 minutes and C will end in 10-2-3+12=17 minutes. Thus average tat is (6+12+17)/3=11.66? Or is this just ridiculous?