I have a set of programs and for each program, it contains many subprograms, of which, one subprogram has the longest runtime. My goal is to calculate the the average ratio of (longest runtime)/(entire program runtime). I want to know what is the right way to do so.
> program longest runtime entire runtime ratio
>
> 1 10 secs 50 secs 0.2
>
> 2 5 secs 40 secs 0.125
>
> 3 1 secs 10 secs 0.1
>
> 4 20 secs 80 secs 0.25
>
> 5 15 secs 20 secs 0.75
So I want to see how much percentage the longest runtime takes of the entire runtime. There are two ways to do so: 1: compute the ratio for each program and then calculate the average of the ratios.
(0.2 + 0.125 + 0.1 + 0.25 + 0.75) / 5 = 1.425 / 5 = 0.285
2: compute the sum of longest runtime and then divided by the sum of entire runtime.
sum_longest = 41 secs
sum_entire = 200 secs
average = 41 / 200 = 0.205
which way is correct?