Almost always you can just plug in numbers that common sense tells you comprise the worst case scenario. Unless said worst case is guaranteed to only run a fixed amount of times regardless of N (i.e. all loops are near-optimal, except for at most 5 loops which aren't, even if we loop a thousand times - very rare) - then just do the math with the worst case.
This goes as far as each individual loop: Take the worst case for each. This is right, unless there is a link between the loops, e.g. if the inner loop runs 'worst case' only if some outer loop is guaranteed to run best case, and vice versa - when there is a clear relationship.
So, let's apply it here:
- i runs from 0 to n*n, that part is easy.
- j runs from 0 to i (we can ignore constant factors, so that
/5
part is irrelevant, and we can forget about it).
The j loop's worst case is that i is equal to n*n
, which makes j run from 0 to n*n
. So let's just use that.
This is in fact the right answer; this is a loop that runs for n*n
iterations (This is the j
loop), and this process is repeated n*n
times (The i
loop), for a total complexity of O(n^4)
(ouch, that's gonna fall over pretty quickly as n grows).
If you are unsure if the math works out, try to find linearity. That guarantees it. In this case, the best case scenario is when i
is 0, in which case the j loop runs only once, vs. the worst case where the j loop runs n*n
times. Crucially, the characteristic of the j
loop is linear: Just chart out what happens to the performance of the j loop over the 'lifetime' of the i loop, and it's a simple straight relationship. First 5 j loops run once, second 5 j loops run twice, all the way up to the last j loop which runs n*n/5
times.
The 'average' of all j loops is this simply the middle of that line: half of n*n/5
. Which is n*n/10
, and constant factors do not matter, so that's still n*n
. Hence why this is O(n^4)
, the same as for (j = 0; j < n*n; j++)
would have been.