0

I understand that the innermost for loop is Θ(logn) and the two outermost for loops is Θ(n^2) because it's an arithmetic sum. The if-statement is my main problem. Does anyone know how to solve this?

int tally=0;
for (int i = 1; i < n; i ++)
{
   for (int j = i; j < n; j ++)
   {
        if (j % i == 0)
        {
            for (int k = 1; k < n; k *= 2)
            { 
                tally++;
            }
        }
   }
}
okamiaaron
  • 90
  • 1
  • 8

1 Answers1

3

Edit:
Now I noticed loop order: i before j.

In this case for given i value j varies from i to n and there are (n/i) successful if-conditions.

So program will call then most inner loop

n/1 +n/2+n/3+..+n/n

times. This is sum of harmonic series, it converges to n*ln(n)

So inner loop will be executed n*log^2(n) times.

As you wrote, two outermost loops provide O(n^2) complexity, so overall complexity is O(n^2 + n*log^2(n)), the first term overrides the second one, loop, and finally overall complexity is quadratic.

int tally=0;
for (int i = 1; i < n; i ++)
{  
   // N TIMES
   for (int j = i; j < n; j ++)
   {  
     //N*N/2 TIMES
        if (j % i == 0)
        {
         //NlogN TIMES
            for (int k = 1; k < n; k *= 2)
            { 
             //N*logN*logN
                tally++;
            }
        }
   }
}

Old answer (wrong)

This complexity is linked with sum of sigma0(n) function (number of divisors) and represented as sequence A006218 (Dirichlet Divisor problem)

We can see that approximation for sum of divisors for values up to n is

  n * ( log(n) + 2*gamma - 1 ) + O(sqrt(n))

so average number of successful if-conditions for loop counter j is ~log(j)

MBo
  • 77,366
  • 5
  • 53
  • 86
  • Ah, I forgot to mention that I'm trying to find the worst case runtime. Would ~log(j) happen to also be the worst case while being the average case? – okamiaaron Sep 19 '18 at 07:41
  • 1
    @okamiaaron I've completely rewritten solution – MBo Sep 19 '18 at 08:03
  • 3
    @MBo - I really like the in-depth answer (+1), but in this particular case I would call it `N^2 / 2` as the heaviest operation seems to be the modulo `%`. – bobah Sep 19 '18 at 08:07
  • @MBo This definitely cleared things up. But wouldn't n*log^2(n) be the runtime of the second for loop and everything inside it, meaning the overall running time would be (n^2)*log^2(n)? – okamiaaron Sep 19 '18 at 22:43
  • 1
    Code after `if` has runtime `n*log^2(n)`. But after `~n^2` if-operators only `~n*log(n)` further calls occur. I added estimated call count scheme. – MBo Sep 20 '18 at 04:00
  • @MBo I see, thank you for the response. Since the only line of code being repeated is `tally++`, wouldn't we just say that the total runtime is `n*log^2(n)`? Wouldn't the runtime only be `n^2 + n*log^2(n)` if there was code right before or after the if statement? – okamiaaron Sep 20 '18 at 20:14
  • 1
    There is pitfall: line ` if (j % i == 0)` is repeated more times. In estimation of run times one should account for both significant code lines (like `tally++`) and comparison operations (like if's) and swaps etc. If you have clear problem statement - count only significant lines - then answer is `n*log^2(n)` but in general case - `n^2` – MBo Sep 21 '18 at 03:57
  • 1
    If I were instructor, I would be glad to see any solution but together with reasoning and considering both approaches (yes, I realize that some ones expect only answer from their book) – MBo Sep 21 '18 at 04:01