-1
f(int n) { int array[n] if (n == 1) return; else {
  f(n/2); f(n/2);
  return;

} }

I know when f(n/2) is one, the time complexity is O(log n). but this function has two f(n/2). Does this function have a time complexity O((log n)^2)? and Is space complexity the same?

1 Answers1

0

TLDR

Time Complexity = O(n) (or Θ(n))

Space Complexity = O(n)

Explanation

Your function is as follows: (else statement is removed for clarity as it is not-needed/redundant)

f(int n)
{
    int array[n];
    if(n == 1)
        return;
    f(n / 2);
    f(n / 2);
    return;
}

Computing Time Complexity

Let us analyze your function line by line and let us try to compute its time complexity. Also let for some input `m` the time taken by the function is some `T(m)`.
f(int n)            // it takes time T(n)
{
    int array[n];   // since it is just declaration, it takes time O(1) that is constant time
    if(n == 1)      // condition checking also takes O(1) time
        return;     // this also takes O(1) time
    f(n / 2);       // it takes time T(n / 2)
    f(n / 2);       // it takes time T(n / 2)
    return;         // it takes O(1) time
}

Adding time taken by all the statements (including time for only one return statement and not both), we get

T(n) = 2T(n / 2) + 3O(1)

Now using Masters Theorem (reference: method 3 of https://www.geeksforgeeks.org/analysis-algorithm-set-4-master-method-solving-recurrences/),

We first compute nlogba = nlog22 that is just n

Now we see that, out of the three cases of Masters Theorem, we are able to find a value for c which satisfies the first case

If f(n) = O(nc) where c < logba then T(n) = Θ(nlogba)

We can take any c > 0 to satisfy the above if condition because f(n) that is 3O(1) will always be bounded from above by O(npositive value)

Also, we see that we cannot find any c value for the rest of the two cases of Masters Theorem. Hence, using the first condition of Masters Theorem, we get

T(n) = Θ(nlogba) = Θ(nlog22) = Θ(nlog22) = Θ(n)

Computing Space Complexity

We see that in our function f(n) we have two calls to f(n / 2). We note here that the second call to f(n / 2) will be made only after the successful and complete execution of the first call to f(n / 2). Therefore, to compute the space complexity (which gives an idea of the space required for the program to run), we need not consider the space required for both the function calls at the same time, as the second call executes after the completion of the first call (and therefore its required memory resources are released and thus can be reused by the second call).

So, now, let the space complexity for our function for input size m be S(m). Therefore, we have

S(n) = O(n) + S(n / 2)

as the declared array on the first line of the function takes space of O(n) and no other variables are declared (Note: omitting the space for the input parameter n)

Also, we know that,

S(n / 2) = O(n / 2) + S(n / 4)

Substituting this value of S(n / 2) in S(n), we get

S(n) = O(n) + O(n / 2) + S(n / 4)

Similarly, recursively keeping on substituting the values of S(n / 4), S(n / 8), ..., we get

S(n) = O(n) + O(n / 2) + O(n / 4) + ... + O(1) (when n == 1)

For simplicity, let us take the constant factors involved as 1, therefore, we get,

S(n) = n + n / 2 + n / 4 + ... + 1

S(n) = n * ( 1 + 1 / 2 + 1 / 4 + ...)

We see that ( 1 + 1 / 2 + 1 / 4 + ...) is a Geometric Progression (GP) with a = 1 and r = 1 / 2.

Since, the GP is decreasing (and we can also consider it to be infinite), using the formula for sum of infinite series of GP. we get

( 1 + 1 / 2 + 1 / 4 + ...) = a / 1 - r

= 1 / (1 - 1/2)

= 1 / (1/2)

= 2

Therefore, S(n) = n * 2 = 2n = O(n)

ubaid shaikh
  • 453
  • 3
  • 7