-1
def f3(n):
   if n <= 1:
      return 1
   L = [i for i in range(n)]
   return 2 * f3(n // 2) + 1

I am trying to calculate the time and space complexity of this function, and I have a very important question about the last part 2*f3(n//2): do we consider it 2 recursive calls, or is it just one?

Here's how I'm calculating: T(n)=n+2T(n//2)=n+2*[n/2 + 2t(n//4)]=2n+4T(n//4)=...=kn + 2^k*T(n/(2^k)) which stops at whenever n/(2^k) = 1 then n = 2^k, log(n)=k. So we have log(n) iterations, by substituting back we get nlog(n)+n*1 which means time complexity of O(nlog(n)), and same for space complexity since we're using list comprehension.

But now looking at it again 2*f3(n//2), is it the same as f3(n//2)+f3(n//2), or I'm making a big mistake?

Because if it's only one call to the function the time and space complexity will be O(log(n)), I guess.

I would appreciate any explanation about my question, and would be happy to hear any feedback and tips about my answers.

ad absurdum
  • 19,498
  • 5
  • 37
  • 60
Pwaol
  • 206
  • 1
  • 7
  • 1
    It's one recursive call: the *result* is multiplied by two. – chepner Feb 19 '21 at 19:09
  • @chepner Thanks for the comment, does multiplying the result affect the time complexity calculations? or would I consider it normal? -> T(n) = n + T(n//2) – Pwaol Feb 19 '21 at 19:11
  • 1
    To answer your question, it is one recursive call because the function only gets called once (from a given parent call). However, it's made more complex by the fact that your input is a single integer, so formally you cannot assume that an integer of that size takes constant space or that arithmetic on integers that size takes constant time. – kaya3 Feb 19 '21 at 19:12

1 Answers1

1

The number of recursive calls is O(lg n), but you still need to take into account how much work each of those calls does. That work (computing L) is linear in the size of the input. As a result, the total time complexity is essentially the sum n + n//2 + n//4 + ... + 1, which you should be able to verify as O(n).

The space complexity depends on what exactly you need to do with L. You can reduce the space complexity by getting rid of it before the recursive call if you don't need it during or after the recursive call.

chepner
  • 497,756
  • 71
  • 530
  • 681
  • Thanks for the explanation, I messed up calculating that sum when I considered the recursive call as one, but I didn't quite get the idea of the space complexity, if we have L in each recursive call, wouldn't that mean we should count it? then we get O(n(lg n)) since we have log(n) recursive calls, and we get the same sum you mentioned for space complexity too. could you elaborate please? – Pwaol Feb 19 '21 at 19:33
  • 1
    Each recursive call gets its own `L`, referenced from that call's stack frame. *Cumulatively*, the space complexity is O(n), because the top call uses O(n) space, the first recursive call `O(n//2)`, etc. If you didn't need to hold on to that space through the recursive calls (e.g., you could call `del L` just before the recursion), you would have much less memory in use at any given time (but still O(n) because of the first list). – chepner Feb 19 '21 at 19:43