4

On pg. 44 of Cracking the Coding Interview, there is the following algo:

int f(int n) {
    if (n <= 1) {
        return 1;
    }
    return f(n - 1) + f(n - 1);
}

The book says that this has time complexity of O(2^n) and space-complexity of O(n). I get the time complexity part since there are O(2^n) nodes created. I don't understand why the space-complexity is not also that. The book says because it's because only O(n) nodes exist at any given time.

How can that be? Wouldn't the call-stack have all 2^n calls when we are at the bottom level of f(1)? What am I missing?

Please let me know if I can provide more detail.

Thanks,

Python Developer
  • 551
  • 1
  • 8
  • 18
  • 2
    No, the call stack only reaches a level of `n` calls before recursion stops. At each depth, `n` is one less, so none of the recursive calls can affect the maximum call depth. – paddy Nov 11 '21 at 04:18
  • 1
    The second call to `f(n-1)` doesn't take place until after the first one returns. They do not occupy stack space at the same time. And the same at every other level. – Nate Eldredge Nov 11 '21 at 05:19
  • Thank you both! That was helpful, particularly @Nate Eldredge. If you want to designate as the answer, I will accept it! – Python Developer Nov 12 '21 at 01:54
  • I am not convinced by the answer. I posit that @NateEldredge's explanation, "The second call to f(n-1) doesn't take place until after the first one returns" is making an assumption about that way a call stack operates. A naive implementation of a call stack may not follow this assumption. – Hamster Hooey Jan 13 '22 at 07:22

1 Answers1

5

No. The second call to f(n-1) doesn't take place until after the first one returns, so they do not occupy stack space at the same time. When the first one returns, its stack space is freed and may be reused for the second call.

The same applies at every level of the recursion. The memory used is proportional to the maximum depth of the call tree, not the overall number of nodes.

Nate Eldredge
  • 48,811
  • 6
  • 54
  • 82