0

What's the time complexity of the following two functions?

int fun1(int n){
if(n>0)
return (2*fun1(n-1)+1);
else return 1;
}

int fun2(int n){
if(n>0)
return (fun2(n-1)+fun2(n-1)+1);
else return 1;
}

Obviously for fun2 we write recursive equation as T(n) = 2*T(n-1) + 1 but how do we write recursive equation for fun1?

striker
  • 560
  • 6
  • 13
  • 1
    Obviously we must understand that calling function two times is not same as calling it and multiplying it with some constant, while we are calculating its time complexity – striker Oct 20 '20 at 13:55

2 Answers2

2

Just a quick look at the code (I may be wrong). The fun1 has the O(n) time complexity (linear), the fun2 has O(2^n) time complexity (exponential).

When you imagine levels of recursion, then one depth level doubles the number of recursive calls. So, for n == 10, there is one call of fun2(10), and then there are 2 calls of fun2(9), 4 calls of fun2(8), 8 calls of fun2(7), 16 for 6, 32 for 5, 64 for 4, 128 for 3, 256 for 2, 512 for 1, 1024 calls fun2(0). The last mentioned just return 1.

This is a nice example that you should always think twice when implementing functions like that using recursion. A simple fix (the 2*fun2(n-1) instead of fun2(n-1) + fun2(n-1)) makes it O(n).

This also explains why Fibonacci numbers should not be implemented using naive recursion. Frankly, simple loop without any recursion is much better in the case.

So, the equation for calculating the time complexity should contain 2^something + something. ;)

pepr
  • 20,112
  • 15
  • 76
  • 139
  • Sigh of relief that someone finally understood what I was saying! Thank you so much. Unfortunately I'm not able to convince this fact to anyone else. – striker Oct 22 '20 at 05:02
  • It is easy to prove it by experiment. Just add any form of `print` to the beginning of the functions `fun1` and `fun2` (something like `"fun1 called\n"`). And execute the functions with the same argument (say 10). The difference will be visible. You can also increment some global variable here to get the resulting number of calls. – pepr Oct 22 '20 at 08:13
0

You're correct about fun2.

For fun1, think about your usual rules of math, disregard time complexity. 2*fun1(n-1) = fun1(n-1) + fun1(n-1), unless rules of multiplication can redefined, such as in modern analysis (I believe that's the vein of mathematics that's taught in. Been a while since I was in that class :) )

So with the distribution rule, fun1 is effectively the same as fun2, thus having the same time complexity.

  • hey @Todd Caywood, probably you didn't read my comment. Multiplying the value of f(n-1) with 2 is a constant time (O(1)) operation while calling function f(n-1) two times is not a constant time operation. If T(n) time is taken by f(n) then f(n-1) will take T(n-1) time. Now (f(n-1)+f(n-1)) will take T(n-1)+T(n-1)=2*T(n-1) time but, 2*f(n-1) should take T(n-1) time as multiplying with two is constant time operation I hope you got my point that why f(n-1) + f(n-1) is not same as 2*f(n-1) in terms of time complexity. – striker Oct 20 '20 at 19:04
  • Yes, I understand your logic. The interesting thing about time complexity is that the highest order of N is used as the overall time complexity, so adding something of the same or lesser order doesn't change the time complexity, multiplying constants, etc, likewise doesn't change the complexity. Due to that, the time complexity for (f(n-1)+f(n-1)) is the same as time complexity for f(n-1), since they are of the same order. So time taken and time complexity are different. So 2*T(n-1) = O(n) complexity, and T(n-1) is also O(n) complexity. Does that make sense? – Todd Caywood Oct 21 '20 at 17:43
  • Hey Todd Caywood, I duly understand that time taken and time complexity are totally different things! Time complexity is nature of increase in time taken for execution when with respect to change in input size. But many of your assumptions are wrong in your above comment. Have a look at this answer https://stackoverflow.com/a/64452408/6743289 – striker Oct 22 '20 at 05:04
  • Please look at this analysis https://drive.google.com/file/d/1_6_XTlcgJVFLzO5SoVFN65zd0f7mIzGG/view?usp=drivesdk Note that "+1" written in recursive functions should ideally be any constant, I have taken 1 for simplicity. Any it won't affect Big O. – striker Oct 22 '20 at 07:56
  • Goodness, my eyes deceived me. I read the inner part of fun2 as fun1 + fun1, thereby not being recursive with the duplication (and effectively the same as the time complexity with fun1, hence my answer based in the confusion!). I agree with the thought experiment shown by @pepr. Sorry for the confusion in my post :) – Todd Caywood Oct 23 '20 at 13:48