-1

I am working with a very specific divide and conquer algorithm that always divides a problem with n elements into two subproblems with n/2 - 1 and n/2 + 1 elements.

I am pretty sure the time complexity remains O(n log n), but I wonder how could I formally prove it.

1 Answers1

0

Take the "useful work done" at each recursion level to be some function f(n):

enter image description here

Let's observe what happens when we repeatedly substitute this back into itself.


  1. T(n) terms:

enter image description here Spot the pattern?

At recursion depth m:

  • There are enter image description here recursive calls to T
  • The first term in each parameter for T is enter image description here
  • The second term ranges from enter image description here to enter image description here, in steps of enter image description here

Thus the sum of all T-terms at each level is given by:

enter image description here


  1. f(n) terms:

enter image description here

Look familiar?

The f(n) terms are exactly one recursion level behind the T(n) terms. Therefore adapting the previous expression, we arrive at the following sum:

enter image description here

However note that we only start with one f-term, so this sum has an invalid edge case. However this is simple to rectify - the special-case result for m = 1 is simply f(n).


Combining the above, and summing the f terms for each recursion level, we arrive at the (almost) final expression for T(n):

enter image description here


We next need to find when the first summation for T-terms terminates. Let's assume that is when n ≤ c.

The last call to terminate intuitively has the largest argument, i.e the call to:

enter image description here

Therefore the final expression is given by:

enter image description here


Back to the original problem, what is f(n)?

You haven't stated what this is, so I can only assume that the amount of work done per call is ϴ(n) (proportional to the array length). Thus:

enter image description here

Your hypothesis was correct.


Note that even if we had something more general like

enter image description here

Where a is some constant not equal to 1, we would still have ϴ(n log n) as the result, since the enter image description here terms in the above equation cancel out:

enter image description here

meowgoesthedog
  • 14,670
  • 4
  • 27
  • 40
  • I finally got it. Thanks! What if the amount of work done is log(n), instead of n (like searching in an ordered array)? I suppose the complexity falls to O(n). How to adjust the final part of the proof ? – Sérgio Mergen Oct 09 '17 at 14:12
  • Replace `f(n)` with `log(n)`. However the solution becomes non-trivial, so we may need to make some approximations to arrive at a final answer. – meowgoesthedog Oct 09 '17 at 14:23