7

I've found that there is a lot of controversy about asymptotic complexity of List.Add(). The source of it I suspect is the worst case scenario that causes underlying array to resize and would logically be O(n) operation. However, the array grows twice in size each time list runs out of space. That makes amount of resizes needed for n elements be proportional to log(n).

Does not that mean that asymptotic complexity of Add operation in average case will be O(n/log(n))?

The real benchmark for List.Add() is below. However, benchmarks are not really expressive for such operation - we might be running out of memory prior to any deviation from straight (in logarithmic scale) line becomes visible.

benchmark

Eugene D. Gubenkov
  • 5,127
  • 6
  • 39
  • 71
  • When the array growths indeed the existing N/2 elements are moved. That's why you think it's log N. But the new yet unusued capacity growth exponentially. That counters the effect. The first element is moved log N times. The last N/2 elements are moved 0 or 1 times. – usr May 26 '16 at 21:54

1 Answers1

9

This means that the amortized complexity of List.Add() can be calculated by summing the resize operations and then multiplying by number of total additions made to the list.

T(n) = (2 + 4 + 8 + ... + n/2 + n) / n

But note that the summation is a geometric series, and we can do better than assuming it's (summation) n*log(n):

T(n) < 2n/n = 2 -> T(n) is in O(1)

Note: Here I am assuming you mean add() as appending. Inserting an element in an arbitrary location takes O(n) time, and you will have to account for that as well, which will change the end result from O(1) amortized complexity to O(n) amortized complexity.

IrkenInvader
  • 4,030
  • 1
  • 12
  • 23
amit
  • 175,853
  • 27
  • 231
  • 333