3

Is there any real complexity between O(n logstar(n) ) and O(n)? I know that O(n sqrt(logstar(n))) and other similar functions are between these two but I mean something original which is not made of logstar(n).

A. Mashreghi
  • 1,729
  • 2
  • 20
  • 33

1 Answers1

6

Yes, there is. The most famous example would be the Ackermann inverse function α(n), which grows much more slowly than log* n. It shows up in contexts like the disjoint-set forest data structure, where each operation has an amortized cost of O(α(n)).

You can think of log* n as the number of times you need to apply log to n to drop the value down to some fixed constant (say, 2). You can then generalize this to log** n, which is the number of times that you need to apply log* to n to drop the value down to 2. You can then define log*** n, log**** n, log***** n, etc. in similar ways. The value of α(n) is usually given as the number of stars you need to put in log**...* n to get the value down to 2, so it grows much more slowly than the iterated logarithm function.

Intuitively speaking, you can think of log n as the inverse of exponentiation (repeated multiplication), log* n as the inverse of tetration (repeated exponentiation), log** n as the inverse of pentation (repeated tetration), etc. The Ackermann function effectively applies the n-th order generalization of exponentiation to the number n, so its inverse corresponds to how high of a level of exponentiation you need to apply to get to it. This results in an unbelievably slowly-growing function.

The most comically slowly-growing function I've ever seen used in a serious context is α*(n), the number of times you need to apply the Ackermann inverse function to a number n to drop it down to some fixed constant. It is almost inconceivable how large of an input you'd have to put into this function to get back anything close to, say, 10. If you're curious, the paper that introduced it is available here.

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065