If so can you provide explicit examples? I understand that an algorithm like Quicksort can have O(n log n) expected running time, but O(n^2) in the worse case. I presume that if the same principle of expected/worst case applies to theta, then the above question could be false. Understanding how theta works will help me to understand the relationship between theta and big-O.
3 Answers
When $n$ is large enough, the algorithm with complexity $\theta(n)$ will run faster than the algorithm with complexity $\theta(n^2)$. In fact $\theta(n) / \theta(n^2)\to 0$ as $\theta \to \infty$. However there might be values of $n$ where $\theta(n) > \theta(n^2)$.

- 9,912
- 3
- 38
- 64
It's not always faster, only asymptotically faster (when n
grows infinitely). But after some n
— yes, it is always faster.
For example, for little n
a bubble sort may operate faster than quick sort just because it's simpler (its θ
has lower constants).
This has nothing to do with expected/worst cases: selecting a case is another problem that is not related to theta or big-O.
And about the relationship between theta and big-O: in computer science, big-O is often (mis)used in sense of θ, but in its strict meaning big-O is a more wide class than θ: it limits only the upper bound of a growing function while theta limits both bounds. E.g. when somebody says that Quicksort has a complexity of O(n log n), he actually means θ(n log n).

- 11,008
- 1
- 23
- 18