-1

If so can you provide explicit examples? I understand that an algorithm like Quicksort can have O(n log n) expected running time, but O(n^2) in the worse case. I presume that if the same principle of expected/worst case applies to theta, then the above question could be false. Understanding how theta works will help me to understand the relationship between theta and big-O.

Ian R
  • 1
  • 1
  • 1

3 Answers3

0

When $n$ is large enough, the algorithm with complexity $\theta(n)$ will run faster than the algorithm with complexity $\theta(n^2)$. In fact $\theta(n) / \theta(n^2)\to 0$ as $\theta \to \infty$. However there might be values of $n$ where $\theta(n) > \theta(n^2)$.

Emanuele Paolini
  • 9,912
  • 3
  • 38
  • 64
0

It's not always faster, only asymptotically faster (when n grows infinitely). But after some n — yes, it is always faster.

For example, for little n a bubble sort may operate faster than quick sort just because it's simpler (its θ has lower constants).

This has nothing to do with expected/worst cases: selecting a case is another problem that is not related to theta or big-O.

And about the relationship between theta and big-O: in computer science, big-O is often (mis)used in sense of θ, but in its strict meaning big-O is a more wide class than θ: it limits only the upper bound of a growing function while theta limits both bounds. E.g. when somebody says that Quicksort has a complexity of O(n log n), he actually means θ(n log n).

nullptr
  • 11,008
  • 1
  • 23
  • 18
0

You are on the right track of thought.

Actual runtime of program can be quite different from asymptotic bounds.This is a fundamental concept that arises from the way asymptotic notation is defined.
You can read my answer here to clarify.

Community
  • 1
  • 1
Aravind
  • 3,169
  • 3
  • 23
  • 37