If I have some algorithm that runs at best n time and at worst n^2 time, is it fair to say that the algorithm is Big Omega (n)?
Does this mean that the algorithm will run at least n (time)? I am just not sure if I have the right idea here.
Thanks.
If I have some algorithm that runs at best n time and at worst n^2 time, is it fair to say that the algorithm is Big Omega (n)?
Does this mean that the algorithm will run at least n (time)? I am just not sure if I have the right idea here.
Thanks.
For Big Oh, you will state the worst time as the time it takes, in your case O(n^2)
. For Big Omega, you state the smallest time, in this case f(n)
.
Also see this guide to Big O and this discussion of Big O and Big Omega.
Yes. If the runtime f(n) is asymptotically bounded below by g(n) = n, then f(n) is BigOmega(n).
Edit: Most of the time, algorithms are analyzed in terms of their worst-case behavior -- which corresponds to "Big O" notation. In your case, the runtime is O(n^2).
But for those rare occasions when you need to talk about a lower bound, or best case behavior, "Big Omega" notation is used. And in your case, the runtime is at least n, so it is correct to describe it as BigOmega(n).