I would like to know the difference between the complexity of an algorithm and the complexity of a problem, that is, which points differ from the two things
1 Answers
Good question! Most people make no distinction between the two, which is a big no-no.
Simply put, the complexity of an algorithm is the algorithm's running time. This could be represented many ways, big O, big Theta, or any of the assorted Landau Notations. There also other representations, but the most commonly used in big O notation, which can be applied to analyze the worst-case time complexity of an algorithm as a function of the input size.
The complexity of a problem is typically the lower bound of any algorithm that solves the problem (wiki is a decent resource here). For example we can prove comparison-based sorting has a lower bound of n log n
. This means, any algorithm that sorts by comparing elements, no matter which algorithm, will take at least n log n
time in the worst case. Using Landau notation we would say this problem takes Omega(n log n)
.
In summary, problem complexity is a lower bound and algorithm typically establishes an upper bound. When you find an algorithm whose upper bound matches a problem's lower bound, you've found an asymptotically optimal algorithm!

- 54,589
- 14
- 104
- 138

- 296
- 3
- 10
-
4It's not true that "big O notation" "represents the worst-case time complexity of an algorithm as the input size grows towards infinity"; rather, big-O notation represents an asymptotic upper bound on a function. One very common application of it is to worst-case time complexities of algorithms as functions of input size, but the same notation can be used for average-case time complexity, worst-case space complexity, etc., and the other Landau notations can likewise be used for any of these. – ruakh Jun 25 '17 at 02:05
-
@ruakh, fair point. I've updated it as an application of big-O, rather than an exact meaning or such. Thank you for pointing that out. – ryan Jun 25 '17 at 02:11