Asymptotic complexity is an approximation of the edge case performance of an algorithm used to determine best and worst case scenarios.
Questions tagged [asymptotic-complexity]
796 questions
10
votes
3 answers
Complexity of algorithm std::includes in c++
The algorithm std::includes takes two sorted ranges and checks whether set2 is in set1 (i.e. if each element of set2 is included in set1)?
I wonder why eel.is/c++draft says that the complexity of this algorithm is at most 2·(N1+N2-1)…

MrPisarik
- 1,260
- 1
- 11
- 21
10
votes
2 answers
Calculating large factorial time complexity
I ran across a problem where I needed to calculate the values of very large factorials. I solved this problem in C++ in two different ways, but only want to know if my complexity analysis is accurate.
In either method I am representing very large…

Dominic Farolino
- 1,362
- 1
- 20
- 40
10
votes
3 answers
Storing pairwise sums in linear space
If we have two arrays of size n each and want to sort their sums, the naive approach would be to store their sums in O(n^2) space and sort it in O(n^2 logn) time. Suppose we're allowed to have the same running time of O(n^2 logn), how would we store…

maregor
- 777
- 9
- 32
10
votes
3 answers
Big O for worst-case running time and Ω is for the best-case, but why is Ω used in worst case sometimes?
I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain?
And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something?
Show that the…

jantristanmilan
- 4,188
- 14
- 53
- 69
9
votes
1 answer
Finding the upper bound of a mathematical function (function analysis)
I am trying to understand Big-O notation through a book I have and it is covering Big-O by using functions although I am a bit confused. The book says that O(g(n)) where g(n) is the upper bound of f(n). So I understand that means that g(n) gives…

Jude
- 449
- 1
- 7
- 17
9
votes
11 answers
Running time of algorithm A is at least O(n²) - Why is it meaningless?
Why is the statement:
The running time of algorithm A is at least O(n²)
is meaningless ?
The running time of Insertion sort algorithm is at most O(n²)
Is it Correct?
I tried the net but could not get a good explanation.
I have another…

Tanmoy Banerjee
- 1,433
- 3
- 22
- 31
8
votes
3 answers
Asymptotic time complexity of inserting n elements to a binary heap already containing n elements
Suppose we have a binary heap of n elements and wish to insert n more elements(not necessarily one after other). What would be the total time required for this?
I think it's theta (n logn) as one insertion takes logn.
user966892
8
votes
4 answers
Expected running time vs. worst-case running time
I am studying the randomized-quicksort algorithm. I realized that the running time of this algorithm is always represented as "expected running time".
What is the reason for specifying or using the "expected running time"? Why don't we calculate the…

minyatur
- 175
- 2
- 4
- 11
8
votes
2 answers
How to solve the following recurrence?
I am not familiar with recurrence-solving techniques outside of the master theorem, recursion trees, and the substitution method. I am guessing that solving the following recurrence for a big-O bound does not utilize one of those methods:
T(n) =…

velen
- 81
- 3
8
votes
3 answers
Analysis of Algorithms - Find missing Integer in Sorted Array better than O(n)
I am working through analysis of algorithms class for the first time, and was wondering if anyone could assist with the below example. I believe I have solved it for an O(n) complexity, but was wondering if there is a better version that I am not…

Busturdust
- 2,447
- 24
- 41
8
votes
2 answers
Constants in the formal definition of Big O
I'm revising the formal definitions of Big O and the other associated bounds and something is tripping me up. In the book I'm reading (Skiena) Big O is defined as:
f(n) = O(g(n)) when there exists a constant c such that f(n) is always <= c*g(n) for…

Justin
- 322
- 3
- 13
8
votes
3 answers
Solving recurrences
Am trying to solve the given recursion, using recursion tree, T(n) = 3T(n/3) + n/lg n.
In the first level (n/3)/(log(n/3)) + (n/3)/(log(n/3)) + (n/3)/(log(n/3)) = n/(log(n/3)).
In the second level it turns out to be n/(log(n/9)).
Can I generalize…

Chaitanya
- 1,698
- 5
- 21
- 41
8
votes
2 answers
Big O of clojure library functions
Can anyone point me to a resource that lists the Big-O complexity of basic clojure library functions such as conj, cons, etc.? I know that Big-O would vary depending on the type of the input, but still, is such a resource available? I feel…

Anonymous
- 739
- 7
- 15
7
votes
2 answers
Java Collection addAll complexity
Is there a Java collection with a complexity of O(1) and not O(n) for the addAll operation, or must I implement my own collection ? With a efficient linked list the Collection1.addAll(Collection2) operation should append the second collection to the…

kaizokun
- 926
- 3
- 9
- 31
7
votes
4 answers
Time complexity of function calling another function?
I know how to find the time complexity for almost any option (simple function, function with loops, etc.), but I can't figure out how to determine time complexity of function that is calling another function, specifically if the calling function is…

DataJayden
- 73
- 1
- 1
- 3