Questions tagged [big-o]

The Big-O notation is used to represent asymptotic upper bounds. It describes relevant time or space complexity of algorithms. Big-O analysis provides a coarse and simplified estimate of a problem difficulty.

The Big-O notation is used to represent asymptotic upper-bounds. It allows a person to see if a problem will take years or seconds to compute on a modern computer.

In computer science, it is most commonly used when talking about the time complexity of algorithms, but can also refer to the storage required.

For example, a linear search on an unsorted array of size N, is O(N). If we put the elements first in a hash table, the space used is O(N) (Theta(N) to be more precise), but the search time is O(1) in the average case.

It should be noted that Big-O only represents an upper bound for a function. Therefore an O(N) function will also be O(NlogN), O(N²), O(N!), etc. In many cases Big-O is used imprecisely and Big-Theta should be used instead.

If a complexity is given by a recurrence relation, an analysis can often be carried out via the Master Theorem.

Properties

  • Summation
    O(f(n)) + O(g(n)) -> O(max(f(n), g(n)))
    For example: O(n^2) + O(n) = O(n^2)

  • Multiplication by a positive constant
    O(c * f(n)) -> O(f(n))
    For example: O(1000 * n^2) = O(n^2)

  • Multiplication
    O(f(n)) * O(g(n)) -> O(f(n) * g(n))
    For example: O(n^2) * O(n) = O(n^2 * n) = O(n^3)

  • Transitivity
    f(n) = O(g(n)) and g(n) = O(h(n)) then f(n) = O(h(n))

Groups of Big-O

Complexity Sample algorithms
O(N!) Get all permutations of N items
O(2^N) Iterating over all subsets of N items
O(N^3) Calculating all triplets from N items
O(N^2) Enumerating all pairs from N items, insert sort
O(NLog(N)) Quick sort, merge sort
O(N) Getting min, max, average, iterating over N items
O(Log(N)) Binary search
O(1) Getting an item by the index in the array

More info

6779 questions
81
votes
5 answers

Python dictionary keys. "In" complexity

Quick question to mainly satisfy my curiosity on the topic. I am writing some large python programs with an SQlite database backend and will be dealing with a large number of records in the future, so I need to optimize as much as I can. For a few…
tknickman
  • 4,285
  • 3
  • 34
  • 47
75
votes
8 answers

what does O(N) mean

Possible Duplicate: What is Big O notation? Do you use it? Hi all, fairly basic scalability notation question. I recently recieved a comment on a post that my python ordered-list implimentation "but beware that your 'ordered set' implementation…
Fire Crow
  • 7,499
  • 4
  • 36
  • 35
73
votes
11 answers

Why is O(n) better than O( nlog(n) )?

I just came around this weird discovery, in normal maths, n*logn would be lesser than n, because log n is usually less than 1. So why is O(nlog(n)) greater than O(n)? (ie why is nlogn considered to take more time than n) Does Big-O follow a…
Ritveak
  • 2,930
  • 2
  • 13
  • 28
71
votes
8 answers

What is the Big-O of a nested loop, where number of iterations in the inner loop is determined by the current iteration of the outer loop?

What is the Big-O time complexity of the following nested loops: for (int i = 0; i < N; i++) { for (int j = i + 1; j < N; j++) { System.out.println("i = " + i + " j = " + j); } } Would it be O(N^2) still?
mmcdole
  • 91,488
  • 60
  • 186
  • 222
71
votes
14 answers

Is the time complexity of the empty algorithm O(0)?

So given the following program: Is the time complexity of this program O(0)? In other words, is 0 O(0)? I thought answering this in a separate question would shed some light on this question. EDIT: Lots of good answers here! We all agree that 0 is…
jyoungdev
  • 2,674
  • 4
  • 26
  • 36
71
votes
3 answers

Big-O of list slicing

Say I have some Python list, my_list which contains N elements. Single elements may be indexed by using my_list[i_1], where i_1 is the index of the desired element. However, Python lists may also be indexed my_list[i_1:i_2] where a "slice" of the…
mjgpy3
  • 8,597
  • 5
  • 30
  • 51
68
votes
8 answers

Which is better: O(n log n) or O(n^2)

Okay so I have this project I have to do, but I just don't understand it. The thing is, I have 2 algorithms. O(n^2) and O(n*log2n). Anyway, I find out in the project info that if n<100, then O(n^2) is more efficient, but if n>=100, then O(n*log2n)…
user3579272
  • 683
  • 1
  • 6
  • 6
68
votes
8 answers

Is list::size() really O(n)?

Recently, I noticed some people mentioning that std::list::size() has a linear complexity. According to some sources, this is in fact implementation dependent as the standard doesn't say what the complexity has to be. The comment in this blog entry…
foraidt
  • 5,519
  • 5
  • 52
  • 80
66
votes
10 answers

Time complexity of nested for-loop

I need to calculate the time complexity of the following code: for (i = 1; i <= n; i++) { for(j = 1; j <= i; j++) { // Some code } } Is it O(n^2)?
yyy
  • 912
  • 1
  • 9
  • 13
65
votes
6 answers

Complexity of list.index(x) in Python

I'm referring to this: http://docs.python.org/tutorial/datastructures.html What would be the running time of list.index(x) function in terms of Big O notation?
user734027
  • 747
  • 1
  • 7
  • 8
64
votes
6 answers

Space complexity of recursive function

Given the function below: int f(int n) { if (n <= 1) { return 1; } return f(n - 1) + f(n - 1); } I know that the Big O time complexity is O(2^N), because each call calls the function twice. What I don't understand is why the…
George Kagan
  • 5,913
  • 8
  • 46
  • 50
62
votes
6 answers

Can an O(n) algorithm ever exceed O(n^2) in terms of computation time?

Assume I have two algorithms: for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { //do something in constant time } } This is naturally O(n^2). Suppose I also have: for (int i = 0; i < 100; i++) { for (int j = 0; j < n; j++) { …
Brian
  • 7,098
  • 15
  • 56
  • 73
61
votes
7 answers

Why is the constant always dropped from big O analysis?

I'm trying to understand a particular aspect of Big O analysis in the context of running programs on a PC. Suppose I have an algorithm that has a performance of O(n + 2). Here if n gets really large the 2 becomes insignificant. In this case it's…
driftwood
  • 2,051
  • 4
  • 21
  • 28
61
votes
4 answers

Why hashmap lookup is O(1) i.e. constant time?

If we look from Java perspective then we can say that hashmap lookup takes constant time. But what about internal implementation? It still would have to search through particular bucket (for which key's hashcode matched) for different matching…
genonymous
  • 1,598
  • 3
  • 18
  • 27
60
votes
2 answers

What is O(log(n!)), O(n!), and Stirling's approximation?

What is O(log(n!)) and O(n!)? I believe it is O(n log(n)) and O(n^n)? Why? I think it has to do with Stirling's approximation, but I don't get the explanation very well. Am I wrong about O(log(n!) = O(n log(n))? How can the math be explained in…
Jiew Meng
  • 84,767
  • 185
  • 495
  • 805