42

So I can picture what an algorithm is that has a complexity of n^c, just the number of nested for loops.

for (var i = 0; i < dataset.len; i++ {
    for (var j = 0; j < dataset.len; j++) {
        //do stuff with i and j
    }
}

Log is something that splits the data set in half every time, binary search does this (not entirely sure what code for this looks like).

But what is a simple example of an algorithm that is c^n or more specifically 2^n. Is O(2^n) based on loops through data? Or how data is split? Or something else entirely?

dlkulp
  • 2,204
  • 5
  • 32
  • 46
  • 1
    n^c?? do you mean n^2 because that is what your example shows... (this is about the first sentence in your question) – Coder Aug 30 '20 at 01:53
  • @JohnD c is just a variable representing the number of nested loops. n^2 would be the example I gave, n^3 would be 3 nested loops, etc. – dlkulp Aug 31 '20 at 01:08

8 Answers8

49

Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.

This program, for instance prints out all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks in pseudo-code

void solve_hanoi(int N, string from_peg, string to_peg, string spare_peg)
{
    if (N<1) {
        return;
    }
    if (N>1) {
        solve_hanoi(N-1, from_peg, spare_peg, to_peg);
    }
    print "move from " + from_peg + " to " + to_peg;
    if (N>1) {
        solve_hanoi(N-1, spare_peg, to_peg, from_peg);
    }
}

Let T(N) be the time it takes for N disks.

We have:

T(1) = O(1)
and
T(N) = O(1) + 2*T(N-1) when N>1

If you repeatedly expand the last term, you get:

T(N) = 3*O(1) + 4*T(N-2)
T(N) = 7*O(1) + 8*T(N-3)
...
T(N) = (2^(N-1)-1)*O(1) + (2^(N-1))*T(1)
T(N) = (2^N - 1)*O(1)
T(N) = O(2^N)

To actually figure this out, you just have to know that certain patterns in the recurrence relation lead to exponential results. Generally T(N) = ... + C*T(N-1) with C > 1means O(x^N). See:

https://en.wikipedia.org/wiki/Recurrence_relation

Matt Timmermans
  • 53,709
  • 3
  • 46
  • 87
17

Think about e.g. iterating over all possible subsets of a set. This kind of algorithms is used for instance for a generalized knapsack problem.

If you find it hard to understand how iterating over subsets translates to O(2^n), imagine a set of n switches, each of them corresponding to one element of a set. Now, each of the switches can be turned on or off. Think of "on" as being in the subset. Note, how many combinations are possible: 2^n.

If you want to see an example in code, it's usually easier to think about recursion here, but I can't think od any other nice and understable example right now.

Marandil
  • 1,042
  • 1
  • 15
  • 31
  • This is actually `O(n * 2^n)` complexity. – Sanket Makani Jun 12 '17 at 11:12
  • @SanketMakani how does iterating over all binary numbers of bit-length `n` correlate to `O(n * 2^n)`? Unless of course you assume incrementing an n-bit number to be `O(n)` (which IMHO is perfectly correct, but *many will disagree*) This is somewhat similar to saying, that iterating over `n` numbers take `O(n log n)` which is, if you count single bit operations, correct, but usually some assumptions are made. – Marandil Jun 12 '17 at 11:33
  • When you iterate over all possible `2^n` numbers, you need to check for every bit of the number to check if an element is present or not in the subset. We consider that to check whether a bit is set or not takes `O(1)` time, Still you need to iterate through all `n` bits so this would take `n` iterations for each of the `2^n` numbers. So total complexity would be `O(n * 2^n)`. – Sanket Makani Jun 12 '17 at 11:45
  • @SanketMakani you are basically repeating the stuff I wrote: "Unless of course you assume incrementing an n-bit number to be O(n)". Still, the argument about iterating over `n` values taking `O(n log n)` holds. – Marandil Jun 12 '17 at 11:57
  • No, I didn't consider than incrementing `n` causes that another `O(n)`overhead. Increment is done in `O(1)` operation. Now consider the example you have given, You would check if the `ith` bulb is `on` or `off` for each of the `2^n` numbers, It requires a linear loop to check state of each bulb. There you need a linear loop which causes overhead of `O(n)` for each number. That makes this complexity `O(n * 2^n)`. – Sanket Makani Jun 12 '17 at 12:23
  • Actually, in my example, I don't even check if a bulb is on, I just increment :). [in order to provide an example, ofc. any "inspection" of the state would induce additional overhead, but still it might not even be O(n); as an counter example think of a check: use log(n) bulbs to determine index of an object in the set and check if its on; the complexity of the whole case, excluding the increment, would then be `O(2^n * log(n) )] – Marandil Jun 12 '17 at 12:53
  • How would you check if bulbs are on or not in `O(log n)`. The best possible I could do is `O(1)` for each bulb so all bulbs requires `n*O(1) = O(n)`. – Sanket Makani Jun 12 '17 at 12:58
  • What if you only need to check for a SUBSET of all bulbs? In this case, `log(n)`. – Marandil Jun 12 '17 at 13:05
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/146424/discussion-between-sanket-makani-and-marandil). – Sanket Makani Jun 12 '17 at 13:07
  • 1
    @Sanket Makani The power set of set S contains 2^n elements. Just iterating the power set of S is of O(2^n) time complexity. However, generating the power set of set S is of O(n * 2^n). – Shaaer May 20 '22 at 15:04
11

Consider that you want to guess the PIN of a smartphone, this PIN is a 4-digit integer number. You know that the maximum number of bits to hold a 4-digit number is 14 bits. So, you will have to guess the value, the 14-bit correct combination let's say, of this PIN out of the 2^14 = 16384 possible values!!

The only way is to brute force. So, for simplicity, consider this simple 2-bit word that you want to guess right, each bit has 2 possible values, 0 or 1. So, all the possibilities are:

00
01
10
11

We know that all possibilities of an n-bit word will be 2^n possible combinations. So, 2^2 is 4 possible combinations as we saw earlier.

The same applies to the 14-bit integer PIN, so guessing the PIN would require you to solve a 2^14 possible outcome puzzle, hence an algorithm of time complexity O(2^n).

So, those types of problems, where combinations of elements in a set S differs, and you will have to try to solve the problem by trying all possible combinations, will have this O(2^n) time complexity. But, the exponentiation base does not have to be 2. In the example above it's of base 2 because each element, each bit, has two possible values which will not be the case in other problems.

Another good example of O(2^n) algorithms is the recursive knapsack. Where you have to try different combinations to maximize the value, where each element in the set, has two possible values, whether we take it or not.

The Edit Distance problem is an O(3^n) time complexity since you have 3 decisions to choose from for each of the n characters string, deletion, insertion, or replace.

Shaaer
  • 527
  • 5
  • 17
2
  int Fibonacci(int number)
 {
  if (number <= 1) return number;

  return Fibonacci(number - 2) + Fibonacci(number - 1);
 }

Growth doubles with each additon to the input data set. The growth curve of an O(2N) function is exponential - starting off very shallow, then rising meteorically. My example of big O(2^n), but much better is this:

public void solve(int n, String start, String auxiliary, String end) {
   if (n == 1) {
       System.out.println(start + " -> " + end);
   } else {
       solve(n - 1, start, end, auxiliary);
       System.out.println(start + " -> " + end);
       solve(n - 1, auxiliary, start, end);
   }

In this method program prints all moves to solve "Tower of Hanoi" problem. Both examples are using recursive to solve problem and had big O(2^n) running time.

darthir21
  • 57
  • 4
  • You should explain why it has exponential complexity - it's not obvious. Also, it's a bad example, because you can easily "fix" this algorithm to have linear complexity - it's as if you wanted to waste processing power on purpose. A better example would show an algorithm that calculates something that is hard/impossible to do fast. – anatolyg Jan 21 '16 at 17:29
0

c^N = All combinations of n elements from a c sized alphabet.

More specifically 2^N is all numbers representable with N bits.

The common cases are implemented recursively, something like:

vector<int> bits;
int N
void find_solution(int pos) {
   if (pos == N) {
     check_solution();
     return;
   }
   bits[pos] = 0;
   find_solution(pos + 1);
   bits[pos] = 1;
   find_solution(pos + 1);
}
Sorin
  • 11,863
  • 22
  • 26
0

Here is a code clip that computes value sum of every combination of values in a goods array(and value is a global array variable):

fun boom(idx: Int, pre: Int, include: Boolean) {
    if (idx < 0) return
    boom(idx - 1, pre + if (include) values[idx] else 0, true)
    boom(idx - 1, pre + if (include) values[idx] else 0, false)
    println(pre + if (include) values[idx] else 0)
}

As you can see, it's recursive. We can inset loops to get Polynomial complexity, and using recursive to get Exponential complexity.

boileryao
  • 21
  • 4
0

Here are two simple examples in python with Big O/Landau (2^N):

#fibonacci 
def fib(num):    
    if num==0 or num==1:
        return num
    else:
        return fib(num-1)+fib(num-2)

num=10
for i in range(0,num):
    print(fib(i))


#tower of Hanoi
def move(disk , from, to, aux):
    if disk >= 1:
        # from twoer , auxilart 
        move(disk-1, from, aux, to)
        print ("Move disk", disk, "from rod", from_rod, "to rod", to_rod)
        move(disk-1, aux, to, from)

n = 3
move(n, 'A', 'B', 'C')
grepit
  • 21,260
  • 6
  • 105
  • 81
0

Assuming that a set is a subset of itself, then there are 2ⁿ possible subsets for a set with n elements.

think of it this way. to make a subset, lets take one element. this element has two possibilities in the subset you're creating: present or absent. the same applies for all the other elements in the set. multiplying all these possibilities, you arrive at 2ⁿ.

PhiAgent
  • 158
  • 6
  • 1
    There’s a nuance here in that listing all subsets takes more than 2^n time because of the cost of writing out those subsets. The actual time bound is Theta(n * 2^n). – templatetypedef Aug 01 '21 at 01:40