18

I found the time complexity of Prims algorithm everywhere as O((V + E) log V) = E log V. But as we can see the algorithm:

It seems like the time complexity is O(V(log V + E log V)). But if its time complexity is O((V + E) log V). Then the nesting must have to be like this:

But the above nesting is seems to be wrong.

Community
  • 1
  • 1
Sonali
  • 631
  • 3
  • 9
  • 11

4 Answers4

44
MST-PRIM(G, w, r)
1  for each u ∈ G.V
2       u.key ← ∞
3       u.π ← NIL
4   r.key ← 0
5   Q ← G.V
6   while Q ≠ Ø
7       u ← EXTRACT-MIN(Q)
8       for each v ∈ G.Adjacent[u]
9           if v ∈ Q and w(u, v) < v.key
10              v.π ← u
11              v.key ← w(u, v)

Using a Binary Heap

  1. The time complexity required for one call to EXTRACT-MIN(Q) is O(log V) using a min priority queue. The while loop at line 6 is executing total V times.so EXTRACT-MIN(Q) is called V times. So the complexity of EXTRACT-MIN(Q) is O(V logV).

  2. The for loop at line 8 is executing total 2E times as length of each adjacency lists is 2E for an undirected graph. The time required to execute line 11 is O(log v) by using the DECREASE_KEY operation on the min heap. Line 11 also executes total 2E times. So the total time required to execute line 11 is O(2E logV) = O(E logV).

  3. The for loop at line 1 will be executed V times. Using the procedure to perform lines 1 to 5 will require a complexity of O(V).

Total time complexity of MST-PRIM is the sum of the time complexity required to execute steps 1 through 3 for a total of O((VlogV) + (E logV) + (V)) = O(E logV) since |E| >= |V|.

Using a Fibonacci Heap

  1. Same as above.
  2. Executing line 11 requires O(1) amortized time. Line 11 executes a total of 2E times. So the total time complexity is O(E).
  3. Same as above

So the total time complexity of MST-PRIM is the sum of executing steps 1 through 3 for a total complexity of O(V logV + E + V)=O(E + V logV).

Zack Light
  • 167
  • 2
  • 11
Tanmoy Banerjee
  • 1,433
  • 3
  • 22
  • 31
  • 2
    its all exactly written in Coreman. Also Its not answer the question, in step 2 you arrived at ElogV, which is executed |V| times, so its total to O(VElogV). That was the question. – ajayv Feb 25 '18 at 08:48
  • 1
    @ajayv: yes it is written in Coreman, but i have written it in a more simple way so that anyone who has little knowledge on computer programming can understand it. And it is evident from the upvotes that it is helping people to understand the concept. Coming to your second argument, it seems from the question, OP was having difficulty understanding the analysis. So i tried to provide a clear idea .I did not get any other question from her so that i can understand what the actual question was. And you can not tell me what the actual question was unless you are the OP. – Tanmoy Banerjee Apr 16 '18 at 11:39
  • 2
    @ajayv, I might be late for the party, but second part of your comment seems wrong. The for loop will be executed twice for every edge in total, counting the outer loop in. Thus, ElogV seems correct. It does not traverse the whole graph at every node, but only adjacent vertices. – Incomputable Mar 12 '19 at 19:35
8

Your idea seems correct. Let's take the complexity as V(lg(v) + E(lg(v))) Then notice that in the inner for loop, we are actually going through all the vertices, and not the edge, so let's modify a little to V(lg(v) + V(lg(v))) which means V(lg(v)) + V*V(lg(v)) But for worst case analysis(dense graphs), V*V is roughly equal to number of edges, E V(lg(v)) + E(lg(v)) (V+E((lg(v)) but since V << E, hence E(lg(v))

user3473400
  • 81
  • 1
  • 1
  • 2
    Alternatively, `V(lg(V) + e(lg(V)))` expands to `V(lg(V)) + Ve(lg(V))`. Notice Ve here, it means that for each vertex, we go over the edges associated with it. Hence, we go over all edges, thus, Ve expands to E. – user3473400 Mar 07 '16 at 15:22
5

The time complexity of Prim's algorithm is O(VlogV + ElogV). It seems like you understand how the VlogV came to be, so let's skip over that. So where does ElogV come from? Let's start by looking at Prim's algorithm's source code:

  | MST-PRIM(Graph, weights, r)
1 |  for each u ∈ Graph.V
2 |       u.key ← ∞
3 |       u.π ← NIL
4 |   r.key ← 0
5 |   Q ← Graph.V
6 |   while Q ≠ Ø
7 |       u ← EXTRACT-MIN(Q)
8 |       for each v ∈ Graph.Adj[u]
9 |           if v ∈ Q and weights(u, v) < v.key
10|               v.π ← u
11|               v.key ← weights(u, v)

Lines 8-11 are executed for every element in Q, and we know that there are V elements in Q (representing the set of all vertices). Line 8's loop is iterating through all the neighbors of the currently extracted vertex; we will do the same for the next extracted vertex, and for the one after that. Djistkra's Algorithm does not repeat vertices (because it is a greedy, optimal algorithm), and will have us go through each of the connected vertices eventually, exploring all of their neighbors. In other words, this loop will end up going through all the edges of the graph twice at some point (2E).

Why twice? Because at some point we come back to a previously explored edge from the other direction, and we can't rule it out until we've actually checked it. Fortunately, that constant 2 is dropped during our time complexity analysis, so the loop is really just doing E amounts of work.

Why wasn't it V*V? You might reach that term if you just consider that we have to check each Vertex and its neighbors, and in the worst case graph the number of neighbors approaches V. Indeed, in a dense graph V*V = E. But the more accurate description of the work of these two loops is "going through all the edges twice", so we refer to E instead. It's up to the reader to connect how sparse their graph is with this term's time complexity.

Let's look at a small example graph with 4 vertices:

    1--2
    |\ |
    | \|
    3--4

Assume that Q will give us the nodes in the order 1, 2, 3, and then 4.

  • In the first iteration of the outer loop, the inner loop will run 3 times (for 2, 3, and 4).
  • In the second iteration of the outer loop, the inner loop runs 2 times (for 1 and 4).
  • In the third iteration of the outer loop, the inner loop runs 2 times (for 1 and 4).
  • In the last iteration of the outer loop, the inner loop runs 3 times (for 1, 2, and 3).

The total iterations was 10, which is twice the number of edges (2*5).

Extracting the minimum and tracking the updated minimum edges (usually done with a Fibonacci Heap, resulting in log(V) time complexity) occurs inside the loop iterations - the exact mechanisms involve end up needing to occur inside the inner loop enough times that they are controlled by the time complexity of both loops. Therefore, the complete time complexity for this phase of the algorithm is O(2*E*log(V)). Dropping the constant yields O(E*log(V)).

Given that the total time complexity of the algorithm is O(VlogV + ElogV), we can simplify to O((V+E)logV). In a dense graph E > V, so we can conclude O(ElogV).

Austin Cory Bart
  • 2,159
  • 2
  • 19
  • 32
Anshu Shahi
  • 151
  • 2
  • 4
2

actually as you are saying as for is nested inside while time complexity should be v.E lg V is correct in case of asymptotic analysis. But in cormen they have done amortized analysis thats why it comes out to be (Elogv)

Rakhi
  • 21
  • 1