in wikipedia they state under the Heading: "running time" for Dijstras algorithm is
O(|V|^2 + |E|*decreaseKey) = O(|V|)
This is my first time analyzing time complexity for an algoritm, but in my opinion it should be:
- Outerloop (while Q is not empty) will take O(|v|) time
- Findmin in an array should take O(|v| )
- Once Min is found then examine all neighbors which can take O(|E|)
- then updating they keys for these neigbors is done in constant time C
so we will have
O(V)*( O(|V|) + O(|E|)C ) = O(|V|^2 + O(|V||E|* C) ) = O(|V|^2)
my question is about the term:
O(|V|*|E|*C)
In Wikipedia they don't even bother the |V| factor at all why? is my analysis wrong?