I'm having trouble understanding Big-O Notation. Here is an algorithm I wrote, it is supposed to be an alternative of (C++) Stack's size() function, and I need to determine its running time with the assumption that there are n elements in the stack when it is invoked.
Algorithm size():
Input: none
Output: A constant value of the size of an n-element stack.
Let V be a vector of n type objects.
Let S be the name of the stack that is being operated on by this function.
K ← 0
V
while !empty()
V.push_back(top()) //Keep track of elements in V
S.pop() //Remove element from stack
K ← K + 1 //Count the size of the stack
return K //Return the size of the stack
for i ← K – 1, i > 0, i-- do
S.push(V[i]) //Retain initial contents of stack
Please correct me where I'm wrong:
In the line K ← 0, I think that is an O(1) operation.
Creating the vector V is also an O(1) operation.
the while loop is an O(n) operation since it runs until it empties a stack containing n contents.
Pushing values back into V is an O(n) operation.
Popping contents off the stack S is an O(n) operation.
Returning K is an O(1) operation.
The for loop is an O(n) operation.
Pushing contents back into S is an O(n) operation.