No, elapsed run time is not a standard for measuring efficiency as it varies from platform to platform -- saying "my algorithm ran in 10 seconds" gives little to no information about the algorithm itself. In addition to that, you would need to list the entire environment specs and other processes running at the same time and it would be a huge mess. Hence, the development of the order notations (Big Oh, Little Oh, Omega, etc.).
Efficiency is typically branched into two subsections:
- Time efficiency.
- Space efficiency.
... where one algorithm may be extremely time efficiency, but very inefficient space-wise. Vice-versa applies. Algorithms are analyzed based on their asymptotic behaviour when scaling the amount of instructions they need to execute for a given input n
. This is a very high-level explanation on a field that is meticulously studied by PhD Computer Scientists -- I suggest you read more about it here for the best low-level explanation that you will find.
Note, I am attaching the link for Big Oh notation -- the sister notations can all be found off of that Wikipedia page and it's typically a good place to start. It will get into the difference of space and time efficiency as well.
Small Application of Time Efficiency using Big Oh:
Consider the following recursive function in Racket (would be in Python if I knew it -- best pseudo code I can do):
(define (fn_a input_a)
(cond
[(empty? input_a) empty]
[(empty? (rest input_a)) input_a]
[(> (first input_a) (fn_a (rest input_a))) (cons (first input_a) empty)]
[else (fn_a (rest input_a))]))
... we see that: empty?
, rest
, >
and first
are all O(1). We also notice that in the worst case, a call is made to fn_a
in the third condition and fourth condition on the rest
of input_a
. We can then write our recurrence relation as, T(n) = O(1) + 2T(n - 1). Looking this up on a recurrence relation chart we see that fn_a
is of order O(2^n) because in the worst case, two recursive calls are made.
It's also important to note that, by the formal definition of Big Oh it is also correct (however useless) to state that fn_a
is O(3^n). Lot's of algorithms when analyzed are stated using Big Oh however it would be more appropriate to use Big Theta to tighten the bounds, essentially meaning: the lowest, most accurate order with respect to a given algorithm.
Be careful, read the formal definitions!