Regardless of how the multiplication(or division) operation is implemented(i.e. whether it is a software function or a hardware instruction), it won't be solvable in time O(1)
. for big n
values, the processor cannot even compute it by one instruction.
In such algorithms, why are these operations constant and not depended to n
?
for (i = 1; i <= n; i++) {
j = n;
while (j > 1)
j = j / 3; //constant operation
}