0

I tried looking around to see if my answer could be answered but I haven't stumbled what could help me.

When Dealing with Run Time Complexity's do you account for the operands? From my understanding dealing with run time you have each different operand can take x-amount of time so only counting for the loops with give you the lower bound? If this is incorrect can you please explain to me where my logic is wrong.

for example:

            for (i=0;i<n;i++)
               for (j=0;j<n;j++)
                 a[i,j]=b[i,j]+c[i,j]

Would just be O(n^2) right? or would it be O(a*n^2) because of the Addition Operand?? and you use "O" for run time usually correct?

for example:

            for (i=0;i<n;i++)
               for (j=0;j<n;j++)
                 a[i,j] -= b[i,j] * c[i,j]

Would just be O(n^2) again right?? or would it be O(a^2*n^2) because of the subtraction and multiply Operands??

Thanks Stack!

Conor
  • 640
  • 5
  • 10
  • 21
  • I would recommend reading a bit more on what the `O(x)` notation really means - it's more intended to evaluate the *growth* of time for a given algorithm given increasing `n`. As such, constant factors are usually ignored, which means that `O(n^2) == O(k*n^2) == k*O(n^2) == (k^k)*O(n^2)` for any constant value `k`. Both of your above loops are simply `O(n^2)`, unless `a`, `b`, and `c` are some user-defined type for which addition and subtraction are not `O(1)` or constant time operations. – twalberg Jun 24 '13 at 14:54

1 Answers1

0

I suggest you read on what the O notation means.But let me present you with a brief overview:

When we say f(x)=O(g(x)), we mean that for some constant c independent of input size,

f(x)<=c.g(x) for all x>=k

in other words, beyond a certain point k, the curve f(n) is always bounded above by the curve g(n) as shown in the figure.

enter image description here

Now in the case you have considered, the operations of addition and subtraction, multiplication are all primitive operations that take constant time(O(1)). Let's say the addition of two numbers takes 'a' time and assigning the result takes 'b' time.

So for this code:

  for (i=0;i<n;i++)
   for (j=0;j<n;j++)
     a[i,j]=b[i,j]+c[i,j]

Let's be sloppy and ignore the for loop operations of assignment and update. The running time T(n)=(a+b)n2.

Notice that this is just O(n2), why?

As per the definition, this means we can identify some point k beyond which for some constant c the curve T(n) is always bounded above by n2.

Realize that this is indeed true.We can always pick sufficiently large constants so that c.n^2 curve always bounds above the given curve.

This is why people say:Drop the constants!

Bottomline: Big O of f(n) is g(n) means, to the right of some vertical line, the curve f(n) is always bounded above by g(n).

Aravind
  • 3,169
  • 3
  • 23
  • 37