

What counts as a step? - orthebox

Hello all! Learning Big O Notation now, and was wondering –– what counts as a step?<p>For example, in pseudocode:<p>function a(x):
  return x + x<p>How many steps are there? 2? Or 1? Basically, what counts as a 'step' in a programming language –– is it just any statement or are there certain exceptions.<p>Or maybe I'm just complicating the issue?<p>Thanks!
======
bartonfink
It really depends, to be honest. Donald Knuth uses a fictional assembly
language in his Art of Computer Programming because he believes that any
abstraction provides a place for bad analysis to hide under the covers. For
instance, in your example, you use an addition operator whose implementation
is undefined. It might be implemented as a string of increments and run in
linear time. It might use a scheme of bit shifts and run in some logarithmic
time. It might be implemented to run as a look-up table in constant time. The
point is that you don't know what the processor's actually going to do, and
this is what Knuth was getting at.

However, in nearly every analysis I've done, mathematical operations are
assumed to run in constant time because it can cloud the overall picture of an
algorithm's complexity with an implementation detail. Unless you're
specifically working on the implementation of those operations, you're
probably safe to assume that as well.

~~~
orthebox
I always thought that mathematical operations run in constant time ... how can
they not?

~~~
bartonfink
Look at the algorithm you use for adding two integers by hand, for example.
You add the ones digits together and possibly have to carry one over. You then
add the tens digits and possibly carry over. You repeat as long as the numbers
"share" digit places.

This isn't constant time because the number of individual digit additions and
carries changes based upon the length of the inputs. In fact, this particular
algorithm is linear in the number of digits.

You could also implement linear addition as a series of increments, as I said
above. Translate "a + b" into code looking something like

    
    
      for(int i = 0; i < b; i++){
        a += 1;
      }
    

This obviously runs in linear time, and is how we teach addition at first.
It's not an unreasonable thing to assume at first, especially since your
professors will almost certainly tell you to assume that, but there can be
hidden complexity under the covers. You just don't need to care about it in
most cases.

------
ericn
In your example, that is O(1). A computer can add two numbers in constant
time. How could that be two steps? It is obviously only one operation.

In general, you can count steps as whatever you like. For instance, you could
say it takes O(n) CPU operations or O(n^2) file seeks. But then people use the
word "time" as a proxy for those. You'll hear "constant time" or "order-n
time".

~~~
orthebox
So basically every computation counts as a step?

But how about control structures? Like in Python:

"for i in range(4)", how many steps are there?

