

Is there a way to calculate approximate running time of an algorithm - curiouskevin

Given an algorithm and input size, is it possible to calculate how long it would take for typical person computer to give the output?<p>If there are any techniques, please help me out.
======
NickPollard
This is, generally, undecidable. In fact, it's one of the most famous results
in Computer Science, known as The Halting Problem[1].

Alan Turing showed that, for a program run on a particularly definition of a
computer (a 'Turing Machine'), that is equivalent to modern computers, there
is no _general_ solution for determining whether a Program, run with a given
Input, will halt or run forever. Your question about how long an algorithm
will run is a more complex version of the same problem.

If you confine yourself to particular instances of certain algorithms, then
you can do what you mean. The field of computational complexity is focused on
the performance of algorithms, though the common measures[2] - big/little
O/Omega/theta measure asymptotic complexity; that is how the complexity varies
as inputs become arbitrarily large.

[1]
[https://en.wikipedia.org/wiki/Halting_problem](https://en.wikipedia.org/wiki/Halting_problem)
[2]
[https://en.wikipedia.org/wiki/Big_O_notation](https://en.wikipedia.org/wiki/Big_O_notation)

------
varjag
You can express the upper bound performance limit via Big-O notation:

[https://en.wikipedia.org/wiki/Big_O_notation](https://en.wikipedia.org/wiki/Big_O_notation)

E.g. a list traversal, where n is the length of the list, would be O(n)
operation, an array reference a constant time O(1) operation and so on.

This gives you just the upper bound however, there is no method to time the
execution of a program on a given input other than running it. Also notice
that in general case, this reduces to halting problem:

[https://en.wikipedia.org/wiki/Halting_problem](https://en.wikipedia.org/wiki/Halting_problem)

~~~
curiouskevin
I am aware of Big-O notations. For me its just about execution time of an
algorithm. I thought it might be possible to calculate it by may be
multiplying a bunch of constants.

For instance an algorithm has complexity of O(n).

And input size of 10^6. may be with the number of operations a typical
personal computer can perform, we calculate in terms of seconds/milli seconds
for a given input.

Well, kind of a very vague numbers I guess... Not trying to get exact numbers.
Just curious.

~~~
gus_massa
It's almost impossible, but you can get some back of the envelope estimations
that are not useful for any serious use.

First you need to know the constant that is in front of O(n^k).

If you are optimistic, the run time is T ~= A * n^k + B n^(k-1) + ...

For simplicity, just drop the B and all the term of smaller order. So T ~= A *
n^k

As a rule of thumb, I use that a program where n^k = 10^9 will run in a few
seconds, so A=10^-9. (Ten years ago, I use to estimate that A~=10^-6, but now
the computers are faster.)

This heavily depends on the computer and language and which is the actual
operation that the program is repeating. So it's useful to run a few test with
small numbers to get a runtime of a few seconds and estimate the value of A
for each case.

In particular, my netbook is 4 times slower than my desktop. You can get the
constants ratio between your favorite languages. So remember that A is only an
approximation that changes from case to case.

But there are some other problems, for example when the number increase you
begin to have problems with the garbage collector, malloc, memory cache, disk
cache, ...

Sometimes small cases enter in the microprocessor's cache but big cases need
to be swapped to disk. If you draw T vs n you will see that the curve is not
smooth, but have a few jumps. (I like log-log graphics.)

And there are weird cases, for example the time constant can be different for
even and odd n, or other unexpected conditions.

For example, I remember a Fortran program for something like matrix
multiplication where the time was much bigger when n was a power of 2, like 64
or 128, ... (i.e. the run time of 65 was smaller than the run time of 64) I
guess it was bad for the cache. But the problem disappear with -O3, because it
automatically transpose the matrix, or make it bigger to avoid cache alignment
or some other kind of magic.

