

Ask HN: A tool to analyse the orders of growth - morphir

Does anyone know of any code profiling tool that can do asymptotic analysis of source code?
======
sqrt17
You may be starting from wrong assumptions. Asymptotic analysis of algorithms
is useful when you want to extrapolate to larger datasets, since it helps you
figure out what, if the data grew enough, would become the bottleneck in your
source code. A profiling tool is useful for getting at the _actual_ time
consumption. So what you'd want is to run your code under profiling, using a
realistic amount of data. (Use a sampling profiler if the instrumented source
code is too slow). Anything that looks bad at the scale that you're aiming at
is something that you'd (eventually want to address). Another method would be
to run the profiler twice, once on a small-ish dataset and once on a large-ish
dataset (where large-ish doesn't have to be the full size) and then subtract
the small-ish times (cumulative or self) from the large-ish times so you get
rid of the startup time and see more clearly where the non-startup portion is
spending its time.

------
jheriko
This is a non-trivial problem, generally complexity analysis is done on paper,
based on some idealised version of an algorithm. In practice implementation
issues will make it difficult to extrapolate asymptotic behaviour for small n.

For instance an algorithm with O(log n) multiplications and O(n^2) additions
will be dominated by the multiplications for small n, and may mislead an
analysis tool into believing the asymptotic time complexity of the algorithm
to be O(log n), rather than O(n^2), since the additions will dominate for very
large n, simply because at some point a n^2 > m log n, where m is the run time
of addition and a is the run-time of addition.

This becomes even more difficult if addition and multiplication have non-
constant run-time as well (e.g. multi-precision ints).

