

Using static analysis to derive the Big-O notation of a program at compile time? - aerlinger

There are various tools that use static analysis to perform validation and extract information about a program without it running. This can either happen at&#x2F;during compilation (e.g. generating documentation) or after the code has already been compiled and linked (such as a virus scanner). However, I haven&#x27;t heard of static analysis ever being used to infer the computational complexity of a program or to estimate how long a program will take to run. Of course, there&#x27;s always branching conditions within a program, but machines are strictly causal, so given some initial state and an input it should be possible to estimate the runtime of a program.<p>Perhaps I&#x27;m being naive but does a tool like this exist? Is it even possible to use this sort of analysis to estimate, even if imprecisely, the computational complexity of a program?
======
zimpenfish
> it should be possible to estimate the runtime of a program

Isn't this an analog of the Halting Problem?

