

Progress In Algorithms Beats Moore's Law - yarapavan
http://agtb.wordpress.com/2010/12/23/progress-in-algorithms-beats-moore%E2%80%99s-law/

======
ced
I'm a fan of Proebsting's Law:

 _Proebsting's Law: Compiler Advances Double Computing Power Every 18 Years

I claim the following simple experiment supports this depressing claim. Run
your favorite set of benchmarks with your favorite state-of-the-art optimizing
compiler. Run the benchmarks both with and without optimizations enabled. The
ratio of of those numbers represents the entirety of the contribution of
compiler optimizations to speeding up those benchmarks. Let's assume that this
ratio is about 4X for typical real-world applications, and let's further
assume that compiler optimization work has been going on for about 36 years.
These assumptions lead to the conclusion that compiler optimization advances
double computing power every 18 years. QED.

This means that while hardware computing horsepower increases at roughly
60%/year, compiler optimizations contribute only 4%. Basically, compiler
optimization work makes only marginal contributions._

------
pohl
Moore's Law doesn't just bring processor speed, but also vast quantities of
RAM. How much of the improvement credited to algorithms came from newly-
affordable time/space tradeoffs brought by Moore's Law? I like the question
this post is raising, but would like to see it approached in a more careful
manner.

~~~
lsb
Also with more RAM, more and more datasets are fitting in memory. You can have
all of Wikipedia in memory on a machine that costs as much as a Venti Latte
per day.

------
iwwr
Perhaps Moore's law is a consequence of people's ability to design algorithms.
Hardware design needs similar resources of the mind. Circuit routing,
simulations, benchmarking, design re-use, HDL code bases (
<http://en.wikipedia.org/wiki/Hardware_description_language> )...

~~~
sharednothing
Digging deeper, you will hit the characteristic comparative pattern of
platonic and the physical. Chips are build using matter and the limiting
factors are ultimately physical. Algorithms are embodied on physical machines,
but the bottlenecks are primarily conceptual. In that sense, the takeaway from
this for me is a verification of the intuition that conceptual models offer
greater degrees of freedom. (duh.)

------
philwelch
This is really interesting, since a lot of the fundamental algorithms you
learn about in school (sorting, Dijkstra's, etc.) haven't been improved for
decades, and in some cases are even theoretically impossible to improve. What
kinds of algorithms _have_ steadily improved over time?

~~~
uriel
> What kinds of algorithms have steadily improved over time?

What? are you requesting evidence for their claim? How dare you!

