Hacker News new | past | comments | ask | show | jobs | submit login

There was a slashdot post many years ago that spoke of a 100,000x general/global algorithmic improvement over the decades. The point was that it absolutely dwarfed the hardware improvements over the same time span.

To be clear, this was a post, not a comment. So it wasn't just some rando spouting off. Nevertheless, I was never able to find any backing for the claim. But it's always fascinated me.

It reminds me of something from a Frank McSherry comment about scaling down (algorithmically) being a third option, in addition to up (bigger server) and out (more servers).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: