
Wirth's law - frostmatthew
https://en.wikipedia.org/wiki/Wirth%27s_law
======
znpy
A relative of mine asked some very similar question a while ago, asking: "But
this computer i bought in 1999 was super fast at the time, how comes it is so
slow now?"

While the computer is the same (actually it was upgraded by adding more ram
and a way bigger hard disk) the real difference is the amount of data we
process: in 1999 we did take dozens of 5MB 12-Megapixel photos, we did not
have rich internet applications in our browsers (Facebook?) and we did not
have full-hd movies.

I feel that many times the software gets slower, but more importantly we use
the same computer to perform computation on bigger-sized data (sometimes even
an order of magnitude bigger).

~~~
vmorgulis
> ... we use the same computer to perform computation on bigger-sized data
> (sometimes even an order of magnitude bigger).

Yes for the size of the data (in video for example) but the look and feel of
the UIs we have today is not one million better than a mac from 1984:

[https://jamesfriend.com.au/pce-js/](https://jamesfriend.com.au/pce-js/)

Maybe as the Moore's Law is finishing, we could see improvements in software
(with superoptimization, why not) but it's not sure if we look at how cars
have evolved.

------
hyperpallium
Optimise only if needed; then only where needed.

The JVM and also javascript engines have increased in performance markedly
(clawing back a little of the performance they squandered in the first
place...).

Perhaps iOS is very performant - because it had to be, especially on the
original iPhone.

