By "computer" I mean: desktop computer, phone, tablet, VR headset. I'm not so sure that the problem happens with game consoles -- but it also doesn't seem to affect PC games. (The machine annoying me the most right now is an 8th generation iPad.)
So far as I can tell the problem started around the time Win95 came out and also affected MacOS Classic at around the same time. It did not seem to affect minicomputers like the PDP-8, PDP-11 and VAX. It did not seem to affect home computers like the Apple ][, Commodore 64, and such. I don't think it affected PC compatible computers running DOS. I don't think it affected Sun or AIX workstations or Linux machines running X Windows until KDE and Gnome came along, now it does.
(I think it happened around the time that most GUI systems started to incorporate distributed object models and RPC mechanisms like COM, XPC services, DBus, Binder, etc. Certainly all the systems that annoy me like this have something like that.)
Reinstalling the OS seems to help for a short time, but as time goes by the length of the honeymoon seems to get shorter and it is faster to get slow again.
Back in the day it helped to defragment hard drives but we're told we don't need to do this in the SSD era.
If I had to describe it, it is not that the machine gets "slower", but instead it has periods of being unresponsive that become longer and more frequent. If it was a person or an animal I'd imagine that it was inattentive, distracted, or paying attention to something else.
If you benchmarked in the ordinary way you might not see the problem because it's a disease of interactive performance: a benchmark might report 10 seconds on either an old or a new machine: it might take 0.2 sec for the benchmark to launch on a new machine and 2.0 sec on the old machine, but the benchmark starts the clock after this. You might need to take a video of the screen + input devices to really capture your experience.
Any thoughts?
Computers were slow in the 80s/90s, so if you wanted your program to run fast, you had to know how cache and RAM worked, and utilise what little power was there to the max. As computers have gotten faster, that knowledge has gotten less relevant to the average developer.
There are some fields where that knowledge is useful, like embedded systems, HPC or games, but they are a minority.
I won't deny there are some benefits in not needing to care - you can write programs faster (even if they perform slower), and some classes of bugs can be reduced (e.g. memory bugs). These are good things, but fundamentally our programs still move memory around and interface with the real world.
As more people forget (or aren't taught) to keep this in mind, we'll lose the ability to make things fast when we need to, and our software will get slower and slower until it's unusable.
On a related note, we're also not allergic to complexity the way we should be. The average developer doesn't have a problem using hundreds of libraries in a single project if it makes their job easier. Unfortunately this also makes programs slower, harder to debug, and increases the surface area for a vulnerability.