Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why do computers/tablets/etc "freeze up" as they get older?
79 points by PaulHoule on Jan 14, 2024 | hide | past | favorite | 103 comments
By "computer" I mean: desktop computer, phone, tablet, VR headset. I'm not so sure that the problem happens with game consoles -- but it also doesn't seem to affect PC games. (The machine annoying me the most right now is an 8th generation iPad.)

So far as I can tell the problem started around the time Win95 came out and also affected MacOS Classic at around the same time. It did not seem to affect minicomputers like the PDP-8, PDP-11 and VAX. It did not seem to affect home computers like the Apple ][, Commodore 64, and such. I don't think it affected PC compatible computers running DOS. I don't think it affected Sun or AIX workstations or Linux machines running X Windows until KDE and Gnome came along, now it does.

(I think it happened around the time that most GUI systems started to incorporate distributed object models and RPC mechanisms like COM, XPC services, DBus, Binder, etc. Certainly all the systems that annoy me like this have something like that.)

Reinstalling the OS seems to help for a short time, but as time goes by the length of the honeymoon seems to get shorter and it is faster to get slow again.

Back in the day it helped to defragment hard drives but we're told we don't need to do this in the SSD era.

If I had to describe it, it is not that the machine gets "slower", but instead it has periods of being unresponsive that become longer and more frequent. If it was a person or an animal I'd imagine that it was inattentive, distracted, or paying attention to something else.

If you benchmarked in the ordinary way you might not see the problem because it's a disease of interactive performance: a benchmark might report 10 seconds on either an old or a new machine: it might take 0.2 sec for the benchmark to launch on a new machine and 2.0 sec on the old machine, but the benchmark starts the clock after this. You might need to take a video of the screen + input devices to really capture your experience.

Any thoughts?




The average developer doesn't put the same care into what they're developing the way they had to in the past.

Computers were slow in the 80s/90s, so if you wanted your program to run fast, you had to know how cache and RAM worked, and utilise what little power was there to the max. As computers have gotten faster, that knowledge has gotten less relevant to the average developer.

There are some fields where that knowledge is useful, like embedded systems, HPC or games, but they are a minority.

I won't deny there are some benefits in not needing to care - you can write programs faster (even if they perform slower), and some classes of bugs can be reduced (e.g. memory bugs). These are good things, but fundamentally our programs still move memory around and interface with the real world.

As more people forget (or aren't taught) to keep this in mind, we'll lose the ability to make things fast when we need to, and our software will get slower and slower until it's unusable.

On a related note, we're also not allergic to complexity the way we should be. The average developer doesn't have a problem using hundreds of libraries in a single project if it makes their job easier. Unfortunately this also makes programs slower, harder to debug, and increases the surface area for a vulnerability.


> we're also not allergic to complexity the way we should be.

Very well put. I think the majority of why performance isn't a goal anymore can be attributed to this. So much software today is written like it's running on a supercomputer (and in comparison to computers even just 10 years ago, it is). Libraries have become crutches rather than tools.


> Computers were slow in the 80s/90s, so if you wanted your program to run fast, you had to know how cache and RAM worked, and utilise what little power was there to the max. As computers have gotten faster, that knowledge has gotten less relevant to the average developer.

Yes and no here.

I fully remember the same thing happening slower computers back in the day. My Amiga 500 was this way as I recall. That was a computer released in 1987, until I finally upgraded to a PC circa 1996/7 ish.

There was a point where MS was releasing slow software on purpose with the idea that the processors (which were still following Moore's law) would catch up and make it fast. Though one has to wonder if it was secretly Microsoft's inability to write good software that drove Moore's law.


> average developer

I don't know about you but "performance upgrades on ancient hardware" isn't exactly the sort of thing my manager would be terribly happy I was focusing my time on.


Our operating systems are not made by average developers.


Programmer from the 80s: Cache? What's that? People have room for Cache?


Sounds like excessive paging, aka Thrashing[1]. Newer software uses more memory, and when the OS has no more physical memory to allocate, it moves some data from RAM to disk and back until the process frees those pages. This destroys performance, and the more memory is over allocated, the longer it takes to finish the operation and go back to normal.

Differently from high CPU usage or network problems, this can potentially affect any memory operation, so even carefully designed GUIs will freeze.

This effect will not happen in systems with sufficient RAM, or that don't swap memory to disk. In that case it'll either crash the program trying to allocate, or a random process[2].

[1] https://en.wikipedia.org/wiki/Thrashing_(computer_science)

[2] https://en.wikipedia.org/wiki/Out_of_memory


100%. RAM is always the culprit IMHO, every release has a few more features that need a persistent daemon.


RAM upgrade was always the first thing to do back in the day, and my personal computers all have expansion slots. But when I’m involved choosing new computers at my work, I’m struck by how many are not upgradable.


Demonstrably not for me, and many others I know who have 32/64GB RAM


How big of un upgrade is it from 16 gigs to 32? I don't use Android Studio or any Jetbrains software but for me 4 GB isn't too low. In my eyes 16 GB is almost too much, so I get shocked when I hear of people using 32/64 GB.


Same, last time I changed PC I got 32GB of RAM just to avoid these issues, but they still happen nonetheless. I don't even know what to debug to find out the cause.


I'm discussing it in terms of mobile (I worked on Android)


There's a few things we have on newer computers that we didn't have on older computers:

* regular software updates -- more features and larger codebases running on the same machine

* larger quantities of software running at the same time -- on older machines multitasking either wasn't as common or wasn't even possible, and as time went on more software became available as did the ability to run more of it at the same time.

* the internet -- pages on the internet get larger and more demanding all of the time, while the capability of any given piece of hardware stays constant.

* active cooling and thermal throttling -- hardware can actually run slower if it gets dirty


... of course retrieving content from the internet can always be a little slow on a bad day for a contribution to tail latency, whether that content is being fetched for a good or a bad reason.


I've always attributed it to the fact that developers typically have really good hardware all the time. When you develop on a great machine, you don't give the performance of older machines as much consideration. Software now typically has a mandatory upgrade path, so even if there is a version that works well with your hardware, you may not be allowed to use it.


Mostly, it's the software upgrades. Programs almost always get more bloated with each up update: bigger executable sizes, more features that need more cruft initialized on startup and such.

Silicon aging contributes some: https://en.wikipedia.org/wiki/Transistor_aging

This wouldn't have affected your Apple II and PDP-8's, due to the huge transistors in those things (relatively to modern integration scales), and not running at anywhere the modern clock speeds.


There definitely are rumors that device longevity is about to take a nosedive in 5nm and future process generations.


Maybe this is why the computers are so low-tech in the "Silo" series.


Where did you get this from?


I searched the text of their comment with Google and the third hit was this article (the first two being their comment): https://semiengineering.com/aging-problems-at-5nm-and-below/


from the article posted above:

> The biggest factor is heat. “Higher speeds tends to produce higher temperatures and temperature is the biggest killer,” says Rita Horner, senior product marketing manager for 3D-IC at Synopsys. “Temperature exacerbates electron migration. The expected life can exponentially change from a tiny delta in temperature.”

basically smaller nm builds are more susceptible to tiny variations in heat/cold, physical impacts, electrostatic shock, etc.


Substitute “bloated” with “more capable”. So much of the stuff we consider basic essentials was impossible not too long ago.


Like?

My personal experience is that general productivity apps, even office suites haven't changed that much over 30 years, but their system requirements keep jumping.

If you can identify differences between then and now, how many of those aren't bloat?


We now expect instant messaging apps to support:

* Video and voice calls, including group calls

* Inline images and video

* Screen sharing

* Clients on every OS and the web browser, they should all look pretty much exactly the same and be updated at the same time

* and a whole bunch of other random stuff like in app image editing, payment processing, etc

I'd say the biggest hit to memory usage and performance is just all the inline images and videos. IRC may have been super lightweight and fast, but people expect more of programs now. Electron/web tech was also pretty heavy, but releasing a native Windows app and maybe later a half done MacOS version is no longer acceptable. You _have_ to support windows, mac, linux, android, ios, and web. All of these versions _must_ have the full feature set and look the same.


Skype had group video calls, screen sharing and instant messaging almost twenty years ago. It ran fine on the machines of that time. It had good UI for all these things.


1) software grows and evolves in complexity over time while hardware remains constant until it is upgraded, this explains the macro or low frequency component of the slowdowns (ie, things that reinstalling the o/s won't totally solve)

2) local state grows and evolves in complexity. configuration databases grow, new files are added to the filesystem, software upgrades leave unused state around (because getting software upgrades right is still a not-totally-solved problem), indexes (or similar) of mutable state grow but sometimes don't get rebuilt causing locality to suffer affecting cache performance.

3) psychology.


> software grows and evolves in complexity over time

AKA software bloat. The amount of layers of garbage in an OS today is horrendous.


I don’t miss each program coming with its own set of printer drivers. Or only having one type of storage: a directly attached disk of one of a few specifically supported models. Or each program having to decode images in its own code. Or having to POKE specific memory addresses to put a dot on the screen. Each of those layers provide baseline functionality that lots of programs can use.

Most of the time, “bloat” is another way to say “the features I don’t use”. Yes, sometimes people push bad code to production. Other times it’s new inherently more complex code that makes computers nicer to use.


Which layers do you think are not useful?


Apple has admitted to reducing clock speeds of devices during updates to increase battery life.

As for android, I have experienced this a million times also, but do not know the cause. But I do usually see an accumulation of new and stupid bugs as I get software updates. Incorrect app scaling and positioning, etc which were not the case for the first six months I had the phone.


> Apple has admitted to reducing clock speeds of devices during updates to increase battery life.

Every single modern device with a CPU does this dynamically. You are mixing up what batterygate was about.

Apple released an update that would permanently throttle a handset if it rebooted (browned out) due to a degraded battery. Replacing the battery would bring it back to full speed. The feature still exists today. It’s a bit like the “limp home” mode in your car.


It’s not to increase battery life, it’s that when batteries degrade, the voltage can drop bellow spec when the current gets too high. So when the cpu hits its peak frequency, the battery voltage drops and the chip glitches causing a reset/reboot.

Apple added an update that could detect these power failure reboots and cap the CPU frequency so your phone doesn’t crash all the time. Around the same time, a lot of Android phones like the nexus 6P were plagued with this issue where the phone would just randomly shut down.


Older Android devices had terrible storage performance. Any time a background process is using storage, the device gets very slow.


I think that this comes from people using the cheap and nasty SD cards, some vendors automated the app data to the sdcard, which really sucked if the sdcard was slow/faulty.


One of the biggest differences was on an nvidia Shield tablet. They cut a ton of corners to fit the GPU and still hit a $200 price point. It didn’t even have fast charging, so sometimes the battery would drain during a game even when it was plugged in.


In going to hell for saying this... But I never had this with my Android phones.

Galaxy S2,S3,S4,S5 One Plus 6 and more


One aspect sometimes overlooked is that the physical device degrades over time. Blocks of memory corrupt, connections erode, humidity or dust slows down cooling mechanisms. We tend to think of computers as either working or not but they can and do break down over time.


Most of those effects you mention manifest themselves as obvious crashes.

If cooling is insufficient so as to cause thermal throttling, the effects are usually very obvious too, and not gradual. Neither will reinstalling software change that.


Most computer users never monitor their cooling effectiveness at all. They get used to the fans spinning up when the computer is doing anything that THEY think is non-trivial, even if it actually should be trivial. Then there's a smooth transition from "fans spin up but things perform normally" (the ailing thermal solution is able to hit targets without throttling) and "fans spin up and things perform awfully" (the cooling can no longer hit targets without throttling).

This can be explained just by thermal paste drying out and caking up, no more mystery needed. I've had 12 year old "workstation" laptops (the copper cooling alone was heavier than some entire laptops are these days) which started spinning up their fans for seemingly no reason. I dismantled them, cleaned off the old paste, put on a fresh paste, reassembled, and haven't heard the fans once since then. The cooling was good on day one, it got worse for unavoidable physical reasons, and it's easy to fix if you know how.

The problem is most users do not know this is happening and wouldn't know how to fix it even if they did. If they ask someone for help, 99% of the time they'll be told it's too old and they should upgrade.


Surely if a connection erodes it is either still connected and working or not connected and not working? It wouldn't get slower.


Systems are built to be fault tolerant because hardware errors are inevitable. Bit-flips in RAM and disks, line noise in transmission circuits, voltage fluctuations, etc. And these problems are statistical. In a new machine the chance of error may be 1%, then as it ages the error rate increases to 10%. Repeat something a billion times and those percentages mean you have a million errors. Not only in time but density; a chip with a thousand transistors is going to have fewer problems than one with 10 million. That's why a much older system can be faster than an old but more recent one. But also because the older system will be less fault tolerant. Back then the cost of handling a bit-flip or low-voltage wasn't higher so they ignored the errors and continued to run at full speed as it corrupted your data.


The connection itself wouldn't get slower, but the outer mechanism using that connection may retry, and the retry may be successful. The appearance to the user is one of slowness.

For example, one of the symptoms of networking packet is slowness. Assume 2% of packets are lost. A packet is lost, the receiver re-requests it, and the retry has a 98% chance of being successful. End result: the transmission completes, but at a slower pace.


The process of recognizing that a subset of a component has failed, recovering from the failure, and resuming would result in a slow-down. I'm not sure that's a full explanation for the experience though.


For something like an Ethernet connection a cable can certainly be half bad, good enough that you can transfer some data but with speed and reliability problems particularly involving weirder protocols that involve broadcast on the LAN and such.


There are a few things going on that I’m aware of. It’s usually related to disk.

- Windows update can take a lot of blame for this. Basically the longer the Windows release has been out, the more windows updates there are. After every boot one of the first things windows does is check for updates. It then scans the local folder where the updates are saved and does some sort of check to make sure everything is good. Processing this folder can take hours on an older machine with a spinning disk.

- The Windows registry, where almost all windows settings and quite often app settings are stored, has terrible performance and can become bloated over the years at as more and more stuff is written to it.

- An SSD can suffer bad write performance if you are dealing with a somewhat full disk and it can’t trim space quick enough to deal with new writes coming in.

- In Linux you’re not supposed to mount with discard any more but instead rely on a cron job that runs in the background once a day. I think various manufacturers are supposed to handle the trim in firmware too but then it’s a black box.

- Filesystems generally perform worse as they fill up. In my experience NTFS seems to have a lower threshold than ext4 or XFS.

- Your disk might be failing. Look at the SMART stats to see if you are hitting unacceptable error thresholds.

- In Linux anything that uses snaps will take forever to start the first time.


There are a lot of different factors at play. What you're seeing looks similar but has many different causes in different cases, like having a runny nose. Lots of different things cause it, same symptom.

Software can decay, hardware can decay, both can exacerbate each other. The registry gets filled up, the filesystem gets fragmented, the software accumulates in memory, upgrades of the same software accumulate hacks and poor design leads to worse fixes to problems, and slowly buggier versions are shipped. The hard drive slowly fails, the SSD loses viable blocks, the fan and heatsink gets clogged with dust and less efficiently cools the system, the battery loses capacity which causes the system to throttle itself, the power supply components slowly degrade causing voltage inconsistency, tin whiskers develop.

The simplest explanation I can give is entropy. Everything in the universe is in a state of decay. Eventually the thing decays so much it breaks down.


I definitely had a desktop replacement laptop which got terribly clogged with dust in my dusty old house and revived it with a spray can of some F-gas that has terrible global warming potential. I was hoping we'd get a fluoroketone based replacement but now we know fluoroketones are dangerous PFAS (makes me wonder what to do with the expensive 3M contact cleaner I bought now that I think of it)

I had two mac minis that operated in the same conditions and seemed well designed enough that dust got separated from the air that went through the computer so it never got clogged up.


I haven't experienced this for nearly 15 years now. If something starts acting up I always find there is either hardware going bad that will completely fail before too long, a bunch of bloatware running in the background, a drive is completely full and there isn't enough room for page file, or it has uptime measuring in weeks or months.


most people aren't sophisticated enough to identify those issues practically speaking sadly

e.g., I'm one of the only people on my team at work whose computer just works consistently since I understand these things and sidestep well intentioned but poorly conceived it policies such as untested updates that nearly brick laptops often...


I use a lot of 10-20 years old computers and never experiencing those. Right now I am writing from single-core P4 and 2Gb RAM, of course it is 32-bit. I still can open any website from HN except of those requiring me to have a top-notch web-browser.

The only problem I have is that I can neither update Chrome to a modern version (understandable) nor downgrade it to the version with Adobe Flash support (not understandable). Not supporting Flash is kind of moronic despite of notoriously bugginess of Flash because if you (Google) don't support a browser for me anyway and my browser is buggy a-priori, why do you bother me with disallowing Adobe Flash?


For a program to freeze, it must do i/o, usually. E.g. my image viewer has a convenient thumbnail cache which speeds up things. But when I start it in a folder of many folders of many images, it freezes for several seconds in i/o. Sadly it has features that stop me from migrating away.

Another vector is uncontrolled synchronicity. E.g. I knew programs that could install an Explorer context menu item that could delay presenting it for two seconds when you right-click. I don’t remember the names, but remember hunting them down and removing from my context menu.

So I think that mostly all of this is because of (1) problematic implementations of caches and (2) plugins that break assumptions on how long something should take.


Right, I'm strongly skeptical about plugins of any kind for GUI applications.

I've met people who claim they can install 30 plugins into Eclipse and still get things done but when I try it, Eclipse seems to get "pluginitis" and become unreliable. I've found distros of tools like Eclipse and the JetBrain IDEs that have a curated collection of plugins that seem to work but I find installing anything, even something little like a Plugin for bash files because it saw I was editing a ".sh" file, is like playing Russian Roulette.

I feel the same way about web browser extensions. I think it's OK to install something like uBlock Origin and certain other anti-tracking tools, but you can make the case that those improve system performance whereas performance can only go in one way installing other browser extensions.

The "problematic cache" rings true for me. Any developer has had the experience of making something that performs 100x or more better with a very simple cache. However it is by no mean a given that adding a cache layer really improves performance in a system, particularly when you can have problems with locks being held, invalidation, etc. Particularly if worst-case latency is your concern, for instance (and I think it is the concern when your computer is "freezing up") the cache will sometimes waste time doing its cache thing and still have to go to the source.

Those games might do better because of a different performance culture... People are very concerned about worse-case latency for games but seem to be completely unconcerned when it is a spreadsheet or a graphics editor or the GUI shell of your desktop.


Eclipse is using an effectively 'cooperating multitasking' scheme where it just hooks lots of events from the main application. If you just install 1 single bad plugin that tries to do synchronous network access it will undoubtedly kill your responsiveness. Lots of plugin based systems have evolved a more resilient way of hosting those modules. DAW applications host hundreds of audio plugins and even browser plugins are sandboxed nowadays.


> It did not seem to affect home computers like the Apple ][, Commodore 64, and such.

Your software most likely did not ran off a hard disk. So it was slow to load anyways.. and after that it ran within the memory it had. One program at a time, no memory swapping.

> I don't think it affected PC compatible computers running DOS.

Umm.. I remember being mesmerized by disk defrag programs. There was also TSR programs and other ways to run more as one program at a time.


Circa 1987 I switched from a TRS-80 Color Computer 3 which had two floppy drives to a 286-based IBM AT clone with an HDD that almost exclusively ran MS-DOS or DR-DOS.

In general though if you have more state there is more possibility that the state goes bad (e.g. malware counts as "bad state", as does a configuration database growing without bound, or something like the XP-era updating mechanism in Windows that was O(N^2) in terms of the number of previous updates)

I could swear I didn't notice this rot with that 286 machine, the 486 machine I replaced it on that ran Linux, the Sun 3 and Sun SPARC workstations at my undergrad school, AIX workstations at grad school, etc. (There was the SGI machine that always struggled to get out of its own way at anything that a professor bought, never bothered to set a route password, never got anything done with it, but left it plugged into the Ethernet and power)

Once we got into modern Linux distributions like Fedora that had Gnome or KDE I definitely had this problem though.


none of the other comments seem to mention memory swapping. this is the correct answer


HDDs wear out. Cheap eMMCs wear out too. I believe these were big reason for performance degradation on computer. With decent SSDs, it shouldn't be happen (on computer's lifetime) but some computer performance still degrades a bit, so there should be other reasons too, like bloating Chromium.


This is obviously not the case in general. There are a few things that may be correlated with the age of a computer though -- full SSDs are often slower; newer programs/sites require more memory and CPU; thermal throttling due to dried up thermal paste or dust.


I've had laptops slow down like this due to dust on the fans. They weren't able to cool effectively, and presumably the CPU got throttled.

Not sure why it happens with phones -- I assumed the OSes shipped more expensive animations & apps to encourage people to upgrade.


In at least one case (I've most clearly observed this in cheapish Android tablets): inexpensive flash storage.

I've been able to observe, most obviously with my first gen Nexus 7 but also with other devices, that pure CPU benchmarks are still fine, but "disk" benchmarks show terribly low (~100kbps) throughput. Lots of modern software assumes flash/SSD is fast, but inexpensive flash wears out to the point that this assumption fails.

At least once I've seen a thorough wipe very temporarily restore some performance in such a case.


FWIW, I don't personally think this affects most Linux distros. My Arch Linux laptop from 2014 stills runs just as well as it did on day 1. The only reason I don't use it is because I wanted to use a gen4 ssd (at gen4 speeds). I verified this when I revived it recently to run a win95 emulator so my son could play an old PC game.


I believe it's attributable to "wider" vs "taller" software development. As in, software using more cores, running more background processes, intermittently accessing I/O more frequently, vs software that does a single task more intensively like a game.

Games can be parallelized, but in a relatively tight way: the game just has to coordinate the physics/AI stuff with the rendering stuff. The rendering is defined in a way that lets it consume most of the resources. Games on old CPUs do hit thresholds where they stop being playable, but they are fairly concrete boundaries where that number of cores and clocks just won't hit the pacing needed for that level of detail.

But the general operating system layer has made a point of pushing both power and responsibility onto userspace applications, and allows them to be helpful and do things in the background. The applications in turn are doing a lot of redundant, overlapping work - no unified methods of serializing and communicating, polling for events, contra-user telemetry and the like - so the end result is a pileup of background load that can kill the experience.

The switch to the SSD era cured one aspect of this - background storage accesses - but just brought on different bottlenecks with memory, core, and network utilization.

The way it might be addressed is to give the OS relatively greater, higher-level responsibility and supersede more of what the applications are doing. The open-source/free-software space has a better chance of accomplishing this because, while it usually develops at a slower pace than a commercial company, it's more "sticky" in terms of accumulation at the base layer and fixing root causes instead of hacking around them. And people generally do report that installing Linux is a speed-up for their old device; when you run a contemporary Linux distro you get a lot of software that has been properly domesticated to the environment, and therefore doesn't turn into a burden.


my old Compaq w/ Windows 95 boots up like a charm, and I can still play games like X-Com w/o a problem. My early generation iPad can barely run the home screen.


My early iPad also seems unbearably slow. Not sure as to why.


Same here. Add to that the fact that very few apps can be installed anymore with the old version of iOS and it's of surprisingly little value. Even letting the kids watch YouTube has become impossible...


Yes, basically the only app I could use was a browser. I ended up giving to my mother in law. It’s more her speed.

I bet apple 2s run better today.


that old ipad probably has been updated to an iOS that really isn't designed for it

or, if it hasn't, you're trying to view websites with heavy javascript which requires too much memory/cpu to view effectively


My iPhone does this, and I have no idea why. It’ll just randomly have huge issues switching from one program to another. It doesn’t have an activity indicator on individual programs, so I never know what the issue is.

My mac basically never has it, unless Intellij is running in the background and decides to re-index.


The two most impactful things to try would be factory resetting it and having the battery replaced.


Has anyone had any luck breathing life into old tablets with PostmarketOS or anything similar? I'm talking about Android tablets that are now so slow as to be literally unusable. Some comments here mention hardware degradation, so can a lightweight alternative OS/distro make a difference?


A reason not mentioned so far: lots of software “phones home.” If home doesn’t exist, or network changes take place, the software when can confused.

Most software that phones home is designed to work regardless. But the functionality can degrade and performance takes a bit.

Obviously, updates become impossible.


Yes, I'd say this is one major reason. Software these days likes to talk to the internet and if that internet is slow or unreachable, the software is respectively slow or frozen.


Capacitors wears out.

Hardest to identify exactly which one blew unless it is obvious.

Toss motherbord and upgrade it.


Those operating systems have non-kernel long running software and more and more long running UI software. Heap fragmentation and memory garbage collection seem like probable causes.


... but the memory gets cleared when I reboot it, doesn't it? Rebooting doesn't fix the problem, at least not with the iPad.


Possibly as newer OS and software updates are installed they take more space. This could reduce the amount of time before memory becomes constrained. I have a similar problem with an android tablet where there is no slow-down when I'm using an application or game but switching apps or other OS UI activities is terribly slow


They don't. Yes they might get slower as newer software imposes higher demands on less-capable hardware. But they shouldn't freeze.

That sounds like a faulty hardware issue, or, more likely, a configuration issue.

Once upon a time, the saying was "If you want to speed up a slow machine, throw more RAM at it" as more RAM meant less slow swapping/caching to slow hard-drives.


I had an iPhone and Google tablet that both had this problem. I stopped buying hardware from either. I switched to Galaxy phones and haven't had a problem since. My current S20 is 3 years old and runs like the day I got it. My previous galaxy phone still ran perfectly well after 4 years. I have no idea what accounts for it.

On a desktop, I think it's more about hygiene and expertise with the OS.


It does happen with consoles but much later. This lets you understand why.

It comes from various electric components, for example electrolytic capacitors dry out.

The reason consoles don't use them, they don't want to ever have the bad reputation of dying consoles.

Why do 'computers' use them? Planned obsolescence. You will go buy that new computer basically by certain number of years.


Some manufacturers, such as Apple, have been intentionally making devices slower to drive sales of new products. Potentially Microsoft does the same with Windows. For instance, they recently announced that computers lacking a certain hardware component will not be able to run new versions of the OS, making hundreds of millions of perfectly running computers obsolete.


I personally haven't seen this effect on my machines, but then again, I don't change my configuration much and keep a constant watch over resource consumption. I suspect it occurs to those who will gradually install and run lots of applications over time, and leave them open, without realising the resources they're use.


I had a nexus 7 tablet that ran like a dream when it was new. Within a year it was unusable slow. That's because google had shipper it with a very low quality SSD. It degraded quickly, resulting in slower and slower speeds. So at least thats one way computers could get slower or keep locking up after some time.


I feel so much of speed is perception. When I first got the internet I have no problem waiting 3-5 mins for webpages to load. Now I expect 3-5ms. The current context of your experience of speed might influence your expectations when using older equipment.


Batteryless devices are superior to whatever we had in the 90s. Blue screen of death was a daily occurrence in the past. I don't remember having any device issues in the last 10 years. Devices with batteries are having issues with voltage drops.


If the system is otherwise stable, then the first thing that springs to mind for me is storage going bad. Bad sector reads/writes will cause wait times while errors are corrected or mitigated etc but otherwise cause no real issues.


I only experience "slow down/freeze up" with Windows and to a lesser extent, Mac based operating systems. I have never felt that way with a Linux install.


Sounds like a Windows thing in my experience


Memory overflows.

Defragging freed up space for virtual memory


> Ask HN: Why do computers/tablets/etc "freeze up" as they get older?

Because HW gets older. Your access time for RAM will not be the same as it was new, resistors, capacitors, etc will change value due to aging. You semicondoctors will experience migration of ions.

The SW is not "thought" about it and if some operation does not work or does not finish in a period of time, if will mostly crash.


Resistors drift, that's true. I've seen them drift 10-20% across 70 years. I've seen transistors from the 1960s work just fine.

It's not the hardware. If it was, you'd get random crashes, not slower performance. Everything is locked to the clock signal, and there is plenty of wiggle room, relatively speaking.

It's the OS, and how software gets installed/removed.


Doesn't happen to my computer, which runs FreeBSD and is fairly old by now. X1 Carbon Gen 6.

FreeBSD+XFCE for the win.


My Android phone is doing this now.

I thought this was a bug in firefox, but now it's in the chrome based browser too.


Your iPad has flash memory and a battery. Both are consumable items that wear out over time.


"Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster."

https://en.wikipedia.org/wiki/Wirth%27s_law



I think the op was hoping for a technical answer


In particular I believe the problem emerged circa 1995 so there must have been a technological and/or social change.

My impression was that the upgrade treadmill for computers started to get established in the late 1980s.

That is, even though they produced a huge number of computers in the Apple ][ line over about 13 years

https://en.wikipedia.org/wiki/Apple_II_series

you couldn't say any of the newer computers were that much better, particularly in terms of raw performance. It was the PC clones that decoupled the system clock from the video clock making it possible to release a machine that was clocked substantially higher than what you could get six months ago, enough that you could buy a new computer in two years that was more than twice as powerful as your old computer.

Circa 2006

https://en.wikipedia.org/wiki/Dennard_scaling

broke down and the industry has had to multiple cores, GPU acceleration, etc. to be able to make the case that today's computer is better than the one you had before.


Doesn't happen with my Mac. Just make sure you have minimum 16GB.

I still use an iPhone 6A and it works fine.

Stuff like this is one of the key reasons I dumped Windows.


answers like these are why I cant take apple fanboys seriously.


I was a Microsoft true believer for a very long time. I started working on DOS in 1987, with Windows 2.0 through Windows 286, Windows 3, 3.11, WFW, OS/2 Windows 95 and Windows NT. I knew DOS and Windows in great depth and spent years resolving the toughest problems with them.

I’ve earned the right to say the truth about Windows.

And yes, MacOS is a far better operating system….. I’ve earned the right to say that too.


broken fan -> overheating, sometimes, maybe


check the ram usage




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: