I think I make a fresh macOS with drive format install monthly, I'm kind of obsessed...
Why? Because it takes me weeks to months to get the system configured in the same way again - there's hundreds of points where you tweak, so many programs requiring all kinds of packages, files and customisations when you do full stack development and have a few hobbies.
I often see the advice "just do a clean install", even on the Apple forum when people have problems and it always makes me laugh - surely those people aren't professionals as "just reinstalling" is insanely expensive in terms of hours wasted on configuration, and you will often need to take care of problems half a year onwards especially with non app-store installs, kext files, packages installed via terminal, dot files but also just regular programs behaving differently.
Windows I agree, and I avoid reinstalls whenever I can. Linux, I could do one fresh install every six months (Ubuntu release cycle) because restoring my home directory and installing the software I need is extremely easy. I can do that from zero in a couple of hours. Anyway, I settled for Ubuntu LTS, so a fresh install is probably only every few years when I change computers, or the version upgrade fails.
Once SSDs became the norm, Windows rot has practically disappeared. The difference between boot time now and initial install time has risen by seconds, not minutes, and usability is immediate when the desktop loads.
Windows: Every year until Windows 10, then nothing until Windows 11 (updates are stable and work fine). Everything of consequence is on OneDrive, so no data to migrate.
Linux on Intel: Comparatively, in between both. Maybe I will nuke & pave 50% of my machines when getting a new Ubuntu LTS, but most I just upgrade (Ubuntu has had a pretty flawless upgrade record for me since... 12.04?). Obviously I have not had a lot of hardware last me that long. Containers greatly reduced the churn of config and services, data is just storage I mount.
Raspberry Pis: about once a year, maybe once every two years. I have had only 2 SD card failures since the Pi 2 came about, and I only upgrade my Raspbian installs - I now use Ubuntu for everything I can.
For the Mac, I haven't in a very long time. Again, last time was when I added an SSD for the boot partition (and, accidentally, borked my root folder, but iCloud restored everything in a couple hours).
Windows, oh well... I install it fresh every time because I use the machine for experiments.
1 - I had to install software as .pkg and I don't like that because I feel like I have no control of what's been installed and I don't trust software that can't be easily uninstalled
2 - I just like to keep things tidy and clean and after a year of use things can get messy when you run shitty/buggy scripts
3a - It's very easy for me to do (no struggle at all), I have a nice script and dotfiles that sets everything up for me, so in an hour my system is ready to use
3b - Everything that's worth keeping is stored either on my NAS or in the could, I don't have to worry about loosing stuff when I full wipe my machine
4 - Over the years I've noticed slightly better performances when clean installing macOS instead of just upgrading to the new version and potentially bring over buggy stuff
Other than that, I didn't need a fresh new install (Debian, Ubuntu) for as long as I can remember.
Today my OS installations live through generations of HDD/SSD drives, and I always have a quarterly or so full-system backup in case my system drive fails. All other important data is in the cloud (steam, gitlab, email, bitwarden, etc) so it's a matter of re-download. My latest reinstall was when I built a completely new PC in 2018, when I decided to build everything from scratch consciously, just for tech fun (I probably had to do that anyway, because windows doesn't like massive hardware changes). Before 2018, another system lived on a laptop, which I gifted to my friend. It has around 6 years of install time today, iirc.
With all respect, it looks like a sign of an anxiety-related issue, like niche OCD of some sort. I'd try to understand the driving feeling and deal with it on a subjective level.
* verify the checksums of everything installed
* find packages that no longer exist in any enabled repo
* find packages (not explicitly installed) that have no reverse dependencies
* find files no longer owned by any package
As a result, I haven't had too much trouble cleaning up things that might be left behind. My longest running install is going on 11 years now, having been migrated between computers and to/from a VM 8 times.
For Windows 10, I used to reinstall whenever explorer.exe would get stuck with a persistent bug where the metadata gatherer (eg. video thumbnails, mp3 ID3 tags, etc.) would take 10+ seconds per file and Explorer would freeze until it loaded the metadata for every file. I probably reinstalled 6 times before I got frustrated and threw explorer.exe in a debugger to find the cause. Turns out clearing a few registry keys fixes the problem: https://github.com/chenxiaolong/random-scripts/blob/master/c...
Nowadays, I reinstall Windows whenever my VR headset starts to drop frames for seemingly no reason (no Windows, SteamVR, or Nvidia driver updates).
I find myself semi-regularly setting up a fresh ubuntu install on a VPS though (although infrequently enough that I forget half of what I'm supposed to do). I agree with the poster that said it takes months to get things configured the way you want. The number of basic utilities and configurations, plus work specific software that doesn't come with a fresh install is huge.
1. My personal laptop: I did an Windows 10 erase and install when I purchased it to get rid of the junk, and split the hard drive for a dual boot. I also installed Fedora initially, but a month later shifted to Debian 11, and have not touched that since. It has been 6 months.
2. My wife's laptop: It is very old - maybe 10 years. It used to run Windows till 2017. Then it fell into disuse due to a hard drive failure. I revived it with an SSD around the time Linux Mint 20 released, installed it, and it has never been erased since.
3. My father's and father-in-law's computers: These aren't in my home, but I manage them. They are also a few years old. They run Debian. Started with 9 I think. I updated them to 10 and then to 11. They work flawlessly too.
4. Phones: I erased my phone (Oneplus 6) only once after Oneplus stopped updates to install LineageOS. Others haven't been erased since they've been bought. All are over 3.5 years old.
With Ubuntu, ca. 2004-2014: every six months to a year, because almost every single dist-upgrade would result in a completely broken system (wouldn't boot or wouldn't start X).
With Arch Linux, ca. 2014-2021: never, presumably because each change to the system was small and was installed by a large user base semi-simultaneously, so it was easy to figure out what had gone wrong the few times it did.
With NixOS, 2021-now: none yet, but I'm worried that going back to six-monthly release cycles is going to make each upgrade as unreliable as Ubuntu was. NixOS is in many ways easier to grok than Ubuntu, but I don't yet know the most common failure modes.
In the same way that a server should be treated like cattle, not a pet, I also treat my PC and phone OS installs the same way. I don't spend so much time configuring them that wiping them hurts my feelings. I more prefer the speedup and cleanup benefits than I do preserving my (non-existent) deep configuration. Just as an example, on my phone, I routinely have the OS alphabetize my app icons. Trying to keep them configured in some kind of order just seems like more trouble than it's worth.
My configuration is around 95+% scripted, so, while still slow to execute, it's deterministic, and I just need to copy/paste.
Generally speaking, I've found that in the last years, the amount of configuration I need to update (there is always something) is getting less. My guess is that, because Linux is getting more mature, and my own setup is getting more mature as well, there are less changes in general.
This is desktop, however. I also have a mini-server, server version of the same desktop distro, and I dread updating it; I find that updating configuration for server services is considerably more complex.
Regarding the other questions.
Why: in some cases it's a necessity. I always have modern-ish hardware, and older distros may not support, out of the box, the required kernel. Out-of-the box support also helps doing O/S updates in the long term.
Apps: the set of programs I install and use are stable. I may add something, but the basic programs I use are the same. This is a big advantage (see "how" section).
How: super-simple: I have a script to copy the configuration files to an external disk. I copy the files, I format the disk(s) and install the O/S, and execute the installation script (which, also, copies the files back). Important clarification: I copy/paste the script manually; this is necessary because new O/S version may cause minor breakages.
Regarding the "how", an important thing is that I update the script when I add a new program or there is a configuration to update. This considerably reduces the work I need to do on O/S updates.
Screw centralized configuration databases like Gconf, which some programs use "just because" (plaintext is fine; speed of configuration access is something rarely required, if ever). And screw O/Ss that don't allow this level of customization ;)
For desktop computers: typically i use either Windows, or some Ubuntu derivative (typically with XFCE), so both are pretty long term options with regular updates. Might ditch Windows 10 after it hits EOL and move over to just Linux, or maybe just use Windows 10 without networking for older games (in lieu of finding a better solution, like a VM with GPU passthrough because the old games that i like refuse to run on Linux without significant effort).
For servers: same reasoning, previously i ran CentOS (RIP), but nowadays it's also either Debian or Ubuntu. Right now i'm thinking of trying out Debian LTS (community effort over at https://www.debian.org/lts/) and afterwards just migrating over to Ubuntu across all of my nodes because the longer release cycle is more favourable for my needs.
Then again, i primarily use my servers as just hosts for Docker containers, so i can have clear separation between the software and its versions that i need, as well as have more flexibility in regards to a compartmentalized setup where i might want 10 different versions on MySQL with resource limits for each, for all sorts of stable software. Thus, if i ever need to use something old/EOL as well, i can just throw it into a container and (slightly) mitigate the possible fallout.
Relatively little automation to speak of currently because of how simple everything is: fail2ban, Zabbix, Docker Swarm/Kubernetes cluster join, sometimes ddclient and a bunch of tools, sometimes additional setup for backups and additional HDDs (in homelab, not for cloud VPSes). Would introduce Ansible otherwise, like i have at my workplace.
In short: whatever the EOL is for the device (currently ~5 years for most), though i focus on a stable and vaguely secure baseline over bringing over my exact configuration, the latter doesn't matter to me much. Then again, i don't rice my desktop either and treat everything as a boring set of tools to solve problems with.
I'm booting it into RAM from USB-Keychain or SD-Card, do updates in RAM, and occassionally 'remaster' that back onto the (removable) boot media.
Benefits: everything is always f...ing FAST!
Disadvantages: limited by available RAM. Starts being practical with 8GB, where 'OS' can take anything from about 700MB to 1.8GB, depending on my needs.
Good for uptimes of 3 to 4 months on 'Desktop', after that a kernel-update seems prudent anyways. Anything else can be handled by stopping everything and logging out, then logging back in. Takes a minute, or so.
Spoken from Antix- & MX-Linux.
I honestly don't know how you can live with a PC that you need to format every month.
Back 20 years ago, I used to reinstall Windows every twelve to eighteen months. There's just no need any more.
Of course, I don't install warez any more, that probably helps. But mostly Windows just turned into a real operating system about 10 years ago.
I admit I've been starting to think about a reinstall, though. My WSL-1 install is so old that the instructions on how to upgrade to WSL-2 don't work, and I can't figure out how to make them work. So maybe a reinstall would do it.
For Linux I keep data on a separate partition so I can mount it via whatever distro I installed elsewhere, then I add the partition to /etc/fstab and mount over /home. Been doing that since 2000s.
Windows I only run in a WM or on the work computer so a fresh install for former and the latter is not my problem.
I recently had to install Linux four times, because I had some trouble with my Nvidia GPU + AMD onboard graphics card setup.
Windows seems to be working just fine without ever setting it up afresh. But this could also be related to the fact that I'm using it for gaming only.
I'm also not planning to upgrade to Windows 11, because my laptop's Ryzen CPU is not supported by Microsoft.
Keep in mind fresh install normally means you don't quite know what's wrong. With some experience you figure this out and fix it instead of doing a fresh and not knowing if the problem will recur.
I have everything I need in an installation script (a seperate script for NixOS, Windows & Mac) that I just need to run after reinstall to get up and running. Big changes I make on the system are added to the script.
But more interesting is mine brother story. Before little bro was on Windows and frequently install-uninstall software. So on every 2-3 months he was make fresh install with format drive.
Then i show him Linux (Ubuntu TBH) and he join club "Years without reinstall".
I've got 2 32 Gb cards to move to should I find the need to do so.
5/6 years ago Migration Assistant was barely usable compared to now!
Windows: at least a dozen times (and I haven't used Windows for five years).
This is the problematic part.