Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How often do you make a fresh OS install?
22 points by kkotic 13 days ago | hide | past | favorite | 81 comments
How often do you do it? For what reasons? Do you format the drive, keep your files, apps?

I think I make a fresh macOS with drive format install monthly, I'm kind of obsessed...






Never unless something is corrupted or it's absolutely needed.

Why? Because it takes me weeks to months to get the system configured in the same way again - there's hundreds of points where you tweak, so many programs requiring all kinds of packages, files and customisations when you do full stack development and have a few hobbies.

I often see the advice "just do a clean install", even on the Apple forum when people have problems and it always makes me laugh - surely those people aren't professionals as "just reinstalling" is insanely expensive in terms of hours wasted on configuration, and you will often need to take care of problems half a year onwards especially with non app-store installs, kext files, packages installed via terminal, dot files but also just regular programs behaving differently.


This is so true. Every time I get a new (used) laptop, I think "I should do a clean install". Then I start thinking about all the configuration, packages, applications, little scripts, .bashrc, etc. and retreat to cloning my last hard drive over. I think my system is going on 13 years now. It's collected some barnacles for sure, but the price of starting fresh is too high.

> Why? Because it takes me weeks to months to get the system configured in the same way again

Windows I agree, and I avoid reinstalls whenever I can. Linux, I could do one fresh install every six months (Ubuntu release cycle) because restoring my home directory and installing the software I need is extremely easy. I can do that from zero in a couple of hours. Anyway, I settled for Ubuntu LTS, so a fresh install is probably only every few years when I change computers, or the version upgrade fails.


Usually once, when I get the computer, because it probably doesn't have Linux on it.

And even if it does, probably not the version I want

Pretty sure I'm running a Windows 10 install (MBR, not even EUFI) from 2016 or so... making this the longest OS installation I've had. Not sure if OS swaps like Windows' new update mechanism likes to do count as a fresh install.

Once SSDs became the norm, Windows rot has practically disappeared. The difference between boot time now and initial install time has risen by seconds, not minutes, and usability is immediate when the desktop loads.


Yes. Monthly reinstalls used to be somewhat necessary back in the Win95 era, and even into 7 there were some long-running corruption issues (Windows Update itself was a particular offender, as the list of updates grew it got less efficient), but 10 seems to have finally fixed this.

Mac OS: Every new machine. I have done maybe _five_ reinstalls in around 20 years of using modern (post-9) MacOS, and three were on the same machine while debugging a hardware problem. I stopped using Migration Assistant after moving to High Sierra and just migrate my documents.

Windows: Every year until Windows 10, then nothing until Windows 11 (updates are stable and work fine). Everything of consequence is on OneDrive, so no data to migrate.

Linux on Intel: Comparatively, in between both. Maybe I will nuke & pave 50% of my machines when getting a new Ubuntu LTS, but most I just upgrade (Ubuntu has had a pretty flawless upgrade record for me since... 12.04?). Obviously I have not had a lot of hardware last me that long. Containers greatly reduced the churn of config and services, data is just storage I mount.

Raspberry Pis: about once a year, maybe once every two years. I have had only 2 SD card failures since the Pi 2 came about, and I only upgrade my Raspbian installs - I now use Ubuntu for everything I can.


Since I moved the interesting parts of my home folder into a separate partition, it's pretty painless to do a full install when the time comes. I've been doing clean installs every year or so, for every two Ubuntu releases, more or less, or when I upgrade my laptop's storage (because it has only one device - next one should have two NVMe sockets).

For the Mac, I haven't in a very long time. Again, last time was when I added an SSD for the boot partition (and, accidentally, borked my root folder, but iCloud restored everything in a couple hours).

Windows, oh well... I install it fresh every time because I use the machine for experiments.


How do you deal with system configuration then like in /etc

I avoid changing it. I keep my desktops as close as possible to a standard install.

My friends say that I'm paranoid but every time a major macOS version comes out I do a full wipe installation.

I got in the habit of that back in the day after I was forced to do so after a failed in place upgrade. I stopped, as it seems to be more reliable lately, but I also learned to wait for at least one update after the initial release of new macOS versions before I make the jump.

Can elaborate your reasoning behind this? I look at the amount of customization made to my system over the years that never made it in a config control system and I quiver at the thought to start over every year or so.

Mainly because of these:

1 - I had to install software as .pkg and I don't like that because I feel like I have no control of what's been installed and I don't trust software that can't be easily uninstalled

2 - I just like to keep things tidy and clean and after a year of use things can get messy when you run shitty/buggy scripts

3a - It's very easy for me to do (no struggle at all), I have a nice script and dotfiles that sets everything up for me, so in an hour my system is ready to use

3b - Everything that's worth keeping is stored either on my NAS or in the could, I don't have to worry about loosing stuff when I full wipe my machine

4 - Over the years I've noticed slightly better performances when clean installing macOS instead of just upgrading to the new version and potentially bring over buggy stuff


Because you never know whether your system is compromised. See also: https://www.qubes-os.org/news/2017/04/26/qubes-compromise-re....

Whenever I have new hardware.

Other than that, I didn't need a fresh new install (Debian, Ubuntu) for as long as I can remember.


I did it weekly in days of w95-98, but when I began to work with clients circa 2004, I stopped. Because it was always more beneficial to restore a broken system rather than reinstalling it and losing all of the apps and settings. Clients could even ask for a reinstall, only to call me few days later remembering that thing they forgot to mention/re-check. Reinstalling must be the last option, when an OS turns FUBAR.

Today my OS installations live through generations of HDD/SSD drives, and I always have a quarterly or so full-system backup in case my system drive fails. All other important data is in the cloud (steam, gitlab, email, bitwarden, etc) so it's a matter of re-download. My latest reinstall was when I built a completely new PC in 2018, when I decided to build everything from scratch consciously, just for tech fun (I probably had to do that anyway, because windows doesn't like massive hardware changes). Before 2018, another system lived on a laptop, which I gifted to my friend. It has around 6 years of install time today, iirc.

I think I make a fresh macOS with drive format install monthly, I'm kind of obsessed...

With all respect, it looks like a sign of an anxiety-related issue, like niche OCD of some sort. I'd try to understand the driving feeling and deal with it on a subjective level.


For Linux, I almost never reinstall. Most of the package managers I've used don't make it too difficult to:

* verify the checksums of everything installed

* find packages that no longer exist in any enabled repo

* find packages (not explicitly installed) that have no reverse dependencies

* find files no longer owned by any package

As a result, I haven't had too much trouble cleaning up things that might be left behind. My longest running install is going on 11 years now, having been migrated between computers and to/from a VM 8 times.

For Windows 10, I used to reinstall whenever explorer.exe would get stuck with a persistent bug where the metadata gatherer (eg. video thumbnails, mp3 ID3 tags, etc.) would take 10+ seconds per file and Explorer would freeze until it loaded the metadata for every file. I probably reinstalled 6 times before I got frustrated and threw explorer.exe in a debugger to find the cause. Turns out clearing a few registry keys fixes the problem: https://github.com/chenxiaolong/random-scripts/blob/master/c...

Nowadays, I reinstall Windows whenever my VR headset starts to drop frames for seemingly no reason (no Windows, SteamVR, or Nvidia driver updates).


On a physical computer, only when I get a new one.

I find myself semi-regularly setting up a fresh ubuntu install on a VPS though (although infrequently enough that I forget half of what I'm supposed to do). I agree with the poster that said it takes months to get things configured the way you want. The number of basic utilities and configurations, plus work specific software that doesn't come with a fresh install is huge.


When I change laptops, or when I swap motherboard in the desktop. In the past reinstalling windows was a yearly rite because it had "Gone slow" or "corrupt" or whatever. Now I change only if it's absolutely necessary. There is nothing that magically becomes much snappier just by reinstalling, as in the old days. At least so much as to be worth doing.

Weeks to a couple months from the last time; I quite enjoy the work of setting everything up as I want and dealing with installation problems. (I use Linux-based systems, often on testing versions.) I like Scheme and would like to try GNU Guix System when I can, but I have non-free internet drivers and it's not practical to build it on my device.

Usually never. I will list a few systems that are there in my home:

1. My personal laptop: I did an Windows 10 erase and install when I purchased it to get rid of the junk, and split the hard drive for a dual boot. I also installed Fedora initially, but a month later shifted to Debian 11, and have not touched that since. It has been 6 months.

2. My wife's laptop: It is very old - maybe 10 years. It used to run Windows till 2017. Then it fell into disuse due to a hard drive failure. I revived it with an SSD around the time Linux Mint 20 released, installed it, and it has never been erased since.

3. My father's and father-in-law's computers: These aren't in my home, but I manage them. They are also a few years old. They run Debian. Started with 9 I think. I updated them to 10 and then to 11. They work flawlessly too.

4. Phones: I erased my phone (Oneplus 6) only once after Oneplus stopped updates to install LineageOS. Others haven't been erased since they've been bought. All are over 3.5 years old.


With Windows, back in the 2000s: every few days to months, whenever something broke sufficiently that I couldn't figure out how to get the system back to normal. To be fair, I was trying out various tools and techniques to get it "just right", and these would often break things in inscrutable ways.

With Ubuntu, ca. 2004-2014: every six months to a year, because almost every single dist-upgrade would result in a completely broken system (wouldn't boot or wouldn't start X).

With Arch Linux, ca. 2014-2021: never, presumably because each change to the system was small and was installed by a large user base semi-simultaneously, so it was easy to figure out what had gone wrong the few times it did.

With NixOS, 2021-now: none yet, but I'm worried that going back to six-monthly release cycles is going to make each upgrade as unreliable as Ubuntu was. NixOS is in many ways easier to grok than Ubuntu, but I don't yet know the most common failure modes.


On my PC I never migrate major OS versions, I always and wipe and reinstall the OS at least once a year. On my phone I never restore a backup to a new phone, I always begin from a factory reset. My last phone lasted 3 years so at some point in there, I completely wiped it just to keep it cleaned up. I'm old enough to remember installing Windows 3.11 from dozens of 3.5" floppy disks, so a modern PC wipe and reinstall barely registers on the drama meter for me.

In the same way that a server should be treated like cattle, not a pet, I also treat my PC and phone OS installs the same way. I don't spend so much time configuring them that wiping them hurts my feelings. I more prefer the speedup and cleanup benefits than I do preserving my (non-existent) deep configuration. Just as an example, on my phone, I routinely have the OS alphabetize my app icons. Trying to keep them configured in some kind of order just seems like more trouble than it's worth.


Two years, as I follow a "famous distro" long term release schedule.

My configuration is around 95+% scripted, so, while still slow to execute, it's deterministic, and I just need to copy/paste.

Generally speaking, I've found that in the last years, the amount of configuration I need to update (there is always something) is getting less. My guess is that, because Linux is getting more mature, and my own setup is getting more mature as well, there are less changes in general.

This is desktop, however. I also have a mini-server, server version of the same desktop distro, and I dread updating it; I find that updating configuration for server services is considerably more complex.

Regarding the other questions.

Why: in some cases it's a necessity. I always have modern-ish hardware, and older distros may not support, out of the box, the required kernel. Out-of-the box support also helps doing O/S updates in the long term.

Apps: the set of programs I install and use are stable. I may add something, but the basic programs I use are the same. This is a big advantage (see "how" section).

How: super-simple: I have a script to copy the configuration files to an external disk. I copy the files, I format the disk(s) and install the O/S, and execute the installation script (which, also, copies the files back). Important clarification: I copy/paste the script manually; this is necessary because new O/S version may cause minor breakages.

Regarding the "how", an important thing is that I update the script when I add a new program or there is a configuration to update. This considerably reduces the work I need to do on O/S updates.

Screw centralized configuration databases like Gconf, which some programs use "just because" (plaintext is fine; speed of configuration access is something rarely required, if ever). And screw O/Ss that don't allow this level of customization ;)


Whenever my OS hits its EOL.

For desktop computers: typically i use either Windows, or some Ubuntu derivative (typically with XFCE), so both are pretty long term options with regular updates. Might ditch Windows 10 after it hits EOL and move over to just Linux, or maybe just use Windows 10 without networking for older games (in lieu of finding a better solution, like a VM with GPU passthrough because the old games that i like refuse to run on Linux without significant effort).

For servers: same reasoning, previously i ran CentOS (RIP), but nowadays it's also either Debian or Ubuntu. Right now i'm thinking of trying out Debian LTS (community effort over at https://www.debian.org/lts/) and afterwards just migrating over to Ubuntu across all of my nodes because the longer release cycle is more favourable for my needs.

Then again, i primarily use my servers as just hosts for Docker containers, so i can have clear separation between the software and its versions that i need, as well as have more flexibility in regards to a compartmentalized setup where i might want 10 different versions on MySQL with resource limits for each, for all sorts of stable software. Thus, if i ever need to use something old/EOL as well, i can just throw it into a container and (slightly) mitigate the possible fallout.

Relatively little automation to speak of currently because of how simple everything is: fail2ban, Zabbix, Docker Swarm/Kubernetes cluster join, sometimes ddclient and a bunch of tools, sometimes additional setup for backups and additional HDDs (in homelab, not for cloud VPSes). Would introduce Ansible otherwise, like i have at my workplace.

In short: whatever the EOL is for the device (currently ~5 years for most), though i focus on a stable and vaguely secure baseline over bringing over my exact configuration, the latter doesn't matter to me much. Then again, i don't rice my desktop either and treat everything as a boring set of tools to solve problems with.


I almost never do, as it's a bit of a pain to get everything set up again. I've been working a little bit each day on building a NixOS config file with everything I need though, both to learn it, as I like the declarative model, and to quickly have my system as I like it anywhere, including my laptop which I only use rarely, and any VPS I set up for myself. I have nightmares about losing my main SSD and having to rewrite all my scripts and track down all those obscure dconf settings again, so it seems smart to get it right while I still have a system that is set up the way I like it, rather than waiting until I don't.

If you think of installing it onto HDD/SSD/NVME, then never since a few years.

I'm booting it into RAM from USB-Keychain or SD-Card, do updates in RAM, and occassionally 'remaster' that back onto the (removable) boot media.

Benefits: everything is always f...ing FAST!

Disadvantages: limited by available RAM. Starts being practical with 8GB, where 'OS' can take anything from about 700MB to 1.8GB, depending on my needs.

Good for uptimes of 3 to 4 months on 'Desktop', after that a kernel-update seems prudent anyways. Anything else can be handled by stopping everything and logging out, then logging back in. Takes a minute, or so.

Spoken from Antix- & MX-Linux.


Never. In 2008 I switched to Mac (OS X Leopard on Core2Duo iMac) and I have done in-place upgrades and when buying new hardware migrated that install to new machines. Currently running Monterey on an M1 MacBook Air.

I had my Debian running (updated) for almost 10 years. The PC shut down only a few times, mostly when there were power outages. One day I was gifted a new SSD (I was still using HDD) so that day I decided I would install the new disk and make a fresh installation too. I guess I could have "cloned" the old HDD to the new SSD, but I was bitten a few times with those clone tools so I try not to use them when cloning different disks models.

I honestly don't know how you can live with a PC that you need to format every month.


Never. On my Windows PC, my last fresh install was on Vista, and since then, I have made in-place upgrades and image restores when switching to new system drives. On macOS, I also never needed to clean install.

Yeah, same.

Back 20 years ago, I used to reinstall Windows every twelve to eighteen months. There's just no need any more.

Of course, I don't install warez any more, that probably helps. But mostly Windows just turned into a real operating system about 10 years ago.

I admit I've been starting to think about a reinstall, though. My WSL-1 install is so old that the instructions on how to upgrade to WSL-2 don't work, and I can't figure out how to make them work. So maybe a reinstall would do it.


whats your preferred way to make image archives on windows?

Initially Acronis, but then they dumped down their home version and started to make marginal paid upgrades every year. I am currently on Macrium Reflect, and so far, I have had no issues with it.

thank you!

Never, I version my dotfiles and use aconfmgr (Arch specific semi-declarative config manager) for system stuff. So (the goal is) a 'fresh OS install' would be the same anyway.

I've been installing the OS on each new release to a Mac since the 2010s and do a CCC of the filesystem when moving to a new Intel Mac. An OS reinstall on a Mac is pointless if there are no issues.

For Linux I keep data on a separate partition so I can mount it via whatever distro I installed elsewhere, then I add the partition to /etc/fstab and mount over /home. Been doing that since 2000s.

Windows I only run in a WM or on the work computer so a fresh install for former and the latter is not my problem.


As infrequently as possible. I usually re-install my main OS (Manjaro atm) every two years or so, mostly to upgrade the desktop environment (this doesn't seem to work properly sometimes, even though it is a rolling release distro).

I recently had to install Linux four times, because I had some trouble with my Nvidia GPU + AMD onboard graphics card setup.

Windows seems to be working just fine without ever setting it up afresh. But this could also be related to the fact that I'm using it for gaming only.


Not since 2011 when I upgraded from Debian i386 to amd64. These days a reinstall wouldn't have been necessary for that, since we have cross-grading now. So I could probably buy an ARM machine, install qemu-user-static, cross-grade to ARM, swap over the drive and boot Debian arm64 from the same drive.

https://wiki.debian.org/CrossGrading


Far less frequently then before. In the 90's I did a Windows reinstall at least once every 6 months. I've bough my most recent laptop about 3 years ago and only did one fresh install to upgrade the Home version to Pro. I never did any Windows upgrades from one major version to an another.

I'm also not planning to upgrade to Windows 11, because my laptop's Ryzen CPU is not supported by Microsoft.


I only do fresh installs for actually new systems (not replacing an existing system, which just gets the disk moved or copied), and it might happen if I lost a drive that wasn't backed up. Or if a system switches operating systems, or if I break things bad enough and can't restore.

I do a new VM now and again to test things, but in terms of my daily driver OS, I never do a reinstall. No real issues that would require it.

Keep in mind fresh install normally means you don't quite know what's wrong. With some experience you figure this out and fix it instead of doing a fresh and not knowing if the problem will recur.


I reinstall every time I feel there is too much bloat on my machine. At least 2 times a year.

I have everything I need in an installation script (a seperate script for NixOS, Windows & Mac) that I just need to run after reinstall to get up and running. Big changes I make on the system are added to the script.


I do it only if I really have to, so every few years. When I was young I spent weeks compiling my custom linux kernel for my slackware installation, I really enjoyed it, but now I'm 38, full time job, house and family, I don't have time anymore for this.

Probably over decade...

But more interesting is mine brother story. Before little bro was on Windows and frequently install-uninstall software. So on every 2-3 months he was make fresh install with format drive. Then i show him Linux (Ubuntu TBH) and he join club "Years without reinstall".


Roughly once every new computer. Although I will often take that chance to try 3 or 4 different Linux distros just to see what works best and if any of them are doing anything new and interesting. However once I pick one I will generally keep it until next major hardware upgrade.

Every time when new Qubes OS is released. Soon 4.1 is out: https://www.qubes-os.org/news/2021/12/21/qubes-4-1-rc3/.

Never. Unless I accidentally hit rm -rf on my system. Even though it is easy to recover as I have backups automated to my external drive as well as on cloud, it will take me a few hours to return my system configuration back to its original state and this is not fun, not anymore.

Only if necessary to set up a new machine. The most recent was getting Raspbian running, then installed Lazarus/Free Pascal on a Raspberry Pi Zero W (the old/cheaper one). All on a single 16 gb microSD card.

I've got 2 32 Gb cards to move to should I find the need to do so.


Rarely. I have my macOS installation since the first Intel Mac, then I’ve always cloned the drive, and now -surprise for me- I’ve used Migration Assistant for the ARM Mac and it worked perfectly.

5/6 years ago Migration Assistant was barely usable compared to now!


Almost never. I try not to install anything outside of docker, either. I feel that the future is a completely defined dev environment, although with docker the experience isn't necessarily great, especially as regards giving access to hw devices.

I used to do a fresh Windows install once or twice a year for various reasons. I switched to Mac about 10 years ago and I rarely mess with the OS aside from the usual updates. I have done fresh installs on older MacBooks before I sell them, though.

Now that I have archlinux on both home and work PCs, basically never. Rolling releases FTW!

I've never needed to reinstall an OS. Current is macOS (2 years), and before that I used Xubuntu (for 2 years) and Windows 10 (started using it when it was Insider only). I don't remember needing to reinstall Windows 7 either.

... at work, once a year/PC's used by students and running Windows - faster and more reliable this way. PC's running Linux only when changing HDD. ... at home, almost never.

Never. Settled on arch almost ten years ago when I got the laptop I use every day since then. Never formatted, what are the benefits? I backup my data and dotfiles on an external drive.

When I build a new pc or upgrade to a larger ssd (but lately just been cloning). Modern operating systems are quite resilient. Haven't had any need to reinstall in many years.

Whenever I buy a new computer. It just seems like a good time to start afresh and only copy over the personal files I want to hang on to from the old machine.

On my main device almost never. Unless and until hard disk crashes and I am left in the dust. On my servers, every time I crash them accidentally.

Bought the mac in 2018 and not a single reinstall since. System is almost always on. Essentially never, unless the laptop goes in for a service.

Linux: only once, when there was no upgrade path to the newer distro version.

MacOS: never.

Windows: at least a dozen times (and I haven't used Windows for five years).


I make a lot of VM sometimes to try them out. Resetting my main OS never but I think it's needed now as it runs slow af

Almost never. Since I am on NixOS, every system configuration 'edition' is reproducible, and easy to rollback-to.

With NixOS, you can even reinstall your system on every boot: https://grahamc.com/blog/erase-your-darlings (I don't have this setup myself)

My Ubuntu has been installed on 10.x, currently it's on 22.04. It has seen 4 or 5 machines.

On windows, about once a year. Now that I’m on PopOS! 1.5 years and counting.

Every other LTS, one year after it is released (usually after .1 is out)

There’s little need now with macOS as the system volume is readonly

That’s a very good point, and relevant too! I wonder if that makes macOS development easier or harder due to this implementation.

What benefit do you believe you gain from such frequency?

Absolutely nothing, I belive it's a hard obsession with making the things cleaner as possible.

"Making things clean" implies that reinstalling removes something that has built up. Computers are not broom closets, there's no dirt or grime that accumulates. Anything on the install is something you put there directly or indirectly, and can be removed just the same.

> or indirectly

This is the problematic part.


Everytime I find a cool looking distro / derivative

Every year with windows, never since switching to macs.

Pfft, monthly is not bad but you could take advantage of the weekends and do them on a weekly basis. It's what I do and it has served me well for the last decade or so.

Once a year, when a new MacOS comes out.

Never, I like rolling releases.

About once a year

Yearly



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: