> This is where many people will tell me “Okay but not everything works outside the box”, true! but which OS works outside the box these days anyway? Windows is still a nightmare, setting up macOS took me 3 days the last time, Linux takes way more if you’re building it from scratch. Setting up FreeBSD took me 3 days, however this meant that I will NOT need to change it again for a very, very, VERY long time.
Why is Windows a nightmare? I use Windows 10 on a PC I built and it's fine. I also use it for dev using the Linux subsystem with WSL2 and Powershell.
Why did setting up macOS take 3 days? I don't even think it would take 1 day if I set up a new machine from scratch.
Why are you building Linux from scratch for comparison??? It took me two hours to set up Debian by hand yesterday.
And then they say FreeBSD also took 3 days of setup (?!), but that's alright because they won't have to change it going forward. Leaving aside my skepticism of that last part, the same probably applies to macOS.
EDIT: I think the author actually is better served by FreeBSD given the fact that they were hand-modifying persistent packet-filter rules. That is not a thing I would suggest someone do on a macOS machine. But the head scratcher I have is why they would try to do this on macOS (or Windows) in the first place, and why setup times took so long besides the breaking updates.
I paid $200 for a Windows 10 pro license because I needed to run a Windows application in a virtual machine. I was very unsatisfied with my purchase. I would describe Windows as “a nightmare”. Here are my top three complaints about Windows.
* There are advertisements built into the operating system.
* The operating system often restarts itself, without the users permission. It will restart itself even if a user launched application is running.
* There is still no central repository for useful applications. Managing and keeping all software updated takes a lot of manual work.
This was a huge problem, for me. Some multiaxis CNC toolpaths require days to generate. I missed a critical deadline as a result of Windows 10 deciding to update itself, which finally prompted me to install ShutUp10 from O&O.
I use the maximum shit-disabling mode in ShutUp10. It works. Once or twice a year, I run Windows Update and get updates, then I run ShutUp10 again and disable all the trash Microsoft re-enables.
I'd rather describe this as a mail-in-rebate for the costs of running Windows 10.
We sort of implicitly know there is this need to make the Windows lifestyle more sane, and can rely on tools like these existing.
But the time it takes to research this, track down the best binary for the job (examining the source code is not always easy), makes me conclude that there's a real hidden cost to being a Windows user.
The most impressive thing about Microsoft as a company is the way it single-handedly lowered user expectations to the point where ads, security issues, and critical time-wasting failures are somehow considered an acceptable price of entry, and not evidence of an unacceptably shoddy, incompetent, and user-hostile product culture.
The relevant dichotomy is more like: people who run Windows aren't the buyers. One-off personal licenses for home PCs are more than a rounding error but are certainly not what made Microsoft what it is.
Governments and F500 companies buy Windows and Office for X00,000 machines for X0 years of support at a time. Enterprise procurement teams are the actual buyers whose opinions matter to product managers.
Does MS sell their telemetry data?
> I paid $200 for a Windows 10 pro license because I needed to run a Windows application in a virtual machine.
I think if someone is willing to cheat on the licensing cost 4hey might just as well go with a pirated XP?
Remember that time Microsoft let FTDI brick a bunch of knockoffs through the first party update mechanism? Remember when they locked up a bunch of embedded devices with Windows 7 support nags? Remember when they dropped Candy Crush in your start menu, when they decided local accounts now had to be cloud linked, and when they enabled Cortana by default and made it increasingly difficult to opt-out? When they decided to take 30 minutes of your morning without asking (hope you weren't planning on using the computer for anything important)?
When it comes to high-reliability embedded OSes, Microsoft is a case study in inept paternalism. Updates regularly cause problems. Between updates and malware spreading behind a NAT, I'm not at all convinced updates are the lesser of two evils. Ideally, these applications wouldn't run windows, but since they often do, IMO the best approach is to isolate them to the greatest degree possible which includes blocking auto-update (note: not turning it off, blocking it, along with everything else you can get away with).
I've also rebooted to find out about Microsoft Edge, the new browser that's now pinned to my taskbar, even after removing Edge several times.
MS finished making Windows into a stable, usable OS and then promptly began turning it into adware.
I just installed a new Windows 10 VM last week and there was no
option to use a local account anymore. None. No dark pattern
menu link somewhere hidden in a corner of the screen with low
contrast, only the choice between logging in or creating a new
Microsoft account. Which took me longer than the whole rest of
the installation, because as it turns out generic addresses like
"email@example.com" are already taken and Microsoft seems very
anal about certain choice words.
But at the same time they don't seem to mind if I enter a date
from 2018 as my birthdate and the calendar dialog includes
decades of future dates, but at least they got the forced online
> Between updates and malware spreading behind a NAT, I'm not at all convinced updates are the lesser of two evils.
You can get your answer looking back to events like Nimda and Code Red.
Now I run a rolling distro since 2018. Development got a whole
lot simpler, everything is more enjoyable and to date there's
been exactly one noteworthy issue which was resolved after a
quick search and 10 minutes.
And since then I've never seen a "new update available" popup
dialog, no ads, no Cortana, no "smart" features, and most
definitely no need to install third-party software to actually
have a usable file search. I even update much more frequently
than Windows 10 every forced me to, because for some reason I
just never have any issues and I can just do it in the
I wish Windows would let me treat it like a car.
* I log in.
* I start setting things up so I can start work.
* I'm notified about critical updates. There will be a forced reboot soon...
* I'm watching out in case something pops up while I'm typing and whatever key I was about to press causes the 'reboot' question to be answered - and I wait several minutes and lose flow entirely.
* I start setting things up again as they were so I can start work again.
* There are more critical updates...
Turning off auto-update is not an option, because there are so many security holes and I don't want to end up a victim. I also don't like having to fight to try and keep the OS from doing something it will push against, so trying to creep around letting it do updates when I'm not busy and reboots when I want it to but not asking me - waiting to be told - it's hard work and stressful.
Windows is an amazing piece of tech and gets better all the time, but I find myself much more able to stay in the 'flow' and avoid stress in MacOS or Linux.
I think I get it why it wants to reboot: not only is it probably somewhat safer in terms of not screwing anything up in running session, but you can't be sure some running application or service isn't still using an outdated version of a library until you've restarted it. (Not to mention that the Linux kernel updates every week or so nowadays, but those are definitely not the only ones that trigger an update [edit: by which I of course mean "trigger a reboot"].)
I tend to not use the graphical software updater and just ignore its notifications instead, and just update from the command line and reboot when it suits me. That does allow me to not have my workflow interrupted but it doesn't change the fact that I still do get notified of updates that require some kind of action pretty often.
On the other hand, checking for those updates doesn't burn minutes on end of CPU time every time the OS is booted, as it seems to do on Windows 10.
But no, obviously Fedora doesn't force a reboot, nor does it give a prompt you can accidentally reboot through.
However, a part of the complaints (which I fully understand) seemed to be about the frequency of the updates, and in the name of honesty I just wanted to point out that's not really just a Windows issue.
I have said it before, but around half of the machines in my estate are macs, and I have many more reported update problems from them. I think a lot of it is caused by inconsistent updates.
I've been running one at home since the first TP in 2006, and I don't spend any time thinking about or maintaining them (running on Debian)
I wish that was the case.
It may not be a perfect situation but I don't find it to be quite as dreadful as many make it out to be. FreeBSD is my other OS and I'll be honest, I dread doing those updates much more.
Windows 10 hasn't always warned you about an update. Once I lost a half days work while visiting family. I had been doing some genealogy work, when some small kids come over, so I put my laptop asleep (shut the lid) and went about my day, not thinking to save. The next day, opened my laptop and was greeted with the "Hi! We are setting things up for you" screen. Was very upset to have to redo the previous days work.
You may not always be at the computer if you're waiting for some kind of a long-term process to finish.
No store, no advertisements, no telemetry, Edge optional, all Enterprise features enabled, security updates only (defer fully configurable), out of the box.
I use it at home and with my family members.
Of course, you're not the only one with this issue. There used to be an option in "gpedit" to disable this, but I can't find it anymore. I suspect that this is because there is now an option in the "advanced options" of the "Windows update" menu, called "restart this device as soon as possible".
It's very SV-bubble to assume that Microsoft's security needs are more important than him getting his work done.
As he stated, he missed a critical deadline. What if the client cancelled the contract because of that, causing him to go out of business? How does trying to sympathize with Microsoft feed his family?
No, the tech bubble is not more important than things that happen in the real world. Windows has security problems? Sure, all operating systems do. But Microsoft shouldn't shift its problems onto the users. That's just bad business, and bad ethics.
I don't like the automatic reboots, and I'd agree it's not a good solution and I probably wouldn't have implemented it if I were making the decision, but I don't really see how finding reasons for why MS might reasonably want to do it is a SV bubble thing.
The bubble isn't a geography, it's a mindset.
Once a month I update my work PC and three home PCs, all on the same day. Easy to remember, saves surprise updates and reboots.
Similar to how getting a vacination is not only good for you, but also for the rest of society.
Mind you, I'm saying this from a perspective of someone who uses his machine to perform nightly jobs (learning models, ontology alignment, etc) So it's not like I haven't shared your pain =)
Thanks for the explanation, but I'm well aware of malware. I wrote my own boot sector viruses back in 8-bit days.
But the point still stands: Microsoft's practices stop him from doing his work, and could cost him his livelihood because it assumes that fixing its failures are more important than his work. What's the point of even having the computer if it can't be relied upon to do complex tasks?
I tend to agree, but I have difficulty when you scale the problem. No individual person benefits from having their computer rebooted enough to counteract the issues, but might everyone collectively? The utilitarian calculus becomes difficult.
Though parallel solutions: better security testing or a capacity for live updates without a reboot, may be an answer.
Same as you, I periodically turn off the metered connection toggle and then run updates. It saves me a heap of angst.
And this is why I tend to ignore most commentary on Operating Systems. You know the moment that Windows releases a centralized repository, there will be a thousand cries of "walled garden!" and "embrace, extend, and extinguish". It's literally impossible to satisfy everyone, given how many mutually exclusive needs exist out there.
- You can still install applications manually, and there are no special permissions lost for doing so.
- You can access it using your choice of application, instead of only the MS-approved tool
- You can add 3rd party repositories
If that's what the "Windows store" was, I'd absolutely use it (if I still used Windows). If it comes along with a bunch of other crap, then yeah, there will be pitchforks.
What does seem to work is the certificate and code reputation. It does actually prevent users from running things but is easy enough to click through if you really know better.
Yes, that's the "other crap" I was referring to [edit: specifically "special permissions lost for doing so". I was a little vague because I knew there were some additional restrictions for using the Windows store but wasn't sure what they were.]
I didn't spend enough time to verify if it satisfies the "can use other clients" and "can add 3rd party repositories" criteria. But assuming it does, I bet tons of people would happily use it, if Microsoft blessed it as the official standard for managing Windows applications / installed it by default.
I like scoop though personally.
This is only true when the centralized store is the only way of installing software. That's why iOS is called a walled garden, but not Android.
I think, though, I've come to realize why: backwards compatibility. I've seen videos where people have installed Windows 3.1 and then upgraded the OS from there and pretty much everything continues to work. I think in one video I saw, Quake or Doom stopped working after one upgrade (probably Vista) and then started working again after a patch release or something.
I have to hand it to Microsoft that they honor backwards compatibility as well as they do. However, at some point, you have to just make a clean break. You have to acknowledge that what you have is bad and broken and should no longer exist in this world. Microsoft finally did just that with Internet Explorer and then (to a lesser degree) with replacing Edge's rendering engine.
For Apple, backwards compatibility isn't a sacred cow. They're more than willing to throw something out that is old and/or doesn't work as well as it should. They handle breaking that fairly well, all things considered. They give plenty of notice (sometimes not as much as they probably should have -- https://arstechnica.com/gadgets/2018/01/apple-prepares-macos...) so average and power users alike have the time to transition. Sure, it's never as smooth as we would like but what transition ever is? Still, Apple users don't have to deal with things like Windows users do.
Microsoft needs to start making a clean cut with Windows, probably starting with the Explorer shell. It's old. It's obnoxious. It actively gets in the way and makes me less productive. They can do this if they just put their shoulders into it a little. (Christ almighty why does the scroll bar have to jump back to where it was if my mouse veers away from while scrolling? "Because that's how we've always done it!" is easily the worst answer to any question ever asked)
Also, for the love of all that is holy they need to reverse the decision to have ads built into the OS. How can they not see how bad an ad-centric model is becoming for Google and Facebook? Having a whole OS where the user is the product? That's a privacy nightmare that makes Google and Facebook look like little league.
But Microsoft then throws a bunch of stupid crap on top of it like Candy Crush ads for a piece of software that costs $200 and kinda ruins it. Or with all the deep hooks for Windows Defender & friends that just cripple process launching & file creation performance.
Can't the same be said with Linux too? Major distros has Linux with KVM kernel module built-in. I never had to do anything rocket science to run KVM, for example.
I use it to model and test code that will take hours to fold into a build for our embedded target. On the last feature I completed, I wrote 2500 lines of pthread / Unix domain socket server code, and it ran on my WSL/Ubuntu target with only lightweight text file based shims to replace drivers on the embedded target. When I slid it over into the embedded build flow, it ran on the first try, no changes.
Compiled as a unit test harness in about 3 seconds on my WSL instance. Our full embedded target takes 30 minutes to build. So the WSL/Ubuntu instance is a huge time saver.
Linux is a different and somewhat distant version of UNIX.
-On Windows Home however, it comes with Candy Rush and a bunch of nonsense that show up in the Start menu tiles. I just uninstalled them and call it a day.
-Yeah this is my biggest issue.
-Indeed but I have not even thought about it.
You can turn off advertisement during installation, and set up privacy measures like don't give up hardware id for advertising.
I had no problem with windows restarting itself during work hours. After you install all drivers, it would only restart during night time while I was sleeping.
About central repository, is true. You have windows stores, but is very incomplete. The best tool I found to help me was chocolatey, and it saved me a few hours.
What? Where? How do you tell it not to advertise the new Edge or Candy Crush in the Start Menu during installation?
> and set up privacy measures like don't give up hardware id for advertising.
some privacy measures. For others, you still need to manually turn them off via gpedit or using a 3rd-party-tool like OO Shutup.
Source: Have installed win10 more times than is healthy.
ShutUp10 is probably your best bet, or something like this: https://gist.github.com/alirobe/7f3b34ad89a159e6daa1
Edit: If you don't mind a little up front prep time, you could probably create install media for windows that would run one of the above programs immediately after finishing the install. If you have to do it multiple times quickly that may save some time.
The fact that these are required at all is ridiculous, and (part of) why I run Linux/BSD these days.
On something you paid $200 for? WHAT? What is this world coming to?
Use https://www.microsoft.com/en-us/solution-providers/ and pick one of the companies in your area. It largely doesn't matter.
Shoot them an email explaining what you want and they'll have you meet the 5 license minimum for volume licensing by buying a Windows Pro License a Windows Enterprise Upgrade license, and 3 user cals which are cheap. They'll put your name down as the business and then you're done.
If the OP indeed meant "Windows LTSC", then I'm not following along with their point.
You can't. It asks you if you want to not share data, but warns you that this will result in less personal ads.
So you can opt for non-personalized ads, but you can't turn off ads altogether during the install.
- half of the things are for stuff you shouldn't have to opt out of in the first place in any reasonable OS
- that's if you can even opt out of them
- the settings/ways to do so are often obscure/ obscured
- come next update, your settings are never safe, and you can't have faith they didn't add more bullshit that you have to research and find ways of turning off
Server does require initial tweaks to run smoothly on a desktop machine. Some security and telemetry policies need relaxed, audio service doesn't run by default, Internet Explorer Enhanced Security Configuration needs disabled so you can grab an installer for your preferred browser, and other minor things like disabling Ctrl+Alt+Del on the login screen.
The result, though, is that you get an OS that acts like regular old Windows 10, with _zero_ ads and (I think?) less telemetry. I have not seen anything Pro can do that Server cannot, even gaming (Steam, AMD graphics drivers) works fine. As a bonus, you're not subject to any of the restrictions that Pro has for advanced workloads like virtualization. Server Manager along with all the extra utilities not found on Pro are also nice to have.
Updates are still downloaded/installed automatically, but servers can't reboot whenever they feel like it, so I'm not ever interrupted by unwanted reboots like on Pro.
Just set 'active hours', turn off everything under 'Advanced', and if you really want to not do any updates for a while, you can select a date a date 35 days in the future to restart updates.
Three very simple things built into the update tool.
If you are on a corporate machine, they may limit what you can block based on their internal rules, but that is an IT problem, not Microsoft.
Windows 10 never updates or reboots during my active hours.
I always defer updates to a long time out, then manually update at a point where I want to. Problem solved.
2. There's an option to disable automatic restart for updates
3. There's the winget package manager, but it's still in preview. There's also chocolatey and scoop, which are community run, but that's similar to homebrew on macOS.
Where? I've been able to defer reboot, but not disable it completely. ShutUp10 seems to (mostly) work, but automatic updates/reboots somehow get re-enabled sometimes.
With Group Policy: if you are logged in, Windows will pop a modal dialog in your face and you can preempt it there, but just leaving an application running and being AFK will not prevent reboot if Windows has decided it has waited long enough. The amount of time I can ignore the "restart" request in Windows Updates varies from days to weeks for me.
I hate the behavior with a passion, but I can see why Microsoft wants to force reboots eventually. Having un-updated Windows is not good for 90% of the people out there.
Honestly speaking, Windows 10 is as far from the spirit of the "I'm a PC" and "Windows 7 was my idea" ad campaigns as you can get. Now it's just "You will bend over and take what we give you, for the greater good"
If you want full control of your OS, you need to switch to a BSD or Linux, with appropriate expectations about the applications you will or won't find there.
Edit: fix grammar/punctuation a bit
Stop spreading lies. It does work. (I also have hot had ads in Windows on all Windows machines I own. But this is a Microsoft product, which always raises these long list of myths.)
Instead of name-calling, maybe post the group policy here that allows you to never reboot after updating.
So the solution, as I said, is to not install these updates. (Or reboot, but you say you don't want reboots.)
Sounds like you have problems with reading or clicking links, it's a:
Windows 10 Enterprise
>Likely nix23 upgraded to Win10 Pro from Win10 Home for this reason.
WHAT? I run freebsd and develop/maintain sometimes Windows "Apps" with the virtual machine.
This works for my web testing purposes, but probably isn't practical for a CNC machine.
Since the CNC maschine/controler should no be connected to the internet anyway it's not important...and even then, you still get sec-updates.
>Get a Windows 10 development environment
>Start building Windows applications quickly by using a virtual machine >with the latest versions of Windows, the developer tools, SDKs, and >samples ready to go
So i say thank you very much Microsoft.
> You may use the software in the virtual hard disk image only to demonstrate and internally evaluate it. You may not use the software for commercial purposes. You may not use the software in a live operating environment.
I think the heading Get a Windows 10 development environment is referring to downloading the image, rather than to licensing. I agree it's not very clear.
I’ve also had forced restarts while a directX application was in full screen and I didn’t see the windows pop up.
I think the one thing he said that resonated is: "Doing forensics is almost impossible." Holy hell, yes. I use a lot of weird USB hardware, and debugging USB serial-port issues is impossible (even with a Beagle USB debugger, it only gets you half way, system logs on macos are not particularly helpful).
Why FreeBSD over Linux?
This is the most perplexing decision in the article. Why would OP do this? I have FreeBSD running on an old laptop, along side an Ubuntu laptop. There's more software ported for Ubuntu than FreeBSD and I don't want to rebuild everything for BSD, I have too much work to do. Nobody targets BSD but BSD devs.
I'd like to know though, OP aside, besides the BSD networking speed and extensions, why FreeBSD over Linux when the latter is more widely supported for GP dev?
I feel the author's frustration, though. macOS is really diverging from BSD and become even more obfuscated, and as a power user who spent over two decades working in Linux, it is very hard for me to treat macOS and Win10 seriously as development platforms (to my hireability's detriment, I suppose) for anything other than Node/Electron and playing around for my own projects.
> I use Windows 10 on a PC I built and it's fine.
Because you probably don't use Win10 the way other people use it. If you've spent any time developing in Linux, Windows is a pathetic development system once you step outside Visual Studio. The Unix command line is one of the most insanely flexible tools every built, which is why the philosophy has changed so little in 40+ years. (Plus I don't like ads popping up in the toolbar or start menu when I'm trying to work.)
 Case in point: I tried setting up iPerf on a new MBP today. port/brew both failed to install correctly with lib dependencies broken. `iperf`? Seriously? /smh/
EDIT: I meant despite OP's decision in Q above.
From the post:
> Linux has systemd, not my favorite thing out there, Windows is privacy nightmare. That left me with 2 major options: Linuxes without systemd (Gentoo, in my case) or BSDs.
> Since I run FreeBSD servers anyway, I just migrated to FreeBSD.
Basically because the author hates systemd (not an invalid point, but that ship has sailed) and has familiarity with FreeBSD. This isn’t really an article that’s trying to convince people to run FreeBSD. It’s trying to explain why they left macOS. Where they ended up is less consequential, IMO.
The larger question of “if not macOS, then what” is almost a throw away in this post. Which, I don’t blame the author for at all... it’s a prickly issue and includes a lot of background context that dilutes their primary point of macOS issues.
The ship has only sailed for the mainstream Linux OSes. With Ubuntu for example other things bother me too, like snap, I don't want all this overhead. But I see more and more people moving away from those things.
"Giving in and going with the flow" is not necessary with open OSes. It's exactly why I'm moving off macOS too, because I'll no longer be locked into some company that decides what's best for them (and not me)
Considering MacOS and Windows have much more software and driver support than Linux, it's only natural to choose to give up a little more for better docs, userland, and all the other benefits of FreeBSD.
It's the exact same argument the "works for me" Linux folks use, taken to its full conclusion.
Sorry, I know it's nit-picking, I just had to say something.
Yeah, especially the "over Linux" part.
> This is the most perplexing decision in the article. Why would OP do this? I have FreeBSD running on an old laptop, along side an Ubuntu laptop. There's more software ported for Ubuntu than FreeBSD and I don't want to rebuild everything for BSD, I have too much work to do. Nobody targets BSD but BSD devs.
For instance, I use Arch+KDE for years in the same laptop and I never had problems with updates thanks to the Arch's rolling release approach. That's why, for example I don't get why FreeBSD over Linux in that particular regard.
He wrote bout it tho. OP dislikes systemd's approach to things
(No beef, just thought it was funny)
I should be more specific, I meant serial-port: lots of USB devices communicate through a USB serial-port driver, but not every USB device appears as a serial-port (e.g., all USB devices expose endpoints for bulk or serial transfers, but whether that maps to a COM: or /dev/tty* depends on the driver).
I used to develop windows shareware, all the way back to Windows 95. Back then you could write to the installation folder "c:\program files\myapp" and life was simple.
At some point MS figured this was bad security, and you needed to write to the Windows Rigistery (HCKLM / HCKLU) to store your app settings. And write to C:\Users\<username>\AppData\ folder to store your data files.
So all of this resulted in 3 different places you needed to write your data to now. But then it gets even more split because, in some cases you needed to run as an admin during the installation, and your appdata will be stored under the admin account.
Also, at some point c:\progrm files\ was split, and also they added c:\program files (x86), and I believe the c:\program files\ is virtualized. And the Windows registry is also split for local user, and machine settings, and admin settings.
Two issues I ran into:
1. c:\program files\myapp\ I could not trust what I wrote there. I believe its virtualized. So I ran into issues where I would delete a file in my installation folder, and old version would be put back by windows in the background.
2. When a regular user installed my app, and it need to write to HKLM, the app will need to be elevated to admin to be able to do that. This also resulted in another split of settings. Because there are per user settings and per machine settings.
In conclusion, app installation became a confusing mess!
Config: XDG_CONFIG_HOME (~/.config)
Runtime data: XDG_DATA_HOME (~/.local/share)
(Note: CLI apps often follow the XDG conventions rather than the Mac conventions on OS X)
Config: ~/Library/Application Support/
Runtime data: ~/Library/Application Support
No, app installation was a confusing mess and now it's been cleaned up. There's nothing wrong with an app separating user-specific data from system-global data. Running an app as a regular user shouldn't require any writes to HKLM!
Buy any consumer machine with Windows 10 and it comes with so much pre-loaded doodoo, that it takes a couple hours to purge it down to the basics. In fact, so much so, that there are a myriad of scripts to help do this .
Another good thing about modern Windows is that it downloads all the necessary drivers after the first boot and Internet connection. Gamers might want to install latest GPU drivers, but that's not necessary for most people. And those drivers usually do not come with bloatware.
At least that was my experience with few computers. Reinstalling takes 3-5 minutes and downloading drivers takes 5-10 minutes depending on hardware and connection speed. Definitely faster than couple of hours.
The only thing that needed to be uninstalled is ads from Microsoft, but that's another 5 minutes: just navigate Settings/Apps and uninstall those unwanted games. They do not reappear in my experience.
I own 2 windows 10 laptops, I only had to click the button 2 times in my life.
It's like windows realized hardware companies were going to do this and wanted to make it easy to defeat them.
It's also blazing fast. These modern computers are incredible. Not sure why I'd ever upgrade.
I think it might depend on what "setting up" means. For example, "logging into my main day-to-day accounts" might take a couple of hours, max, allowing for even an annoying system-level issue.
But installing, configuring, and gaining access to all one's apps might add anywhere from another few hours to another few days, depending on the number of apps, the location of the files, the version delta between new and old systems, and so on. If you have to do some online research and you hit StackOverflow more than a couple of times as you search around, you may be looking at a serious time bump.
And if you have to contact the software vendor...well, yikes, in a lot of cases. Days, understandable. Weeks of suffering--possible.
This is not even getting into things like working with local network services and local network utilities, configuring old hardware on the new machine, or even changing or disabling new Mac OS or Mac features to suit one's working style.
> It took me two hours to set up Debian by hand yesterday.
Congrats, :) but this does complicate the question further because now it depends on what is involved in this "by hand" thing. It took me 20 minutes to set up Haiku OS by hand...10 minutes to set up Puppy Linux by hand...what does it mean? That I used my hands? That I plugged in a USB drive and hit "Next" a bunch of times? That I configured and compiled all sources from scratch?
Homebrew manages 90+% of installations the the few that it doesn't do coming from the App Store or manual download and install. The rest of the work is simply restoring my config files and copying over data all of which is done via a shell script.
Windows and Linux (Fedora in my case) is mostly the same with just a little extra work on Windows to manually change some settings that never seem to stick if done another way.
Not saying switching to BSD is the wrong thing for this person but having used BSD in the past I wouldn't say it is any quicker to setup than macOS or pretty much any other OS these days. In fact with hardware issues you are likely to run into it can take a lot longer to get an equivalently working system. They even say how WiFi and Bluetooth still isn't as functional or performant as it is elsewhere so who knows how much time they have spent trying to sort that.
But as long as the author finds their environment works better for them that is all that really matters. However I feel when people throw out things like 'macOS took 3 days to setup' it weakens their argument as I know that is not "normal" so I wonder if perhaps they are just making up things to justify (to themselves) the switch. I see it a lot when people switch from Windows to macOS or Linux. People come up with all kinds of "reasons" as to why they are doing it and a lot of the time it is more emotional than factual which is fine providing you admit that rather than come up with nonsense claims.
Linux distros probably come in second, with the caveat that it's easy to get sucked into a time-burning black hole of fine-grained tweaks if I don't make a point of keeping things close to default.
Windows is downright frustrating because even after spending considerable time on setup, there are things that just don't work correctly — for example, changing virtual desktops independent of monitor is not possible with the built-in implementation.
The BSDs are often great if you stick to the base system, but once you start adding software, things break way more often than with Debian, Fedora, Ubuntu, etc.
This is especially true if your hardware is not-a-Thinkpad.
I always wanted to run OpenBSD as my main OS, but for "production" desktop use, I can't deny how much easier/faster it is to get and stay running with Debian, Ubuntu or Fedora.
Sounds like a good choice for the author's needs.
Telemetry, the window manager, BT support, Ads, Cortana for a start.
It's fine, for you. I spent a day convincing Windows 10 to install on my NVME drive. It didn't like something or other. Brand new, blank drive.
Linux installed and worked flawlessly. I actually used it to troubleshoot and mess with partition schemes until Windows was happy. The Windows 'repair' and diagnostic tools are laughable.
It is running ok so far. The Linux subsystem and the new shell make it not entirely terrible.
> Most importantly, it’s Free and Open Source.
"Most importantly" ?
Separately the source code for FreeBSD is pretty clean and easy to read. I needed to modify the behaviour of one of the core utils a few years ago, and I found it very easy to do. The C library that it uses is also pretty well written. It won't be quite as clean as OpenBSD, but it is a lot simpler than trying to figure out glibc(which was possibly intentionally obfuscated to avoid lawsuits).
The build system for FreeBSD is also pretty simple to figure out. You can download the source, modify the kernel or userland and rebuild world in like at most like 30 commands. I haven't tried with Debian or any Linux distro, but I would imagine it would be harder given the separation of the kernel and userland.
like for instance:
> Every time Apple pushed an updated, my pf.conf and automount configs got broken on macOS. They either got deleted or they moved somewhere. Well, the last 2 times it just got deleted.
I mean, wtf? With open source, there is at least discussion and education about something like this.
You'd be surprised -- unless you follow dev forums for various elements of your DE/distro.
macOS moves because it updates into a clean system as far as those configs are concerned.
In any case, it takes like 2 minutes to write a shell script to deploy your pf.conf/automount config after every such upgrade, and 1 seconds to use it. Or to look for the proper, permanent, way to do it: https://blog.scottlowe.org/2013/05/15/using-pf-on-os-x-mount...
With brew and brew mas you can also automate command line and GUI app installations, even custom font installs, among other things.
I can get up and running into a new machine in ~2 hours from a clean install with an automated shell script and get all my apps and cli apps, custom shell aliases/scripts, configuration for dev tools like VSCode and IntelliJ, documents (from TM/Dropbox) and so on.
I mean Gnome3 has removed features pretty regularly that people liked without any real discussion. I don't think that is a closed/open source thing, more of a project communication thing.
I know there are issues upgrading applications on every platform, but I think pf.conf should be treated in a very conservative manner.
I won't call Windows a "nightmare" overall, but I had a nightmarish experience trying to install it last night.
It turns out that creating a bootable USB stick that can install Windows 10 is absurdly difficult. You can download an ISO from Microsoft, but there is literally no way I could find to successfully write that ISO to a USB stick from an OS other than Windows.
If you Google "write windows 10 iso to usb linux" and click on a result like https://itsfoss.com/bootable-windows-usb-linux/, the instructions will create a USB stick that will successfully boot to the installer, but will fail shortly thereafter with "A media driver your computer needs is missing".
If you Google this error and try to find a solution, it feels like you just asked a witch doctor for medical advice. You'll find advice to try plugging the USB drive into a different USB slot, try using a USB 2.0 drive instead of 3.0, etc. This Stack Overflow answer swears that the cause of this error is that Microsoft's servers corrupt the download of the ISO file in transit by aborting the transfer early: https://superuser.com/a/964362 (I verified the SHA of my file and it was fine).
Many answers suggest that using the Windows Media Creation Tool from Windows is the most reliable way to do it. I managed to resurrect my old Windows partition and boot into it, and indeed the Windows Media Creation Tool burned a USB stick that successfully booted the installer. But when I tried to select the partition I wanted to install onto, I got the error "Windows cannot be installed to this disk. The selected disk is of the gpt partition style."
If you Google this error, it sounds like when you create the bootable USB, you create it as either a BIOS-booting or a UEFI-booting variety of boot disk, and once created, it can only install onto a MBR or GPT partition, respectively. The Windows Media Creation Tool did not ask me which kind I wanted, it (apparently) just creates a BIOS-booting USB by default.
The only way I was ultimately able to install onto my GPT partition was to use Rufus, which explicitly lets you choose between UEFI and BIOS when you create the USB. It is amusing that Rufus runs on Windows only (so you need to have Windows to install Windows). Luckily I was able to use the VirtualBox images provided by Microsoft for Microsoft Edge testing to run Rufus and create the USB.
Once Windows is installed it's pretty reliable. But it is a nightmare to install.
I think it is interesting that you can still buy old Windows 7 hardware, and just stick the Windows 10 install USB into it, reboot, and be on your way.
The second option. Easy as pie.
- it only runs on Windows (I was trying to install without having Windows already).
- it creates an installer that cannot install to a GPT partition.
If you're building linux from scratch, you're probably not a customer who would consider MacOS or Windows in the first place.
This just happened. (It’s the reason I’m on HN rather than working right now.)
- I get up at 5am to do some early work.
- Work laptop (Win10) is still connected to the VPN from yesterday. Cool.
- Try to connect to drive. No dice. Try same drive in SharePoint. Failed auth. Reconnect VPN. Same. Oh well, guess I’ll reboot.
- Reboot. Go to make coffee.
- Come back to black screen. Ah yes, that’s right. If I reboot and McAfee drive encryption has kicked in, and I’m plugged in to my USB-C monitor, all screens are just black. The solution is to...
- ...pull the monitor cable, power down the laptop, power it back up, authenticate with McAfee, plug the monitor in, let Windows start, check HN, reply to this post.
I know this latter half isn’t Windows’ fault, strictly speaking. But it’s the Wintel laptop’s fault and that’s all the same to me. I hate using Windows. It’s easily the thing I dislike most about my job.
Let’s just try my Mac over here, hang on. I’ll open the lid. Yep, worked.
She has similar issues with the VPN too.
- Sincerely usually the only Mac-proficient IT professional in the team.
Everything blends together now. I do so much command line work in various languages learning and googling is just the job.
Heck despite me using Ubuntu server for 13 years and mysql for 7 years, I look up syntax and examples.
Even as a Lifelong need, nothing on computers can be intuitive, Everything must be learned.
One thing that keeps me somewhat sane in windows are those heroes who spend hundreds of hours figuring out what each system service does, whether it’s needed, and how to disable it if it’s not.
I will answer most of the questions below :)
1) 3 days to setup macOS?
Yes, it took me at least 3 days, keep in mind that a setup is not just installing software, it's also dotfiles, shell environment, automout (I use NFS a lot), PGP/GPG-alike keychains, the OS keychain, Firewall (pf in my case), privacy settings, company-related software, etc. So yes, it takes time, which I am okay with. My problem with macOS is the fact that updating/upgrading the system crashes a lot of configuration.
2) Why FreeBSD?
Because I love it :) my company's product is based on FreeBSD, my servers are FreeBSD, my operating system of choice for teaching is FreeBSD. The handbook is there, all man pages are well written, pkg is easy to use, it's a whole system. Also: ZFS and DTrace makes your life easier. Sure, I can have ZFS on Linux and eBPF, but why learn a new technology when DTrace is rock-solid. FreeBSD is not "just" an OS, it's a complete self-hosted development ecosystem.
Yes, WiFi is not the best, but not everyone needs 100Mbps connection. I have a wired connection at home to use when streaming movies to my PS4 (also a FreeBSD-based system), but other than that, it's fine. I will still donate every year so the devs improve it.
Apologies for the bad English, it's not my native language.
Thanks for posting and reading!
I do get frustrated when other people jump in to in effect say "jeeze, three days, what are you doing wrong?"
I sent my work Macbook into Apple for a keyboard replacement, which naturally means they have to wipe the SSD, as one does with keyboard replacements. Setting it up again meant replicating three years of cruft that I had long since forgotten about. Its been a month since I did this, and I'm still not up to the level it was before.
Password manager, check. Both the native 1Password and browser extensions. Speaking of browsers, need to install both Firefox and Chrome for testing. Brew? Ok. AWS, gotta configure new access credentials there, now lets install the aws-cli, oh its not available in a package manager, cool. Node, Go, Rust, Elixir, ok now maybe my git repositories? Oh, git isn't installed, lets install xcode, and there's a system update, that'll take about 25 minutes. Didn't I have a command to quickly switch kubernetes contexts? Lets see if I snippeted that somewhere, actually I guess i need to install eksctl and kubectl now. Don't forget email sign-in, calendar sign-in, gotta install slack, iterm, VSCode, jeeze I remember VSCode being a lot more productive, yup I'm missing about twenty extensions.
This stuff is really, really hard to automate; not because its technically hard to automate, though in some cases it is, but its shit I do, like, once every three years. No one automates things they do three times a decade. Cloud or local server system image backups can help, but I'm not giving Apple a full system image for Time Machine to use, there's too much sensitive data on this machine. Its just hard! And that's ok.
1) Before sending your Mac in for repair, use Carbon Copy Cloner  or SuperDuper!  to make a clone of your system drive to a spare SSD.
2) When your Mac is returned to you, if the system drive has been wiped, then use the same software to restore your backup.
Both these programs are free (gratis) for the described use, and have a reputation for reliability. The spare SSD drive will cost about $80 (how much is your time worth?).
why is logging into things an issue if you have a password manager to auto fill things?
on linux I just make a backup of the firefox folder so I don't have to reinstall the extensions on my new computer.
all the settings files I want to keep are kept in syncthing folder then i have a bash script to create softlinks where those settings are supposed to be.
doing all that manually every 3 years would be my idea of "really, really hard". with a script you just add the instructions once and then you can keep reusing it without any effort
It's clear you're non-native, but it's not "bad English". :) Thanks for sharing it with us! I like hearing about the range of experiences that highly technical people have with macOS. I'm still trying to use it as my daily driver, even though it takes a few days of setup (compared to the old days of building a preconfigured image), because my alternate option (Linux, for me) also takes a similar amount of time. You're right about Windows being a trash fire.
PS: "outside of the box" is the end of the idiom referring to "thinking outside of the box" (thinking differently about a problem). I think the one you meant to use in your article was "out of the box", which means the first experience with a product when it's opened or unwrapped (think: "taken out of the box"). I hadn't even noticed how similar these two are until today.
I understand his concerns, and it seems like everyone has their "why I left macOS" hot-take cocked and loaded these days, but this sort of sounds like "I left macOS and the only thing I miss are the things that makes a laptop useful in 2020."
Anecdata: I don't use wifi on my work computer. Ethernet is faster and more reliable, and since my desk/monitors/mouse/keyboard stay in the same place it's easy to drag another wire over there too.
When do you 'need it'?
I've now got a requirement for solid bluetooth (5) in every pc I own. I don't "need" it, but I also don't need most of the things that make my life more pleasant.
One important thing to be aware of: because of their size, you can only fit a single battery cell inside each bud; as I understand it, this prevents any ability to specially manage the lifetime of the battery. I remember hearing something like “expect about 380 full recharges until the battery is no good.”
Anyway. Switching between devices is so seamlessly awesome. “Transparency” mode works well and is easy to switch in and out of. They do a great job of detecting if you just pulled them out of your ear and manage stop/start. Combined with a cellular Apple Watch, it totally rocks, giving you complete communications hook-in and letting you keep your phone at home. I assume they make you buy the phone not only because it’s technically easier to implement the full experience, but also because the Watch would totally cannibalize iPhone sales. The iPhone suddenly feels like a niche device when you have the AirPod/Watch combo.
My primary complaint with the AirPod Pros is that no matter which size insert I use or how tightly I secure them, they tend to fall out of my ears when I smile. I wish you could get them in the “peg” form-factor.
This has been my mini-review.
ah, this so much. I was fine with cheap sony earplugs with cables until i fell for the meme and bought a pair bluetooth sony headphones.
That's a gamechanger, really.
Counter-Anecdata: I currently couldn't use my laptop for work without wifi.
I recently moved and my current place has no suitable desk (or chair) near to the internet gateway.
If I had no wifi I could as well have a brick instead of a laptop.
If I had poor wifi (eg: freebsd not supporting wifi-ac) work would be a torture and video-conferencing with colleagues would be unusable.
If you're going to run an OS with bad or lack of wireless support, might as well grab a cheap PC tower.
I think most people today would look at a laptop with bad wifi and wonder what its purpose was.
I just ordered a desktop and was a bit surprised and also relieved to find that it comes with wifi built in.
The idea of a wired network sounds great to me, but the reality is my house isn't set up for one.
So IME it's not really a big deal.
Until you need that USB port back...
I've since switched away from Mac hardware and don't have this problem anymore, along with a myriad of other problems that Apple created for me (I can actually upgrade my RAM to be >16 GB, I don't need to carry around a bag of dongles, etc.).
You throw one of those little "7 in 1" USB-C hubs in your bag. That's it. That's all you need.
Not sure why you would consider that to be disingenuous, I'm merely sharing my personal experience.
And it's just not been the case for me with the recent MacBook Pros. Like your experience, with past laptops I've used I've always had to carry around different adapters, regardless of whether the machine was Windows or Mac. No matter how many ports the vendor tries to shoe-horn into the laptop, they can never quite get them all in there, and so you end up with a dongle or series of dongles.
With the MacBook implementation of USB-C, I carry one small USB-C hub that gives me pretty much everything from ethernet to microSD. The infamous "dongle book pro" actually has made my life less dongle-y.
And that's where I see the whole "bag of dongles" as a disingenuous line of attack from people who don't actually use the machines for their real lives. In practice, it's the least dongle-y laptop I've ever used.
(thanks for coming to my ted talk)
As I use a laptop for my main machine, I don’t use any Bluetooth devices with it and I connect it to a thunderbolt dock with Ethernet. The Lenovo one. I find wired devices more reliable and wireless devices more expensive, heavier and usually slower.
I love my little Apple pen and ear pods for my phone and tablet though. It’s just a different "workflow".
Why this obsession with Wifi? If I can I always use a wired connection. It's faster, more reliable and removes a major failure mode.
If wireless is so awesome, how come WIFI routers have wires inside them!
I'm only being a little hyperbolic here--there are a lot of reasons people prefer or even require WiFi.
If I have gigabit ethernet at every place I use a computer, then sure, I'm not saying no. But even in a work from home environment where I have a little more control, it's not always an option.
We had a once in a century ice storm a few years ago. We were without power for four days during in the winter.
They re-established power gradually and I still remember the crowds at Starbucks. It was all people waiting to use the outlets to recharge their phones.
Meanwhile, while the power was out and we had no heat, my wall phone worked fine throughout.
You know that they have an independent power supply, and if you don't use a cordless handset, they work fine even with the electricity out.
I also have the last version of the phone book that was printed in my city, and I was making calls all week and finding people and resources to help me deal with the emergency.
I also prefer wires, because they're duplex and not shared, as well as the reliability issues with radio transmission/reception.
But yes, I also use Wi-Fi when wires are infeasible. It's better than nothing, if I need internet.
But if you were on 10base2, you had to ensure that you had your 50Ohm terminators.
Doesn't everybody already know that networking performance has greatly increased over the decades?
That being said, the reason I don't go 100% all in on it and use Ubuntu is the upgrade process. The article says they were successful upgrading minor versions, but major version upgrades are a real pain. Last time I did it it was showing me diffs of system scripts and asking me to make calls on what changes to accept; I chose "use latest" for everything, and ended up breaking "sudo" and effectively lost control of my cloud instance.
If there's a better way to perform system upgrades, I'd love to hear it, because I think the OS is beautifully minimalistic and closer to Unix philosophy than Ubuntu/OSX.
That isn't to say that the current freebsd-update isn't any good, just that it needs someone knowing they are doing. And even experienced people do make mistakes. Case in point, you tried to upgrade, you were given a locally-modified file under /etc/ to merge with the new version, you made a small mistake, shouldn't result in losing the system.
If I'm not mistaken, when Debian is confronted with the same problem usually asks whether to keep the old file or replace with the new, which basically narrows the ability to merge the two together in many cases. FreeBSD actually is slightly smarter there, in the sense that it can even detect whether the comment header of the file is the only thing that changes and actually merge it on its own. But, if there are real differences (e.g. you modified /etc/ssh/sshd_config), it asks you to merge the incoming with the current version. In any case, I think FreeBSD is extremely stable when it comes to what goes into /etc and how stable the file formats of individual files there are, so I think it would be perfectly reasonable to just simplify the choices presented to the user.
In my previous job we actually had an exhaustive list of IGNOREd files for freebsd-update, just to make sure that nothing was to get clobbered by some major or even minor upgrade. And that freebsd-update would practically run to completion without any interaction of this sort.
FreeBSD is more complicated, less consistent but also faster and integrate novelties such as zfs, wayland, BT ...
I personally prefer openbsd to FreeBSD, but still use Debian Linux for my laptops and servers.
The regular 6 month base system updates have been automated as well with "sysupgrade". Doesn't work for whole disk encryption if you have required networking firmware that can't be distributed (I'm looking at you Intel), For that particular case you have to copy the new ram disk kernel (bsd.rd) to the root of the disk. Regular updates are super easy now.
But beyond that, I do love FreeBSD
If you install those, your software built for older versions of FreeBSD will mostly continue to work (lsof seems to need a recompile almost always though).
Fresh rebuilds are probably 'better', but there's lots of reasons you might need to run something built on an earlier version, and it should work (if not, and there's no clear reason, it's a bug).
For a box you have full control of, the good news is I can almost always fix it in single user mode.