Back in my youth, when I was time rich and cash poor, this kind of tinkering was fun and a good way to improve the machine I was using.
Now that I have more disposable cash, but waaay less time, I couldn't imagine "wasting my time" doing this sort of thing. These days I want to -use- the computer, not spend time trying to convince it to work.
Incidentally it's the exact same journey with my cars. 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.
Hackintosh served the purpose for its time. It'll be fondly remembered. But I think the next generation of tinkerers will find some other thing yo capture the imagination.
People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:
> Linux is only free if your time is worthless!
Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.
So yes, I may just want to turn the key and have my car work. But when it doesn't, I often wish I was that guy that had tinkered with my car, so I can better understand what was wrong, and whether I can fix it myself or if I needed a professional.
I run Linux on all my machines, and my family generally uses Mac (both sides), but all those years tinkering with Linux, they still come to me for help with their Mac machines that they insisted would Just Work.
All that out of the way, I agree with your fundamental premise: hackintosh is likely in the rear view mirror for the next generation of tinkerers.
I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out. With open source software, simple electronics, old cars, fabrication and woodworking, the time I spend learning feels worthwhile.
Even this "I hate investing time in proprietary technologies, because I know I can be stopped or locked out" is a hard-gained insight. Hackintosh is one of those things that made me understand this. Nothing like spending weeks to get your hackintosh working smoothly with all the hardware just to find out that the next update breaks everything. I've come to see it as a necessary part of the journey
This is my current state of thought. Proprietary software perceives me as an enemy who needs to be locked out of as many features as possible to allow for more money to be extracted out of me while also investing the least amount possible back into the product. The only timeframe where proprietary software is groundbreaking and at the forefront of technology is when they have not yet captured and locked in a large market share.
In my experience, doing a hackintosh actually teaches you that Apple hardware is not that special and macOS works only because they make it easy for themselves.
Then it becomes clear that if you don't really have an absolute need for macOS it is not worth the trouble since Windows/Linux actually make better use of the hardware with little trouble in comparison. By extension you develop a feeling that desktops Mac are really overpriced and don't have much of an advantage in the Apple Silicon age, since efficiency don't get you much but the performance delta for a given price is insane.
In fact, buying a PC that is equivalent to a base Mac Studio will cost you 1k euros less, even if you go with "nice but not that necessary" things (especially for a personal computer, like 10G networking).
But yeah, you also learn that it's better to not waste time trying to confort to Apple agenda, but that's also true for real Macs in my opinion.
This is a great point. I sort of detest becoming an expert at proprietary stuff, because I know they'll just change it before long. I've lamented about this elsewhere as modern software creating "permanent amateurs". Even those that want to invest in expertise often find their knowledge outdated in a handful of years, and those that don't want to invest can easily justify it by pointing out this effect.
Meanwhile, the article is clear about how proprietary code absolutely prevented the author from understanding why the Wifi and Bluetooth failed with specific apps.
I know plenty of people with stamps who don't care to fiddle with their OS or change their own oil. People who work on putting things in orbit and beyond, people who build bridges, people who design undersea robots and airplanes. They're most definitely engineers.
This is the reason I still buy older cars. I can't stand owning a car only to find out that I can't work on it myself. Even if I don't have the time or tools needed for a specific job, if its something I could do on my own it means the job should be that much easier and cheaper to have a mechanic do.
I fully empathize - and yet, there are benefits from tinkerers/hackers messing around on proprietary hardware/software. Hackintosh - and similar communties - led to projects like Asahi Linux, Nouveau, Panfrost, etc.
> I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out.
The problem with this approach is then you get a generation of engineers with tunnel vision thinking the One True Way to achieve your goal is the same way your GNU (or whatever) software did it.
Invest time in learning your technologies, whatever they are.
There's valuable knowledge in proprietary stuff just as there is in OSS.
I agree with your point in principle, and yet I installed Ubuntu on my work laptop this January after using Windows professionally for my entire (5 year) career. I've found myself moving in the opposite direction from the person in the root comment, because I find that it's getting harder and harder to find tolerable proprietary software. It feels like everything is glacially slow, laden with ads and tracking, reliant on an internet connection for basic functionality, or some combination of the above.
"There is valuable knowledge worth learning in the technology" != "this is strictly better software on every axis and you should switch to it for your daily work"
As someone that learned to program on BSD and shortly thereafter, Mac OS X and Linux....
I honestly don't know how people use Windows machines as a dev environment 24/7. It would drive me mad. Everything's so wonky and weird. Everything from symlinks to file permissions is just backwards and fucky.
Back in the day it was alright because Microsoft gave you a fairly good dev environment in the form of Visual Studio, with the focus of it being squarely on desktop application development instead of tinkering with the system or running web services. It didn't stop people from doing it anyways but it's part of the reason why everything is so janky. Then the web took over and Microsoft tried for ages to make .net and Windows Server work until they realised they can't tune an OS that was never meant for backend development and just put all their focus on WSL. In the year 2024 there is almost no reason to be doing any non-desktop dev in a Windows environment unless it's on WSL. And you get the benefit of having an actually sane window management system and external display handling unlike MacOS, not to mention how nice PowerToys is.
I mean this in the nicest possible way: 5 years is likely not long enough for the “just work, stupid” desire to really, really, really set in. Nor is a couple of months enough time for the potential rough edges of desktop Linux to set in.
Given that I've been using Ubuntu on the desktop since I was 11, I'm not worried.
The reason I switched was because Windows didn't work. Win11's desktop makes early-2010s KDE look like a smooth, bug-free experience. My laptop (a 10th gen X1 thinkpad) was plagued with driver problems. At least twice a month, I'd have to reboot when I joined a meeting and discovered my mic wouldn't un-mute. Switching to Ububtu solved both of these problems, and I don't have to deal with an awkwardly bifurcated environment where a couple of my CLI tools run in WSL while everything else is native. Oh, and my Zephyr build times are a good 25% faster now.
After 17 years of using Linux I realized that I was tired of tinkering with shit, so I caved and bought a macbook air. Not even two years later I was back on Linux, because I realized that the amount of tinkering I do on Linux is actually very small; the experience I already paid my time for means that Linux is simply easy for me to use, while MacOS is a pain in the ass in innumerable small unexpected ways. The path of least resistance, for me, is to continue with Linux.
I work in IT, so I’m paid for my time to solve all kinds of issues with Windows. At home, such issues are unpaid work. Linux has the advantage of having issues be mostly of my own choosing. Stick to the golden path and you’ll hardly ever have them. And the easy configuration and recovery options allow you to jump into a new install with minimal hassle.
Everyone will have the same headaches with Windows as Microsoft’s choices are required these days. Millions of people have quite lucrative jobs solving them. I’d rather not bring work home so I run Linux.
I’ve been using Windows throughout my childhood and start of my CS career - now I use Windows for specific software (audio/music) and Linux for developing (about 8 years I guess). I had a 1-year stint with macOS because I was developing an iOS app, and have been the troubleshooter for people with macs at my previous job, so I consider myself somewhat ‘multilingual’ when it concerns OSs.
As a power user, Linux is just so much nicer. I constantly get frustrated, especially with macOS, about stuff that I can’t easily. In Linux my stuff works and if it doesn’t it can be made to work (usually). In Windows/Mac it’ll often take considerable effort to make the system work the way I want, or it’s just not possible.
I think with proprietary software ‘it just works’ is only a thing if you’re happy with the basic experience that is tuned to the average person. If you have more complex needs, you should be using Linux (and if you know your stuff or use the right distro, things will likely also ‘just work’).
It is _stable_, not outdated. You are practically guaranteed that if you’re running Debian Stable, and live only within the official apt ecosystem, you will not have software-based instability.
Debian Sid makes a better desktop distro than Ubuntu. The drivers are up to date, the instability is greatly exaggerated and installing nonfree codecs is easy (so easy with virtually any distro that it shouldn't even enter into the equation...)
This said, I prefer OpenSUSE Tumbleweed, which is rolling release yet more stable than Sid. Rolling release + extensive testing + automatic snapshotting gets you the best of all worlds.
Yeah, that’s why when I update my Arch MacBook Air once a year or two it works well, but Debian dies and needs to reinstall for some unknown reason. Before that, I believed Debian is so very stable. My experience shows the opposite.
Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.
Like learn the proprietary tech in the environments it's intended to be used in but if you can't use it in that environment I personally wouldn't waste my time with it. With FOSS tech at least you can make the argument that you can learn stuff by maintaining it properly but with a proprietary stack in an unsupported and actively user hostile environment the best you are going to do is learn how to maintain a fragile truce with the software gods.
Peeling all the way all the politics / idealism from your comment and the value proposition between these two options is basically the same, with the difference being that on a proprietary stack there’s a higher chance of things breaking in a way that you low/no likelihood of fixing. It’s all good and well that it seems like this makes you personally want to throw up in your mouth a bit or whatever, but you are claiming objectivity that clearly isn’t here.
That's not a good way to make money. It's not how FAANG pays people, and if it is how your employer pays people then you should always be learning so you can change to better jobs.
A funny thing about "never work for free" advice is that a lot of highly paid jobs (investment banking, high end escorts) are about doing tons of client work for free in a way that eventually gets them to pay you way too much when they do pay you.
I learn the interesting stuff, I just don't learn proprietary tech that I really don't ever want to be dependent on for my wages.
In fact most of the essential skills for my job I've learnt in my own time, and continue to learn. I invest my own money in equipment and training courses. I love learning. But only when it's interesting to me, not because it'll make more money to somebody else. If it'll make you more money, pay me.
> Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.
This is a gross misunderstanding of the GP's point though. It's not that they are against doing any of these things. In fact, they said they were more than happy to do it in their youth. I am in full agreement with the GP's sentiment as well.
Mucking about and tinkering with things while one has the time, desire, and stuff to learn is a young "man's" game. I did all of that and absolutely learned a helluva lot. It did everything I needed from it. I got cheaper/better computer than what I could afford. I learned a hell of a lot about not just the hardware pieces I chose, but also why/how certain things about the OS that I never would have.
But now, I too just don't care. It was interesting, but I'm not that interested about maintaining an OS or how it works. I just want it to work. So for all of those that are willing to do all of that today, I'm all for it.
your comment came across to me as just another one of those "if you don't feel the same way i do, you're wrong". that's not true. people can just be in different places in their life. been there, done that does not mean you can't go there and do it too. we're just focused on different things now
There’s another perspective: even if OP is done, if we shut the door (or let it be shut by companies like Apple) then the currently-young won’t be able to tinker and won’t grow to gain the same knowledge.
They are free to continue that kind of work, it just gets harder. Look at Asahi Linux. While it might not be Hackintosh in the same sense, it is the same spirit. Hackintosh worked because the systems were built on commodity hardware. Now that Apple is using custom chips, they've definitely made it a bit more difficult, but in my experience that just brings out the really talented that step up to the plate to take a swing.
I agree that tinkering is a side effect of curiosity, and that curiosity leads to expertise, which has value.
I parleyed my curiosity in hardware into my first job. (My car-fixing skills alas didn't take me anywhere.) Hardware was fun for the first 10 years of my career, but now, well, it's just not interesting.
I played with Linux as well along the way, but I confess that too has dulled. Building your first machine is fun, building your 10th is less so.
The past couple years I've gone down the solar energy rabbit hole, and I'd love a wind turbine (but I just can't make the economic argument for having one.) If I do end up getting one, it'll be to prove to myself that it was a dumb idea all along.
In some ways we never stop tinkering. But the focus moves on to the next challenge.
> Building your first machine is fun, building your 10th is less so.
Building a Linux box led me back to Apple.
I had been using UNIX at home, school and work for several years, and decided it was time to build my 3rd Linux box. Went to CompUSA out of idle curiosity to see what equipment they had, and the only computer in the store with Internet access was a Mac.
I hadn't used a Mac since the SE/30 days, and I suddenly realized that the NeXT acquisition which I'd mostly ignored had changed everything. Why build a Linux box and be locked out of tools like Photoshop when I could have UNIX workstation that ran commercial software (for, admittedly, significantly more money).
> Why build a Linux box and be locked out of tools like Photoshop
That's what VMs are for. You're never really locked out. It may not make sense to go that way if Photoshop is THE thing you work with of course.
> when I could have UNIX workstation that ran commercial software
Because for lots of software MacOS is a second class system. Partially because there's just no way to test things on it without investing lots of money in hardware, so many people don't.
If you're doing lots of sysadmin / software maintenance style work, MacOS just provides unnecessary pain.
I'm working on packaging things for Darwin platform. And helping people deal with homebrew/compilation issues. I'm painfully aware how much is developed by people with no access to or interest in MacOS. And unless something targets windows explicitly (not wsl), you can basically expect issues going in. In a twisted way, I'm one of the enablers of the current situation where things are usable on a mac.
Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.
> > I'm painfully aware how much is developed by people with no access to or interest in MacOS. [...] Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.
> Nothing to do with Linux, and everything to do with case sensite filesystems common on UNIX.
Of the three major desktop operating systems (Microsoft Windows, Apple macOS, and the Linux family), only Linux has case sensitive filesystems by default. Therefore, it's likely that someone who didn't care about filename case conflicts was running Linux.
If that works for you, great. My default works better with Linux with only occasional other system. Makes me least angry. (Also because Linux is the only system that handles sleep/hibernation for me without issues, ironically...)
I think the awkward part of your first post is that you appear to start with a value judgement that tinkering is for poor people who's time is worthless. That's not remotely fair to either poor people, or rich people who like to tinker. No one's time is worthless. Not your time. Not mine. It's all just time.
Fair enough, and no I didn't mean to impinge time is worthless. It's not the value of time that changes, but the amount of it you have.
In a work context a shortage of time (more customers than you can handle) means you need to discriminate, which means you can't make everyone happy. Which usually means differentiating based on value. (Aka, you get more expensive. )
For personal time you also become more discerning. Spend time with spouse, or build another computer, or lie under a car etc. Life has more demands, so there are more choices.
Incidentally, one of those choices is to work less.
The tinkering never goes away, but I prefer to tinker in profitable areas now. (I get to tinker for work.)
All my PCs and servers run Linux, and its certainly not out of some idealism or anything. I'm fundamentally lazy, but I have a high standard for how things should be. As a result, I tend towards the highest quality, lowest cost (time, money, etc.), and thats Linux for me. Specifically, the setup I run on almost all my machines, which is the most optimal way I have found to write and run software, and play games.
If Windows was easier to use, more stable, less of a hassle, easier to fix, I would use it, but its neither of those (for me). When I have a windows problem, I can either try magical incantations to fix it, reinstall, or give up, and each of those takes much longer than most things I could possibly do on my linux systems. Even if my linux box fails to boot, the drivers break and my ssd doesnt mount, all those fixes together take less time and effort than finding a fix for the most trivial of windows problems.
The most trivial problem on Windows has been that the right click menu doesn't fully populate on first right click. I reported the issue, and thats all I can do. Its been a year and nothing has changed.
On linux, a less trivial problem (a calculator crashing with a series of very weird inputs) was solved by me opening it in gdb and fixing the code, making a PR and having it merged.
I guarantee a lot of people are on linux because its easier, and for no other reason. I dont need it to "just work", because I will break it. I need any possible fix to be possible in bounded time.
Windows has been disconnected from user needs for a very long time. Any logical person would've put a "right click" icon in the Control Panel that would give the user full control of what does and doesn't appear in the menu, their order, etc.
I also use Linux on all my machines but that's because (perhaps after years of tinkering) it is currently the most turn-key laptop/desktop OS. Things just work, they don't break without a good reason, and weird limitations don't randomly pop up.
Windows at work, despite being maintained by professional helpdesk staff, or Macs my family have, with all the ease of use designed by Apple in California, are not like that.
Just the other day I tried to download an mkv file over https on a Mac and I couldn't get it to exceed 2.5 MB/s. Same network, same server, my laptop breezed at over 20 MB/s and Apple took out that walker for a stroll at a very leisurely pace. It didn't come with `wget` either.
If you sincerely believe this, you've tinkered enough that the massive knowledge barrier that is Linux seems like nothing to you.
I would never sit my 70 year old mother down in front of a Linux machine. We're not at "caring that video files download too slowly" - we're at "how do I put a file on a USB".
Put USB stick into computer, click on "Files" in the program chooser, select the USB drive (helpfully listed as "USB drive" even), drag your files there?
Same as on Windows and MacOS really. I don't dispute that Linux has rough edges, but putting files on a USB stick is not one of them tbh.
I have very little Linux sys admin knowledge and have been using it on my home notebook for 5years and my work one two years now.
Really no issues with the OS.
I was using the very excellent 2015 Mac book pro before, but despite hardware that isn’t quite as nice (not bad though) that hardware I can’t go back to Mac OS. I know I pay a premium to get it pre installed over windows, but it’s not bad.
> Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.
I have plenty of other things I’d rather tinker with and become an expert on, though. My computer is a tool to let me work with those things. It’s not fun when I have to debug and fix the tool for hours or days before I can even start working on the things I want to work on.
This is me. The range of things I want to tinker with has grown. Various house projects, jiu-jitsu, cooking, etc... are all things I tinker with and learn from. Building computers, I've done and don't feel the need to do again. I even built a Gentoo install long ago when I was learning the nuts and bolts of linux.
This argument is quite out of date. You'll lose a whole lot more time on forced Windows 10/11 updates than you'd spend managing a reasonable Linux installation. ("Reasonable" meaning avoid things like Arch or Ubuntu, and pick decent, natively supported hardware.)
That argument doesn’t sound very convincing to me. How would I know an avoiding Ubuntu is reasonable? That still seems to be the go-to distro for many people I know that like to use Linux but aren’t Linux experts. How do I know which hardware is natively supported?
With Windows 10/11 I’ve never had any problems, either with pre-built computers or my home-built PC. Hell, running Ubuntu in WSL has been relatively smooth as well.
My experience with Linux as an OS has been fairly good for many years, regardless of the distro. It’s the applications that could be an issue. Feels like it’s only very recently (post Steam deck in particular) that gaming seems to be viable at all. And it’s hard to beat the MS Office package for work. I recently got the idea to have two user accounts on my home computer where I have an account dedicated to working from home, logged into my office 365 account from work.. and it was honestly amazing how suddenly everything was just perfectly synced between my work and home computer.
If you have recently endured Windows Update for Patch Tuesday, you know that you are forced to reboot during this process. This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.
If you have recently performed the analog activity on a Linux distribution, which is likely either apt update/upgrade or yum update, you will notice that a reboot is not required. These update approaches cannot alter the running kernel, but ksplice and kernelcare offer either free or low-cost options to address that.
Windows update is enormously painful compared to Linux. There can be no argument of this fact.
> This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.
Which is something 99% of personal computers don’t care about even slightly. These days restarting your machine is a very inconsequential event, your browser can effortlessly reopen all the tabs you had active, macOS will even reopen all the windows for your native apps.
I don’t mean to defend Windows Update, I just think “you have to restart your computer!” is not a particularly good reason to damn it.
A complete patch Tuesday session is twenty minutes of reduced performance, followed by a "don't reboot your computer" of unknown time both before and after the reboot.
Anything is better than that, especially when some updates either reboot immediately or kindly give you five minutes to close everything down (was tmux made precisely for Windows update?).
Exposure to apt/yum really makes Windows intolerable, just for this alone.
> especially when some updates either reboot immediately or kindly give you five minutes to close everything down
I have been a Windows user since XP. Never, not even once did Windows decide to reboot without asking first. Never.
The only way this could've have happened is if Windows kept asking you over the span of a week or 2 to restart to apply the updates and you kept postponing it.
Either way, "Hot Patching" will soon be a thing on Windows so restart won't be required every month [1].
That would be a firing offense at my company. Company files stay on company hardware. Personal files stay on personal hardware, and never should the two meet.
Not the OP, but personal files are not just vacation pictures. I work in R&D and I have my org-mode/roam on various scientific and technical topics going back 15 years or so. I use these for work to benefit my current company, and maintaining two parallel versions of these is rather inconvenient.
Isn’t that exactly what a cloud drive is for? There’s a difference between using your personal notes for business purposes on the one hand, and keeping company property and data on a machine totally outside IT control. That’s just a massive lawsuit waiting to happen, and it’s bad for the employee too - why would you want the liability?
I would't store company data or code outside of approval services, but one might say that my notes, including notes on the people I meet and projects I work on, can constitute proprietary information - so yeah, it is a bit of a grey area still.
That may be sensible if you want or need stronger security and isolation.
However, many companies do support BYOD, especially on mobile where it's a pain to carry two phones around.
There is some support for this. For example, Apple supports dual Apple IDs and separate encrypted volumes for personal and corporate data. Microsoft apps (Outlook) also have some support for separating personal and corporate data.
The benefits of BYOD can include lower equipment costs, lower friction, and potentially higher employee happiness and productivity.
Yeah preinstalled. And I never had issues with Ubuntu breaking in ways like arch or gentoo. Breaking includes trying to install some new thing or uograde and having random other stuff have to be googled.
That is patently wrong. I run Fedora on my Framework because it is the most supported and recommended distro for it and I mostly just need a web browser for most of the things I do on it. I've had kernel upgrades break wifi completely, the fingerprint reader doesn't work properly out of the box, 6GHz Wifi isn't supported (though neither is it supported in Windows 10), VLC (which I hate using) is the only media player that supports playing from SMB shares on Linux, Wayland isn't compatible with Synergy type software (and my web browser doesn't work well with xorg), etc.
Most of these things worked without any fuss in Windows and I can't think of any notable Windows issues I had to deal with on the laptop before I installed Fedora.
I have been running Ubuntu then Arch as my daily driver 2004-2017. As I started a consultant working for Western companies I thought they will care about me being clean copyright wise so I went 100% Linux. This was obviously not so but what did I know? I deeply regret doing this now. (I was dual booting before.)
With Ubuntu, upgrades every six month or so meant you were better off reinstalling and reconfiguring -- no matter which way you went, it was 2-3 days of work lost to tinkering the system. With Arch, the whole system doesn't shatter, it's just this and that doesn't work and it's frustrating. Bluetooth, multifunction scanner-printers being in the forefront. In fact, I needed to sell a perfectly working Samsung MFC at one point because Samsung ceased to make drivers, the old ones didn't work with newer Linux and while open source drivers surfaced that only happened years later. Let's not even talk multimedia. https://xkcd.com/619/ is ancient but the priorities are still the same.
Neither systems were great on connecting to weird enterprise networks, be it enterprise wifi or strange VPN. At one point I was running an older Firefox as root (!) to be able to connect to the F5 VPN of my client because the only thing supporting 2FA for that VPN was a classic extension -- and the binary helper disappeared in the mists of time. The only Linux related discussion was ... the IT head of my client asking how to connect Linux to his VPN now that he turned 2FA on and being told it doesn't work. https://community.f5.com/discussions/technicalforum/linux-ed... well I made it work but faugh.
I have been running Windows 10 + WSL since 2018 January and all is well. It reboots sometimes while I am asleep and that's about it. You need to run O&o shutup like once in a blue moon. Right now I am on Win 11 as my primary laptop is being repaired, you need to run ExplorerPatcher but that's it. It's been indeed six years and there was never an update where the OS just didn't start up or a hardware driver decided to call it quits after an upgrade.
Also, updates are not forced, I control my machine thanksmuch via Group Policy.
Bluetooth mouse, keyboard, headphones, controller works. Intel iGPU works, including hardware accelerated video in browsers. VPN: Pritunl worked without issues, Perimiter 81 initially failed, works with update.
Wayland, Pipewire, Wine, Proton - Steam Deck is widely successful multimedia device. Priorities are same, NVK joined open source drivers.
Linux does not connect to "enterprise wifi or strange VPN" - ok.
Well I'm just a rando, and you didn't ask me, but I agree with the sentiment, so: Fedora. Or openSUSE. I'd be more comfortable giving a newbie Fedora.
I was a Debian devotee for nearly 25 years, but I've found it to be less foolproof and fault-free lately, and it has always lagged behind current package versions in Stable, forcing you to run Testing (or -backports) or even Unstable to get newer versions-- with corresponding potential for breakage.
Debian Stable was very out of date 25 years ago, but ever since mid '00s (after Ubuntu got popular) it improved by miles. Debian Stable is akin to Ubuntu Stable LTS. Ubuntu Stable non-LTS is a 6 month snapshot from Debian Testing, does not get supported for long. If you run Debian Unstable, you're probably running something akin to a rolling distribution. What is best all depends on your goal and purpose of the task. Personally, I very much like the Debian ecosystem and would prefer any Debian(-based) OS. However these days, Docker can trivialize a lot (and also mitigates your mentioned issue), ZFS and other filesystems allow to rollback in case of issues (useful on a rolling distribution, but also on Debian Unstable), and hypervisors allow snapshotting, multiple OSes, and all that, too.
For a server I'd recommend Proxmox (especially since ESXi is now only for enterprise). From there, have fun with different OSes.
Proxmox on a desktop is a bit meh, but possible. There's a lot of useful Linux desktop OSes out there. For example if you want to perform pentesting you can use Kali Linux. The one which interests me most from a security standpoint however, is Qubes OS (Fedora-based, sadly, but you can run anything on top of it). For gaming, SteamOS is neat (Arch-based, these days) and could even be fun to have a kid play around with Linux, too.
As for macOS, I played around with Hackintosh a couple of times in the past with success. But I never liked it much because you'd lag behind on security patches, and every new update would be praying it'd work. I did get it to work under Proxmox though, that was fun, but had to install a second (dedicated) GPU for that. I latest M-series ARM-based Macs work very well, only disadvantage is the fat price upgrade for RAM and SSD (often even soldered!). That part is terribly sad.
This is absolutely false. I run dual-boot Windows and Linux on hardware that has 100% Linux support. Windows just works, the same cannot be said for Linux unless all you do is use a browser and listen to Spotify.
There are pain points on both. Audio on Linux is still annoying if your system isn't very vanilla, while Windows sucks at bluetooth, configurability, and has a lot of annoying anti-user "features".
Windows does not “just work”. On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns. Windows is pretty shit these days tbh. It’s pretty much like Linux was 10 years ago.
However, I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.
Frankly, I hate that I’m forced to use windows as work. I feel like I need to constantly deal with BS windows annoyances. When I go home and work on Linux it like breathing a sigh of relief. My desktop actually feels fast and efficient.
> On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns
> I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.
Perhaps your Windows knowledge is not up to the level of your Linux knowledge? It might be that a Windows expert could fix every issue you’ve listed and more.
I've worked daily in Windows enterprise environment for 15+ year (which mean that when it won't work I usually "just have" to get help from a colleague.
I've been in charge of a debian/postgresql cluster for 10+ year which I managed to keep upgraded on a reasonable schedule.
But Yet, since for some utterly opaque random reasons Windows updates on my home gaming PC stoped working two months ago I feel totally clueless about how to even begin to debug this crap.
There seems to be absolutely no clear working procedure out there to fix that, only people with the same problem shouting out to the void. All them poor souls trying byzantine procedures that have been duplicated ad nauseam from stack overflow to windows help forums through reddit and back.
The consensus seems to reinstall windows from scratch (by choosing amongst a handful of ways for which risks/benefice looks unclear).
That really piss me off and but I guess it's user fault because "my Windows knowledge is not up to the level..."
That’s very possible, but I don’t want to invest time gaining knowledge in a proprietary platform. Microsoft already owns most of the default stack programmers use these days. I don’t want to contribute my energy to entrenching them further.
These have been pain points for me. Not saying they're impossible to solve on Linux, but it's nontrivial especially compared to Windows
Change trackpad scrolling speed
Set up suspend-then-hibernate
GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)
Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)
Higher res video streaming (I think this is usually a DRM issue?)
Full disclosure: I'm posting this list because I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed
If you're on X11, I think you'll have to use xinput to set it manually.
If your on Wayland, in KDE at least this is available in the standard settings application.
> Set up suspend-then-hibernate
On KDE at least that's just one of the options in the power settings ("When sleeping, enter:" has "Standby", "Hybrid Sleep" and "Standby, then hibernate").
> GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)
Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?
> Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)
I find that Proton mostly just works for me, but indeed EAC is a problem that I don't know how to solve (and also don't really care about since I'm not into playing public multiplayer games).
> Higher res video streaming (I think this is usually a DRM issue?)
You should check if HW Acceleration is enabled in your browser, but IIUC Netflix will indeed refuse to provide higher quality streams to Linux (and also Windows depending on your browser), you might be able to resolve it by googling a bit, maybe using a browser with DRM support and switching out your user-agent?
> I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed
Gnome is notorious for removing user choices, so I wouldn't be surprised if this was impossible on Gnome/Wayland. Xinput might work on Gnome/X11. Switching to KDE should work on Wayland ;)
Alas, I'm using Gnome. There's a setting for changing scroll speed with a USB mouse but not for a laptop's track pad. I don't see anything for standby-then-hibernate either.
>Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?
Based on their compatibility list[1], it doesn't look like amdgpu supports my hardware (Richland chipset). Most distros I've tried don't even boot unless I add "amdgpu.dpm=0" in GRUB.
> it doesn't look like amdgpu supports my hardware (Richland chipset)
I see, looks like your card is too old for the official open source amdgpu support, meaning you should either install the unofficial open source ati drivers as per https://wiki.archlinux.org/title/ATI or try the official proprietary drivers from AMD (which I assume will be too outdated to function on a modern kernel?).
Not OP, but the fact that I have an easily accessible text file on my desktop with the exact commands to run in my terminal to recompile the graphics driver when upgrading packages breaks graphics again should speak volumes. I don't really mind, because running 3 commands in the terminal a few times per year is not particularly difficult for me. I could see it being difficult for non-devs though.
What does get annoying is when such an OS upgrade breaks the wifi drivers and I have to setup a bluetooth hotspot on my phone to access the github repo and fetch the latest driver version for the wifi dongle.
At this point I feel like Linux may be more likely to just work than a windows machine. I just had the unfortunate experience of setting up windows 11, and the number of ‘please wait while we get things ready for you’ was truly astonishing.
It's not. You can go and pick up any computer that is currently on the market, doesn't matter if it's 300 or 3000 dollars as long as it is a (n IBM) PC and it will run Windows.
Will it always be flawless? No. Will it always work perfectly out of the box? No. But it will work and generally you have a good chance of it working as you wish assuming you are fine with Windows and what MS does with it.
I bought an Asus Zephyrus G15 (2022) specifically because it was recommended to me because it is supposed to be great for Linux and it's probably the worst Linux experience I have ever had. As the first piece of hardware that I specifically picked for Linux support.
Because most DEs don't do fractional scaling but all high end laptops have too much DPI to not have fractional scaling.
Nvidia is still not providing proper Linux drivers.
Asus can't program to save their lives but the tools that replace the Asus stuff on Windows are still better than the stuff that is replacing the Asus stuff on Linux (asusctl/ supergfxctl vs G-Helper).
I once had a machine where the nvme drive was simply not working. That was when Kernel 5 came out. It broke on Fedora but worked in Mint until Mint got Kernel 5.
During my last Linux adventure, KDE just died when using WhatsApp as a PWA (where I live, WhatsApp is essential software to have a social life).
And even after years of Wayland being around, it's still impossible to have apps that aren't blurry in most DEs because X11 is still around.
You're complaining about software updates and user friendly loading screens. The issues that drive people away from Linux and to Windows are literally unfixable to 99% of the techies that try Linux. I'm not fixing an nvme driver in the Kernel. That's not my area of expertise. But I still need my machine to work and on Windows, it does.
Rufus let's you create an ISO that skips most of the windows 11 nonsense btw.
I think that everyone knows that's a pretty ridiculous statement. Installing Windows 11 is basically putting in a USB stick, waiting about 8 minutes, clicking a few things and typing out your login and password. I love Linux, first started playing with it about 20 years ago now. There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.
Now, that is a ridiculous statement. Installing Windows has never once been a smooth experience you describe. It's been long wait times, dozens of reboots, and never ending cycles of Windows Updates. Always has been for the last 20+ years.
Today, it's even made worse by the fact that MS is intentionally driving Windows UX to the ground in exchange for short term profits. Installing Windows isn't "clicking a few things." It's going out of your way to disable piles upon piles of anti-features MS throws at you, whether it be spyware, bloatware, or the hyper-aggressive nags to get you work against your will. The length die-hard Windows users go to to "de-bloatify" their Windows installation these days is absurd.
It's true that Windows had a superior end user UX over Linux 2 decades ago. But that has changed with improvements on the Linux side and poor, poor decisions on behalf of MS.
You're greatly over-exaggerating how much effort it takes for a power user to set up Windows. I had to do it the other day on a Dell MiniPC (sadly couldn't use Linux since I needed HDR) and it's just the following.
1. Set up USB stick in Rufus with all the setup skips enabled
2. Select install options, skip key, next next next
3. Wait for it to install
4. Say no to MS account, put in username, password, and security questions
5. Wait for a reboot and setup
6. Connect to internet, run Windows update, reboot when done
7. Uninstall the few bloatware apps in the start menu, most of them are UWP so the uninstall button does it immediately, takes no more than 5 minutes
8. Disable web search from group policy
9. Install Windows Terminal, Powertoys, and another web browser.
I could easily automate steps 7, 8, and 9 through powershell and winget if I wanted to. The total install time was less than 10 minutes plus the time it took for Windows Updates to install and I have a pretty clean environment.
In comparison, with Fedora running on Gnome I'd have to spend a solid amount of time messing with dconf settings to get fractional scaling to show up and for my touchpad to scroll at the correct speed + installing extensions to get a UX as good as Powertoys has by default, and on KDE I would need to spend the same amount of time messing with settings and installing KWin scripts to get functional tiling (although that might have got better since I last tried it).
Oh and on MacOS I would be up and running in almost no time, because there's no way to fix the absolute dumpsterfire of a UX it has so I don't even bother.
So all options kinda suck, Windows just sucks in its own ways.
> There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.
You having that experience does not make it a basic fact.
I didn’t even have to do the actual installation, as it was a prebuild machine. The only thing I had to do was the ‘clicking a few things and typing out username and password’ part.
Comparing the two between Ubuntu and Windows, I’m forced to conclude that Ubuntu has the easier version, or at least faster. And windows has the advantage/disadvantage of needing my MS account to set up an operating system.
Windows installer images have some files too large for most tools to understand and also I believe the USB stick needs to be exFAT formatted too. Virtually any tool for making a USB stick would fail in various ways on macOS.
> Windows installer images have some files too large … I believe the USB stick needs to be exFAT formatted
That’s true. I forgot!
While it’s not part of the UEFI spec many (most?) consumer BIOSes will be able to UEFI boot from NTFS as well, so formatting as that might also be an option.
Both that and exfat should be easy to do on Linux. No idea about MacOS.
Which brings me to
> too large for most tools to understand
This I don’t understand. What tools? What tools do you need beside “cp”?
I don’t know if you can just copy the files over. It seems you also need to make the USB stick bootable?
There’s dozens of guides on doing this on a Mac, they all seem outdated. I found a tool called WinDiskWriter on GitHub and it was the only GUI tool that worked.
Suspect there’s more to it than just cp. There’s wimlib too for handling the larger files on the install ISO.
TLDR: UEFI just checks in NVRAM for pre-existing boot-configs (stored paths to EFI-executables for installed operating systems) and if doing "dynamic" booting from some random media, it checks if the the volume has a EFI-executable for the given architecture (for instance in \BOOT\bootx64.efi for Intel x64), and if it does, it loads that file.
UEFI usually boots straight into native long mode without any weird 8086 compatibility modes being employed (which the OS then has to unroll), so for the OS its simpler to deal with.
It can also serve as a multi-boot menu on machines which has several OSes installed.
It often comes with a MS-DOS like "UEFI Shell" you can boot into... To manage UEFI itself. So if something doesn't boot, you can just boot into the shell instead and try to fix things from there.
It may sound complex, but once you get into it, its really much easier to work with that legacy MBR boot, and all the "magic" things you have to do their to get things booting.
> hackintosh is likely in the rear view mirror for the next generation of tinkerers.
Part of this might be that making Hackintoshes is so much harder now, but part of it might also be that OOTB desktop Linux is luxuriously good these days compared to where it used to be. Ubuntu and Pop!_OS linux are absolutely on par with MacOS for a user who meets the (admittedly higher) entry requirements for using Linux.
Your comment makes me think of my 3d printing journey. A lot of printers require maintenance and tinkering just to keep them functional. To an extent, since they are targeted towards “makers” who like to play with these things, that’s fine.
But sometimes the thing you’re trying to build is of central importance, and you want the machine to stay out of your way.Tinkering with the machine takes away time you could be exploring your ideas with a machine that’s already fully functional.
This argument makes a lot of sense. I get more upset than I probably should about car issues, likely because I never spent the time to tinker with them, so I feel rather helpless… and I don’t like feeling helpless.
In my youth I did a lot of tinkering with computers and it has paid dividends. It gave me a career.
These days though, I want to be able to tinker on my own schedule. I want my primary computer, phone, and car to “just work”. That means any low level tinkering needs a second thing. That can work fine for computers, because they’re small and relatively cheap. The idea of having a project car isn’t something I ever see myself doing, as it’s big and expensive.
I can still tinker on some things with my primary computer without it being a problem. Tinkering on writing software, running servers, or whatever, isn’t going to kill my ability to do other things on the computer. A lot of tinkering can be done without tinkering with the OS itself.
> the understanding you gain from tinkering is priceless
You pay with time. It's priceless, if you are a romantic or lack foresight (because what you did with your total will be way more important than what is left). Otherwise it will always be the most expensive thing you have (and we must still be able to spend it without care, because what would life be otherwise).
> But when it doesn't, I often wish I was that guy that had tinkered with my car
Don't. Instead build a network of experts you trust and make more money doing what you do best to pay them with. Trying to solve the world on your own is increasingly going to fail you. It's too complicated.
Disclaimer: This became more of a rant than I intended. I've become pretty unhappy with the general quality of the "professionals" I've interacted with lately.
I just can't agree with this take. It sounds that simple, but it's not.
I happen to enjoy learning and fixing.
It would take me a long time to build that trust. Nobody cares about my things and my family's safety like I do.
Most people are a long way from making as much money as an expert would charge them.
In the last couple of years, I have had some terrible times when I call for help.
When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.
Plus one time they left my fuel pump loose and I had to pay (in time and money) for an extra round trip with Uber, and the fuel it sprayed onto the road. They didn't fix the original problem, which cost me another round trip.
Another time, I had technicians (experts) out to look at my leaking hot water tank 4 times before they decided it was time to replace it. I wasted the time calling, babysitting, coordinating, figuring out how to shower without hot water, etc.
If this is the average "expert" count me out. I'll do it myself. Plus, throwing money at a problem isn't near as fun.
> When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.
Regrets about not becoming more of investing the time to be an intuitive handy man is a very different category from "let's see if there's a video on yt to help me fix that in 5 minutes". My message is definitely not "don't get your hands dirty" but "be practical". Doing the yt/google/chatgpt thing to get an idea is mostly practical.
> If this is the average "expert" count me out.
You disclaimed, no problem — but I did write "build a network of experts you trust". Just calling someone and being annoyed that they are not good (and I agree, most of them are not) is not that. It's going to take time and money, but decidedly less so, because you get into the habit if doing it, you learn, you see red flags, network effects are real (people know people) and relationships on average last long enough. That is my experience, at least, but I have no reason to believe I would be special here.
> Plus, throwing money at a problem isn't near as fun.
That's true, in my case, only for very few problems. Most problems I would rather not solve myself.
I'll admit: All of this is a concession to reality, at least my perception of it. Learning is fun. I would really love to be good at a great many things. It's just increasingly unreasonable to invest the time necessary, because things get more complicated and change more quickly.
Staying good at a few things, learning whatever is most important next, and getting better at throwing money at the rest, will have to do.
I'm enjoying this thread. I want to add that building a network of experts has other costs too.
Sticking to a network will limit the variety of people you get to meet, everything else the same. Local maxima.
It also isn't practical in some circumstances; if I travel for work or move cities every few years, the local network for mechanics gets lost. The cost of keeping the network would be staying in one place.
>> The cost of keeping the network would be staying in one place.
One man's bug is another man's feature:). You describe staying as a bug, I've lived in the same house for 24 years, and, for me, it's definitely a feature. I'd positively hate moving to another suburb, never mind city.
And yes, I've developed relationships with local service providers. My plumber, my electrician, my mechanic, all know me by name. I've found the people I can trust and they eliminate those hassles from my life.
But, and this is my point, I'm not you. My context, my goals my desires, are all different to yours, and that's fine. We're all in different places, being different people, and that's OK. It doesn't have to be "us versus them". We might enjoy different thinks, and have different perspectives, but that's OK.
There's levels to tinkering though. When I was running Ubuntu, a lot of the tinkering came down to searching for what config files to update. Sure, that freedom is nice if you care to use it, but it's mostly just searching, configuring, experimenting. This is hardly fun or instructive.
A deeper form of tinkering is actually working on the code. I think an instructive example is writing your own X windowing system with xmonad. You get to see exactly how a whole windowing system works.
These days, if you have the skills and tools to swap a transmission you have to tow it into a dealership and beg them to flash the transmission so it will work in your truck. If you want to avoid that you better know where to find the strategy code and match it up before purchasing another transmission. Same goes for touch screens and a whole slew of essential parts. While we weren't looking the rug was completely pulled out from underneath us. Now your family mechanic is beholden to the dealership.
> People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:
> > Linux is only free if your time is worthless!
But it is exactly why I quit Linux and returned to macOS. I used to run Linux on cheap 2nd hand ThinkPads and for 3 years on Macbook as main system. But after another upgrade destroyed gain all network connectivity I have quit.
macOS isn't perfect but it works in most imposrtant areas and I can tinker with small stuff when I feel like it.
Ubuntu is so easy to use. I enjoyed using Arch before, but got to a point where I also just wanted my PC to work without any tinkering. Ubuntu is very good at that.
Your argument is excellent and made me evolve my point of view about Mac. I use Mac for efficiency, and yet, I was wrong about what kind of efficiency I’ve been developing. Tinkering is so important, even if just for the fun of it.
Do you not have any hobbies to "waste time" with? I would assume that most Hackintosh enthusiasts do this as a hobby, not for a living or even to save money on hardware.
You need to understand the bias of many HN commenters. They are running businesses, aspire to run businesses or employed by businesses that are monetizing the work of tinkerers and packaging it for a mass market where they can sell higher volumes or mine more personal data. There are a lot of people who will recommend spending massive amounts of time and money learning and renting proprietary services over learning fundamental concepts and owning your own stack. I just ignore them along with the crypto bros before them and the AI pumpers now. Renting proprietary closed services to people who don't know better is their bread and butter.
>Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.
Yeah, but an expert in what? There are only so many hours in a day. Like if you care about learning about some rando soy d driver, or why all you photos come out pink under Linux, but not Windows[], that’s great. Go knock yourself out.
But if you want to do something that’s not rando debugging, then maybe it’s not for you. Like, I like Unix. It’s lets me do my work with the least amount of effort. What I don’t like is being a sysadmin. Some folks do, and that’s awesome. But that’s the reason why I got rid of desktop Linux 20 years ago.
[] Both of these are actual lived experiences. I do not care about you chiming in about either of these.
I don't know about you, but for me it was never about the money. I did this stuff (and still do) because I find it fun, not because I can't afford to buy it. I have my desktop, and I want that to just work, and I have a bunch of computers, hardware, 3D printers, etc etc that I constantly tinker with, because I like it.
I suspect it's the same for you, and it may be the lack of time, but not so much the access to money.
As a teen in the mid-oughties. I played heavily with the OSx86 project/Hackintosh. I learnt about writing kexts and kernel patche and I fondly remember getting a Linksys USB-to-ethernet adapter working on an HP workstation, running Tiger.
My financial circumstances have improved somewhat in the intervening years. Today, I own quite a bit of Apple hardware, most recently Vision purchase overton-shifted my definition of “disposable” into very unfamiliar territory. Even still, about once a year I ensure I can still triple-boot” - just now I do it with ProxMox and Virtual Passthrough. The first iMessage sent from my virtualized “iMac pro” at 2AM and was almost as gratifying as the first Apple Bootscreen on a a Sony Vaio.
Spot on for me, but there's a different argument at play: At the beginning of the OSX on x86 times, Apple had an OS with a stellar user experience, but the hardware was just completely overpriced, so Hackintosh made complete sense.
Fast forward to today and I think Apple managed to pivot this almost to the complete opposite end. I think the hardware is incredible value (that's debatable for sure, but my M1 aluminum machined Macbook with Apple Silicon is blazing fast, completely silent, super sturdy and runs forever — I wouldn't trade it for any other laptop I could buy with money), while the Operating System has really taken a backseat, with hugely annoying bugs unfixed since 10 years: https://news.ycombinator.com/item?id=39367460
To me, in a world like that, Hackintosh simply doesn't make much sense anymore. Asahi Linux is really the star on the horizon, by doing exactly the opposite: Letting a free and better maintained operating systems run on strictly awesome hardware.
Value was the driving factor for some of my early hackintoshing (Core 2 Duo era), but what pushed me to do it from 2015-2020 was the abysmal state of higher powered Mac hardware.
The Mac Pro was still the un-updated 2012 trash can, the 15” MBPs were too thin for the CPUs they housed (perhaps Intel’s fault for getting stuck on 14nm for so long, but still) and were hot with terrible battery life, and while the 27” iMacs weren’t terrible and probably the best of the lineup, they still weren’t cooled quite as well as they should’ve been. My 6700k + 980Ti tower in a Fractal Define case with a big quiet Noctua cooler was just flat better and made a far better Mac than anything Apple sold at the time.
That said, I did eventually grow tired of the tinkery-ness of it all and in 2020 picked up a refurbed base model iMac Pro, one of the few Macs in that timespan that wasn’t a mistake, for about half its MSRP. It was about as powerful as that tower, surprisingly even more quiet, and of course just worked without the tinkering.
This is the classic "money is time, time is money" conundrum. A teenager doesn't have the money to buy a fancy car or computer but they have the time to tweak and experiment to get the most out of it. Meanwhile an adult has the money but not the time, assuming they have a full time job, kids, etc. So they're willing to spend the money to get products that work and would rather spend their limited time with their family instead.
In my teens I had a group of friends who loved to tinker, from hackintoshes to custom ROMs to homelabbing to electronics repair. Now I'm like the only one left who does this stuff :(
The only way out of this is an early retirement in a LCOL area or a job with a very good WLB (which is likely pretty rare for most HNers in the tech industry). Even ignoring overtime I'm typically tired when I get home from work and have other commitments alongside my hobbies and tinkering.
> Incidentally it's the exact same journey with my cars. 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.
This resonates with me as well. As a teenager with my first car I spent a lot of time tweaking its appearance, sound, performance, etc., buying what little I could from local auto parts stores. I couldn't wait to get older to have more money so I could do more mods and really make the vehicle how I wanted it.
In the back of my head I wondered why older folks didn't do this though. They have these nice vehicles but they're bone stock! Why not new wheels, tint, a tasteful lower, etc.?
Then I myself got older and found it just isn't as important as it used to be. I still have a slightly modified car, but I'm not rooting around inside the dash with a soldering iron like I once did, haha.
Haha that is a very similar mindset I had when I bought my first house. I was excited about all of the nice improvements I could make and wondered why so many people I knew who were well off never really put much work into their home.
Then I quickly realized that its such a big hassle and also you almost instantly get used to things how they are.
I don't understand. Do you imagine there isn't a young generation of time rich cash poor tinkerers now? Why would the idea of a hackintosh suddenly become obsolete because you can afford one now and don't have time? Nothing about your statement logically follows.
He’s just parroting a usual HN-ism of ignoring the topic and talking about themselves. I’ve seen the “I used to tinker but now I don’t” line a hundred times as well as the “this doesn’t apply to me so I don’t care - let me tell you how”.
Isn't that the truth. For a site with the word "hacker" in it there seem to be so few of them. I can't imagine letting all that curiosity die out of me like the parent comment implies.
I don't have the amount of time I used to to do that stuff either but the curiosity of it has never died and if I had more time I'd still do it.
If I ever lost that drive I think I'd rather be dead.
The funny thing about growing older is that we change, and the things that were once "I'd rather be dead than not do this" just naturally fade away, and other new exciting things take their place.
I say thus not to dampen your enthusiasm, but rather to encourage you to enjoy it to the maximum while it lasts.
Everything has a season and in that season it can seem terribly important. Perhaps an activity, or a favorite sports team, or a group of friends.
Some of that remains forever, some of it gets deferred as other things happen. It's part of life, we grow, we change, the world around us changes.
It's not that the drive is lost, it's just that it manifests in different ways, different activities, different challenges.
When you see a post like yours in 30 years time, remember this moment, and raise a glass :)
I’m going to gently pile on to the sibling comment here, and note that the “hacking” we find interesting should and does change over time. I used to spend time hacking PDP-11 assembly code to make games. That got old, and if I play a game now it’s purchased. The stuff I hack on now is more like applied math.
This is all good and natural, if it’s organic and not growing it’s probably not alive.
Ever since blogs have had comments sections, the set of people who are too lazy to make their own blogs, have been holding forth (writing, essentially, their own blog posts) in other people's blogs' comment sections.
Heck, I'm sure people were doing it on Usenet and all-subscribers-can-post mailing lists, too — using the "Reply" button on a message to mean "I want to create a new top-level discussion that quotes/references this existing discussion" rather than "I want to post something that the people already participating in this existing discussion will understand as contributing to that discussion."
In all these cases, the person doing this thinks that a comment/reply is better than a new top-level post, because
the statement they're making requires context, and that context is only provided by reading the posts the statement is replying to / commenting on.
Of course, this being the internet, there is a thing called a hyperlink that could be used to add context just as well... but what there is not, is any kind of established etiquette that encourages people to do that. (Remember at some point in elementary school, learning the etiquette around writing a letter? Why don't schools teach the equivalent for writing a blog post/comment? It'd be far more relevant these days...)
Also, for some reason, social networks all have "reply" / "quote" actions (intended for engaging with the post/comment, and so showing up as "reactions" to the post/comment, or with your reply nested under the post/comment, etc); but no social network AFAIK has a "go off on a tangent" action (which would give you a message composer for a new top-level post, pre-filled with a cited quote of the post you were just looking at, but without your post being linked to that post on the response-tree level.) Instead, you always have to manually dig out the URL of the thing you want to cite, and manually cite it in your new post. I wonder why...
"...but no social network AFAIK has a "go off on a tangent" action (which would give you a message composer for a new top-level post, pre-filled with a cited quote of the post you were just looking at, but without your post being linked to that post on the response-tree level.) ... "
On Usenet, if you were altering the general SUBJECT of a post, you'd reply to a comment BY PREPENDING the NEW TITLE/SUMMARY of your post to the PREVIOUS TITLE of the post to indicate that you HAD changed the GENERAL SUBJECT of the post to something else AND end your NEW TITLE with "Was..." to prefix the previous title, e.g. "Hackintosh is Almost Dead" => "My Changing Hobby Habits Was: Hackintosh is Almost Dead"
On the contrary, I was relating the article to my own experience. The thrust of the article was explaining the end of an age.
I was merely saying that we shouldn't see this as bad, it is the natural way of things. Everything that has a beginning has an end. Raise a glass to remember hackintosh, but don't mourn it.
People are asking how the fact that you make more money now is evidence of that. That's your natural ending, but it's not evidence of a natural ending.
HN community selects for these kinds of posts, in the same way that subreddits like /r/amitheasshole love overwrought girlfriend-is-evil stories.
Most often the highest rated posts on HN are from 40+ year olds who don't discuss the post at hand, they'll post a hyper-specific nostalgic story from their youth on something that is tangentially related to the post.
In fact, the older the better. If your childhood anecdote is from the 70s or 80s you're a god.
There are other things that are more interesting to build and make now than a hackintosh (with the added difficulty that trying to make a silicon compatible device may not be feasible).
Combine this with that a Mac mini that might be at the target for a hackintosh device is $600 USD ... and has the advantage that it isn't hacked together and so has better support.
The part of me that wanted to tinker with a hackintosh in my younger days is more satisfied by Raspberry Pi and Arduino projects. I've even got an Onion IO over there that could use some love.
Its not that people don't want to tinker, but rather the utility that one gets for hacking together a Mac (again, note the silicon transition) is less than one gets for hacking on single board computers.
As I said in my post, the next generation will find something new to tinker on.
The idea of a hackintosh is obsolete because there are new worlds to conquer, the time of hackintoshes has come and gone. The new generation will find their own challenges, not re-hash challenges of the past.
I guess the commercial success of the platform has increased the offering in the second hand market.
Also, the MacOs desktop has pretty much stagnated and is behind the competition. What is strong is the seamless integration of the whole Apple ecosystem so it makes sense to run MacOs if you already own iOS devices. I doubt people using iphones and ipads are struggling to finance the purchase of a mac.
The primary demographic of people interested in Hackintoshing are people who, like the GP in their youth, couldn't afford to just buy "hardware with full compatibility", let alone buy the equivalent-specced Mac.
The secondary demographic of people interested in Hackintoshing are people who have an existing PC (or enough extra parts to build a second PC) and want to figure out how to "make something that can run macOS" out of it, while spending as little money replacing/upgrading parts as possible.
People who buy parts, to build machines from scratch, just to run macOS on them, are a very tiny fraction of the Hackintosh community. (Which is why you so rarely hear stories of Hackintosh builds working the first time with no added tinkering — they can, if you do this, but ~nobody does this.)
I have a need to be able to run macos binaries and xcode from time to time, and it used to be non-trivial to run macos in a unsanctioned vm so I had a mac laptop around.
But these days you can spin up a qemu macos vm without too much effort and that's my virtual hackintosh.
I started my developer career on Hackintoshes many years ago.
No matter how much time I invested into building my desktop, it never "just worked". There were always inevitable problems with software updates, which often meant you had to re-image the system from scratch to install a new OS version. Which happened quite often, when you needed it to run the latest Xcode.
Then there were a lot of minor annoyances over the years, like crashes and graphical glitches with certain apps, like Photos or Preview, problems with monitor resolutions and refresh rates, and many, many others.
Ultimately, they were a useful tool for a time, but they suffered from death by a thousand cuts in terms of practical usability. So, I bought a basic Mac Mini as soon as I was able to, and never looked back.
The first hackintosh I built back around 2008 I was able to get working actually perfectly. Somehow the hardware and software bits all aligned and everything worked great. It’d run for months on end without issue.
Nothing since that one were quite as good. Had a Dell laptop for a while that was almost perfect, but would lock up and require a reboot once every couple of weeks. A tower I built in 2016 was also almost perfect, except I never could get USB working 100% right and later on the Nvidia drivers got flaky.
I built a hackontosh in 2016, bought all the right mobo with the right driver sets, etc. Used the buyer's guides on tonymacx86.com and purchased the exact hardware, downloaded the drivers, flashed things, etc. It was far from "just working". I had a stable and solid system for about 18 months (after a weekend of tweaking), and then it needed to be reconfigured, and I didn't have the time to spend the weekend getting it to work again....so that machine went back to windows. Even with the proper supported Nvidia card, I had issues, and went through some pains with the wifi.
I wouldn't count on 10 years of real-world life from a non-upgradeable and 'repair-resistant' device with a glued in battery, even if the hardware specs are good enough to last that long.
> Monthly pricing is available when you select Apple Card Monthly Installments (ACMI) as payment type at checkout at Apple, and is subject to credit approval and credit limit. Financing terms vary by product. Taxes and shipping are not included in ACMI and are subject to your card’s variable APR. See the Apple Card Customer Agreement for more information. ACMI is not available for purchases made online at special storefronts. The last month’s payment for each product will be the product’s purchase price, less all other payments at the monthly payment amount. ACMI financing is subject to change at any time for any reason, including but not limited to, installment term lengths and eligible products. See https://support.apple.com/kb/HT211204 for information about upcoming changes to ACMI financing.
----
You do not have to drop $2000 on a new laptop up front.
And that's how you know someone has not used the 8GB MBA model. 8GB is more than enough for the light usage you'd buy a 8GB model to begin with. Which means not running 3 IDEs and 5 VMs at the same time.
That's what most laptops used to cost back in the 1990s or so (after adjusting for inflation). If you look further back in time, hardware was even more expensive - and it couldn't even do 10% of what a modern MacBook does. Modern hardware is ridiculously cheap.
In the 90s most people didn't have a laptop for that very reason. They just owned desktops which were way cheaper.
I studied computer science then and I knew 1 student out of 50 that had an actual laptop. Even at the uni we had to use their computer rooms full of desktops and X terminals.
I had a Gameboy, parents had a PC. Friends all had a game console. NES, SNES, SEGA, Amiga, C=64, etc. PC went booming in 90s though. Because here you could buy a PC tax deductible via a law called 'pc-privé'. This was to stimulate citizens to learn to use a computer in their private time. Still, even with tax deduction a PC was very expensive. Not like a car, but expensive still.
Not really sure they were much cheaper in the 90s. My first PC was a Dell P90 in 1994, IIRC it cost about $2500. There was kind of a mantra at the time that no matter the improvements, you'd always spend about $2500. And adjusted for inflation, that "way cheaper" desktop was over $5K in today's dollars.
Inflation calculator says the Core 2 Duo MacBook Pro I bought in 2007 is something like $4400 now. Bought a small intel ssd for the sata3 port in it too for probably another $500 now
Entry-level price for a new Mac, right now, is $700 (M1 Macbook Air at Walmart). It doesn't get you the best or the fastest, but it's a perfectly usable laptop. Or, if you're okay with something lightly used, a refurbished M1 Mac Mini is ~$500.
The M1 MBA being sold by Best Buy and Walmart are perfectly fine for 99% of the computing world. Maybe not for gamers (most laptops suck for this), or for someone needing to crunch large datasets, but when this first came out, tons of developers were perfectly happy using it, even with small storage. Hell, my iMac I used up until buying a Mac Studio only had 256GB.
Yep. 8 GB RAM isn't great, but for basic use -- web browsing, word processing, some light photo/video editing, etc -- it'll be perfectly adequate. Not everyone needs a supercomputer.
Depends on how much browser tabs they dare to keep open, and how many pictures and videos they want to keep around on their computer stored in high resolution.
As far as the processor goes, the M1 is in active production (e.g. for the iPad Air), and is still a very capable CPU. It may not be the fastest laptop CPU on the market anymore, but it's hardly slow.
I don't understand how this relates to paid warranty or insurance. Your TCO for a non-Apple laptop can also include a protection fee or cost to repair with no insurance.
They have lots of monthly payment plan options here in Canada, and probably in the US too. It even used to be zero interest. Not sure about the rest of the world.
Also many no interest options in India but the prices are higher here, somewhat so for the Macs but significantly higher for the iPhone as it is such a social status thing here in the north.
> It's really not much of a time commitment. You can just lookup hardware with full compatibility and build a desktop that "just works".
Oh JFC. This canard has been floating around the about linux for 30 years, and it's always been a half truth at best.
Inevitably, it always comes down to "Cards with 2361YSE rev 5 chipset" or some other nonsense. Like that makes total sense for a kernel developer, but most people don't know what chipset they have in some peripheral.
So now you're left with assholes saying, "WeLL yOu ShOuLd gEt InFoRmEd. JuSt GoOgLe iT!", and it ain't that easy. If you can even find a brand name to chipset list, it's going to be out of date, or it's going to be something that says "2361Y" or "2361YSE rev 3" or something. Is that close enough to "2361YSE rev 5"? Who knows!
Then the best part? Even when you lookup the hardware with "full compatibility", you'll find that it actually isn't. Then when you ask about it, you'll get, "I just don't use that feature, and you shouldn't use it either."
You guys that grow old forget that there's still younger people in this *world* (not just the US). It's analogous to saying "I worked 9to5 in the 2000s (when wages were acceptable). But now that I have way less energy to work, and made millions off my retirement fund. I don't see why this generation shouldn't equally work as hard today."
Tinkering shouldn't be nostalgia, it should be a right. I'm sure you used to fix the rusty old generational family car with your dad on the weekends. He probably used to do the same with your grandfather on the weekends. I don't think there'll be a car to fix for the next generation.
Just like ramen or office chairs can measure a recession, fixing cars with a father figure could be used as an indicator for the prevalence of greed in society.
I was also that kid. I remember an OSX upgrade breaking my mouse and I couldn't figure out how to get it working again. I was desperate for a Mac, but it was financially unattainable.
I feel the same way with phones. I pre-ordered a Nexus One the day it opened, installed a dozen custom ROMs, etc etc. Upgraded to a Nexus 4, 5. These days I use an iPhone. Don’t miss it, though I’m nostalgic for the excess free time!
Same for me. I spent countless hours recompiling my kernel in slackware, configuring enlightenment window manager. These days I don't even change the desktop wallpaper.
Yeah I spent literally dozens of hours of my life compiling different kernels with OSS and ALSA variations to get my sound card working lol. Really a 'you had to be there' thing.
I have one running in a virtual machine but on hardware that would natively support a Hackitosh which I use only for testing Mac distributions. It's too old to use now but when I built it you could buy Mac OS at Best Buy.
While tinkering for the sake of tinkering is as good of a hobby as any other, the process of tailoring your OS's does not have to be infinite. Maybe it is different for others, but while I did spent a lot of time on writing my Linux dotfiles until they were nearly perfect, for the last 5 years or so, when I have a fresh OS install, it's really just 'git clone; chezmoi apply' and I get a system where every keybind is exactly where it needs to be.
When work banned Linux machines and I had to transition to OS X, I had to do just as much, if not more, tinkering to make it work for me. Perhaps it does 'just works' for those who think exactly like Steve Jobs - but if you want it your way and on your terms, there'd be a lot of tinkering to do, from yabai configs to Karabiner json configs and custom plists, to replacing most of the gelded Apple apps.
I had a similar revelation a few years ago. My giant PC gaming rig blew up again (specifically my 3080 shit itself), a year after having to replace the power supply and requiring the whole thing.
I was just done with faffing around with that kind of thing.
So I bought a (then fairly recently released) Mac Studio, just the plain jane base 32GB model, and couldn’t be happier. So nice to have something virtually silent and energy efficient, instead of jet turbine that drew about 300w at idle.
I 100% do not want a laptop for my primary personal machine, but the big workstation towers are too much.
The Studio is that wonderful Goldilocks zone - performant, bring-your-own input devices, but merely “a bit pricey” and not extravagantly so.
Agree. I used tweaked BlackBerry ROMs for a couple years before getting my first Android device, an HTC One M7 with Android 4.4 KitKat. Spent loads of time getting all the tools working to modify ROMs, bootloaders, recovery/TWRP, and squeeze every drop of performance out of that phone. Then went "backwards" to an iPhone 4S and have been rocking stock iPhones ever since.
The Steam Deck sort of occupies this space today. I'm not in the scene myself but I've read about users modding them, running unsupported OSs, liquid cooling them, etc. Seems like any sufficiently broad technology will garner a community of hackers and modders around it.
I'm not into modding but I got SD because of it's openness and all sorts of things I can make with it (also to support gaming on Linux and kudos do Valve for the work on pushing it ;) )
I played with jolla/sailfish for a while and the device was awesome but I couldn't get myself to like gesture/swipe navigation (I hate it on the current flock of iOS/Android with same passion)...
For the device I'm pondering new OP which is more open than the rest but still, as you said - it's mostly the same OS and the changes are not that significant to spend all that time on flashing...
Macs are very expensive in some parts of the world, where other computer brands are affordable. A hackintosh could be a good option, and when somebody learns to do it well they could do it for others for money. Not only installing MacOS on PCs, but also installing newer versions of MacOS on Macs that are officially not supported anymore.
> A hackintosh could be a good option, and when somebody learns to do it well they could do it for others for money
Apple thoroughly screwed over Mac developers that the only compelling software that's exclusive to MacOS is developed by Apple themselves[1], IMO. Even those packages have equivalent (or better) alternatives on Windows. Macs used to be the platform for DTP, audio and video production - now all the 3rd party developers have pivoted away to other operating systems. One of the reasons professionals resorted to Hackintoshes in the past was because Apple had periods of neglecting the Mac Pro hardware on and off. Why would anyone go through the paid of setting up a Hackintosh in 2024, outside of being a fan of MacOS aesthetics?
1. Logic Pro likely has the biggest pull; Final Cut isn't the halo app it once was.
I use many third-party apps on MacOS that are top of the line in their niche, regardless of OS. People have many different uses for computers and workflows that you are unfamiliar with.
When you discover how programs on MacOS can connect and interact with each other and with the OS as a whole, it becomes a completely different experience.
Nice strawman! You completely demolished a market share argument I never made. My point is that audio and video professionals now have viable alternatives to Apple software. Running MacOS is now optional, which wasn't the case in the past, so there's less of an impetus for running MacOS on an non-Apple hardware.
As for making stuff up - I don't know if you remember the years of neglecting Mac Pros, or the clusterfuck that was Final Cut Pro X. I do. I remember a lot of dyed-in-the-wool Apple users switching to Adobe on Windows. How many 3rd party DTP, audio or video production packages are still exclusively available on Apple?
Only on the countries where Apple brand dominates, the remaining of the 80% desktop market share makes do with Windows and eventually some custom Linux distros supported by the hardware vendors themselves.
VFX Reference platform includes Windows and Linux for a reason.
I couldn't agree more. I started my life buying/using Macs but nowadays I wonder why people make the financial effort considering the very large premium even though Macs don't do anything much better anymore.
You are right that the main driver is probably Logic. Final Cut is being largely replaced by BlackMagic and Adobe softwares (because they work everywhere and integrate better with other things people care about). Avid software works just as well on PC.
As for the desktop publishing stuff, this is a use case so trivial (and in some ways displayed by web tools) and so dominated by Adobe that it feels Adobe is really doing Apple a favor in keeping their software update/optimized on time.
In my opinion/experience, they keep selling them because it is very hard for people to change. Most keep using them because this is what they are used to and similar reasons.
This is where Apple is very shortsighted and acting pretty stupid, you can see the young largely ignoring Apple computers because they are way too expensive, I believe they have permanently eroded their domination even in the media industries, because even though they have the money it won't take long to notice they can keep doing the same quality work even if they buy hardware half as expensive because it makes no difference to their young workers.
Apple premium pricing was a historical artifact of always being on the bleeding edge and being one step ahead of the competition for many things. Now, aside of the silicon (that has some advantage for the laptop in the form of battery life, but not really any for the desktop) it doesn't feel like they are ahead for anything.
In fact, when you take a hard look, you realize they are selling update of stuff that were designed 10 to 15 years ago and not much has changed when the PC industry as a whole has evolved quite a lot.
What is strong though, is the delusions of people defending their extortionate price for whatever ridiculous reason they can think of at the moment.
Don't get me wrong, I think Macs are ok for the most part, just not at their current price, especially in Europe (France).
Why is it a hacker thing to denounce everybody with a different opinion or preference as "delusional" or brain washed?
For many people it is nothing to pay a couple of hundred dollars more for a device that in their experience is easier to use or in their preference looks wise, even if the geek specs are worse. Especially for a device they'll be using for many years.
> In fact, when you take a hard look, you realize they are selling update of stuff that were designed 10 to 15 years ago and not much has changed when the PC industry as a whole has evolved quite a lot.
Then why has the PC industry not been able to catch up with Apple in computer design in all this time? A 2014 Macbook still looks better than any 2024 PC laptop. It's beyond weird to me by now. Apple made a great leap in design with the unibody Macbooks, but then after more than a decade nobody else has came after, even though customers would love it.
Imagine if there was only one car manufacturer that could design a good looking car...
Where did you find me saying it was a hacker thing?
And it's not simply about having a different opinion, those are perfectly fine.
It's about saying things that range from deceptive statements to outright lies designed to satisfy the cognitive dissonance created by making a somewhat less optimum choice (sometimes irrational). It's all about projections and insecurities.
A couple hundred more was the case in the 2000-2010s but nowadays, Apple hardware very quickly closes up on thousands of euros difference for a device capable of the same workloads.
The RAM controversy started precisely because of that fact; on paper entry level MacBooks bench very well, but as soon as you start you level up your workload the truth appears and suddenly a much cheaper device less powerful on paper gets more done. There are plenty of youtubers who demonstrated that, even self-proclaimed Apple fanboys.
This is exactly what I mean by delusional: you can be happy/content with your purchase but trying to downplay the stupid compromises one has to make at a given price point is complete foolery no matter what the perceived advantages a MacBook may bring.
Trying to rationalize/minimise those facts to others in order to justify Apple's extortionate pricing is very close to religious beliefs which is delusional by definition.
My first job was working for an Apple service provider (very high end) so I know firsthand the various reasons that may be valid for buying a Mac; but with Apple current business practice it has become less and less advisable, pretending otherwise don't get you bonus point from Apple...
Then you say that somehow a 2014 MacBook looks better than any PC laptop. Now I wonder if you live under a rock or just didn't check any modern laptop worth a dam recently (to be clear, I am generally talking about laptops in the 1K euros range at least).
Have you seen a modern Huawei laptop (looks pretty much like a MacBook until you start looking at it closely), in my country it starts at half the price of a MacBook Air; in fact, as a Mac user, the first time someone handed one to me I couldn't believe the price they paid for it and the assembly quality for said price.
But even then, looks are a rather subjective thing, and you can say it looks better to you, but it is not really an exact truth. I find their latest design bulky and inelegant, well made, sure, but not necessarily nice/interesting looking.
In fact, Surface devices look very nice as well (even though they are overpriced, they still manage to be better value will less volume); Asus has some pretty good-looking laptops, HP as well, even Dell manage pretty good with their XPS line.
Trying to justify multiple hundreds of euros price difference (at minimum) for a subjectively better look is not exactly rational behavior.
It would be like someone trying to sell me on his Louis Vuitton handbag purchase that is "worth the price" because it looks good; even though the common LV pattern is a toilet room tile pattern that should have stayed there. You could have paid a master craftsman to make an even better bag to your specifications that would look just as good (I live close to one of their factories so I know that for a fact).
But few people are delusional enough to say shit like that, because it was not a rational purchase, this is all about emotions and the emotions it brings in other peoples.
The problem I have with fanboys (Apple's or any other for that matter) is that they try to pretend otherwise and that's just delusional. Apple is a company that mastered marketing and knows very well how to create desire that will inflate the value of their products that should be tools first and foremost.
Some people like this fact and I find it a bit sad (considering the supposed use case of the device), others begrudgingly pay the price because of various reasons (that very often boils down to limited self-imposed choices).
As for the unibody I don't think you understand why exactly Apple is doing that. It has a lot more to do with saving on device assembly (making stuff more complicated to repair in the process) and ROI on the billions they invested in custom CNCs than any real benefits for the user. It doesn't make the laptops last longer (not that it matters considering how fast Apple is obsoleting their stuff nowadays) and it can render a device useless for just a dent at the wrong place, because you can't swap chassis. On top of that, modern premium laptops have moved on to magnesium alloys that noticeably save weight without compromising on strength and Apple hardware is often heavier for a given form factor.
I could go on for days so I will leave it at that. Just to be clear I am not saying all Apple stuff is all bad or whatever just that it is tiring to see people defending their stuff tooth and nail when considering the price their customers should ask for much more and the behavior that consist in defending all the very real compromises of their gear is, yes, delusional.
With a little preparation setting up a Hackintosh is not much more difficult than setting up an actual Mac. What you're describing is a myth or just out of date.
It really depends if you're tinkering because you have no choice versus tinkering because you like it. People will absolutely tinker on cars for their entire life, but upgrade to more interesting jobs like doing engine swaps rather than just changing their oil.
But it sounds like you were tinkering because you had no other choice, not because you enjoyed it.
To me it was having just one powerful upgradable desktop computer with Windows and MacOS. So I don't have to have devices on my desk.
Now I have solved with PC desktop, MacBook Air, and Apple Display. PC also has usb-c display output so I can just switch which cable connects to the display.
Downside is still that M1 is not as fast, especially something that is GPU intensive as the PC I have.
> this kind of tinkering was fun and a good way to improve the machine I was using.
You know you’re alive when you log into iCloud from a hackintosh and then Apple notices, you get the ‘unauthorised hardware’ message (I can’t remember the exact phrasing) and your various iCloud services begin to cease functioning. It’s not often your OS is exciting.
I had the same experience with windows 98/2k and my franken pc of randomly upgraded parts. I used to have to reinstall win98 every other month or so because it was so unstable. I had the installer on a separate partition, so I could just wipe the system disk and have a clean install up in 12 minutes.
I find these days the Steak Deck has become a great device for tinkering. I've seen people do some nice unexpected stuff with it, for example making an opening in the back to connect an dedicated GPU or using it to pilot drones in Ukraine.
from around 2008 to 2012 I ran hackintosh, on desktop, it was great and fun in 2012 I bought a first MacBook. The good experience on the hackintosh made me get the MacBook. So I like to think hackintosh helped apple.
> 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.
See, that 35 years for me didn't make me stop working on my cars, it just allowed me to have enough money to have a reliable car as well as "toy" cars that I can still tinker with. I drive the Audi, but I still wrench on the Triumph. I used to tinker with Hackintosh stuff as well, and I haven't stopped tinkering, just moved on to other things, like this Rubidium frequency source I just bought to build a high accuracy NTP server from a Raspberry Pi. (Yes, of course, there are already cheap and easy solutions for this, but I want to tinker).
I wouldn’t go as far as “sad”. Free time is always a finite resource you have to prioritise. I used to tinker, these days I’d much rather spend time with my kids. I’m definitely not sad about it. I’ll have plenty of time for tinkering in the future.
Of course. And people enjoy spending their free time on various things not necessarily due to some restriction. For those people time spend on those things isn't wasted. For example, can have fun fixing cars even if have money to have a mechanic do it.
There was a time when hackintosh was practical for everyone who needs macOS and/or can't stand other desktop OSes. It was the tail end of Apple's Intel hardware. It was pathetic in terms of performance (underpowered CPUs and buggy GPU drivers), quality (butterfly keyboards) and thermal design (things would overheat all the time), yet expensive.
I myself was contemplating building a ridiculously overpowered hackintosh machine around 2019. Then the ARM transition was announced. And then the M1 came out with overwhelmingly good reviews from literally everyone. So I decided to wait for the beefed up "professional" version, which did come later, so here I am, typing this on an M1 Max MBP, the best computer I've owned so far.
Also, for me personally, hackintosh was an introduction to macOS. I was a poor student at the time and couldn't afford a real Mac. Of course I bought one about as soon as I could.
As a long time lover of hackintoshes (couldn’t afford a real Mac as a youth but tried to make the netbook macOS dream come true), I’m quite sad to read this. The author has a very valid point that drivers are going to become increasingly complicated and difficult.
I appreciate the call out that Apple (the engineering) isn’t explicitly trying to kill hackintoshes.
As an Apple engineer who deals with ACPI bugs, hackintoshes are a unique source of frustration. I’ll spend hours digging through crash logs only for things to not add up. It says it is an i7 MacBook pro but it has way too many cores. It way more memory than it should. The kext versions are a weird mismash that shouldn’t be possible. The firmware is a version that we never released. Etc etc.
I do my best to fix these sorts of issues but hackintoshes make it hard to reproduce the crash conditions. Which means being confident about a root cause and hard to verify that I’ve fixed it.
Now I’ve spent hours chasing something and I can’t help.
I had a Hackintosh and felt that any crash was 99% my fault and probably an edge case for MacOS. But in my defense, CrashReporter is way too permissive and will send a report even when the user doesn't want it done. After a app or hard crash I'll get the window that a bug report was sent and I know damn well some engineer is going to look at it and it won't make any sense that a MacBook has this particular GPU.
I've been using a Hackintosh as my daily driver for nearly 15 years and they have always been rock solid, with months of uptime consistently. It's just a matter of starting with the right hardware.
People are free to look to support the hardware they have but 've always though it's stupid not buying well supported hardware in the first place, of which there is plenty.
Nothing, I'll just stick the last usable version, just like I'm stubbornly sticking to 10.14.6. I'll fight moving up each and every version by narrowing the software I use. Already happening and it's fine.
Most days I hope that we (as a planet) get past all those obfuscated vendor lock-ins but over the last decade we’ve watched the, get worse and tighter as they get backed up by lobbied laws. And now I’m worried that you’re 100% right.
But we can still be saved by big ideas. We (modern humans) have done it before. I just hope the seeds of it are already planted.
It’s not as if the OP can’t install a different OS that does support x64 processors. The OP is describing a Hackintosh, so there is no vendor lock in. For that matter, installing alternatives on x64 based Macs is also reasonably trivial and as Apple silicon matures, so does the reverse-engineering knowledge (much like it did with the ‘commodity’ PC components used for Linux today), and the vendor lock in issue disappears.
I’m in a similar position and I’m learning the KVM/QEMU way. So I hope I’ll manage to stay on Linux and add virtualised macOS and Windows apps to my stack. If it would work as I expect, then there’s no real need to even update those virtual systems. Since I need them only for some apps that aren’t on Linux (yet?). In my case those are Adobe After Effects on macOS or Windows and Pixelmator Pro on macOS. Other apps has decent counterparts on Linux for me.
I'll add this to reasons why I'm opposed to always-on telemetry. A hackintosh should know that its crash reports will be unhelpful to Apple and not bother sending them. It's a waste of your time to deal with data coming from unsupported configurations.
I installed Snow Leopard on a 2009 MB for kicks and sent in a crash report when Safari died due to something on the modern web. I would love to know whether these still arrive at the fruit company.
I gotta say, 10 or 15 years ago, Mac OS X would have been worth the effort and it was basically what justified buying Apple hardware for me.
Nowadays, if anything, it's the hardware that justifies buying Apple, and the operating systems are something I can live with. I don't see any compelling reason to use macOS on non-Apple hardware today (except hacking for hackings sake)
just in case anyone is on the fence. Apple’s developer network effects are real and impressive, people updated… nearly everything… so quickly! plus with Rosetta under the hood translations to x86 binaries expand everything runnable to seamlessly
I thought I would miss bootcamo but I dont. I have yet to install ARM Windows but I’m hearing praises about that now from relatively casual users
I’m fairly unaware of the current state of Linux on Apple M hardware, but I’d want a Linux partition on Apple hardware more than a Mac partition on x86. These days I have two devices though.
Writing to you from NixOS on an M2 Air. Aside from a handful of missing packages that aren't available for the architecture, it's shockingly good. My battery is reporting 19 hours remaining, and "setup" took about 20 minutes (not counting the brief time writing a new machine definition in my NixOS config). I don't have any fundamental issues with macos, but it's nice to have a consistent environment across machines, and this hardware is glorious.
I used one of their releases rather than building my own image. It’s a guide that merits careful reading, as some key steps are not specifically bulleted. Oh, and it’s not the NixOS graphical installer.
But it was dead simple, and 99% of the heavy lifting is from the Asahi team. The biggest downside is that updating the support files is a manual process, but NixOS of course makes it a breeze to rebuild into a new environment—and back out if it doesn’t work.
Macs do keep their resale value more though. Personally I run Linux on my own machines but I do enjoy the build quality of the MBP I get from work. It’s a whole another universe from my pretty good laptop that creaks and is made of plastic and the screen bends when I move it.. and the screen quality is not even comparable
Yep, Linux will make mid hardware punch well above its weight class so for the right person the value/$ ratio is unbeatable. But if the highest value at any price within reason is your goal then Apple is the go-to.
Which is typically why your employer who could not give a shit about a $4k expense every five years for an employee costing them a quarter mil annually in total comp opts for them.
Untrue. At least in the US you can either sign up for an annual plan that continues until you cancel it or you can extend the 3 year plan in annual increments.
It’s about the only viable option for professionally working with audio, in either studio or live setting. That's the biggest group of hakintosh users I'm personally familiar with anyway.
You can work professionally with audio in Windows, you’ll probably even get better performance out of the same hardware you’d be using for a Hackintosh.
You can, theoretically. In practice, a lot of tools like Logic and specific VSTs are macos-only, and CoreAudio actually "just works" out of the box without having to manually install and setup all kinds of alternative low-latency drivers.
If Apple is not messing up their USB subsystem like they did with Ventura, where people basically had to wait a whole major release until they got stable performance again with Sonoma.
Or if they don’t break iLok copy protection, like they just did with 14.4
And for professional hardware, you don’t have to install “all kinds of alternative drivers”. You just install the one official driver from the hardware vendor and that’s it.
Having used all three major OS’s for music at different times, MacOS is still my go-to. There just isn’t the futzing I had to do on windows. Linux was actually a contender for me back in the day - I feel like MIDI routing on Linux is just easier[1] although older OSX touting was also very good imo. I kept my G4 tower going for a looooong time because I had just the right setup of tools and hacks to do everything I needed to where windows wouldn’t cut it and no one wrote the drivers I used for Linux.
These days I don’t tinker like I used to and mostly just need simple MIDI routing and something that can copy renders from my mixer, so anything will work. I still prefer the ease of MacOS though I can do everything I need to on any OS
[1] the main drawback for Linux was really a lack of good docs for things like PD, and not having access to Live. If I’d had Reaper back then it might have been my daily driver.
Live works fine through Wine, nowadays. If I'm not mistaken, the Ableton installer works fine too - you just download the Windows copy and it installs it like-native. Very odd stuff, but it works fine from what I remember (I use Bitwig now).
CoreAudio was definitely the better choice when Linux only had JACK, ALSA and PulseAudio, but now that PipeWire exists it's very close to a CoreAudio-esque experience. You can record out of other apps, route processed audio into a voice chat, or manage the world's largest DAC without issue.
I can understand having personal preferences of course. I have been using macOS for audio production for more than 20 years.
I personally just feel that the "futzing" you mention is worse on macOS nowadays than it is on Windows.
And while Windows certainly has its share of issues as well, it's usually transparent and open enough so I can actually fix them. Whereas on macOS, the only course of action is often to just hope and pray Apple will fix what they broke in a future update.
Same here. Of course, now that we have the M1 and the likely eventual deprecation of x86 for new versions of Mac OS, things like Hackintosh are a bit doomed. If you have an M1, you'd be runnig MacOS (or Linux) and would have no need for it. If you don't, MacOS is not going to run great on it and very soon not at all. So, it's one of those things that you wouldn't use unless you really needed to. And finally, support for running mac os in virtual machines is kind of getting there. It's still made hard by Apple and only supported on their own hardware. But it's not impossible and there are some legitimate use cases (e.g. building IOS apps in the cloud). Maybe eventually somebody will figure out emulation on non Apple hardware. I think there are some efforts on QEMU. Of course the issue will be emulating enough of Apple Silicon properly.
Apple profitability by lock-in is all that matters in the Apple C-Suite now. Juice that stock price, get free stock options, buy new yacht to show off to your friends.
macOS supports running as a paravirtualised guest OS (officially, on an apple-hardware host also running macOS). If there is to be a "next gen" of hackintoshing, I think it'd be based on a cut-down linux host acting as a shim between the real hardware and the paravirt interface.
See also: https://lore.kernel.org/all/20230830161425.91946-1-graf@amaz... "This patch set introduces a new ARM and HVF specific machine type called "vmapple". It mimicks the device model that Apple's proprietary Virtualization.Framework exposes, but implements it in QEMU."
One thing I've been on the hunt for is a good server hypervisor solution for Apple Silicon. I would like to put an Apple Silicon Mac Mini in my rack, and it would be really nice to have a minimally bootstrapped host OS running a handful of macOS VMs for various purposes.
The best I'm aware of currently is UTM which has some scripting capabilities. But that is very far off from the experience with Proxmox.
It would be interesting to see how feasible it is to run that on an Arm workstation, e.g. an Ampere Altra. Or going the other way and trying to squeeze macOS on to the latest Raspberry Pi.
Lucky you, that's how much they used to charge when one could actually, no kidding, buy Snow Leopard. I paid it so that my hackintosh (on HP Mini) wasn't stolen and I still have the CD-ROM, which shows you how long ago they cared about making the OS a purchasable thing
- there is less alternative hardware I want to use. I want Apple Silicon processors, materials, and there just isn't much high quality competition.
- Because of inflation the Apple premium isn't as high as it used to be. You get a Mac mini or MacBook Air at very competitive prices (RAM is still painful).
- Linux Desktop software is more competitive and fills some of the roles that needed macOS before.
Why? Soldered drives are awful. If your motherboard dies you can kiss goodbye to your data. Watching Louis Rossmann’s repair videos disillusioned me from ever wanting to upgrade internal storage beyond 1TB.
Externals are cheap, fast, and safe. It’s win/win/win, the only downside is that they’re inconvenient if you want to use the laptop on a non-flat surface (such as a lap), but I’m not sure I would pay the Apple tax just for that.
> Soldered drives are awful. If your motherboard dies you can kiss goodbye to your data.
You have backups though, so who cares? Your motherboard died, you need a new computer anyway. Do a restore. And it’s not like external drives can’t fail; they’re just tiny computers.
> It’s win/win/win, the only downside is that they’re inconvenient if you want to use the laptop on a non-flat surface (such as a lap)
Hard disagree, friend. Example: my editing software has a ‘cache’ folder. If I set it on the external drive and that drive isn’t mounted for whatever reason, the software defaults back to ~/Movies. The first you know of this is the system notifying you that you have 20MB of hard drive space left.
There’s no substitute for an always-there-no-matter-what internal drive.
Nothing beats the convenience of unplugging a drive and plugging it to a next computer and it just works, within seconds you have everything ready for work.
Is there a good file system for this? For macOS I’d want APFS, for Linux ext4, Windows NTFS (maybe ReFS?). You can typically get read only access between them, and I know Linux can do RW with NTFS - but I’m not aware of a good option.
I'm living this right now. I just switched laptops with my partner as she's doing a bunch of video editing. So she got my Pro and I'm using her Air. (I know, I'm an amazing boyfriend.)
So yeah, great, I have all my files. But none of my apps! And I'm not signed in to any of my websites. And even if the apps I use were on this laptop I'm not licensed for them. Oh and I have to get my password manager before I can do any of this. And configure my email.
Conservatively, it takes a couple of hours to restore minimal state. I know, I just did it!
In that time I could have connected an Ethernet cable and copied all of my data between the two internal SSDs. :-)
Edit: oh yeah, and I just command-strip-attached the SanDisk SSD to the back of this laptop's monitor. Because otherwise when I go and work somewhere else I'm trailing this umbilical around with me. This setup is absurd and I hate it!
Edit2: oh yeah again, now this external drive is constantly occupying fully 50% of my USB ports. Terrific.
Apple silicon is unfortunately still at a premium. I was shopping around for used M1 mac minis and a 16gb model (minimum acceptable imo) was like $500-600 second hand. You can get an 8th or 9th gen off lease computer for like $100-200 that is close enough to the m1, sometimes with 16gb already in there with the option to add in 128gb, multiple drive bays, pcie, etc.
> I was shopping around for used M1 mac minis and a 16gb model (minimum acceptable imo) was like $500-600 second hand.
That's because a M1 mini with 16gb still has a lot of utility in it that hasn't depreciated significantly in the 2 years since it's been released. My M1 (8gb) is still happily sitting in a media consumption part of my day and is not showing any signs of age. I would be surprised if, for the role that it has, it becomes outdated in another 2 or 3 years... and wouldn't be surprised if it lasts another 2 or 3 beyond that.
If you spent $700 on it in 2020 at release, it is still working as well as it did on the day you bought it.
You may be seeing the premium from when it was bought being attached to the current used price - and there are less expensive ones available now - but the device, for what it was when I got it is still providing value and selling it used would mean I'd need to get a new one... at a similar price as what I'd sell it for.
> sometimes with 16gb already in there with the option to add in 128gb, multiple drive bays, pcie, etc.
I will note that for me, in the spot where it is, the "multiple drive bays, pcie, etc." represents a worse device as it doesn't sit nicely under a monitor on a small desk. Part of the choice of the Mac mini for me for that role was its form factor and quiet running.
There's still lots of utility in 9th gen intel PCs. And it's cheap to get extra utility. The resale value of the Mac mini is down to the brand cache and that there's not a glut of them on the secondhand market from businesses upgrading overtime.
You can also get PCs in a similar form factor to the Mac Mini but they still have RAM slots, M.2 slots and sometimes a 2.5" drive bay. You could get one that's powered off the USB C PD from the monitor requiring only a single cable and could be mounted on the back of the monitor (mount included!).
On a mini pc like a lenovo you can have an m2 ssd, a sata drive, and whatever you hook into the pcie expansion slot. the hp mini pcs actually have 2 m2 slots along with the ssd, so three internal bays and two sodimm slots in a 6.97 x 6.89 x 1.35 box, so its technically smaller than the mac mini.
9th gen intel? According to geekbench 6[0] the m1 is between 2x-5x faster with better battery life, so depending on your workload, it's quite possibly not close enough.
For example I5 9th gen vs M1 here. 9th gen has 2 fewer cores (although 9700 is the same, and 10th gen doubled the threads) and nearly the same clock speed. Graphically the M1 has more compute units but in the 8gb model the gpu is likely to be starved anyhow as it can only take on 8gb of the system ram. I5 listed here can take on 64gb for its igpu. 9th gen has better encoding abilities. Battery life, sure maybe, but I am talking about the mac mini here compared to an e.g. hp prodesk minipc. The 9th gen chip depending on what you have connected to the rest of the system could idle at like 6w.
- The icloud webapps are now pretty good, and icloud file storage and password management works well on windows these days
- Windows is also pretty good these days, and obviously if you want to run games, it's the easiest option.
Although I enjoy my macbook and iphone, I don't have a compelling reason to have MacOS on my desktop instead of windows. I think the only thing that I would like to have are clipboard sharing and Universal Control (share mouse and keyboard with macs and ipads), but there are cross-platform software solutions that are good enough
I daily drove a hackintosh for years until I recently pivoted to apple silicon. I was a very enjoyable experience for me. The success and reliability of a hackintosh is really dependent on your hardware configuration. I had lucked out that my desktop tower that I had built years prior just so happened to coincide almost 1:1 with hardware requirements for a golden build. (6700k, 64gb ram, Vega 64, compatible wifi/bluetooth pcie, compatible m.2 controllers, z170 mb which is well known in the hackintosh community, etc.)
Being able to have a modular Mac was really something and I exploited that to tailor my machine to my use case (television/video production). I never had issues with bluetooth or WiFi, nor did I have ever have an issue with Apple's services like iMessage/Facetime.
What sucked about the process was staying current with system updates. Updates within the macOS release went without a hitch, but my hardware was aged out in newer macOS versions which made upgrading a bit too much like surgery, and since this hackintosh was my production device, that wasn't something I wanted to roll the dice on.
Having switched to apple silicon, I do kind of miss that freedom, but I've found that same freedom just by doing things a little less hacky. Instead of a board I can add drives to, I just setup a NAS, instead of using an old PCIE HDMI capture card, I got a more modern USB one, etc.
For a long time, Hackintosh was an opportunity to do things my way, and that experience led me to learning experiences that have improved my day to day that I otherwise may not have learned. It was a freeing experience. Today I still do things my way, but these days my way is more focused on convenience for the things that should "just work" so I put my attention on things that matter, rather than things that shouldn't, such as modifying my EFI before a macOS update to trick macOS into thinking I have the iGPU of a newer chipset because Apple dropped support for Skylake on a new release.
Good times, the headaches were worth it in hindsight.
I’ve got a machine pretty similar to what you’re describing in my closet (6700k, mobo reasonably well known in the community, 5700XT GPU) which used to be a hackintosh. Might be worth reviving and trying to find a use for.
Well I can run Mac OS in Proxmox nowadays so that's killed the "bare matel" Hackintosh for me. Yes it takes time to set it up (once!) but I much prefer KVM now than fiddling with hardware https://www.youtube.com/watch?v=68R2SdbFj-8
Yep, this is exactly what I do. It's nice to be able to use Messages from my desktop, plus it fulfills an edge case of 2-way backup with Synology Drive and iCloud for me
Does anybody know how/why FaceTime/iMessage are coupled so tightly to the Wifi drivers? I'm assuming this is for communication with local iPhones for handoff purposes but I'm still surprised this requires special interaction with the hardware and doesn't gracefully fall back to just talking to the backend if driver features are unavailable.
I'm not a specialist, but there's a chain of trust you need to maintain to have the full set of features. If it's ever broken, you're sent to the gulag. I broke the chain when reinstalling like an old-timer on my M1 Macbook Air and was then forced to enter my password twice to unlock the Mac.
I had to reflash with a second Mac to restore the chain.
I remember reading (perhaps here on HN) that Apple does weird/nonstandard things to wifi packets to enable Continuity/Handoff features, so it could be related.
The application can see that the kernel is tainted and refuse to run. Similarly, some kernel-related functionality may be disabled. None of this requires iMessage to run in the kernel or have a module of its own.
Now it's the other way around: you hack Linux onto Apple Silicon (Asahi Linux).
For me buying a new Macbook Air or Pro is definitely made more palatable knowing I could turn it into a Ubuntu laptop down the line.
And if the Asahi team keeps kicking ass, we may even use Asahi to run Windows games with Proton inside Linux Steam. Thus replacing the Bootcamp partition for those who dual booted to play Windows games.
I don't miss Hackintosh. The one build I made it all looked like it was genuine macOS, but you could feel it wasn't the same. Photoshop felt more laggy. Unless you used an equivalent iMac beforehand as I did you might not notice that something was off. That said this was like ten years ago.
Even if it works well "Hacking" isn't worth it imho, whenevr there is the slightest lag or issue, you just never know. Is it because of the hackintosh, is it a genuine bug in macOS? Is it my hardware? Too many unknowns.
Some of the first Hackintoshes, and the origin of the term "Hackintosh", were from the practice of putting Mac motherboards in to non-Mac cases, often using non-Mac hardware. For instance, "Macintosh Repair & Upgrade Secrets" showed me how to use a TTL / Hercules monitor with a motherboard that came out of a Mac Classic with a broken picture tube. I used that machine for years.
> While I knew about and even tried various very early attempts to run macOS on non-Apple hardware [...]
Running System 6 / System 7 / Mac OS 8 on Amigas was also popular back then, both legally (by buying Mac Plus ROMs) and not necessarily legally (by loading ROM images from disk). If you had an Amiga with a PowerPC accelerator or a PowerPC BeBox, you could run PowerPC Mac OS, too. Early attempts to run macOS on non-Apple Intel / AMD hardware had plenty of precedent ;)
Will Hackintoshes be made using ARM computers running modern macOS? It's hard to say for sure, but considering how clever Opencore and other communities are, and considering how much can be done to virtualize / emulate hardware presented to virtual machines, I'd be willing to bet we'll see macOS VMs running on ARM systems at some point.
Have you read Bob Brant's Build Your Own Macintosh and Save a Bundle [1]? The man was a proponent of having the only Apple part in your build be the logic board, and using as many commodity PC parts as possible – I remember one of the builds started with a Macintosh SE logic board that ran "headless" with a (Radius?) video card in a generic AT/XT case, with everything else (drives, PSU, etc.) commodity PC. Pure wizardry, and he somehow managed to make the numbers add up cheaper than first-party Apple even when adding accelerators to your build.
It can still be useful for those looking to run older versions of macOS for some reason or another. If there’s PPC applications that one wants to run for example, you can piece together a Snow Leopard hackintosh from used parts that will run PPC apps through Rosetta faster than any real PPC mac ever could while be also being easier and more cheap to maintain.
From time to time I’ll consider building such a box as a time-frozen “zen machine” that runs OS X 10.6 or 10.9, is disconnected from modern distractions, and will never be subject to the disruptions that software updates can bring.
Now that their MacBooks come with 120hz screens with acceptable response time (unlike their early 120hz screens), the value proposition for hackintosh isn't as alluring for me. Previously, I've been worried about the T2 chip and the trend of Apple locking down MacOS, which also turned out to be less of an issue that I thought. The only area that saw significant retreat in macos is gaming.
> The only area that saw significant retreat in macos is gaming.
Mac gaming is probably getting better thanks to wine, crossover, GPTK and Whisky [1]. I am not a gamer but I have seen others playing serious Windows games like FF7 remake (not sure if that counts) on mac.
The problem is, significant portion of "real games" used to run on macOS, and all PC games used to run on BootCamp. Now native mac games are all but extinct and cross-platform toolkits seem to be very hit and miss depending on the games (for now).
Sure, nothing beats bootcamp but that is not strictly macos. Apple's GPTK released last year seems to have greatly advanced gaming compatibility. Probably lots of games still don't work but it looks promising and is getting better. Hope Apple can continue to put resources into that.
I do hope that they would steer some of their resources from Apple Arcade into cross platform porting toolkits.
I think the fundamental problem still remains that games unlike softwares are media and cannot be substituted with equivalents. By pushing their proprietary tech and neglecting native macOS ecosystem over the years, Apple has willingly pushed themselves in to the same corner as with console makers where they cannot compete with the value proposition of PC because of the overwhelming majority of exclusive titles that only run on PC. It's either all or nothing in terms of game coverage, because that's what ultimately allows consumers to "buy one device for (mostly) everything" for a hobby that takes significant upfront investment unlike netflix and hulu for example.
I used to tinker with building nforce4.kexts for OSX Leopard. I got everything working including the Realtek HD Audio thanks to a pcid injection. Snow Leopard was the last time I was able to build for nforce4 boards and we moved onto Intel gen 6 LGA1151.
This was back when NVidia and Apple got along. GeForce 900 days. SLI was a thing. And it worked on my drivers. Sadly, I had kids and grew out of it, got old(er), and now only use Linux because aarm64 killed hackintosh.
> Many will tell you that buying Intel-based hardware from Apple is buying obsolete models.
A strength of Intel-based Macs is they can run Wine/CrossOver. This is very good for people who really need a Windows app for their job and also need to minimze risk of a ransomware attack (which is way higher on Windows).
Intel Macs running MacOS 17 also are great for retro gaming through running Win32 games with CrossOver/Wine, also platformer games with OpenEMU.
Bad news for the indie music scene using these, where they just want to be able to run their VSTs and DAWs. Some will switch back to macs, so worth it for apple I guess.
This is the one 'creative' use case I know of where Macs actually are better - the lag and latency issues on Windows still seem pretty bad. I spent the last week shopping around for an audio interface and every single model had people complaining about driver issues and latency and "random loud static" or other crazy things when using Windows. Always Windows. Crazy stuff.
> This is the one 'creative' use case I know of where Macs actually are better - the lag and latency issues on Windows still seem pretty bad.
I don't know where this is coming from. It certainly doesn't apply to the actual driver latency of higher end and professional interfaces. Most of these actually have a slightly smaller latency for the same buffer size with ASIO then they do with CoreAudio.
The one thing that can cause latency issues are GPU drivers, but there's ways to fix that.
One of things that really annoys me about Windows is that there doesn't seem to be a way to capture audio from a single application even though the audio mixer can clearly change the volume. But if you want to capture, it's only the final mix output for everything (so you better hope that there's no sudden notification sound from anything). Or you'd have to use a virtual sound card like VB's Cable/Voicemeteer, which requires the application to be able to select a specific soundcard.
Arguably this isn't relevant for audio production, but it does make capturing applications a huge pain. (And if I did miss something and there is a way to e.g., have OBS capture the audio of a specific Window/Application, I'm all ears!)
Oh, nice, thank you! That is pretty recent, so I missed it when I setup my stuff a year ago, but that'll teach me to read the patch notes in the future :)
Linux ships with a realtime kernel patchset these days, so the only issues there have to do with hardware- and proprietary VST support. With the newer Pipewire audio server you don't even need to set up JACK.
Except that Apple breaks audio applications with every new release of macOS and has for years. And, it's just a matter of time before Apple kills basically every DAW and VST when they finally kill off OpenGL on macOS.
Edit: What I have said is true here, so I'm unsure why the downvotes.
Curious, why would OpenGL break VSTs? (And yes, I get that AU is the preferred way over VST2 or even VST3, which is why most modern plugins are available in AU as well. I just don't see how OpenGL plays into this.)
I learned this running them on Windows in a VM. I would have expected it to fall back to software rendering like the old days, but no. I ended up installing some build of Mesa for Windows.
I used hackintoshes for a long time. I’m done with dealing with kexts and other such nonsense. Life is short I don’t want to spend all of it dealing with some silly technical administration. Windows is good enough for every piece of software I want to run and I have an old MacBook for the odd thing that windows doesn’t support. My windows machine dual boots into Linux for the extra exceptional thing that I can’t
do with windows.
I’m sure Apple doesn’t love the fact that Hackintosh exists but I’ll tell you what. After installing it on my gaming PC 15 years ago, i loved it and bought my first Mac shortly after. It was a previously owned Mac because I was still getting my feet wet and didn’t want to spend a ton. But then I bought a MacBook Air. Then iMac. Then iPhone. Then 5 more Macs. And now, I’m hooked. Hackintosh was a great gateway drug.
For me it's just that macOS isn't a desirable OS anymore. Over the years Apple kept changing things that I preferred the way they were. So I'm done with it. I use KDE now.
No denying that windows these days isn’t stable. Indeed it is. My biggest issue is all the third party crap I never asked for gets installed with it. Not to mention all the Microsoft services that I don’t want to use, but still manage to be there. Like OneDrive. Sure one can uninstall it. But then see the mess it leaves with the way files are saved in the documents directory.
Even when setting up a Windows 11 VM , I usually have to spend an hour just removing stuff, disabling things and multiple reboots just to trim things down.
yeah, I've used plenty of community made tools to de-bloat windows. But that's not the point. We shouldn't have to do that. Especially when its a paid windows license, I shouldn't have to spend time dealing with Microsoft's effort at further squeezing out more revenue from their OS platform.
It was even worse in the Win95/98 era. I remember reinstalling the entire network stack multiple times in one day just to get TCP/IP working. The operating system was an extremely broken piece of shit and wasted days of my life.
Shoulda this. Shoulda that. This is just the reality of Microsoft that we have to accept, and it will never change. There are tools to deal with it. So I use the tools and move on with my life.
If you set your language to 'English (World)' during the install, none of the crapware is installed.
Or at least it wasn't last year when I installed W11 (I still occasionally need Affinity Photo and Capture One). Microsoft might have realised they're missing out on a few pennies and plugged the gap.
Windows 2000 was the pinnacle of Windows. Rock solid, and that was before they broke the search function (when it actually still actually searched in files rather than an incomplete index - thankfully, grepWin can be installed) or when they dumbed down the Control Panel.
Yes! and even if you search for foo.txt, it will also display foo_something.txt even though you didn't search for a wildcard (foo*.txt).
And then you have that "fantastic" UI that helpfully tells you that the file is in "C:\Users\something\Documents\..." regardless how large you make the window. Who's brilliant decision was it to truncate the locating folder without any way to resize the column and actually see the full path?
Anyway, just giving a shoutout to grepWin again, it's one of the first thing I install on any Windows box while hoping that everyone involved in the Windows Search "experience" steps on a lego brick every day of their lives.
I also wholeheartedly agree: Windows 2000 was the pinnacle of the Windows NT line before Microsoft merged the consumer line (3.1/95/98/Me) with the professional line (NT) beginning with Windows XP, which unfortunately added all sorts of annoyances to Windows. The underpinnings of Windows are fine and are quite a formidable alternative to Unix. WSL has also been a major game changer, allowing me to have a Unix workflow without loading up a separate VM. It’s just a shame the upper layers get in the way, though the Pro and Education versions of Windows are less in-your-face with these annoyances than the home versions. I’d love to have a Windows 2000 UI (with a search bar, introduced in Vista) on top of Windows 11.
I really don't understand where the nostalgia for XP comes from. Well, actually I think I do because a lot of home users probably didn't use NT4 and 2k and went straight from 98/ME to XP. But I remember all the ridicule that XP got for being such a terrible bubblegum OS X imitation, with required activation, and a bunch of stability and driver issues that were eventually ironed out. XP after Service Pack 2 was also rock solid, and probably still the best choice for a retro gaming PC because it's got decent hardware and software support and the activation has been worked around.
But yeah, I used every Windows version since 3.11 full-time, and 2k was perfection - literally can't think of any downside to it.
2k is pretty great, but fully patched XP is too. It’s totally subjective but as much as I loved 2k I’d give an edge to end-state XP mainly for its ability to be customized with third party .msstyle themes, of which there were many that were well made and good looking.
Fully patched 7 is a bit better yet for me though, because its theming engine added support for full depth alpha which really opened up possibilities for theme designers. It was a massive disappointment when Windows 8 came along and gutted the engine, regressing it to being barely more capable than Windows 1.x with all the flat squares.
Imo, it's much more of a proper Linux experience than MacOS. e.g. all the filesystems stuff is there like /proc, you don't have to deal with BSD/Linux differences, zsh/bash compat issues etc.
macOS Unix compatibility is an oversold feature, and it's unlikely for anything made for Linux to work on it unless it's specifically ported.
That being sad, a lot of the dev community own macs, so this support usually exists.
I mean with Hyper-V why even WSL and just run VMs of whatever other OS'es you want? Tinkering with the OS these days is just so much different than it was in the past. Trying to dual boot Win/Linux back in the day was a interesting challenge that might leave your disk corrupt, now it's a question of why do that at all? Hacking smaller platforms like the pi that are cheap seems to get more attention than PCs these days.
When I first delved into programming, I was under the impression that OSX was necessary because most programming video tutorials were recorded on a Mac. This led to a minor obsession with acquiring one. Unfortunately, financial constraints were a significant barrier, leading me to explore Hackintosh as my sole option. Countless days and nights were invested in making it work properly. Despite the challenges, the learning experience and the satisfaction of eventually getting everything to function smoothly made the entire process immensely rewarding.
10/10 would do again, if I were 14. Now I am way older, 3 macbooks around and wish my job would let me use Linux.
I feel like hackintosh virtualization is a better investment of time. Currently it's onerous to run Apple OSes on anything but Apple hardware. Being able to spin up a hackintosh VM in any cloud provider would be pretty sweet. Of course that probably violates Apple OS terms of use so not sure if AWS would shut you down if they discovered people were doing that.
But actually, as others have pointed out, I'm much more interested in Linux running on M chips than Mac OS running on non Apple hardware. There's nothing particularly compelling about Apple's OSes (except maybe their new VR sorry I mean "spacial computing" OS)
Possible they’ve gotten some “out of spec” capabilities wired in? Trying to imagine what for… Securely bridging local adhoc networks with the internet?
Apple’s always done some unique things with WiFi/Bluetooth. Macs have long been able to keep Apple-branded bluetooth keyboards and mice usable even before the OS initializes for example, and if you use one of a few Broadcom chipset BT/Wifi cards that were used in real Macs in a hackintosh, that capability extends to those too. It feels weird being able to navigate BIOS/UEFI with a Bluetooth keyboard.
While I never jumped on the Hackintosh bandwagon, I had many friends who did for almost a decade. They built systems that ran macOS for considerably less when it came to the price/performance ratio.
Nowadays, those same friends are all using Mac Studios because the price/performance ratio for running macOS is better. I believe this is one of the major factors to why the Hackintosh community is dying today (not just changes to drivers and the macOS codebase as the author suggests).
Running macOS on Apple Silicon is very fast compared to running it Intel. This is due to the RISC architecture and tight integration of the components in the SoC. To run it as fast on Intel would cost more than the price of a Mac Studio.
Can you point to any source supporting that point? All the benchmarks I’ve seen say the opposite, a Mac Studio wins on power consumption but is way more expensive compared to PC parts
In the olden times this used to be called the Jade plan:
I don’t really complain. I had a good run which helped me skip over the worst price/performance Mac lineup that I remember. There’re now plenty good choices within the current crop of M1 / M2 / M3 machines and I’ll be following eBay closely for a good used Mac mini / studio models. Or maybe even splurge on something new.
I've been doing it for ages as Apple hardware holds its resale value exceptionally well. You can use the often exaggerated price premium to your advantage - buy brand new default config at an opportune moment in the upgrade cycle, sell just in time for a coveted new release.
This works even better if you happen to be traveling somewhere where Apple devices are unusually expensive.
I'm still on the M1 Air and will likely sell it just before a M4 release 12-18 months from now. Cost of ownership averages out to <$0.25/day.
For a lot of use cases except hardware dev, a VM is sufficient.
Arguably you could (can already) run MacOS via VM on generic hardware, the same way retro systems are (which could be partly in hardware, e.g. FPGA performing functions).
This would only be bad from a performance perspective, from a maintenance perspective it may make life a lot easier as you've moved a hardware compatibility problem into software.
You're no longer having to add compatibility workarounds for hundreds of pieces of hardware, just target one VM.
If you really needed the performance you'd then have 3 options that I see, pay for a real Mac, pay for a machine with x2 the performance of a real Mac (which limits you to mid range machines at the moment), or spin up extra VMs as required on hardware you have lying around that previously you couldn't use.
I think the machine code would run (maybe not Rosetta2, since it uses weird extensions) but the issue is the weird boot sequence. You'd basically have to do the inverse of what Asahi Linux is doing to get it to boot on a machine with ACPI or another bespoke boot system (ala. Raspi).
Perhaps author wants to follow OpenCore guides more closely, especially the hardware buyer's ones - the experience could have been vastly different. There are a lot of (cheap!) wireless PCIE cards that both work on Windows and macOS (they are refurbished Mac Pro parts), I've used one for years and iServices have always been working besides the first hiccup when configuring the system for the first time. Eventually, the setup died as I moved to MacBook Pro and cleared out the NVME storage, but when it did - the experience was flawless (that is, after figuring it out).
The Hackintosh will eventually die with Apple moving away from x86_64 completely.
But that day is not today. Eventually, someone might even manage to continue on the tradition with upcoming ARM64 PC platforms.
I think the switch to their own arm based chips should have been the writing in the wall for the end of the hackintosh.
I think we're at a point where we are going to wind up needing Linux more and more. My kid is running IIRC 2014 rMBP that I passed on to her a few years ago. While it still runs, it's long passed it's last supported update and now even forced unsupported updates are going to stop working. I'm not sure where the gesture support is in Linux at this point. Google maps in Chrome is probably the best demo experience.
I'm looking forward to Cosmic in Pop!OS from System76. I've preferred their take on UX for a while.
It'll be sad when it dies. I remember running Snow Leopard on my netbook and taking it to WWDC. It was sacrilegious but quite the conversation piece; I remember lots of Apple engineers being amazed it worked so well.
i cant imagine why would anyone want to install macos on their hardware unless they have to. macos built in software is mediocre at best, but most of it is utter crap, take finder or preview as an example
I think this article reaches a valid conclusion based on the wrong evidence. It claims hackintosh is dead because they are dropping Wifi drivers. Hackintosh is dead because the number of major versions that will support x86 are 2-3 at best and after that, the OS will only run on apple silicon. Maybe if we’re lucky an arm chip similar enough will come along but I don’t see many motherboards socketed for arm chips. It is dead, but not because of WiFi driver support.
Was building hackintosh' since 2011, mostly intel based, my i3 h370 is still working strong. I was running macOS till the "M" series came out and the next day i swapped it out. Gifted the hackintosh' to my desinger friend. I do miss the days when i had time to figure out each and every bit of things needed. Nowadays, i am running windows.
Edit: Left building hackintosh' in 2019 before covid hit. Been building them for friend and family.
This article is nonsense. If "I've had compatibility issues with my hardware" is proof of Hackintosh dying then it's been dying from day one.
If there are genuine issues across the board Hackintosh software is generally updated to patch the issue, it's always been like that and only improving over time.
Personally I'm still on 10.14.6 and will probably never upgrade to 11 since it and its successors suck so hard.
Apple never licenced their OS to run on non Apple hardware, with exception of when it was on death bed shortly before the NeXT reverse acquisition, and Steve Jobs killing all agreements with clone makers.
I don't get why you would ever want to do a Hackintosh this way... Apple has great hardware but ABYSMAL software. I prefer Windows or any of the top 5 Linux distros on any day that ends with y. They are being carried hard by their top-tier CPUs right now.
Something drastic needs to happen to the software side - as it is, it is almost an unusably bad experience to simply browse the web and move files around.
Now if we could have Windows running on an M3 chip with the nice touchpad and battery, that would be really nice.
I spend most of my time in a shell, so MacOS being POSIX compliant is a huge draw for me.
What difficulty do you have browsing the web? I just click Safari and it works. Though I usually have FireFox and un-Googled Chromium running as well... and they work just fine.
I generally use shell commands to manage files, but, dragging works just fine for copying and moving them. Certainly as well as it does in Windows.
Truly, I can't imagine what you experienced that was "unusably bad".
MacOS has some quirks for sure, it's far from perfect. And I'm not a huge fan of a lot of the changes they've made over the years. But I am a big fan of some of them.
On the other hand, despite massive improvements to Windows security and stability over the years, I do like using Windows. (And yes I realized things like WSL exist).
> it is almost an unusably bad experience to simply browse the web and move files around
Cannot relate at all. "Move files around" is essentially the same on Windows and Mac, except on the Mac I have a UNIX shell. Browsers also behave exactly the same on every platform, and Safari is snappy and the least memory-hungry of all. What is it about?
Have you ever needed to perform a reliable recursive directory copy between two drives on Windows? I have and it turned out to be a comically complicated task. Robocopy helps but als has its edge cases you need to handle. Also long path names become problematic (MAX_PATH etc).
By default there is a path character limit of 260 characters, although there is a method to increase it. So if you try to copy something with a long filename that is many nested folders deep, it will fail. In one office, I had a coworker who used very descriptive folder/subfolder names for everything, and he constantly had this issue.
The problem is that lots of older software allocates fixed size path buffers (mostly on the stack) that uses the MAX_PATH macro (which is set to 260). Fixing this requires recompilation.
No it doesn't. Try setting up Windows CI for a large code base, good luck. I couldn't believe this is an actual problem in 2024 either, but unfortunately it is, depending in what tools you use (Visual Studio and most Microsoft Dev tooling works great, anything cross platform is hit and miss).
Finder is the same crap software it's always been. Windows Explorer has always been better. Nobody except the nerdiest of nerds would want to use the terminal for "moving files around".
>Safari is snappy and the least memory-hungry of all. What is it about?
MacOS is a memory hog in itself. Safari is the laughingstock of browsers, so behind the times and purposely crippled by Apple.
I used to love it when it was still a capable unix with a good UI. At the same time Linux was a horrible mess, none of the desktop environments were passable.
I loved it until Tiger and Snow Leopard. After that it started going downhill. More and more features I really wanted were being deprecated (like the ability to have virtual desktops in a multi-dimensional grid). This was the first big thing that really broke my workflow and I have regretted it ever since. More and more UI things were pushed through I didn't like. The fullscreen mode became (and still is) horribly incompetent. Apps were becoming more iOS-like, dumbed down.
I put up but instead of looking forward to every exciting new OS update, I started worrying about what feature I used would next be removed or mangled beyond recognition. Eventually the negatives added up so much I left the platform entirely. I went to KDE, because that had become a powerful and configurable desktop environment through the years. I finally have my virtual desktop grid back and things are so much better for it. I found that opionated software doesn't work for me (for this reason GNOME won't ever do either). The only reason I thought it worked for me was that OSX's designers had roughly the same opinions as me. But over time this changed.
This was not a coincidence. At the same time Apple changed from a computer company to a lifestyle brand. It started appealing to the masses which started with the iPod but really kicked off in full gear with the iPhone. The Mac is really just an iPhone accessory now. Microsoft has been making attempts at becoming a lifestyle brand too, with hilarious incompetence :') Only their xbox division gets a tiny bit of the way but their main marketeers are such business suits that will never understand 'cool' even if it bites them in the ass.
Oh well.. I still use it for work because it's slightly better than Windows. And our company's Windows desktops are locked down too much.
I strongly disagree re:Apple software. We must have drastically different usage scenarios, since I find it a pleasure. What software do you have issue with, and why?
Homebrew is a nightmare. Nearly every development tool on macOS requires some sort of workaround, usually found in the depths of forums or StackOverflow. Apple has also positioned macOS to be the absolute worst platform for graphics libraries. They only support Metal and an outdated version of OpenGL which they'll remove entirely at one point. Windows directly supports DirectX, Vulkan, and OpenGL.
Go ahead and try to rename iTunes because there's no other way to keep it from opening when your non-Apple Bluetooth headphones connect. Good luck.
There's not even a built-in way to uninstall programs in macOS. It's bizarre.
Or the fact that macOS doesn't implement basic protocols for external monitors, making macOS work terribly with non-Apple monitors.
> There's not even a built-in way to uninstall programs in macOS. It's bizarre.
You literally just drag the app to the trash can. Properly sandboxed Mac apps are a delight to uninstall.
Yes, some apps are more difficult, but those are usually Windows apps that are crudely ported to MacOS and that's on the developers for not creating proper MacOS apps.
Yes. Everyone knows that. But that doesn't uninstall the application. It just deletes the top files. It doesn't remove any caching or configurations or other files in other parts of the system like a Windows uninstaller does. To do that on macOS, there are third-party apps that provide this functionality.
This is by design. Mac apps don't leave tons of trash around like their windows counterparts. Only some config files, always in a standard location. So when you reinstall, everything just works. Your data lives in iCloud or the documents folder, and is not meant to be deleted when you uninstall.
Not in my experience, the majority of the time. Those that don't are usually Linux ported apps that don't utilize the install functionality properly. The point is that the OS provides the tools to do so.
Not removing settings is in fact the standard and expected behavior for MSI-based installers (i.e. those using OS-provided tools). The framework is very paranoid about tracking the origin of every artifact (files, registry keys etc), and removing something in the uninstaller that installer did not add is considered a huge no-no. It is also what users generally expect - if they uninstall and later reinstall, they want their settings to be there.
Providing the ability to remove settings is something that has to be explicitly implemented (because the installer infrastructure cannot track files created outside of the installer), so relatively few apps actually do that. Those that do pretty much always do it as an opt-in. In my 30 years of Windows use, I don't recall a single example to the contrary.
Installing is varied on macOS, but there's certainly a default way to uninstall applications - just drag the application to the Trash. That said I have some sympathy for your complaint, since things that have you run an installer can sprinkle themselves all over the filesystem, and though they leave a trail, there isn't a standard way to reverse that (I find AppCleaner pretty handy for removing all the parts in those cases).
Well that's fine, but Windows isn't Linux but everybody treats it like such, hating on it for not being Linux while people often praise macOS for being "Unix". For macOS, it doesn't have the installer system that Windows has, so solutions like Homebrew are created to try and graft Linux things on it. Usually, the trouble is that Apple has made some asinine decision with the default tooling installed or some other strange limitation.
And because Apple is constantly breaking software, it creates a lot of churn on macOS. From PowerPC to Intel to Arm, from ObjectiveC to Swift, from Cocoa to Metal, etc., they're constantly upheaving the ecosystem and OS. Meanwhile, it provides very little to the end user other than normally increasing the size of Apple's walled garden.
> Well that's fine, but Windows isn't Linux but everybody treats it like such, hating on it for not being Linux while people often praise macOS for being "Unix".
Because even being partly Unix (or technically, full Unix) is easier to deal with than Windows entirely different stack. ;P
> For macOS, it doesn't have the installer system that Windows has, so solutions like Homebrew are created to try and graft Linux things on it.
No? Homebrew was just the creation of someone who didn't like the way MacPorts did it back in the day. MacPorts is literally the FreeBSD ports tree cloned to work on macOS, and back in the day was partly funded by Apple.
(MacPorts arguably worked around the macOS-isms better than Homebrew does, but people chased Homebrew and here we are.)
> Usually, the trouble is that Apple has made some asinine decision with the default tooling installed or some other strange limitation.
Unless you can actually note one of those choices or limitations, this isn't much of a point.
> And because Apple is constantly breaking software, it creates a lot of churn on macOS. From PowerPC to Intel to Arm, from ObjectiveC to Swift, from Cocoa to Metal, etc., they're constantly upheaving the ecosystem and OS.
PowerPC to Intel to Arm happened over the space of multiple decades, and "Cocoa to Metal" doesn't even make sense in the context of tech stacks.
Your entire comment is just complaining because an alternative solution doesn't fit your built-in mental model of how things should work.
> Go ahead and try to rename iTunes because there's no other way to keep it from opening when your non-Apple Bluetooth headphones connect. Good luck.
I have non-Apple bluetooth headphones (Sennheisers) and this isn't something that happens on my M1 MacBook Pro. Is this a common issue for other people?
Huh. Weird. I've had my sennheisers for years, through multiple OS updates, and I've always been surprised by just how well using them works on my macbook pro. Until this thread, I had never even heard of the problem.
Could you elaborate? I find web browsing and moving files around to be practically an equivalent experience between mac os, windows, most linux distros.
I see sentiments very similar to this on Reddit and some other message boards. Generally the user posting them cut their teeth on Linux or Windows, and has an affinity toward the ux conventions you'd see there. Macs have different ux conventions, not bad ones, just different, and it's not what the poster is expecting.
Some call it baby dick syndrome, The user has imprinted the conventions of their first operating system on themselves, and assumes that they are universally considered "best"
I meant to say baby DUCK syndrome, as in how baby DUCKS imprint on whatever the first thing they see as their mother. Probably too late to edit it to reflect that.
> Something drastic needs to happen to the software side - as it is, it is almost an unusably bad experience to simply browse the web and move files around.
I used both Windows and Mac regularly for years, and I have no idea what you're talking about.
Apple released the last major revision of the cheese grater Mac Pro in 2010. If you wanted a Mac with exotic features like a new CPU and more than one internal hard drive in 2013~ then Hackintosh was the way to do it
I'm yet to be convinced there is a single use-case other than iOS development/publishing for someone to want Hackintosh rather than simply installing a well-supported distro like Ubuntu or Fedora.
I tried it a few times and it was always a painful and substandard experience.
Other than trying out the MacOS for the first time to learn how bad it is, why would anyone make a hackintosh? Windows and Linux are infinitely better operating systems, more open, with better backwards compatibility, more hardware support, independence from vendor servers and more available software.
A reminder that with MacOS you need internet connection in order to re-install the OS as it requires activation just like iPads and iPhones. Imagine one day Apple stops supporting your Macbook model, shuts down its activation server and your computer turns into brick after something goes wrong and it requires a factory reset.
One thing to consider is that a lot of what some consider “bad” about macOS is purely subjective and varies depending on the user’s background and mental models. It’s not uncommon for people who grew up on Macs to find operating systems with Windows-esque desktop environments as “bad” as some find macOS.
macOS installs don’t require an internet connection or activation, I’m not sure where that came from. Macs registered with iCloud can be remotely bricked with Find My but that’s completely separate and fully optional.
Yes, Tim Cook could flip a switch and my mac would become activation locked. Considering that Windows 11 has been working really hard to sneak remote attestation below our noses (and other stuff), I think it's safe to cross out Windows as well.
As long as Microsoft wants to keep Windows compatible with user-controllable hardware (like computers that let you disable secure boot and TPM or enroll custom keys), there should always be a way to debloat Windows.
Microsoft doesn't care that much about user-controllable hardware, not as much as they used to. Their partnerships with OEMs have grown very deep and they managed to push Pluton for any device that wants to be certified for W11. They could go a few steps past this in a few short years.
True, Windows will never be as locked down as macOS that only runs on Apple designed custom ARM hardware. I guess my skepticism comes from my expectation that my Windows computer should be able to run games (unlike my macbook which holds personal data and work), and remote attestation is going to be used first in anticheats.
I used to love macOS in the 2000s and 2010s. I never made a Hackintosh but I was always intrigued by them. Before WSL was introduced, the Mac was the best platform for people who needed to use proprietary software packages such as Microsoft Office and the Adobe Creative Suite while running Unix. There was (and still is) a lot of native software on the Mac that is well-polished, such as OmniGraffle and Keynote.
Times have changed, though. While macOS still provides a more consistent user experience, IMO, than Windows or Linux, Windows with WSL means I can run Microsoft Office and other proprietary apps alongside a seamlessly integrated Linux environment without needing to SSH into a VM. The popularity of Electron apps undercuts the Mac’s consistency while also making Linux a more viable option since Linux can run the same Electron apps macOS and Windows do. Microsoft Office is now available as a Web app via Microsoft 365; while I prefer the macOS and Windows versions to the in-browser version, the in-browser version gives Linux users access to Office. I also believe macOS’s Unix environment has not kept up with advances made in the BSD and Linux world. Windows can be quite annoying with its notifications, but unfortunately the Mac in recent years also has annoying notifications; I know this because I use a work-issued MacBook Pro regularly.
In my opinion, the most compelling reason for a Hackintosh in 2024 is for Intel Mac users reliant on Mac software tools to still use them without being restricted to Apple’s hardware. The 2019 Mac Pro is still very expensive, and Apple’s ARM lineup requires paying substantial sums of money for RAM upgrades with no workaround since there are no DIMMs.
Even if you try to install it using a flash drive, it still asks you to connect to the internet to "verify" the installation.
https://sneak.berlin/20201204/on-trusting-macintosh-hardware... explains how it's ensured on the firmware level that you really connect with Apple servers.
I see your points, but I don't want to make compromises for my daily work based on a scenario that's unlikely to ever occur. If the apocalypse comes, I'll gladly use Ubuntu, but in the meantime I'm ok with not reinstalling my OS when I'm somewhere without internet.
Hardware support, sure. Backwards compatibility is a double edged sword though. While it's awesome to have it's also the reason why parts of Windows feel so dated and inconsistent.
> While it's awesome to have it's also the reason why parts of Windows feel so dated and inconsistent.
I'm not convinced.
What Windows could do is make the old components available for old software, while directing all new software to use new components. Old software will feel dated and inconsistent, but the alternative is that this software would not work at all. If you don't install old software, you'll still have a perfectly seamless experience.
I understand that backwards compatibility is the reason Windows still has two control panels. However, if it was up to me, the legacy control panel would be completely hidden from the UI until the user installs some software that uses a custom control pane (or something).
I mostly don't understand why this hasn't happened.
Now that I have more disposable cash, but waaay less time, I couldn't imagine "wasting my time" doing this sort of thing. These days I want to -use- the computer, not spend time trying to convince it to work.
Incidentally it's the exact same journey with my cars. 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.
Hackintosh served the purpose for its time. It'll be fondly remembered. But I think the next generation of tinkerers will find some other thing yo capture the imagination.