Hacker News new | past | comments | ask | show | jobs | submit login
Hackintosh is almost dead (aplus.rs)
549 points by ingve 59 days ago | hide | past | favorite | 527 comments



Back in my youth, when I was time rich and cash poor, this kind of tinkering was fun and a good way to improve the machine I was using.

Now that I have more disposable cash, but waaay less time, I couldn't imagine "wasting my time" doing this sort of thing. These days I want to -use- the computer, not spend time trying to convince it to work.

Incidentally it's the exact same journey with my cars. 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.

Hackintosh served the purpose for its time. It'll be fondly remembered. But I think the next generation of tinkerers will find some other thing yo capture the imagination.


People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:

> Linux is only free if your time is worthless!

Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

So yes, I may just want to turn the key and have my car work. But when it doesn't, I often wish I was that guy that had tinkered with my car, so I can better understand what was wrong, and whether I can fix it myself or if I needed a professional.

I run Linux on all my machines, and my family generally uses Mac (both sides), but all those years tinkering with Linux, they still come to me for help with their Mac machines that they insisted would Just Work.

All that out of the way, I agree with your fundamental premise: hackintosh is likely in the rear view mirror for the next generation of tinkerers.


I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out. With open source software, simple electronics, old cars, fabrication and woodworking, the time I spend learning feels worthwhile.


Even this "I hate investing time in proprietary technologies, because I know I can be stopped or locked out" is a hard-gained insight. Hackintosh is one of those things that made me understand this. Nothing like spending weeks to get your hackintosh working smoothly with all the hardware just to find out that the next update breaks everything. I've come to see it as a necessary part of the journey


This is my current state of thought. Proprietary software perceives me as an enemy who needs to be locked out of as many features as possible to allow for more money to be extracted out of me while also investing the least amount possible back into the product. The only timeframe where proprietary software is groundbreaking and at the forefront of technology is when they have not yet captured and locked in a large market share.


In my experience, doing a hackintosh actually teaches you that Apple hardware is not that special and macOS works only because they make it easy for themselves.

Then it becomes clear that if you don't really have an absolute need for macOS it is not worth the trouble since Windows/Linux actually make better use of the hardware with little trouble in comparison. By extension you develop a feeling that desktops Mac are really overpriced and don't have much of an advantage in the Apple Silicon age, since efficiency don't get you much but the performance delta for a given price is insane.

In fact, buying a PC that is equivalent to a base Mac Studio will cost you 1k euros less, even if you go with "nice but not that necessary" things (especially for a personal computer, like 10G networking).

But yeah, you also learn that it's better to not waste time trying to confort to Apple agenda, but that's also true for real Macs in my opinion.


This is a great point. I sort of detest becoming an expert at proprietary stuff, because I know they'll just change it before long. I've lamented about this elsewhere as modern software creating "permanent amateurs". Even those that want to invest in expertise often find their knowledge outdated in a handful of years, and those that don't want to invest can easily justify it by pointing out this effect.


Microsoft, at least before Cloud happened, supported their tech stacks with backward compatibility for decades.


Proprietary or not, tinkering help you develop an intuition of what might be wrong.


Meanwhile, the article is clear about how proprietary code absolutely prevented the author from understanding why the Wifi and Bluetooth failed with specific apps.


Yeah I mean, whoever made the original statement is just not an OS engineer.


> just not an OS engineer.

Or not just an engineer


I know plenty of people with stamps who don't care to fiddle with their OS or change their own oil. People who work on putting things in orbit and beyond, people who build bridges, people who design undersea robots and airplanes. They're most definitely engineers.


Yeah fair.


Nah I can believe they'd be a chemical engineer or even a software developer that writes iOS apps or something like that.


I wanted to say steam/power engineer, but even they understand the value of tinkering.


This is the reason I still buy older cars. I can't stand owning a car only to find out that I can't work on it myself. Even if I don't have the time or tools needed for a specific job, if its something I could do on my own it means the job should be that much easier and cheaper to have a mechanic do.


I fully empathize - and yet, there are benefits from tinkerers/hackers messing around on proprietary hardware/software. Hackintosh - and similar communties - led to projects like Asahi Linux, Nouveau, Panfrost, etc.


> I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out.

The problem with this approach is then you get a generation of engineers with tunnel vision thinking the One True Way to achieve your goal is the same way your GNU (or whatever) software did it.

Invest time in learning your technologies, whatever they are. There's valuable knowledge in proprietary stuff just as there is in OSS.


I agree with your point in principle, and yet I installed Ubuntu on my work laptop this January after using Windows professionally for my entire (5 year) career. I've found myself moving in the opposite direction from the person in the root comment, because I find that it's getting harder and harder to find tolerable proprietary software. It feels like everything is glacially slow, laden with ads and tracking, reliant on an internet connection for basic functionality, or some combination of the above.


"There is valuable knowledge worth learning in the technology" != "this is strictly better software on every axis and you should switch to it for your daily work"


As someone that learned to program on BSD and shortly thereafter, Mac OS X and Linux....

I honestly don't know how people use Windows machines as a dev environment 24/7. It would drive me mad. Everything's so wonky and weird. Everything from symlinks to file permissions is just backwards and fucky.


Back in the day it was alright because Microsoft gave you a fairly good dev environment in the form of Visual Studio, with the focus of it being squarely on desktop application development instead of tinkering with the system or running web services. It didn't stop people from doing it anyways but it's part of the reason why everything is so janky. Then the web took over and Microsoft tried for ages to make .net and Windows Server work until they realised they can't tune an OS that was never meant for backend development and just put all their focus on WSL. In the year 2024 there is almost no reason to be doing any non-desktop dev in a Windows environment unless it's on WSL. And you get the benefit of having an actually sane window management system and external display handling unlike MacOS, not to mention how nice PowerToys is.


I mean this in the nicest possible way: 5 years is likely not long enough for the “just work, stupid” desire to really, really, really set in. Nor is a couple of months enough time for the potential rough edges of desktop Linux to set in.


Given that I've been using Ubuntu on the desktop since I was 11, I'm not worried.

The reason I switched was because Windows didn't work. Win11's desktop makes early-2010s KDE look like a smooth, bug-free experience. My laptop (a 10th gen X1 thinkpad) was plagued with driver problems. At least twice a month, I'd have to reboot when I joined a meeting and discovered my mic wouldn't un-mute. Switching to Ububtu solved both of these problems, and I don't have to deal with an awkwardly bifurcated environment where a couple of my CLI tools run in WSL while everything else is native. Oh, and my Zephyr build times are a good 25% faster now.


After 17 years of using Linux I realized that I was tired of tinkering with shit, so I caved and bought a macbook air. Not even two years later I was back on Linux, because I realized that the amount of tinkering I do on Linux is actually very small; the experience I already paid my time for means that Linux is simply easy for me to use, while MacOS is a pain in the ass in innumerable small unexpected ways. The path of least resistance, for me, is to continue with Linux.


I work in IT, so I’m paid for my time to solve all kinds of issues with Windows. At home, such issues are unpaid work. Linux has the advantage of having issues be mostly of my own choosing. Stick to the golden path and you’ll hardly ever have them. And the easy configuration and recovery options allow you to jump into a new install with minimal hassle.

Everyone will have the same headaches with Windows as Microsoft’s choices are required these days. Millions of people have quite lucrative jobs solving them. I’d rather not bring work home so I run Linux.


I’ve been using Windows throughout my childhood and start of my CS career - now I use Windows for specific software (audio/music) and Linux for developing (about 8 years I guess). I had a 1-year stint with macOS because I was developing an iOS app, and have been the troubleshooter for people with macs at my previous job, so I consider myself somewhat ‘multilingual’ when it concerns OSs.

As a power user, Linux is just so much nicer. I constantly get frustrated, especially with macOS, about stuff that I can’t easily. In Linux my stuff works and if it doesn’t it can be made to work (usually). In Windows/Mac it’ll often take considerable effort to make the system work the way I want, or it’s just not possible.

I think with proprietary software ‘it just works’ is only a thing if you’re happy with the basic experience that is tuned to the average person. If you have more complex needs, you should be using Linux (and if you know your stuff or use the right distro, things will likely also ‘just work’).


FYI, Ubuntu is a heavily advertised distro. Its pretty bottom barrel for quality.

If you want a modern linux distro, try Fedora Cinnamon or something that isnt on Debian branch.


It is not surprising that you posted this flame bait from a throwaway account.

What is wrong with Debian?


[flagged]


It is _stable_, not outdated. You are practically guaranteed that if you’re running Debian Stable, and live only within the official apt ecosystem, you will not have software-based instability.


Grandma doesn't care about this. They just want their screen to work.


Debian Sid makes a better desktop distro than Ubuntu. The drivers are up to date, the instability is greatly exaggerated and installing nonfree codecs is easy (so easy with virtually any distro that it shouldn't even enter into the equation...)

This said, I prefer OpenSUSE Tumbleweed, which is rolling release yet more stable than Sid. Rolling release + extensive testing + automatic snapshotting gets you the best of all worlds.


Bruh. I use Debian and Arch interchangeably, barely notice the difference.


Yeah, that’s why when I update my Arch MacBook Air once a year or two it works well, but Debian dies and needs to reinstall for some unknown reason. Before that, I believed Debian is so very stable. My experience shows the opposite.


Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.

Like learn the proprietary tech in the environments it's intended to be used in but if you can't use it in that environment I personally wouldn't waste my time with it. With FOSS tech at least you can make the argument that you can learn stuff by maintaining it properly but with a proprietary stack in an unsupported and actively user hostile environment the best you are going to do is learn how to maintain a fragile truce with the software gods.


Peeling all the way all the politics / idealism from your comment and the value proposition between these two options is basically the same, with the difference being that on a proprietary stack there’s a higher chance of things breaking in a way that you low/no likelihood of fixing. It’s all good and well that it seems like this makes you personally want to throw up in your mouth a bit or whatever, but you are claiming objectivity that clearly isn’t here.


Yeah I'll learn as much as I absolutely have to in order to get my paycheck. Any more and you need to give me a raise.


That's not a good way to make money. It's not how FAANG pays people, and if it is how your employer pays people then you should always be learning so you can change to better jobs.

A funny thing about "never work for free" advice is that a lot of highly paid jobs (investment banking, high end escorts) are about doing tons of client work for free in a way that eventually gets them to pay you way too much when they do pay you.


I learn the interesting stuff, I just don't learn proprietary tech that I really don't ever want to be dependent on for my wages.

In fact most of the essential skills for my job I've learnt in my own time, and continue to learn. I invest my own money in equipment and training courses. I love learning. But only when it's interesting to me, not because it'll make more money to somebody else. If it'll make you more money, pay me.


> Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.

Security research. And, uh, applied security research.


This is a gross misunderstanding of the GP's point though. It's not that they are against doing any of these things. In fact, they said they were more than happy to do it in their youth. I am in full agreement with the GP's sentiment as well.

Mucking about and tinkering with things while one has the time, desire, and stuff to learn is a young "man's" game. I did all of that and absolutely learned a helluva lot. It did everything I needed from it. I got cheaper/better computer than what I could afford. I learned a hell of a lot about not just the hardware pieces I chose, but also why/how certain things about the OS that I never would have.

But now, I too just don't care. It was interesting, but I'm not that interested about maintaining an OS or how it works. I just want it to work. So for all of those that are willing to do all of that today, I'm all for it.

your comment came across to me as just another one of those "if you don't feel the same way i do, you're wrong". that's not true. people can just be in different places in their life. been there, done that does not mean you can't go there and do it too. we're just focused on different things now


There’s another perspective: even if OP is done, if we shut the door (or let it be shut by companies like Apple) then the currently-young won’t be able to tinker and won’t grow to gain the same knowledge.


They are free to continue that kind of work, it just gets harder. Look at Asahi Linux. While it might not be Hackintosh in the same sense, it is the same spirit. Hackintosh worked because the systems were built on commodity hardware. Now that Apple is using custom chips, they've definitely made it a bit more difficult, but in my experience that just brings out the really talented that step up to the plate to take a swing.


I agree that tinkering is a side effect of curiosity, and that curiosity leads to expertise, which has value.

I parleyed my curiosity in hardware into my first job. (My car-fixing skills alas didn't take me anywhere.) Hardware was fun for the first 10 years of my career, but now, well, it's just not interesting.

I played with Linux as well along the way, but I confess that too has dulled. Building your first machine is fun, building your 10th is less so.

The past couple years I've gone down the solar energy rabbit hole, and I'd love a wind turbine (but I just can't make the economic argument for having one.) If I do end up getting one, it'll be to prove to myself that it was a dumb idea all along.

In some ways we never stop tinkering. But the focus moves on to the next challenge.


> Building your first machine is fun, building your 10th is less so.

Building a Linux box led me back to Apple.

I had been using UNIX at home, school and work for several years, and decided it was time to build my 3rd Linux box. Went to CompUSA out of idle curiosity to see what equipment they had, and the only computer in the store with Internet access was a Mac.

I hadn't used a Mac since the SE/30 days, and I suddenly realized that the NeXT acquisition which I'd mostly ignored had changed everything. Why build a Linux box and be locked out of tools like Photoshop when I could have UNIX workstation that ran commercial software (for, admittedly, significantly more money).

Never looked back.


> Why build a Linux box and be locked out of tools like Photoshop

That's what VMs are for. You're never really locked out. It may not make sense to go that way if Photoshop is THE thing you work with of course.

> when I could have UNIX workstation that ran commercial software

Because for lots of software MacOS is a second class system. Partially because there's just no way to test things on it without investing lots of money in hardware, so many people don't.

If you're doing lots of sysadmin / software maintenance style work, MacOS just provides unnecessary pain.


> That's what VMs are for.

For me, the psychic angst of using Windows is much, much worse than any Mac-related inconvenience.


> If you're doing lots of sysadmin / software maintenance style work, MacOS just provides unnecessary pain.

Amazingly a significant amount of the software that you use on a daily basis, perhaps unwittingly, is developed and maintained with macOS and Windows!


I'm working on packaging things for Darwin platform. And helping people deal with homebrew/compilation issues. I'm painfully aware how much is developed by people with no access to or interest in MacOS. And unless something targets windows explicitly (not wsl), you can basically expect issues going in. In a twisted way, I'm one of the enablers of the current situation where things are usable on a mac.

Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.


Nothing to do with Linux, and everything to do with case sensite filesystems common on UNIX.

macOS uses case insensitive filesystem by default for backwards compatibility with HFS+.

You can turn case sensitive on HFS+ and APFS if so desired for the "Linux" experience, via Disk Utility or the equivalent CLI tool.

And if looking to have some fun on an ecosystem that doesn't expect it, you can equally turn it on on NTFS, via fsutil.


> > I'm painfully aware how much is developed by people with no access to or interest in MacOS. [...] Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.

> Nothing to do with Linux, and everything to do with case sensite filesystems common on UNIX.

Of the three major desktop operating systems (Microsoft Windows, Apple macOS, and the Linux family), only Linux has case sensitive filesystems by default. Therefore, it's likely that someone who didn't care about filename case conflicts was running Linux.


Indeed, why get a lesser experience with Linux laptops, when I can use Apple and Microsoft platforms, and use Linux in a VM when I really need to.

The Year of Linux Desktop is delivered on a desktop VM.


If that works for you, great. My default works better with Linux with only occasional other system. Makes me least angry. (Also because Linux is the only system that handles sleep/hibernation for me without issues, ironically...)


I think the awkward part of your first post is that you appear to start with a value judgement that tinkering is for poor people who's time is worthless. That's not remotely fair to either poor people, or rich people who like to tinker. No one's time is worthless. Not your time. Not mine. It's all just time.


Fair enough, and no I didn't mean to impinge time is worthless. It's not the value of time that changes, but the amount of it you have.

In a work context a shortage of time (more customers than you can handle) means you need to discriminate, which means you can't make everyone happy. Which usually means differentiating based on value. (Aka, you get more expensive. )

For personal time you also become more discerning. Spend time with spouse, or build another computer, or lie under a car etc. Life has more demands, so there are more choices.

Incidentally, one of those choices is to work less.

The tinkering never goes away, but I prefer to tinker in profitable areas now. (I get to tinker for work.)


All my PCs and servers run Linux, and its certainly not out of some idealism or anything. I'm fundamentally lazy, but I have a high standard for how things should be. As a result, I tend towards the highest quality, lowest cost (time, money, etc.), and thats Linux for me. Specifically, the setup I run on almost all my machines, which is the most optimal way I have found to write and run software, and play games.

If Windows was easier to use, more stable, less of a hassle, easier to fix, I would use it, but its neither of those (for me). When I have a windows problem, I can either try magical incantations to fix it, reinstall, or give up, and each of those takes much longer than most things I could possibly do on my linux systems. Even if my linux box fails to boot, the drivers break and my ssd doesnt mount, all those fixes together take less time and effort than finding a fix for the most trivial of windows problems.

The most trivial problem on Windows has been that the right click menu doesn't fully populate on first right click. I reported the issue, and thats all I can do. Its been a year and nothing has changed.

On linux, a less trivial problem (a calculator crashing with a series of very weird inputs) was solved by me opening it in gdb and fixing the code, making a PR and having it merged.

I guarantee a lot of people are on linux because its easier, and for no other reason. I dont need it to "just work", because I will break it. I need any possible fix to be possible in bounded time.


Windows has been disconnected from user needs for a very long time. Any logical person would've put a "right click" icon in the Control Panel that would give the user full control of what does and doesn't appear in the menu, their order, etc.


I also use Linux on all my machines but that's because (perhaps after years of tinkering) it is currently the most turn-key laptop/desktop OS. Things just work, they don't break without a good reason, and weird limitations don't randomly pop up.

Windows at work, despite being maintained by professional helpdesk staff, or Macs my family have, with all the ease of use designed by Apple in California, are not like that.

Just the other day I tried to download an mkv file over https on a Mac and I couldn't get it to exceed 2.5 MB/s. Same network, same server, my laptop breezed at over 20 MB/s and Apple took out that walker for a stroll at a very leisurely pace. It didn't come with `wget` either.


If you sincerely believe this, you've tinkered enough that the massive knowledge barrier that is Linux seems like nothing to you.

I would never sit my 70 year old mother down in front of a Linux machine. We're not at "caring that video files download too slowly" - we're at "how do I put a file on a USB".


Put USB stick into computer, click on "Files" in the program chooser, select the USB drive (helpfully listed as "USB drive" even), drag your files there?

Same as on Windows and MacOS really. I don't dispute that Linux has rough edges, but putting files on a USB stick is not one of them tbh.


It really works very well for my father-in-law and he's over 75 now. Debian gives me a peace of mind I would never have with him using Windows.


I have very little Linux sys admin knowledge and have been using it on my home notebook for 5years and my work one two years now.

Really no issues with the OS.

I was using the very excellent 2015 Mac book pro before, but despite hardware that isn’t quite as nice (not bad though) that hardware I can’t go back to Mac OS. I know I pay a premium to get it pre installed over windows, but it’s not bad.


I do sit my 75-y.o. mother in front of a Linux machine, and it's fine.


> Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

I have plenty of other things I’d rather tinker with and become an expert on, though. My computer is a tool to let me work with those things. It’s not fun when I have to debug and fix the tool for hours or days before I can even start working on the things I want to work on.


This is me. The range of things I want to tinker with has grown. Various house projects, jiu-jitsu, cooking, etc... are all things I tinker with and learn from. Building computers, I've done and don't feel the need to do again. I even built a Gentoo install long ago when I was learning the nuts and bolts of linux.


Exactly. Why do I want to be neck deep in some XML config hell when I could be playing music?


> Linux is only free if your time is worthless!

This argument is quite out of date. You'll lose a whole lot more time on forced Windows 10/11 updates than you'd spend managing a reasonable Linux installation. ("Reasonable" meaning avoid things like Arch or Ubuntu, and pick decent, natively supported hardware.)


That argument doesn’t sound very convincing to me. How would I know an avoiding Ubuntu is reasonable? That still seems to be the go-to distro for many people I know that like to use Linux but aren’t Linux experts. How do I know which hardware is natively supported?

With Windows 10/11 I’ve never had any problems, either with pre-built computers or my home-built PC. Hell, running Ubuntu in WSL has been relatively smooth as well.

My experience with Linux as an OS has been fairly good for many years, regardless of the distro. It’s the applications that could be an issue. Feels like it’s only very recently (post Steam deck in particular) that gaming seems to be viable at all. And it’s hard to beat the MS Office package for work. I recently got the idea to have two user accounts on my home computer where I have an account dedicated to working from home, logged into my office 365 account from work.. and it was honestly amazing how suddenly everything was just perfectly synced between my work and home computer.


If you have recently endured Windows Update for Patch Tuesday, you know that you are forced to reboot during this process. This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.

If you have recently performed the analog activity on a Linux distribution, which is likely either apt update/upgrade or yum update, you will notice that a reboot is not required. These update approaches cannot alter the running kernel, but ksplice and kernelcare offer either free or low-cost options to address that.

Windows update is enormously painful compared to Linux. There can be no argument of this fact.


> This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.

Which is something 99% of personal computers don’t care about even slightly. These days restarting your machine is a very inconsequential event, your browser can effortlessly reopen all the tabs you had active, macOS will even reopen all the windows for your native apps.

I don’t mean to defend Windows Update, I just think “you have to restart your computer!” is not a particularly good reason to damn it.


Windows update is agony compared to apt/yum.

A complete patch Tuesday session is twenty minutes of reduced performance, followed by a "don't reboot your computer" of unknown time both before and after the reboot.

Anything is better than that, especially when some updates either reboot immediately or kindly give you five minutes to close everything down (was tmux made precisely for Windows update?).

Exposure to apt/yum really makes Windows intolerable, just for this alone.


> especially when some updates either reboot immediately or kindly give you five minutes to close everything down

I have been a Windows user since XP. Never, not even once did Windows decide to reboot without asking first. Never.

The only way this could've have happened is if Windows kept asking you over the span of a week or 2 to restart to apply the updates and you kept postponing it.

Either way, "Hot Patching" will soon be a thing on Windows so restart won't be required every month [1].

[1] https://www.windowscentral.com/software-apps/windows-11/micr...


I'm on a corporate desk/laptop, and I'm guessing that happens about three times per year.

That puts you in a tmux habit.


> Which is something 99% of personal computers don’t care about even slightly

to the point that I know people that still turn their computers off when they are not using them.


Let's get this out of the way first: "the five 9's" is not a requirement for personal computers. That argument therefore is invalid.

But even then, Microsoft is testing "Hot Patching" windows installation so critical updates install without requiring a reboot [1].

When that comes out, I wonder where the goalposts will be?

[1] https://www.windowscentral.com/software-apps/windows-11/micr...


What are you doing on a desktop computer that can only be off for five minutes a year?

A laptop is even dumber to complain about, because they're (suppose to be) suspended every time you close them.


That would be a firing offense at my company. Company files stay on company hardware. Personal files stay on personal hardware, and never should the two meet.


I would never work at your company. I use my own tools, thank you.


My personal vim config on a company laptop? No problem whatsoever, neither for me, nor the company.

A bittorrent client without preauthorization with IT and security? It's basically asking to get fired.

My vacation photos on a company laptop? Tricky - not a huge deal but not recommended. Better upload them to your cloud backup quickly.


Yup,. you get it, exactly! It's not a surveillance state, but don't be stupid, and certainly don't LEAN into it.


You're own tools are your own personal files?

Interesting. How do your vacation pictures help you do your job?


Not the OP, but personal files are not just vacation pictures. I work in R&D and I have my org-mode/roam on various scientific and technical topics going back 15 years or so. I use these for work to benefit my current company, and maintaining two parallel versions of these is rather inconvenient.


Isn’t that exactly what a cloud drive is for? There’s a difference between using your personal notes for business purposes on the one hand, and keeping company property and data on a machine totally outside IT control. That’s just a massive lawsuit waiting to happen, and it’s bad for the employee too - why would you want the liability?


I would't store company data or code outside of approval services, but one might say that my notes, including notes on the people I meet and projects I work on, can constitute proprietary information - so yeah, it is a bit of a grey area still.


I am speaking of company property in only the narrowest sense, ie. physical objects and IP (and I guess property but I've been WFH for a decade, so,..


I don't want or accept IT control of my personal machine.


> How do your vacation pictures help you do your job?

This question is why I don't want the company laptop.


Are you required to maintain PCI compliance? Do you touch customer personal info?


That may be sensible if you want or need stronger security and isolation.

However, many companies do support BYOD, especially on mobile where it's a pain to carry two phones around.

There is some support for this. For example, Apple supports dual Apple IDs and separate encrypted volumes for personal and corporate data. Microsoft apps (Outlook) also have some support for separating personal and corporate data.

The benefits of BYOD can include lower equipment costs, lower friction, and potentially higher employee happiness and productivity.


Mobile is a totally different story, to me. The security model allows them to be compartmentalized in the way a desktop never could be.


> How do I know which hardware is natively supported?

You buy preinstalled. Works for me.


Yeah preinstalled. And I never had issues with Ubuntu breaking in ways like arch or gentoo. Breaking includes trying to install some new thing or uograde and having random other stuff have to be googled.


That is patently wrong. I run Fedora on my Framework because it is the most supported and recommended distro for it and I mostly just need a web browser for most of the things I do on it. I've had kernel upgrades break wifi completely, the fingerprint reader doesn't work properly out of the box, 6GHz Wifi isn't supported (though neither is it supported in Windows 10), VLC (which I hate using) is the only media player that supports playing from SMB shares on Linux, Wayland isn't compatible with Synergy type software (and my web browser doesn't work well with xorg), etc.

Most of these things worked without any fuss in Windows and I can't think of any notable Windows issues I had to deal with on the laptop before I installed Fedora.


This is a great linux post because while taking the time to type out distros to avoid is worth it, saying what distros to try is not.


This is 100% false.

I have been running Ubuntu then Arch as my daily driver 2004-2017. As I started a consultant working for Western companies I thought they will care about me being clean copyright wise so I went 100% Linux. This was obviously not so but what did I know? I deeply regret doing this now. (I was dual booting before.)

With Ubuntu, upgrades every six month or so meant you were better off reinstalling and reconfiguring -- no matter which way you went, it was 2-3 days of work lost to tinkering the system. With Arch, the whole system doesn't shatter, it's just this and that doesn't work and it's frustrating. Bluetooth, multifunction scanner-printers being in the forefront. In fact, I needed to sell a perfectly working Samsung MFC at one point because Samsung ceased to make drivers, the old ones didn't work with newer Linux and while open source drivers surfaced that only happened years later. Let's not even talk multimedia. https://xkcd.com/619/ is ancient but the priorities are still the same.

Neither systems were great on connecting to weird enterprise networks, be it enterprise wifi or strange VPN. At one point I was running an older Firefox as root (!) to be able to connect to the F5 VPN of my client because the only thing supporting 2FA for that VPN was a classic extension -- and the binary helper disappeared in the mists of time. The only Linux related discussion was ... the IT head of my client asking how to connect Linux to his VPN now that he turned 2FA on and being told it doesn't work. https://community.f5.com/discussions/technicalforum/linux-ed... well I made it work but faugh.

I have been running Windows 10 + WSL since 2018 January and all is well. It reboots sometimes while I am asleep and that's about it. You need to run O&o shutup like once in a blue moon. Right now I am on Win 11 as my primary laptop is being repaired, you need to run ExplorerPatcher but that's it. It's been indeed six years and there was never an update where the OS just didn't start up or a hardware driver decided to call it quits after an upgrade.

Also, updates are not forced, I control my machine thanksmuch via Group Policy.

https://xkcd.com/619/ is ancient but the priorities are still the same.


I am Linux user since 2006, Ubuntu then Arch.

Bluetooth mouse, keyboard, headphones, controller works. Intel iGPU works, including hardware accelerated video in browsers. VPN: Pritunl worked without issues, Perimiter 81 initially failed, works with update.

Wayland, Pipewire, Wine, Proton - Steam Deck is widely successful multimedia device. Priorities are same, NVK joined open source drivers.

Linux does not connect to "enterprise wifi or strange VPN" - ok.


> avoid things like Arch or Ubuntu

which one you would recommend?


Well I'm just a rando, and you didn't ask me, but I agree with the sentiment, so: Fedora. Or openSUSE. I'd be more comfortable giving a newbie Fedora.

I was a Debian devotee for nearly 25 years, but I've found it to be less foolproof and fault-free lately, and it has always lagged behind current package versions in Stable, forcing you to run Testing (or -backports) or even Unstable to get newer versions-- with corresponding potential for breakage.


Debian Stable was very out of date 25 years ago, but ever since mid '00s (after Ubuntu got popular) it improved by miles. Debian Stable is akin to Ubuntu Stable LTS. Ubuntu Stable non-LTS is a 6 month snapshot from Debian Testing, does not get supported for long. If you run Debian Unstable, you're probably running something akin to a rolling distribution. What is best all depends on your goal and purpose of the task. Personally, I very much like the Debian ecosystem and would prefer any Debian(-based) OS. However these days, Docker can trivialize a lot (and also mitigates your mentioned issue), ZFS and other filesystems allow to rollback in case of issues (useful on a rolling distribution, but also on Debian Unstable), and hypervisors allow snapshotting, multiple OSes, and all that, too.

For a server I'd recommend Proxmox (especially since ESXi is now only for enterprise). From there, have fun with different OSes.

Proxmox on a desktop is a bit meh, but possible. There's a lot of useful Linux desktop OSes out there. For example if you want to perform pentesting you can use Kali Linux. The one which interests me most from a security standpoint however, is Qubes OS (Fedora-based, sadly, but you can run anything on top of it). For gaming, SteamOS is neat (Arch-based, these days) and could even be fun to have a kid play around with Linux, too.

As for macOS, I played around with Hackintosh a couple of times in the past with success. But I never liked it much because you'd lag behind on security patches, and every new update would be praying it'd work. I did get it to work under Proxmox though, that was fun, but had to install a second (dedicated) GPU for that. I latest M-series ARM-based Macs work very well, only disadvantage is the fat price upgrade for RAM and SSD (often even soldered!). That part is terribly sad.


This is absolutely false. I run dual-boot Windows and Linux on hardware that has 100% Linux support. Windows just works, the same cannot be said for Linux unless all you do is use a browser and listen to Spotify.


There are pain points on both. Audio on Linux is still annoying if your system isn't very vanilla, while Windows sucks at bluetooth, configurability, and has a lot of annoying anti-user "features".


Windows does not “just work”. On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns. Windows is pretty shit these days tbh. It’s pretty much like Linux was 10 years ago.

However, I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.

Frankly, I hate that I’m forced to use windows as work. I feel like I need to constantly deal with BS windows annoyances. When I go home and work on Linux it like breathing a sigh of relief. My desktop actually feels fast and efficient.


> On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns

> I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.

Perhaps your Windows knowledge is not up to the level of your Linux knowledge? It might be that a Windows expert could fix every issue you’ve listed and more.


I'm a long time macOs user at home (pre-X).

I've worked daily in Windows enterprise environment for 15+ year (which mean that when it won't work I usually "just have" to get help from a colleague.

I've been in charge of a debian/postgresql cluster for 10+ year which I managed to keep upgraded on a reasonable schedule.

But Yet, since for some utterly opaque random reasons Windows updates on my home gaming PC stoped working two months ago I feel totally clueless about how to even begin to debug this crap.

There seems to be absolutely no clear working procedure out there to fix that, only people with the same problem shouting out to the void. All them poor souls trying byzantine procedures that have been duplicated ad nauseam from stack overflow to windows help forums through reddit and back.

The consensus seems to reinstall windows from scratch (by choosing amongst a handful of ways for which risks/benefice looks unclear).

That really piss me off and but I guess it's user fault because "my Windows knowledge is not up to the level..."


That’s very possible, but I don’t want to invest time gaining knowledge in a proprietary platform. Microsoft already owns most of the default stack programmers use these days. I don’t want to contribute my energy to entrenching them further.


> It might be that a Windows expert could fix every issue you’ve listed and more.

So in other words, it doesn't "just work."


Wasn’t that Apple’s tagline?


The original comment claimed that Windows "just works" while Linux doesn't. Which can't be more false in 2024.


Let me take a guess:

You have exclusively used Debian-family distros.

Try a desktop distro like Fedora. Debian-family is a server distro that got famous after Conical/Ubuntu did marketing really hard.

Ubuntu is the Apple of Linux, they are famous from marketing, not quality.


I have used all distributions. They all have their own pain points. Debian-based distributions are actually the most painless in my experience.


> unless all you do is use a browser and listen to Spotify

So what exactly isn't working?


These have been pain points for me. Not saying they're impossible to solve on Linux, but it's nontrivial especially compared to Windows

Change trackpad scrolling speed

Set up suspend-then-hibernate

GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)

Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)

Higher res video streaming (I think this is usually a DRM issue?)

Full disclosure: I'm posting this list because I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed


> Change trackpad scrolling speed

If you're on X11, I think you'll have to use xinput to set it manually.

If your on Wayland, in KDE at least this is available in the standard settings application.

> Set up suspend-then-hibernate

On KDE at least that's just one of the options in the power settings ("When sleeping, enter:" has "Standby", "Hybrid Sleep" and "Standby, then hibernate").

> GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)

Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?

> Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)

I find that Proton mostly just works for me, but indeed EAC is a problem that I don't know how to solve (and also don't really care about since I'm not into playing public multiplayer games).

> Higher res video streaming (I think this is usually a DRM issue?)

You should check if HW Acceleration is enabled in your browser, but IIUC Netflix will indeed refuse to provide higher quality streams to Linux (and also Windows depending on your browser), you might be able to resolve it by googling a bit, maybe using a browser with DRM support and switching out your user-agent?

> I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed

Gnome is notorious for removing user choices, so I wouldn't be surprised if this was impossible on Gnome/Wayland. Xinput might work on Gnome/X11. Switching to KDE should work on Wayland ;)


Alas, I'm using Gnome. There's a setting for changing scroll speed with a USB mouse but not for a laptop's track pad. I don't see anything for standby-then-hibernate either.

>Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?

Based on their compatibility list[1], it doesn't look like amdgpu supports my hardware (Richland chipset). Most distros I've tried don't even boot unless I add "amdgpu.dpm=0" in GRUB.

[1]: https://wiki.gentoo.org/wiki/AMDGPU#Hardware_detection


> Alas, I'm using Gnome.

That is unfortunate, but at least that's not a difficult problem to fix ;)

On Wayland at least input stuff is IIUC solely on the compositor, so if they don't want you to control scroll speed, you won't.

For power, maybe the instructions on the Arch Wiki can help? https://wiki.archlinux.org/title/Power_management/Suspend_an...

> it doesn't look like amdgpu supports my hardware (Richland chipset)

I see, looks like your card is too old for the official open source amdgpu support, meaning you should either install the unofficial open source ati drivers as per https://wiki.archlinux.org/title/ATI or try the official proprietary drivers from AMD (which I assume will be too outdated to function on a modern kernel?).


Thanks! Turns out, I don't really need those things.


Not OP, but the fact that I have an easily accessible text file on my desktop with the exact commands to run in my terminal to recompile the graphics driver when upgrading packages breaks graphics again should speak volumes. I don't really mind, because running 3 commands in the terminal a few times per year is not particularly difficult for me. I could see it being difficult for non-devs though.

What does get annoying is when such an OS upgrade breaks the wifi drivers and I have to setup a bluetooth hotspot on my phone to access the github repo and fetch the latest driver version for the wifi dongle.


> You'll lose a whole lot more time on forced Windows 10/11 updates

Utter fantasy.

They complete whilst I sleep, taking zero of my time at all.


At this point I feel like Linux may be more likely to just work than a windows machine. I just had the unfortunate experience of setting up windows 11, and the number of ‘please wait while we get things ready for you’ was truly astonishing.


It's not. You can go and pick up any computer that is currently on the market, doesn't matter if it's 300 or 3000 dollars as long as it is a (n IBM) PC and it will run Windows.

Will it always be flawless? No. Will it always work perfectly out of the box? No. But it will work and generally you have a good chance of it working as you wish assuming you are fine with Windows and what MS does with it.

I bought an Asus Zephyrus G15 (2022) specifically because it was recommended to me because it is supposed to be great for Linux and it's probably the worst Linux experience I have ever had. As the first piece of hardware that I specifically picked for Linux support.

Because most DEs don't do fractional scaling but all high end laptops have too much DPI to not have fractional scaling.

Nvidia is still not providing proper Linux drivers.

Asus can't program to save their lives but the tools that replace the Asus stuff on Windows are still better than the stuff that is replacing the Asus stuff on Linux (asusctl/ supergfxctl vs G-Helper).

I once had a machine where the nvme drive was simply not working. That was when Kernel 5 came out. It broke on Fedora but worked in Mint until Mint got Kernel 5.

During my last Linux adventure, KDE just died when using WhatsApp as a PWA (where I live, WhatsApp is essential software to have a social life).

And even after years of Wayland being around, it's still impossible to have apps that aren't blurry in most DEs because X11 is still around.

You're complaining about software updates and user friendly loading screens. The issues that drive people away from Linux and to Windows are literally unfixable to 99% of the techies that try Linux. I'm not fixing an nvme driver in the Kernel. That's not my area of expertise. But I still need my machine to work and on Windows, it does.

Rufus let's you create an ISO that skips most of the windows 11 nonsense btw.


Good for you.

I’ve had literally zero of any of those issues in my past 4 years of using Ubuntu.

I had a hell of a lot in the past, so I trust I can judge when it’s reached “better than windows” level.


I think that everyone knows that's a pretty ridiculous statement. Installing Windows 11 is basically putting in a USB stick, waiting about 8 minutes, clicking a few things and typing out your login and password. I love Linux, first started playing with it about 20 years ago now. There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.


Now, that is a ridiculous statement. Installing Windows has never once been a smooth experience you describe. It's been long wait times, dozens of reboots, and never ending cycles of Windows Updates. Always has been for the last 20+ years.

Today, it's even made worse by the fact that MS is intentionally driving Windows UX to the ground in exchange for short term profits. Installing Windows isn't "clicking a few things." It's going out of your way to disable piles upon piles of anti-features MS throws at you, whether it be spyware, bloatware, or the hyper-aggressive nags to get you work against your will. The length die-hard Windows users go to to "de-bloatify" their Windows installation these days is absurd.

It's true that Windows had a superior end user UX over Linux 2 decades ago. But that has changed with improvements on the Linux side and poor, poor decisions on behalf of MS.


You're greatly over-exaggerating how much effort it takes for a power user to set up Windows. I had to do it the other day on a Dell MiniPC (sadly couldn't use Linux since I needed HDR) and it's just the following.

1. Set up USB stick in Rufus with all the setup skips enabled 2. Select install options, skip key, next next next 3. Wait for it to install 4. Say no to MS account, put in username, password, and security questions 5. Wait for a reboot and setup 6. Connect to internet, run Windows update, reboot when done 7. Uninstall the few bloatware apps in the start menu, most of them are UWP so the uninstall button does it immediately, takes no more than 5 minutes 8. Disable web search from group policy 9. Install Windows Terminal, Powertoys, and another web browser.

I could easily automate steps 7, 8, and 9 through powershell and winget if I wanted to. The total install time was less than 10 minutes plus the time it took for Windows Updates to install and I have a pretty clean environment.

In comparison, with Fedora running on Gnome I'd have to spend a solid amount of time messing with dconf settings to get fractional scaling to show up and for my touchpad to scroll at the correct speed + installing extensions to get a UX as good as Powertoys has by default, and on KDE I would need to spend the same amount of time messing with settings and installing KWin scripts to get functional tiling (although that might have got better since I last tried it).

Oh and on MacOS I would be up and running in almost no time, because there's no way to fix the absolute dumpsterfire of a UX it has so I don't even bother.

So all options kinda suck, Windows just sucks in its own ways.


From your comment it sounds like you affirmed GP's claims...


> There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.

You having that experience does not make it a basic fact.

I didn’t even have to do the actual installation, as it was a prebuild machine. The only thing I had to do was the ‘clicking a few things and typing out username and password’ part.

Comparing the two between Ubuntu and Windows, I’m forced to conclude that Ubuntu has the easier version, or at least faster. And windows has the advantage/disadvantage of needing my MS account to set up an operating system.


> There's not a single dist I've ever seen that is that simple.

It is that simple with Ubuntu and similar distros. It has been that simple for many years.


> Just a basic fact, sorry.

Ubuntu, Linux Mint and Elementary OS and I guess a few others will beg to differ. And it takes way less than 8 minutes.


As I found out this week, making the Windows 11 USB stick is far harder than it ought to be if you don’t have Windows already.


If you use UEFI, all you need is to copy the files from the ISO over to the USB stick.

Am I missing something?

(And the same applies to UEFI capable Linux-distros)


Windows installer images have some files too large for most tools to understand and also I believe the USB stick needs to be exFAT formatted too. Virtually any tool for making a USB stick would fail in various ways on macOS.


> Windows installer images have some files too large … I believe the USB stick needs to be exFAT formatted

That’s true. I forgot!

While it’s not part of the UEFI spec many (most?) consumer BIOSes will be able to UEFI boot from NTFS as well, so formatting as that might also be an option.

Both that and exfat should be easy to do on Linux. No idea about MacOS.

Which brings me to

> too large for most tools to understand

This I don’t understand. What tools? What tools do you need beside “cp”?


I don’t know if you can just copy the files over. It seems you also need to make the USB stick bootable?

There’s dozens of guides on doing this on a Mac, they all seem outdated. I found a tool called WinDiskWriter on GitHub and it was the only GUI tool that worked.

Suspect there’s more to it than just cp. There’s wimlib too for handling the larger files on the install ISO.


> It seems you also need to make the USB stick bootable?

That’s for legacy MBR boot. It has no function with UEFI boot.

Same for Windows as it is for Linux.


Interesting. This is the first traditional UEFI machine I’ve had. Anything prior to this has been Seabios/Coreboot in recent times.


TLDR: UEFI just checks in NVRAM for pre-existing boot-configs (stored paths to EFI-executables for installed operating systems) and if doing "dynamic" booting from some random media, it checks if the the volume has a EFI-executable for the given architecture (for instance in \BOOT\bootx64.efi for Intel x64), and if it does, it loads that file.

UEFI usually boots straight into native long mode without any weird 8086 compatibility modes being employed (which the OS then has to unroll), so for the OS its simpler to deal with.

It can also serve as a multi-boot menu on machines which has several OSes installed.

It often comes with a MS-DOS like "UEFI Shell" you can boot into... To manage UEFI itself. So if something doesn't boot, you can just boot into the shell instead and try to fix things from there.

It may sound complex, but once you get into it, its really much easier to work with that legacy MBR boot, and all the "magic" things you have to do their to get things booting.

I definitely recommend reading up on it.


> hackintosh is likely in the rear view mirror for the next generation of tinkerers.

Part of this might be that making Hackintoshes is so much harder now, but part of it might also be that OOTB desktop Linux is luxuriously good these days compared to where it used to be. Ubuntu and Pop!_OS linux are absolutely on par with MacOS for a user who meets the (admittedly higher) entry requirements for using Linux.


Your comment makes me think of my 3d printing journey. A lot of printers require maintenance and tinkering just to keep them functional. To an extent, since they are targeted towards “makers” who like to play with these things, that’s fine.

But sometimes the thing you’re trying to build is of central importance, and you want the machine to stay out of your way.Tinkering with the machine takes away time you could be exploring your ideas with a machine that’s already fully functional.


Sometimes the holiday is the destination. Sometimes the fun is in the getting there, not being there.

Tinkering can be fun. But these days I mostly want results, achievements etc. I want to tinker to a successful goal, not just tinker for tinkers sake.


This argument makes a lot of sense. I get more upset than I probably should about car issues, likely because I never spent the time to tinker with them, so I feel rather helpless… and I don’t like feeling helpless.

In my youth I did a lot of tinkering with computers and it has paid dividends. It gave me a career.

These days though, I want to be able to tinker on my own schedule. I want my primary computer, phone, and car to “just work”. That means any low level tinkering needs a second thing. That can work fine for computers, because they’re small and relatively cheap. The idea of having a project car isn’t something I ever see myself doing, as it’s big and expensive.

I can still tinker on some things with my primary computer without it being a problem. Tinkering on writing software, running servers, or whatever, isn’t going to kill my ability to do other things on the computer. A lot of tinkering can be done without tinkering with the OS itself.


To be honest, none of that stuff has been true for 15+ years anyway.

Linux just works now. You put in the Ubuntu/Debian/Arch/whatever USB, you install it, it just works.

I can't remember the last time anything broke on any of my desktop machines and it wasn't my fault for intentionally doing breaky things.


> the understanding you gain from tinkering is priceless

You pay with time. It's priceless, if you are a romantic or lack foresight (because what you did with your total will be way more important than what is left). Otherwise it will always be the most expensive thing you have (and we must still be able to spend it without care, because what would life be otherwise).

> But when it doesn't, I often wish I was that guy that had tinkered with my car

Don't. Instead build a network of experts you trust and make more money doing what you do best to pay them with. Trying to solve the world on your own is increasingly going to fail you. It's too complicated.


Disclaimer: This became more of a rant than I intended. I've become pretty unhappy with the general quality of the "professionals" I've interacted with lately.

I just can't agree with this take. It sounds that simple, but it's not.

I happen to enjoy learning and fixing.

It would take me a long time to build that trust. Nobody cares about my things and my family's safety like I do.

Most people are a long way from making as much money as an expert would charge them.

In the last couple of years, I have had some terrible times when I call for help.

When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.

Plus one time they left my fuel pump loose and I had to pay (in time and money) for an extra round trip with Uber, and the fuel it sprayed onto the road. They didn't fix the original problem, which cost me another round trip.

Another time, I had technicians (experts) out to look at my leaking hot water tank 4 times before they decided it was time to replace it. I wasted the time calling, babysitting, coordinating, figuring out how to shower without hot water, etc.

If this is the average "expert" count me out. I'll do it myself. Plus, throwing money at a problem isn't near as fun.


> When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.

Regrets about not becoming more of investing the time to be an intuitive handy man is a very different category from "let's see if there's a video on yt to help me fix that in 5 minutes". My message is definitely not "don't get your hands dirty" but "be practical". Doing the yt/google/chatgpt thing to get an idea is mostly practical.

> If this is the average "expert" count me out.

You disclaimed, no problem — but I did write "build a network of experts you trust". Just calling someone and being annoyed that they are not good (and I agree, most of them are not) is not that. It's going to take time and money, but decidedly less so, because you get into the habit if doing it, you learn, you see red flags, network effects are real (people know people) and relationships on average last long enough. That is my experience, at least, but I have no reason to believe I would be special here.

> Plus, throwing money at a problem isn't near as fun.

That's true, in my case, only for very few problems. Most problems I would rather not solve myself.

I'll admit: All of this is a concession to reality, at least my perception of it. Learning is fun. I would really love to be good at a great many things. It's just increasingly unreasonable to invest the time necessary, because things get more complicated and change more quickly.

Staying good at a few things, learning whatever is most important next, and getting better at throwing money at the rest, will have to do.


I'm enjoying this thread. I want to add that building a network of experts has other costs too.

Sticking to a network will limit the variety of people you get to meet, everything else the same. Local maxima.

It also isn't practical in some circumstances; if I travel for work or move cities every few years, the local network for mechanics gets lost. The cost of keeping the network would be staying in one place.

So, these are all options.


>> The cost of keeping the network would be staying in one place.

One man's bug is another man's feature:). You describe staying as a bug, I've lived in the same house for 24 years, and, for me, it's definitely a feature. I'd positively hate moving to another suburb, never mind city.

And yes, I've developed relationships with local service providers. My plumber, my electrician, my mechanic, all know me by name. I've found the people I can trust and they eliminate those hassles from my life.

But, and this is my point, I'm not you. My context, my goals my desires, are all different to yours, and that's fine. We're all in different places, being different people, and that's OK. It doesn't have to be "us versus them". We might enjoy different thinks, and have different perspectives, but that's OK.


There's levels to tinkering though. When I was running Ubuntu, a lot of the tinkering came down to searching for what config files to update. Sure, that freedom is nice if you care to use it, but it's mostly just searching, configuring, experimenting. This is hardly fun or instructive.

A deeper form of tinkering is actually working on the code. I think an instructive example is writing your own X windowing system with xmonad. You get to see exactly how a whole windowing system works.


These days, if you have the skills and tools to swap a transmission you have to tow it into a dealership and beg them to flash the transmission so it will work in your truck. If you want to avoid that you better know where to find the strategy code and match it up before purchasing another transmission. Same goes for touch screens and a whole slew of essential parts. While we weren't looking the rug was completely pulled out from underneath us. Now your family mechanic is beholden to the dealership.


> People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:

> > Linux is only free if your time is worthless!

But it is exactly why I quit Linux and returned to macOS. I used to run Linux on cheap 2nd hand ThinkPads and for 3 years on Macbook as main system. But after another upgrade destroyed gain all network connectivity I have quit.

macOS isn't perfect but it works in most imposrtant areas and I can tinker with small stuff when I feel like it.


Ubuntu is so easy to use. I enjoyed using Arch before, but got to a point where I also just wanted my PC to work without any tinkering. Ubuntu is very good at that.


Your argument is excellent and made me evolve my point of view about Mac. I use Mac for efficiency, and yet, I was wrong about what kind of efficiency I’ve been developing. Tinkering is so important, even if just for the fun of it.


Do you not have any hobbies to "waste time" with? I would assume that most Hackintosh enthusiasts do this as a hobby, not for a living or even to save money on hardware.


On my days we used to tinker in proprietary 8 and 16 bit home computer systems, The Way of Linux (TM) is not the only path to enlightment.


Linux is for work. I wouldn’t consider running anything else (Windows, MacOS, FreeBSD, etc.) for my services.


Consider FreeBSD, because it’s great.


Just for my router


You need to understand the bias of many HN commenters. They are running businesses, aspire to run businesses or employed by businesses that are monetizing the work of tinkerers and packaging it for a mass market where they can sell higher volumes or mine more personal data. There are a lot of people who will recommend spending massive amounts of time and money learning and renting proprietary services over learning fundamental concepts and owning your own stack. I just ignore them along with the crypto bros before them and the AI pumpers now. Renting proprietary closed services to people who don't know better is their bread and butter.


I really like the way you put that.


That was a beautiful analogy.


a frustrating freely accessible experience being priceless is not mutually exclusive from your time being worthless

but I’m sure your point will inspire someone


For you, me, other people on HN who generally make a living by understanding computers, definitely.

For a layman who just needs to connect to WiFi, edit some documents and print them without having to update a kernel? No.


> For a layman who just needs to connect to WiFi, edit some documents and print them without having to update a kernel? No.

when it was needed to do it last time, in way more troublesome than Windows system updates?


Even as a dev with 3 environment I've not had to tinker my kernel since I left gentoo something like 15 years ago, Ubuntu takes care of it..


>Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

Yeah, but an expert in what? There are only so many hours in a day. Like if you care about learning about some rando soy d driver, or why all you photos come out pink under Linux, but not Windows[], that’s great. Go knock yourself out.

But if you want to do something that’s not rando debugging, then maybe it’s not for you. Like, I like Unix. It’s lets me do my work with the least amount of effort. What I don’t like is being a sysadmin. Some folks do, and that’s awesome. But that’s the reason why I got rid of desktop Linux 20 years ago.

[] Both of these are actual lived experiences. I do not care about you chiming in about either of these.


I don't know about you, but for me it was never about the money. I did this stuff (and still do) because I find it fun, not because I can't afford to buy it. I have my desktop, and I want that to just work, and I have a bunch of computers, hardware, 3D printers, etc etc that I constantly tinker with, because I like it.

I suspect it's the same for you, and it may be the lack of time, but not so much the access to money.


As a teen in the mid-oughties. I played heavily with the OSx86 project/Hackintosh. I learnt about writing kexts and kernel patche and I fondly remember getting a Linksys USB-to-ethernet adapter working on an HP workstation, running Tiger.

My financial circumstances have improved somewhat in the intervening years. Today, I own quite a bit of Apple hardware, most recently Vision purchase overton-shifted my definition of “disposable” into very unfamiliar territory. Even still, about once a year I ensure I can still triple-boot” - just now I do it with ProxMox and Virtual Passthrough. The first iMessage sent from my virtualized “iMac pro” at 2AM and was almost as gratifying as the first Apple Bootscreen on a a Sony Vaio.

May we never lose whatever that is.


Spot on for me, but there's a different argument at play: At the beginning of the OSX on x86 times, Apple had an OS with a stellar user experience, but the hardware was just completely overpriced, so Hackintosh made complete sense.

Fast forward to today and I think Apple managed to pivot this almost to the complete opposite end. I think the hardware is incredible value (that's debatable for sure, but my M1 aluminum machined Macbook with Apple Silicon is blazing fast, completely silent, super sturdy and runs forever — I wouldn't trade it for any other laptop I could buy with money), while the Operating System has really taken a backseat, with hugely annoying bugs unfixed since 10 years: https://news.ycombinator.com/item?id=39367460

To me, in a world like that, Hackintosh simply doesn't make much sense anymore. Asahi Linux is really the star on the horizon, by doing exactly the opposite: Letting a free and better maintained operating systems run on strictly awesome hardware.


Value was the driving factor for some of my early hackintoshing (Core 2 Duo era), but what pushed me to do it from 2015-2020 was the abysmal state of higher powered Mac hardware.

The Mac Pro was still the un-updated 2012 trash can, the 15” MBPs were too thin for the CPUs they housed (perhaps Intel’s fault for getting stuck on 14nm for so long, but still) and were hot with terrible battery life, and while the 27” iMacs weren’t terrible and probably the best of the lineup, they still weren’t cooled quite as well as they should’ve been. My 6700k + 980Ti tower in a Fractal Define case with a big quiet Noctua cooler was just flat better and made a far better Mac than anything Apple sold at the time.

That said, I did eventually grow tired of the tinkery-ness of it all and in 2020 picked up a refurbed base model iMac Pro, one of the few Macs in that timespan that wasn’t a mistake, for about half its MSRP. It was about as powerful as that tower, surprisingly even more quiet, and of course just worked without the tinkering.


This is the classic "money is time, time is money" conundrum. A teenager doesn't have the money to buy a fancy car or computer but they have the time to tweak and experiment to get the most out of it. Meanwhile an adult has the money but not the time, assuming they have a full time job, kids, etc. So they're willing to spend the money to get products that work and would rather spend their limited time with their family instead.

In my teens I had a group of friends who loved to tinker, from hackintoshes to custom ROMs to homelabbing to electronics repair. Now I'm like the only one left who does this stuff :(


When you're young you have all the time, all the energy, but none of the money.

When you're an adult you have all the energy, all the money, but none of the time.

When you're a retiree you have all of the money, all of the time, but none of the energy.

A generalisation of course, but quite apt!


The only way out of this is an early retirement in a LCOL area or a job with a very good WLB (which is likely pretty rare for most HNers in the tech industry). Even ignoring overtime I'm typically tired when I get home from work and have other commitments alongside my hobbies and tinkering.


Techinically when you're retired you don't have all of the time. For most people they only have 1/5 or less or their time left.


> Incidentally it's the exact same journey with my cars. 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.

This resonates with me as well. As a teenager with my first car I spent a lot of time tweaking its appearance, sound, performance, etc., buying what little I could from local auto parts stores. I couldn't wait to get older to have more money so I could do more mods and really make the vehicle how I wanted it.

In the back of my head I wondered why older folks didn't do this though. They have these nice vehicles but they're bone stock! Why not new wheels, tint, a tasteful lower, etc.?

Then I myself got older and found it just isn't as important as it used to be. I still have a slightly modified car, but I'm not rooting around inside the dash with a soldering iron like I once did, haha.


Haha that is a very similar mindset I had when I bought my first house. I was excited about all of the nice improvements I could make and wondered why so many people I knew who were well off never really put much work into their home.

Then I quickly realized that its such a big hassle and also you almost instantly get used to things how they are.


I don't understand. Do you imagine there isn't a young generation of time rich cash poor tinkerers now? Why would the idea of a hackintosh suddenly become obsolete because you can afford one now and don't have time? Nothing about your statement logically follows.


He’s just parroting a usual HN-ism of ignoring the topic and talking about themselves. I’ve seen the “I used to tinker but now I don’t” line a hundred times as well as the “this doesn’t apply to me so I don’t care - let me tell you how”.


Isn't that the truth. For a site with the word "hacker" in it there seem to be so few of them. I can't imagine letting all that curiosity die out of me like the parent comment implies.

I don't have the amount of time I used to to do that stuff either but the curiosity of it has never died and if I had more time I'd still do it.

If I ever lost that drive I think I'd rather be dead.


The funny thing about growing older is that we change, and the things that were once "I'd rather be dead than not do this" just naturally fade away, and other new exciting things take their place.

I say thus not to dampen your enthusiasm, but rather to encourage you to enjoy it to the maximum while it lasts.

Everything has a season and in that season it can seem terribly important. Perhaps an activity, or a favorite sports team, or a group of friends.

Some of that remains forever, some of it gets deferred as other things happen. It's part of life, we grow, we change, the world around us changes.

It's not that the drive is lost, it's just that it manifests in different ways, different activities, different challenges.

When you see a post like yours in 30 years time, remember this moment, and raise a glass :)


> If I ever lost that drive I think I'd rather be dead.

I wonder how many others had this exact same thought, before they lost their "hacker" drive while also preferring to continue living.

This may shock you, but people's interests and desires can evolve over time, even when those people don't expect them to evolve.


I’m going to gently pile on to the sibling comment here, and note that the “hacking” we find interesting should and does change over time. I used to spend time hacking PDP-11 assembly code to make games. That got old, and if I play a game now it’s purchased. The stuff I hack on now is more like applied math.

This is all good and natural, if it’s organic and not growing it’s probably not alive.


In what sense is this an "HNism"?

Ever since blogs have had comments sections, the set of people who are too lazy to make their own blogs, have been holding forth (writing, essentially, their own blog posts) in other people's blogs' comment sections.

Heck, I'm sure people were doing it on Usenet and all-subscribers-can-post mailing lists, too — using the "Reply" button on a message to mean "I want to create a new top-level discussion that quotes/references this existing discussion" rather than "I want to post something that the people already participating in this existing discussion will understand as contributing to that discussion."

In all these cases, the person doing this thinks that a comment/reply is better than a new top-level post, because the statement they're making requires context, and that context is only provided by reading the posts the statement is replying to / commenting on.

Of course, this being the internet, there is a thing called a hyperlink that could be used to add context just as well... but what there is not, is any kind of established etiquette that encourages people to do that. (Remember at some point in elementary school, learning the etiquette around writing a letter? Why don't schools teach the equivalent for writing a blog post/comment? It'd be far more relevant these days...)

Also, for some reason, social networks all have "reply" / "quote" actions (intended for engaging with the post/comment, and so showing up as "reactions" to the post/comment, or with your reply nested under the post/comment, etc); but no social network AFAIK has a "go off on a tangent" action (which would give you a message composer for a new top-level post, pre-filled with a cited quote of the post you were just looking at, but without your post being linked to that post on the response-tree level.) Instead, you always have to manually dig out the URL of the thing you want to cite, and manually cite it in your new post. I wonder why...


"...but no social network AFAIK has a "go off on a tangent" action (which would give you a message composer for a new top-level post, pre-filled with a cited quote of the post you were just looking at, but without your post being linked to that post on the response-tree level.) ... "

On Usenet, if you were altering the general SUBJECT of a post, you'd reply to a comment BY PREPENDING the NEW TITLE/SUMMARY of your post to the PREVIOUS TITLE of the post to indicate that you HAD changed the GENERAL SUBJECT of the post to something else AND end your NEW TITLE with "Was..." to prefix the previous title, e.g. "Hackintosh is Almost Dead" => "My Changing Hobby Habits Was: Hackintosh is Almost Dead"


On the contrary, I was relating the article to my own experience. The thrust of the article was explaining the end of an age.

I was merely saying that we shouldn't see this as bad, it is the natural way of things. Everything that has a beginning has an end. Raise a glass to remember hackintosh, but don't mourn it.


People are asking how the fact that you make more money now is evidence of that. That's your natural ending, but it's not evidence of a natural ending.


They’re not that far off topic - the site would be far less interesting if we didn’t have tangential discussions in the comments.

They are also, as you noted, expressing a very common opinion.

Now I’m off to spend my Saturday not tinkering, because there’s a bigger world out there and I’ve done my time.


HN community selects for these kinds of posts, in the same way that subreddits like /r/amitheasshole love overwrought girlfriend-is-evil stories.

Most often the highest rated posts on HN are from 40+ year olds who don't discuss the post at hand, they'll post a hyper-specific nostalgic story from their youth on something that is tangentially related to the post.

In fact, the older the better. If your childhood anecdote is from the 70s or 80s you're a god.


There are other things that are more interesting to build and make now than a hackintosh (with the added difficulty that trying to make a silicon compatible device may not be feasible).

Combine this with that a Mac mini that might be at the target for a hackintosh device is $600 USD ... and has the advantage that it isn't hacked together and so has better support.

The part of me that wanted to tinker with a hackintosh in my younger days is more satisfied by Raspberry Pi and Arduino projects. I've even got an Onion IO over there that could use some love.

Its not that people don't want to tinker, but rather the utility that one gets for hacking together a Mac (again, note the silicon transition) is less than one gets for hacking on single board computers.


As I said in my post, the next generation will find something new to tinker on.

The idea of a hackintosh is obsolete because there are new worlds to conquer, the time of hackintoshes has come and gone. The new generation will find their own challenges, not re-hash challenges of the past.


I guess the commercial success of the platform has increased the offering in the second hand market.

Also, the MacOs desktop has pretty much stagnated and is behind the competition. What is strong is the seamless integration of the whole Apple ecosystem so it makes sense to run MacOs if you already own iOS devices. I doubt people using iphones and ipads are struggling to finance the purchase of a mac.


It's really not much of a time commitment. You can just lookup hardware with full compatibility and build a desktop that "just works".


The primary demographic of people interested in Hackintoshing are people who, like the GP in their youth, couldn't afford to just buy "hardware with full compatibility", let alone buy the equivalent-specced Mac.

The secondary demographic of people interested in Hackintoshing are people who have an existing PC (or enough extra parts to build a second PC) and want to figure out how to "make something that can run macOS" out of it, while spending as little money replacing/upgrading parts as possible.

People who buy parts, to build machines from scratch, just to run macOS on them, are a very tiny fraction of the Hackintosh community. (Which is why you so rarely hear stories of Hackintosh builds working the first time with no added tinkering — they can, if you do this, but ~nobody does this.)


I have a need to be able to run macos binaries and xcode from time to time, and it used to be non-trivial to run macos in a unsanctioned vm so I had a mac laptop around.

But these days you can spin up a qemu macos vm without too much effort and that's my virtual hackintosh.


I remember I needed Hackintosh to build an iPhone app on my PC. You must possess a Mac to make apps for iPhones, don't know why.


This really brings me back. My first hackintosh as a kid was on a 1.4ghz Pentium 4 with a ATI Radeon 9600.


I started my developer career on Hackintoshes many years ago.

No matter how much time I invested into building my desktop, it never "just worked". There were always inevitable problems with software updates, which often meant you had to re-image the system from scratch to install a new OS version. Which happened quite often, when you needed it to run the latest Xcode.

Then there were a lot of minor annoyances over the years, like crashes and graphical glitches with certain apps, like Photos or Preview, problems with monitor resolutions and refresh rates, and many, many others.

Ultimately, they were a useful tool for a time, but they suffered from death by a thousand cuts in terms of practical usability. So, I bought a basic Mac Mini as soon as I was able to, and never looked back.


The first hackintosh I built back around 2008 I was able to get working actually perfectly. Somehow the hardware and software bits all aligned and everything worked great. It’d run for months on end without issue.

Nothing since that one were quite as good. Had a Dell laptop for a while that was almost perfect, but would lock up and require a reboot once every couple of weeks. A tower I built in 2016 was also almost perfect, except I never could get USB working 100% right and later on the Nvidia drivers got flaky.


I built a hackontosh in 2016, bought all the right mobo with the right driver sets, etc. Used the buyer's guides on tonymacx86.com and purchased the exact hardware, downloaded the drivers, flashed things, etc. It was far from "just working". I had a stable and solid system for about 18 months (after a weekend of tweaking), and then it needed to be reconfigured, and I didn't have the time to spend the weekend getting it to work again....so that machine went back to windows. Even with the proper supported Nvidia card, I had issues, and went through some pains with the wifi.


Cost of ownership of an M3 MacBook Pro is like $300-400 a year. Even if you have the time, it's just not worth it anymore like it used to be.


I wouldn't count on 10 years of real-world life from a non-upgradeable and 'repair-resistant' device with a glued in battery, even if the hardware specs are good enough to last that long.


That's 5 years of ownership. $2500 USD for a 16" then you get something back on the second hand market.


I'm happy with the M1 air pro for 700. Could use 256 more SSD but it's not worth $500 extra that apple wants to charge.


It seems reasonable when you consider TCO, but you still have to pay 100% up front. Not everyone can drop $2k on a laptop, most people don’t need to.


Buying a new MacBook Air https://www.apple.com/shop/buy-mac/macbook-air/13-inch-m2

    $999.00
    or
    $83.25/mo.per month for 12 mo.
With a footnote:

> Monthly pricing is available when you select Apple Card Monthly Installments (ACMI) as payment type at checkout at Apple, and is subject to credit approval and credit limit. Financing terms vary by product. Taxes and shipping are not included in ACMI and are subject to your card’s variable APR. See the Apple Card Customer Agreement for more information. ACMI is not available for purchases made online at special storefronts. The last month’s payment for each product will be the product’s purchase price, less all other payments at the monthly payment amount. ACMI financing is subject to change at any time for any reason, including but not limited to, installment term lengths and eligible products. See https://support.apple.com/kb/HT211204 for information about upcoming changes to ACMI financing.

----

You do not have to drop $2000 on a new laptop up front.


Don't forget the $700 Walmart special: https://www.walmart.com/ip/609040889


8gb model doesn't count as a usable computer. It's absurd that it is even being offered in 2024.


And that's how you know someone has not used the 8GB MBA model. 8GB is more than enough for the light usage you'd buy a 8GB model to begin with. Which means not running 3 IDEs and 5 VMs at the same time.


Sure, then just add $100 each for a decent amount of RAM and storage.

Oh, wait... forgot that these are designed to be landfill fodder.


> Not everyone can drop $2k on a laptop

That's what most laptops used to cost back in the 1990s or so (after adjusting for inflation). If you look further back in time, hardware was even more expensive - and it couldn't even do 10% of what a modern MacBook does. Modern hardware is ridiculously cheap.


In the 90s most people didn't have a laptop for that very reason. They just owned desktops which were way cheaper.

I studied computer science then and I knew 1 student out of 50 that had an actual laptop. Even at the uni we had to use their computer rooms full of desktops and X terminals.


Cheaper for sure but not "way cheaper", at least nowhere near as cheap as desktop hardware is today.


In the 90s no one I know had a computer at home. Nintendo/Sega maybe, rich guys had Play Station, but no one had PC.


Oh here people did. Internet was booming and I set up so many PCs. It was great business.


I had a Gameboy, parents had a PC. Friends all had a game console. NES, SNES, SEGA, Amiga, C=64, etc. PC went booming in 90s though. Because here you could buy a PC tax deductible via a law called 'pc-privé'. This was to stimulate citizens to learn to use a computer in their private time. Still, even with tax deduction a PC was very expensive. Not like a car, but expensive still.


Not really sure they were much cheaper in the 90s. My first PC was a Dell P90 in 1994, IIRC it cost about $2500. There was kind of a mantra at the time that no matter the improvements, you'd always spend about $2500. And adjusted for inflation, that "way cheaper" desktop was over $5K in today's dollars.


Inflation calculator says the Core 2 Duo MacBook Pro I bought in 2007 is something like $4400 now. Bought a small intel ssd for the sata3 port in it too for probably another $500 now


Entry-level price for a new Mac, right now, is $700 (M1 Macbook Air at Walmart). It doesn't get you the best or the fastest, but it's a perfectly usable laptop. Or, if you're okay with something lightly used, a refurbished M1 Mac Mini is ~$500.


But Apple's entry-level Macbooks aren't intended to be bought. They have almost comically low amounts of storage and RAM for 2024 (8GB/256GB).

It's all about the upsell on those non-upgradeable parts.


The M1 MBA being sold by Best Buy and Walmart are perfectly fine for 99% of the computing world. Maybe not for gamers (most laptops suck for this), or for someone needing to crunch large datasets, but when this first came out, tons of developers were perfectly happy using it, even with small storage. Hell, my iMac I used up until buying a Mac Studio only had 256GB.


Yep. 8 GB RAM isn't great, but for basic use -- web browsing, word processing, some light photo/video editing, etc -- it'll be perfectly adequate. Not everyone needs a supercomputer.


Depends on how much browser tabs they dare to keep open, and how many pictures and videos they want to keep around on their computer stored in high resolution.


VSCode remote to my Linux desktop on LAN. The upsell is obvious but I'm not gonna drop $500 on 256 gb of disk space


That's a pretty time limited sale. I doubt the stock of unsold M1s is huge. It's also $700 for a new but essentially 3+ year old machine.


I think this is a longer-term sales deal, not just a clearance sale of old stock:

https://corporate.walmart.com/news/2024/03/15/walmart-brings...

As far as the processor goes, the M1 is in active production (e.g. for the iPad Air), and is still a very capable CPU. It may not be the fastest laptop CPU on the market anymore, but it's hardly slow.


And you can drop it and lose the $2k in one single second. You can insure against that but it costs another small fortune.

TCOs are great calculations for companies but don't work for individuals.


AppleCare+ on the $2k 14" MBP is $279 or +14% for 3 years of coverage. That seems pretty reasonable to me.


So now you made it a 2.3k$ MBP didn't you? Funny how the 999$ MacBooks tend to explode in price when you configure them to make them useful.


I don't understand how this relates to paid warranty or insurance. Your TCO for a non-Apple laptop can also include a protection fee or cost to repair with no insurance.


Correct. It’s an insurance product.

Whether you take it is all about your risk tolerance and in no way impacts the usefulness of the machine.

I believe 14% is worth it.


Build a Hackintosh out of a rugged laptop case, problem solved.


They have lots of monthly payment plan options here in Canada, and probably in the US too. It even used to be zero interest. Not sure about the rest of the world.


Also many no interest options in India but the prices are higher here, somewhat so for the Macs but significantly higher for the iPhone as it is such a social status thing here in the north.


Apple Card offer this in the US, but you have to get the card. I buy all my Apple gear this way. 0% loans are great.


But more people can certainly afford a second hand mac mini which doesn't cost more than the sum of the parts of a typical hackintosh.


Lower specced macs don’t cost 2k. Go for a generation or two old refurb and you’re looking at 600-800~


I assume people would do this to get a powerful machine and not comparable with one of the cheaper Macbooks?


Did you factor in repair costs?

A 14in MBP with an M3 Pro/36GB/1TB is $2800. Add 10% sales tax, and that's about 3k.


I've never needed to actually repair a Mac.


> It's really not much of a time commitment. You can just lookup hardware with full compatibility and build a desktop that "just works".

Oh JFC. This canard has been floating around the about linux for 30 years, and it's always been a half truth at best.

Inevitably, it always comes down to "Cards with 2361YSE rev 5 chipset" or some other nonsense. Like that makes total sense for a kernel developer, but most people don't know what chipset they have in some peripheral.

So now you're left with assholes saying, "WeLL yOu ShOuLd gEt InFoRmEd. JuSt GoOgLe iT!", and it ain't that easy. If you can even find a brand name to chipset list, it's going to be out of date, or it's going to be something that says "2361Y" or "2361YSE rev 3" or something. Is that close enough to "2361YSE rev 5"? Who knows!

Then the best part? Even when you lookup the hardware with "full compatibility", you'll find that it actually isn't. Then when you ask about it, you'll get, "I just don't use that feature, and you shouldn't use it either."


You guys that grow old forget that there's still younger people in this *world* (not just the US). It's analogous to saying "I worked 9to5 in the 2000s (when wages were acceptable). But now that I have way less energy to work, and made millions off my retirement fund. I don't see why this generation shouldn't equally work as hard today."

Tinkering shouldn't be nostalgia, it should be a right. I'm sure you used to fix the rusty old generational family car with your dad on the weekends. He probably used to do the same with your grandfather on the weekends. I don't think there'll be a car to fix for the next generation.

Just like ramen or office chairs can measure a recession, fixing cars with a father figure could be used as an indicator for the prevalence of greed in society.


I was also that kid. I remember an OSX upgrade breaking my mouse and I couldn't figure out how to get it working again. I was desperate for a Mac, but it was financially unattainable.


I feel the same way with phones. I pre-ordered a Nexus One the day it opened, installed a dozen custom ROMs, etc etc. Upgraded to a Nexus 4, 5. These days I use an iPhone. Don’t miss it, though I’m nostalgic for the excess free time!


Same for me. I spent countless hours recompiling my kernel in slackware, configuring enlightenment window manager. These days I don't even change the desktop wallpaper.


Yeah I spent literally dozens of hours of my life compiling different kernels with OSS and ALSA variations to get my sound card working lol. Really a 'you had to be there' thing.


I have one running in a virtual machine but on hardware that would natively support a Hackitosh which I use only for testing Mac distributions. It's too old to use now but when I built it you could buy Mac OS at Best Buy.


While tinkering for the sake of tinkering is as good of a hobby as any other, the process of tailoring your OS's does not have to be infinite. Maybe it is different for others, but while I did spent a lot of time on writing my Linux dotfiles until they were nearly perfect, for the last 5 years or so, when I have a fresh OS install, it's really just 'git clone; chezmoi apply' and I get a system where every keybind is exactly where it needs to be.

When work banned Linux machines and I had to transition to OS X, I had to do just as much, if not more, tinkering to make it work for me. Perhaps it does 'just works' for those who think exactly like Steve Jobs - but if you want it your way and on your terms, there'd be a lot of tinkering to do, from yabai configs to Karabiner json configs and custom plists, to replacing most of the gelded Apple apps.


I had a similar revelation a few years ago. My giant PC gaming rig blew up again (specifically my 3080 shit itself), a year after having to replace the power supply and requiring the whole thing.

I was just done with faffing around with that kind of thing.

So I bought a (then fairly recently released) Mac Studio, just the plain jane base 32GB model, and couldn’t be happier. So nice to have something virtually silent and energy efficient, instead of jet turbine that drew about 300w at idle.

I 100% do not want a laptop for my primary personal machine, but the big workstation towers are too much.

The Studio is that wonderful Goldilocks zone - performant, bring-your-own input devices, but merely “a bit pricey” and not extravagantly so.


Same story but with custom Android ROMs


Agree. I used tweaked BlackBerry ROMs for a couple years before getting my first Android device, an HTC One M7 with Android 4.4 KitKat. Spent loads of time getting all the tools working to modify ROMs, bootloaders, recovery/TWRP, and squeeze every drop of performance out of that phone. Then went "backwards" to an iPhone 4S and have been rocking stock iPhones ever since.


those were the days. Nowadays it alls feels same-ish and boring. Can't wait for a new kind of device where not everything has been figured out yet.


The Steam Deck sort of occupies this space today. I'm not in the scene myself but I've read about users modding them, running unsupported OSs, liquid cooling them, etc. Seems like any sufficiently broad technology will garner a community of hackers and modders around it.


This!

I'm not into modding but I got SD because of it's openness and all sorts of things I can make with it (also to support gaming on Linux and kudos do Valve for the work on pushing it ;) )


I played with jolla/sailfish for a while and the device was awesome but I couldn't get myself to like gesture/swipe navigation (I hate it on the current flock of iOS/Android with same passion)...

For the device I'm pondering new OP which is more open than the rest but still, as you said - it's mostly the same OS and the changes are not that significant to spend all that time on flashing...


Yea but thats how you learn.

These ipad kids dont know anything because it all works now for games and netflix. No need for drivers, windows installs etc


Macs are very expensive in some parts of the world, where other computer brands are affordable. A hackintosh could be a good option, and when somebody learns to do it well they could do it for others for money. Not only installing MacOS on PCs, but also installing newer versions of MacOS on Macs that are officially not supported anymore.


> A hackintosh could be a good option, and when somebody learns to do it well they could do it for others for money

Apple thoroughly screwed over Mac developers that the only compelling software that's exclusive to MacOS is developed by Apple themselves[1], IMO. Even those packages have equivalent (or better) alternatives on Windows. Macs used to be the platform for DTP, audio and video production - now all the 3rd party developers have pivoted away to other operating systems. One of the reasons professionals resorted to Hackintoshes in the past was because Apple had periods of neglecting the Mac Pro hardware on and off. Why would anyone go through the paid of setting up a Hackintosh in 2024, outside of being a fan of MacOS aesthetics?

1. Logic Pro likely has the biggest pull; Final Cut isn't the halo app it once was.


I use many third-party apps on MacOS that are top of the line in their niche, regardless of OS. People have many different uses for computers and workflows that you are unfamiliar with.

When you discover how programs on MacOS can connect and interact with each other and with the OS as a whole, it becomes a completely different experience.


Macs continue to absolutely dominate audio and video production, and desktop publishing. You're just making stuff up.


Nice strawman! You completely demolished a market share argument I never made. My point is that audio and video professionals now have viable alternatives to Apple software. Running MacOS is now optional, which wasn't the case in the past, so there's less of an impetus for running MacOS on an non-Apple hardware.

As for making stuff up - I don't know if you remember the years of neglecting Mac Pros, or the clusterfuck that was Final Cut Pro X. I do. I remember a lot of dyed-in-the-wool Apple users switching to Adobe on Windows. How many 3rd party DTP, audio or video production packages are still exclusively available on Apple?


Only on the countries where Apple brand dominates, the remaining of the 80% desktop market share makes do with Windows and eventually some custom Linux distros supported by the hardware vendors themselves.

VFX Reference platform includes Windows and Linux for a reason.

https://vfxplatform.com/


I'd run a Linux desktop if it wasn't for audio production. Mac's Core Audio / Core Midi are the top of the heap.


I couldn't agree more. I started my life buying/using Macs but nowadays I wonder why people make the financial effort considering the very large premium even though Macs don't do anything much better anymore.

You are right that the main driver is probably Logic. Final Cut is being largely replaced by BlackMagic and Adobe softwares (because they work everywhere and integrate better with other things people care about). Avid software works just as well on PC. As for the desktop publishing stuff, this is a use case so trivial (and in some ways displayed by web tools) and so dominated by Adobe that it feels Adobe is really doing Apple a favor in keeping their software update/optimized on time.

In my opinion/experience, they keep selling them because it is very hard for people to change. Most keep using them because this is what they are used to and similar reasons.

This is where Apple is very shortsighted and acting pretty stupid, you can see the young largely ignoring Apple computers because they are way too expensive, I believe they have permanently eroded their domination even in the media industries, because even though they have the money it won't take long to notice they can keep doing the same quality work even if they buy hardware half as expensive because it makes no difference to their young workers.

Apple premium pricing was a historical artifact of always being on the bleeding edge and being one step ahead of the competition for many things. Now, aside of the silicon (that has some advantage for the laptop in the form of battery life, but not really any for the desktop) it doesn't feel like they are ahead for anything.

In fact, when you take a hard look, you realize they are selling update of stuff that were designed 10 to 15 years ago and not much has changed when the PC industry as a whole has evolved quite a lot.

What is strong though, is the delusions of people defending their extortionate price for whatever ridiculous reason they can think of at the moment.

Don't get me wrong, I think Macs are ok for the most part, just not at their current price, especially in Europe (France).


Why is it a hacker thing to denounce everybody with a different opinion or preference as "delusional" or brain washed?

For many people it is nothing to pay a couple of hundred dollars more for a device that in their experience is easier to use or in their preference looks wise, even if the geek specs are worse. Especially for a device they'll be using for many years.

> In fact, when you take a hard look, you realize they are selling update of stuff that were designed 10 to 15 years ago and not much has changed when the PC industry as a whole has evolved quite a lot.

Then why has the PC industry not been able to catch up with Apple in computer design in all this time? A 2014 Macbook still looks better than any 2024 PC laptop. It's beyond weird to me by now. Apple made a great leap in design with the unibody Macbooks, but then after more than a decade nobody else has came after, even though customers would love it.

Imagine if there was only one car manufacturer that could design a good looking car...


Where did you find me saying it was a hacker thing? And it's not simply about having a different opinion, those are perfectly fine.

It's about saying things that range from deceptive statements to outright lies designed to satisfy the cognitive dissonance created by making a somewhat less optimum choice (sometimes irrational). It's all about projections and insecurities.

A couple hundred more was the case in the 2000-2010s but nowadays, Apple hardware very quickly closes up on thousands of euros difference for a device capable of the same workloads.

The RAM controversy started precisely because of that fact; on paper entry level MacBooks bench very well, but as soon as you start you level up your workload the truth appears and suddenly a much cheaper device less powerful on paper gets more done. There are plenty of youtubers who demonstrated that, even self-proclaimed Apple fanboys.

This is exactly what I mean by delusional: you can be happy/content with your purchase but trying to downplay the stupid compromises one has to make at a given price point is complete foolery no matter what the perceived advantages a MacBook may bring. Trying to rationalize/minimise those facts to others in order to justify Apple's extortionate pricing is very close to religious beliefs which is delusional by definition.

My first job was working for an Apple service provider (very high end) so I know firsthand the various reasons that may be valid for buying a Mac; but with Apple current business practice it has become less and less advisable, pretending otherwise don't get you bonus point from Apple...

Then you say that somehow a 2014 MacBook looks better than any PC laptop. Now I wonder if you live under a rock or just didn't check any modern laptop worth a dam recently (to be clear, I am generally talking about laptops in the 1K euros range at least). Have you seen a modern Huawei laptop (looks pretty much like a MacBook until you start looking at it closely), in my country it starts at half the price of a MacBook Air; in fact, as a Mac user, the first time someone handed one to me I couldn't believe the price they paid for it and the assembly quality for said price.

But even then, looks are a rather subjective thing, and you can say it looks better to you, but it is not really an exact truth. I find their latest design bulky and inelegant, well made, sure, but not necessarily nice/interesting looking. In fact, Surface devices look very nice as well (even though they are overpriced, they still manage to be better value will less volume); Asus has some pretty good-looking laptops, HP as well, even Dell manage pretty good with their XPS line.

Trying to justify multiple hundreds of euros price difference (at minimum) for a subjectively better look is not exactly rational behavior. It would be like someone trying to sell me on his Louis Vuitton handbag purchase that is "worth the price" because it looks good; even though the common LV pattern is a toilet room tile pattern that should have stayed there. You could have paid a master craftsman to make an even better bag to your specifications that would look just as good (I live close to one of their factories so I know that for a fact). But few people are delusional enough to say shit like that, because it was not a rational purchase, this is all about emotions and the emotions it brings in other peoples.

The problem I have with fanboys (Apple's or any other for that matter) is that they try to pretend otherwise and that's just delusional. Apple is a company that mastered marketing and knows very well how to create desire that will inflate the value of their products that should be tools first and foremost. Some people like this fact and I find it a bit sad (considering the supposed use case of the device), others begrudgingly pay the price because of various reasons (that very often boils down to limited self-imposed choices).

As for the unibody I don't think you understand why exactly Apple is doing that. It has a lot more to do with saving on device assembly (making stuff more complicated to repair in the process) and ROI on the billions they invested in custom CNCs than any real benefits for the user. It doesn't make the laptops last longer (not that it matters considering how fast Apple is obsoleting their stuff nowadays) and it can render a device useless for just a dent at the wrong place, because you can't swap chassis. On top of that, modern premium laptops have moved on to magnesium alloys that noticeably save weight without compromising on strength and Apple hardware is often heavier for a given form factor.

I could go on for days so I will leave it at that. Just to be clear I am not saying all Apple stuff is all bad or whatever just that it is tiring to see people defending their stuff tooth and nail when considering the price their customers should ask for much more and the behavior that consist in defending all the very real compromises of their gear is, yes, delusional.


With a little preparation setting up a Hackintosh is not much more difficult than setting up an actual Mac. What you're describing is a myth or just out of date.


It really depends if you're tinkering because you have no choice versus tinkering because you like it. People will absolutely tinker on cars for their entire life, but upgrade to more interesting jobs like doing engine swaps rather than just changing their oil.

But it sounds like you were tinkering because you had no other choice, not because you enjoyed it.


To me it was having just one powerful upgradable desktop computer with Windows and MacOS. So I don't have to have devices on my desk.

Now I have solved with PC desktop, MacBook Air, and Apple Display. PC also has usb-c display output so I can just switch which cable connects to the display.

Downside is still that M1 is not as fast, especially something that is GPU intensive as the PC I have.


> this kind of tinkering was fun and a good way to improve the machine I was using.

You know you’re alive when you log into iCloud from a hackintosh and then Apple notices, you get the ‘unauthorised hardware’ message (I can’t remember the exact phrasing) and your various iCloud services begin to cease functioning. It’s not often your OS is exciting.


I had the same experience with windows 98/2k and my franken pc of randomly upgraded parts. I used to have to reinstall win98 every other month or so because it was so unstable. I had the installer on a separate partition, so I could just wipe the system disk and have a clean install up in 12 minutes.


> These days I want to -use- the computer, not spend time trying to convince it to work

Me too. But it seems i'm out of luck. Everyday i have to fix something. Today "quality" means how much user data we steal, not if a system works.


I find these days the Steak Deck has become a great device for tinkering. I've seen people do some nice unexpected stuff with it, for example making an opening in the back to connect an dedicated GPU or using it to pilot drones in Ukraine.


*I meant "Steam Deck", autocorrect strikes again...


There are new generations of cash poor tinkerers though, including the 3rd world.


from around 2008 to 2012 I ran hackintosh, on desktop, it was great and fun in 2012 I bought a first MacBook. The good experience on the hackintosh made me get the MacBook. So I like to think hackintosh helped apple.


> These days I want to -use- the computer, not spend time trying to convince it to work.

I have said this same thing about Android vs iPhone. Also, if I cannot tinker with Android then I might as well have an iPhone.


I agree your first three paragraphs, but why won’t they want to continue to hack?

It was my obsession with worthless endeavors that got me the kind of job that made my time valuable in the first place


> 35 years ago I was fixing something on my car most weekends. Now I just want to turn the key and go somewhere.

See, that 35 years for me didn't make me stop working on my cars, it just allowed me to have enough money to have a reliable car as well as "toy" cars that I can still tinker with. I drive the Audi, but I still wrench on the Triumph. I used to tinker with Hackintosh stuff as well, and I haven't stopped tinkering, just moved on to other things, like this Rubidium frequency source I just bought to build a high accuracy NTP server from a Raspberry Pi. (Yes, of course, there are already cheap and easy solutions for this, but I want to tinker).


Either OP doesn't consider tinkering an enjoyable past-time activity or they've no free time to do something they enjoy. Both quite sad to be honest.


I wouldn’t go as far as “sad”. Free time is always a finite resource you have to prioritise. I used to tinker, these days I’d much rather spend time with my kids. I’m definitely not sad about it. I’ll have plenty of time for tinkering in the future.


Yes, time is limited, and these days I have new hobbies. For cars, and computers, it's a bit of been-there-done-that.


Can people simply have priorities?


Of course. And people enjoy spending their free time on various things not necessarily due to some restriction. For those people time spend on those things isn't wasted. For example, can have fun fixing cars even if have money to have a mechanic do it.


>These days I want to -use- the computer, not spend time trying to convince it to work.

This but with LLMs lol.


Same here with Emacs. Tinkering-vs-doing. Opportunity costs too high.


There was a time when hackintosh was practical for everyone who needs macOS and/or can't stand other desktop OSes. It was the tail end of Apple's Intel hardware. It was pathetic in terms of performance (underpowered CPUs and buggy GPU drivers), quality (butterfly keyboards) and thermal design (things would overheat all the time), yet expensive.

I myself was contemplating building a ridiculously overpowered hackintosh machine around 2019. Then the ARM transition was announced. And then the M1 came out with overwhelmingly good reviews from literally everyone. So I decided to wait for the beefed up "professional" version, which did come later, so here I am, typing this on an M1 Max MBP, the best computer I've owned so far.

Also, for me personally, hackintosh was an introduction to macOS. I was a poor student at the time and couldn't afford a real Mac. Of course I bought one about as soon as I could.


went from hackingtosh to mac, never had enough to afford a car. (I think)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: