Hacker News new | past | comments | ask | show | jobs | submit login
ThinkPad X220 MacOS High Sierra Installation (mcdonnelltech.com)
248 points by walterbell on Aug 4, 2018 | hide | past | favorite | 238 comments



I used MacOS for years and loved it. But now have been on linux (on both Mac and a X220) exclusively for a few years and I'm honestly confused as to why a developer in this community would even think of not running Linux full time.

I have seen people here address this before, but it's always like a list someone from Lebanon would write after moving Costa Rica - never really that things are wrong on the other side of the fence, just that it's different. Stick it out a bit and the pain of those differences melt away.

If I had to shift to MacOS now it would be far far more painful than the shift from Mac to Linux.


No distributions with truly great out of the box defaults, too much time tied up in configuration, too few Linux app developers with a user-oriented mindset and obsession with getting all the details right. Also, just generally death by a thousand cuts with ridiculous numbers of small annoyances.

With a Mac I can have it set up to my liking and be working in little more than an hour. Under Linux (and to a lesser extent Windows) it’s a multi day project.


Also, just generally death by a thousand cuts with ridiculous numbers of small annoyances.

This used to be the difference.

But between Preview.app bugs (crashes all the time when previewing beamer slides, slow rendering, jumping through slides in non-linear order), sticky or stuck keys on the butterfly keyboard, ancient UNIX userland (yes, you can install newer versions with Homebrew or Nix), weird bugs when running XQuartz, etc. The gap is narrowing pretty quickly.

Also, I have been using NixOS on my Linux workstation at home. I like it so much that I want fully declarative system configurations everywhere ;).


I can't agree. I work in enterprise Linux. We essentially make distributions, build systems, customized kernels etc. Linux, desktop Linux especially, is the worst of the big operating systems. It certainly has qualities, but if you make a table with all the different things to expect from an operating system and you look into each section without being too subjective Linux will come last overall. And I don't think it really is changing. Ever wondered why Ubuntu doesn't really seem to be getting better, mostly just shifting technologies around? Well, that is because it essentially reached the level that desktop Linux is at. Anything else is going to be going in the direction of something like Android and even that isn't necessarily up to the standard of the alternatives.

I would like to elaborate, but I think it is pretty futile to do so online, which is pretty telling in itself. One example that should challenge people view would be the ssh default story that is on the front page right now. Everything in Linux is pretty much like that if you dig deep enough. And while its competitors aren't perfect either, they are just far more consistent. Because that is a major selling point for them.


> I work in enterprise Linux.

I don't see how you could even compare enterprise linux to home linux. Enterprise linux not only has a different userland (i.e. it's designed for LTS setups -- stability, not usability), but almost all enterprise places run 20 versions behind anyway. Not only that, but you're facing peculiarities of your setup that no sane home user is going to face in a billion years. The entire experience is different.

Go pick up Linux Mint's Cinnamon distribution, then see if you have the same complaints.


Linux Mint 18.3 is amazing. People should really try it out... I recently got a new Lenovo 120S, replaced its Windows 10S with Linux Mint and everything worked out-of-the-box. CPU @ 2-3%, RAM < 550MB on a 11.6" laptop for £129 with zero issues restored my faith on Linux as a viable desktop OS.


Of course my experience is different. If you want browse the web, do some development and stay within those confines it can work. But that has been true for probably at least a decade and is true for any other mainstream operating system as well. If that is what you want you should probably go with Chrome OS. But people usually expect to be able to do more than that.

Quality isn't an absolute measure. If you were to plot the security holes in the Linux kernel on a timeline you will most likely find that there is at least one the majority of the time. Security is essentially a percentage of how hard something is to discover and exploit. And you don't know what that percentage is until someone go looking.

The same is essentially true for the rest of the the operating system. Your average user can be perfectly happy e.g. clicking on random links and executing random code without apparent consequences. They will rave about Linux until reality catches up with them. Whether that is software conflicts, (lack of) backups, security or just maintenance in general. Usually at that point they get disillusioned and switch to something else. That is why we have the perpetual "year of the Linux desktop". Because people leave at a similar rate that they join.

By working in enterprise Linux I, involuntarily, get to see all this at different points in time. Not just from my own perspective as someone who has largely accepted these problems, or from the perspective of a user that is excited about Linux until they aren't. It is from this perspective, and because I know how hard it can be to do things that are outside of the default, that I say that the other big operating systems are better overall.


Ever wondered why Ubuntu doesn't really seem to be getting better, mostly just shifting technologies around?

Lack of either a near complete monopoly or a track record with dedicated, quality hardware. Throw in some questionable leadership decisions.

I don't think there's anything inherent to Linux that's unable to level up on the desktop, but the people best placed to shake things up are probably manufacturers who'd rather take a cut of a Windows license.

Windows has certainly been consistent in not offering an SSH client at all for a very long time.


I think there are things inherent to Linux that make it unable to ever reach the level of usability that MacOS provides.

On MacOS there is one way to do everything. Want to make a desktop GUI app? XCode has a project template for you. If you deviate from the APIs it suggests, you’ll get nothing but pain. On Linux, there is no default window API. You have to choose between Qt and GTK and <a billion other things>. That leads to fragmentation at the windowing system level. And just think, that fragmentation happens at all the other levels too.

Ubuntu has removed some of the fragmentation, since it’s so popular that you can just target Ubuntu and let your application fail on different configurations (or worse than failure - technically work, but require tons of tweaking in config files. Ugh).

However, Ubuntu has reached the point, IMO, where to provide better UX, they would need to start making Ubuntu actually incompatible with other Linux distros, or push the Linux community to adopt new standards that are better for desktop linux. A perfect example is Wayland: supposedly Wayland is a lot better than X11 for desktop purposes. It’s just too bad that Ubuntu can’t just break compat and blaze their own path here, or we would probably have a great Linux UX story already.


In my opinion most of the distributions are starting from the wrong end. They start with kernel, add package management, decide which services to run, which toolkit to use and then make some utility applications. Sometimes over a few different distributions. This mean not only that it is hard to convince people why they should change things, but that when you made the improvements you want you end up stalling since you haven't actually tackled the things that are lacking.

What you should do is the opposite. They should start with the toolkit, the network infrastructure, the management utilities or something else that would be the use-case. Once you have e.g. a solid toolkit that is attractive to developers and people start making good applications the rest is a matter of time. There is of course a reason why people don't do this, which is because it is hard. But hard is good, it is complexity that will kill you.

This is of course essentially why Android and Chrome OS is successful in their own right.


Interesting point, although I can't say I ever managed to get on with anything in XCode I've certainly had some Linux native desktop dev struggles in the distant past.

I'm a Java dev mostly, so perhaps I'm insulated from such things by the JVM. So long as I can run Chrom[e|ium], Eclipse, Docker and a terminal with a bash shell I'm pretty content. Even with that small list I've got issues with keyboard input lagging in Chrome on MacOS and Docker performance is poor relative to Linux. iTerm is a thing of beauty though.

I've got a box running Wayland on Arch. Can't say I've noticed any differences as a user from X11, I'll have to look them up and see what I'd missed.


Windows has certainly been consistent in not offering an SSH client at all for a very long time

That has changed. One of the Windows10 previews offers a client and server (Ed25519 only). I`ve been experimenting with it on WSL and it seems functional... time will tell whether or not it is secure.

Back to the point of the original discussion: having worked using OSX/MacOS, Windows10 and GNU/Linux in various combinations for the last couple of decades I honestly prefer the desktop experience of GNU/Linux. As long as you avoid the dodgy hardware (Nvidia cards, an ever decreasing number of wireless chips) then everything is painless.

At this point really the only thing distinguishing these operating systems as general desktops (as opposed to gaming systems) is whether or not they are Free Software or not. That is something that is becoming more and more important to me in order to avoid being compromised by criminals (governmental or not).


And on macOS, when I hit them it's generally hard to find a solution. On Linux, there's an Arch wiki page, or something.


NixOS made a big difference for me too.


I've had the opposite experience. Stock Kubuntu 18.04 provides pretty much the experience I want with just a few tweaks to fine tune the touchpad and font DPI. Once the base system is installed it only takes a few more minutes to set up my dev tools using Ansible.

On Mac OS, I'm also pretty happy with the stock interface, but I spend quite a bit more time downloading and installing dev tools. Then I end up running out of memory because Docker for Mac and Minikube both end up running Linux virtual machines in the background.


You can drop Minicube now Docker for Mac includes Kubernetes


> With a Mac I can have it set up to my liking and be working in little more than an hour. Under Linux (and to a lesser extent Windows) it’s a multi day project.

With a Mac I set it up as much as I can in little more than an hour, with no hope to achieve what I really want. Under Linux (and to a lesser extent Windows) it's a multi day project but I can get whatever I want.


It can be a bit of a rabbit hole. I wasn't happy with any of the window managers or desktop environments for Linux so I wrote my own. I didn't like the way any of them handled multiple monitors, but it ends up being more than a multi day project. I still don't have what I want, so now I'm trying to find a way to get rid of X or at least fix XRandR, because it doesn't support multiple GPU's the way I would like (Wayland is not a solution either).


> With a Mac I can have it set up to my liking and be working in little more than an hour.

How much of that is due to being accostumed to using Mac's defaults? I nean, Debian's default setup is pretty much my ideal setup, but that's because throughout the years I've got accostumed to it, not because Debian's UI guidelines are awesome.


That is incorrect. I really urge you to try Fedora 28 today (you can liveboot).

I have had Mac users use my laptop for a little while and love the usability. Everything simply works. I use Skype Web and Zoom to do videoconferencing ...and obviously all the Slack-like tools work great.

And you have the choice to buy spectacular developer focused hardware like XPS or ThinkPad laptops.


I am using linux/unix for 25+ years, but the desktop just never works for me. I'd echo its just death by a thousand cuts.

The usual suspects: fonts, video drivers, sounds drivers, various USB devices supports, sleep/wake issues.

And lack of basic apps, like a good multitabbed SSH/RDP terminal.


I honestly can't believe what I'm reading here. Is it still 1999 where you live!?


I started using Linux around '97ish and agree with the parent too. The problem for me wasn't investing the time in tweaking a new installation, but apt-get dist upgrade (whatever it's called these days) would eventually break my system...in very bad ways.


I hear this a lot. And mostly from Ubuntu victims. I have not reinstalled my Fedora XPS 13 laptop for over 4 years. I have clean upgraded 3 times.

Fedora is spectacular and has been spectacular for many years now. The driver support is brilliant. It was one of the first distros to have absolutely seamless integration with RAID mode NVME (which was something the XPS set it to).

I will be very surprised if you have to "tweak" your laptop. Everything that you have in OSX is already there - including nightmode, etc... the works.

And here's the cool part - customizing Fedora is a browser extension away! http://extensions.gnome.org/


Breaking upgrades are not distro-specific, but mostly user specific.

It usually breaks for users, who do not respect package manager and what it does, break their installation with misc convenience scripts and tweaks run as root, and then wonder, what went wrong.


To be totally fair upgrading macOS or Windows can be a minefield too.


I had exactly one issue with a macOS upgrade - when they introduced APFS. The upgrade process has failed in the middle with an FS-related error, but AFAIR it resolved itself after rebooting and retrying. For comparison, Windows 10 and various popular Linux distributions have caused me a lot more after-update trouble: input devices ceasing to work, graphics drivers breaking and falling back to the built-in generic ones (with a 800x600 resolution, on Windows), graphics drivers breaking and not falling back to anything, but just causing X to fail to start (on Linux). Once (a few years back, I admit) upgrading Debian testing brought in a new version of the kernel that turned out to have some ACPI-related bug on my machine and consistently freeze about 30 - 60 seconds after booting. This all is of course just personal experience, but I think I'm not the only person with similar ones.


I agree with the parent. I’ve been using Linux since before 1999, sometimes as my main desktop, but often not.

This comes up somewhat regularly, and probably always will - it’s the nature of the beast.

I’ve found myself very happy using a Mac as my main desktop. It’s not perfect either, but good enough.


For the most part, I think Linux has moved from a thousand cuts to just a hundred or so and those hundred will persist. A Fedora installation covers alot of the hardware out there, but there will always be new machines where support is incomplete (Surface, MacBook, etc.)

It's tempting to compare Linux with MacOS, but also (in my opinion) unfair. Apple appears to be making conscious decisions about what goes in their machines and not simply picking the cheapest components. I think Windows is closer to a fair comparison (they have to try and support everything out there) I have had no more hassle under Linux than I've had with Windows.


It's not unfair. There's nothing preventing, say, RedHat, from offering its own hardware for sale.

EDIT: Rather than downvoting, perhaps those who disagree with me could tell me why other companies aren't capable of "making conscious decisions about what goes in their machines"?

Apple is, after all, far from the only Unix vendor to do so. Whether it's HP-UX or Solaris, it's fairly common for Unix vendors to tightly couple their software to supported hardware. What is preventing Red Hat from offering a fully Red Hat compatible laptop that works near-perfectly out of the box?


There are machines certified for RHEL, Ubuntu or SuSE.

If you do your purchase outside this list and something doesn't work, it was your decision and now your problem. It is not Redhat's or Canonical problem. They are not in hardware business, their partners are, and the information which models were tested and what was the result is publicly available.


Comically enough, I tried not so long ago: Fedora didn't boot. Like, doesn't even reach the login manager.

Everyone has their own stories and use cases, why are people so hell-bent on convincing their use case and solution is The One True Way and-I-can't-understand-why-you-re-using-anything-else?


See other comment here on the Ikea effect. That's part of it anyway. I think the other part is humans can get fetishistic about almost anything. It seems to be part of our nature. Unsurprisingly, it's shaped around computers for HN types. Add into the mix that the organisations which make the alternatives (Apple, Microsoft) really are bloody horrible, so that adds an ethical angle. If there's anything people like more than fondling their fetishes, it's finger-pointing at others for doing wrong.

So yes it's silly, but it's fairly strongly motivated silliness.


I had Fedora 26 or 27 installed on a secondary partition on my tower for a while and wasn’t particularly impressed. All of the things mentioned in my original comment applied to some degree... I could make it work and I’d use it over Windows any day but it wouldn’t be my first choice.


I don't necessarily doubt that Fedora 28 is passable day-to-day platform, but I did think it was a little surprising that you mention Skype as your first example of what should be a list of critical apps that bridge the transition. Would be curious to hear if others do use Skype, but for me it's literally been years. It came up as a joke in conversation recently, as we waxed nostalgic — yes I'm actually using that phrase — about chat apps from days gone by.


I dual boot Win10 and Kubuntu 18.04, plus I run Kubuntu on my laptop. I use Skype in both Windows and Kubuntu. Under Windows, almost unbelievably, it never works right. It either can't find the camera or it can't find the mic, but if it can find the camera it still won't transmit from it. In Linux, I had a little mucking about to get the sound going, but now it just goes.


Here's where I think is the difference between the western world and NBU - The Next Billion Users.

Skype and Whatsapp video are the only two ones that work reasonably well in India with shaky mobile bandwidth. I mentioned Zoom as well.

But here's the thing - have you used Skype Web ? There's no installation needed.


Interesting point. I wouldn't have assumed that, because I've never had a good experience with Skype's handling of unstable connections here in Canada. That said, no I haven't tried Skype for web.

Regarding the rest of the world, WhatsApp definitely comes to mind. For chat, video, and audio.


> And you have the choice to buy spectacular developer focused hardware like XPS or ThinkPad laptops.

I'm using an XPS 15. A recent purchase - I'm an OS X user by choice, but Apple currently makes no laptop hardware I'm interested in buying.

Here's the relevant archwiki page: https://wiki.archlinux.org/index.php/Dell_XPS_15_9560

It would take me days to work through that horror. I'd still end up with an only partially-working machine. Yes, it's the fault of Dell & chipset manufacturers for not supporting Linux, but if I'm going to spend effort apportioning blame for the world's ills, I'll do that on things that matter more (torture, ecosystem collapse, social isolation).

In practical terms, I can spend time setting up & configuring Linux, or I can use Windows today and plant some vegetables, or walk up a hill, read a new volume of history or philosophy, talk to a friend. Windows is kind of nasty, but all the applications I need are available for it, it can launch them, it can give me access to files, and I don't spend any time messing with it. Good enough for me.


(I'm not claiming that Windows is as undemanding as OS X by the way. The impedance mismatch between WSL environments and the rest of the system, and Windows' relative instability, make it more time consuming to set up and maintain. The order for me is OS X < Windows < Linux, with the Windows-Linux gap the larger of the two)


> too few Linux app developers with a user-oriented mindset

By "user-oriented", do you mean kitsunesoba-oriented? Or do you think all users have the same preferences?

> With a Mac I can have it set up to my liking and be working in little more than an hour.

macOS doesn't even include tiling window management out of the box. Apple clearly optimizes macOS for the average user. If you're an average user, that's cool. But for people with sophisticated workflows who aren't average users, the out-of-the-box experience provided by macOS (and Windows, GNOME, etc.) may be grossly inadequate.


The argument that using a tiling window manager is a prerequisite of being a non-average user is a highly arrogant generalization of your use case.


There is a general rule that the more time people spend futzing with some particular aspect of their shit, the more they fetishise it. It's a kind of cognitive sunk-costs fallacy. I've used i3, and it's great. I wish something like it were available on Windows or macOS. But it's just a trivial convenience; hardly something to develop an emotional attachment to. Let's face it, humans are odd creatures and will attach to almost anything.


IKEA effect! An example of which is those prepared doughs where you have to add eggs... the egg could (and has been at some point) be included but marketing discovered that requiring the add-your-own-egg step made people love the end product more because they somehow made it themselves.

https://en.m.wikipedia.org/wiki/IKEA_effect


I've heard of that before, but it always made me wonder - are we just supposed to accept unquestioningly that powdered eggs are indistinguishable from fresh? Sure, there's no question you can make something edible with dehydrated eggs, but that isn't proof there's absolutely no difference.

In general, the existence of a bias is insufficient proof that you should reverse your conclusion.


Nice - I hadn't come across that. Spot on.


i3 allows the user to maximize screen real estate, and also to minimize time spent adjusting window borders and navigating with a mouse. It’s a huge time saver for anyone who learns a few simple controls. It’s like tmux on steroids.

Sure it’s a convenience, but so is basically everything we do to alter a computer after unboxing it.


I've never spent any appreciable time 'navigating with a mouse' and 'adjusting window borders' on other OSs. Sounds like those thundering imprecations that we all must use vim or emacs because IDEs require mouse use (which they don't).

I agreed it's nice - but essential enough to merit the time sink of running linux? Not for me.


I have to agree. I don’t use a tiling window manager and I don’t consider myself an average user. The comment above was pretty insulting.

Maybe I’m just a Luddite that doesn’t know any better, but I’ve been quite happy without tiling windows for many years. Then again, I spend most of my time in a terminal in a tmux session. Maybe that counts.

A common thread here is - your use case is not my use case. Many people are assuming that their workflow is the ‘one true way’. That not a healthy way to have a discussion.


It might've been, but I agree with the core point: make window tiling an option. For the amount of time that it would take to implement, and the non-zero number of people that would benefit from it, it seems a glaring omission. Every OS should let you line up two windows side-by-side out of the box: this should be a minimum requirement of any window manager that lets you have windows at all.


Full screen mode is macOS native tiling mode, you can put two windows side by side and use as much real estate as possible (no menu bar nor title bars)


Eh, that’s pretty darned far from an i3 or Xmonad experience. It’s roughly as wrong as saying that the Gnome frameworks are just as good as Cocoa :-)


I have used i3, awesomewm and Xmonad extensively for years so rest assured I wasn't implying anything like that ;)

I was more like pointing out that macOS does allow for the following in some way out of the box:

> Every OS should let you line up two windows side-by-side out of the box


Sorry, that really was only an example. I also didn’t say, or mean to imply, full-screen only. Every os should also let you line up windows top-to-bottom out of the box. Etc. Macos' approach of allowing one very specific combination of window arrangement, then trying to make a feature of it by calling it 'fullscreen' or 'side-by-side' (or whatever they call it) just doesn’t really cut it.


The problem is that there is no good solution for tiling window management on OS X, even with third-party software. The parent may not have stated it that way, but you’re being over sensitive if you’re actually offended by their comment.

Personally I feel like I’m using a toy when I have to use a non-tiling window manager. It may not be for everyone, but the display server or desktop environment should at least provide an interface to allow someone else to implement it.


This can be amortized though. You spend one good weekend setting things up to your liking and dump your dotfiles and other config properly in a git repo and you don't have to do it every single time.

A one time xmonad + arch setup that I did a couple of years ago has lasted me across 3 machines with only minor tweaking.


Dotfiles are like code, and need to be maintained. Some upgrade will break your custom configuration, and then you have to dive into docs again.

I try to make few customizations to the default macOS install, and just roll with the changes coming from Apple. I do have some dotfiles, containing mostly just SSH and ViM configs. But I've already had to fight ViM several times because of it.


Maintenance is amortizable as well. Every time you have to do a deep dive to find out what's wrong, you learn a little more about the state of the art of that particular part of the world. Then you can make a change that is more robust.


No MS Office, no Keynote, no Photoshop / Illustrator / ..


Linux distributions have become a lot nicer. I find myself not changing much at all anymore - it just works.


I'd love to switch back to running Linux on my laptop. Here's some things that aren't "but it's just different" that are stopping me (for now):

- can't run apps I depend on (e.g., Sketch.app only runs on macOS, and I I have a lot of existing files on it, and there's a lot more support for Sketch than, say, Inkscape. Other examples: Things.app, 1Password... hell, even Spotify barely works on Linux (does hi-dpi work yet?)

- hardware quality & support (if my Apple laptop has an issue, I can take it into an Apple store and get it fixed same-day in many cases. also, recent keyboard issues notwithstanding, Mac laptops are generally seen has having best-in-class build quality and durability)

- hardware support+power management (much better than it used to be, but depending on which laptop you choose, many people still have issues getting hardware edgecases, like suspending and resuming, adjusting the backlight based on ambient light, applying optimal power management, etc)

- touchpad drivers (libinput and wayland are making progress on this front, but the experience is still not on par with Windows Precision and macOS

- mobile device integration (unlock my computer with my watch, get phone texts forwarded to my laptop, make and receive calls via my phone, etc)

Trust me when I say, I really wish I could move back to Linux. I spent many years helping develop a desktop linux distro. And certainly things have improved substantially since a few years ago — hiDPI support in particular is really excellent these days. But there's still a number of barriers for even technical users. At the end of the day, I want to get work done with my laptop. I don't want to fuss around with configuring and worrying what happens if an update upsets my driver compatibility (yes, I know about btrfs snapshots and kernel image caching).

Just my $0.02. Sorry to be a downer.


+1 that is how I see it also

Having integration between iPhone/iWatch/MacBook is great and I have come to accept Apple’s UI experience as definitely good enough for me.

I love Linux, and do much of my day to day coding on Linux servers with far more horsepower than my laptops. Working in macOS for email, writing, research, small code experiments, etc. is fine, as is working in a few SSH shells on a server with a nice Emacs/tmux/etc. setup for development. I find that I can live without the nice JetBrains IDEs, VS Code, etc.

Edit: I am nearing ‘retirement’ when I will do less software development and more writing. I am seriously considering switching to a really nice Chromebook since that would also meet my needs - so I am not permanently married to Apple.


> hardware quality & support (if my Apple laptop has an issue, I can take it into an Apple store and get it fixed same-day in many cases

That kind of depends on the country you are in, I guess. Here in the Netherlands you first need to make an appointment (~1 week), then they order the parts (~2-3 days), then they fix the issue (~1 week). All in all you are without your laptop for 1,5 weeks. That is really lousy support compared to something like Dell's next business day on-site support.


I used to be really happy with AppleCare, which I've always paid for. Lately though it feels a lot more adversarial interacting with the genius bar people, convincing them there's a real problem that needs to be fixed. It used to be they'd quickly fix a problem without much fuss but now they seem cagey and more arrogant than I remember them being. My last encounter with a new model MacBook Pro, they claimed it was my fault the keys were sticking! I had to escalate and consume several hours just to get them to fix the badly designed keyboard. I hear great things about Microsoft extended warranties and find myself wishing Apple was more like they used to be.

I'm in the USA.


(Also in The Netherlands) That depends on what you want, where you have it done, and how you plan it. I usually just iMessage or call the nearest APSP or Apple Business thing whatever-it's-name-is, tell them what the problem is, and we'll have a same-day contact at their location to run it to ASD2 and order the parts at GSX, we take the hardware back and as soon as the parts arrive, which is anywhere between 1 and 4 days, we get a message or call for a same-day repair. And that's the construction for one-off issues; for business use we just migrate the user to a replacement laptop (i.e. using Migration Assistant or Time Machine and a TB2 or TB3 cable), which takes about 1 hour and then the user is done. On our side, we then mail it to whatever service provider we have under contract and we'll get it back once it's done.

I suppose the difference in handling is that other vendors often have a wide open parts chain, and Apple doesn't. You can't buy them as a consumer or user, and as an APSP or AASP you can only get parts when ASD2 says there is a reason to, and you can only order them via GSX. In some cases, for example when it's a big item like a screen assembly or a mainboard, you have to send the item in first. All of that stuff takes time, and for many users, that is a reason to get next-day AppleCare support so you don't have that problem. You just get a replacement sent or someone at your door. Yes, that exists, just like with Dell and HP.


> to run it to ASD2 and order the parts at GSX, we take the hardware back and as soon as the parts arrive

The AASP I contacted could not return the hardware until the repair was done, so bringing it in when the parts arrived was impossible. The Apple stores need 1 week for repairs nowadays.

> a reason to get next-day AppleCare support (...) Yes, that exists, just like with Dell and HP

Only for enterprise users, and it sets you back 400 EUR/yr. Dell charges 190 EUR for 3 years (in total). Quite a difference.


- Figma works very well as an online replacent for sketch - Spotify works perfectly in linux for long time - Thinkpads have the best support ever. They even came to my home to fix it the same day

I used to have Macs but made my way back to Linux (Manjaro) and happiest ever


Spotify doesn't work "perfectly" in Linux when it comes to high-DPI displays - fonts in some parts of the UI scale correctly while others don't.


Spotify doesn't work perfectly in Linux, and it's always lagging behind and does weird things. Take for example a locked screen with spotify in the background. When you wake the screen and log in, Spotify often decided to curl up in a corner of the screen because apparently it tried to get the screen size from a locked, sleeping screen, which is 0. Or maybe it was one of the 10 API's between Spotify and the screen data it wanted...


If you have 1-2 rarely used apps that are keeping you, you can run OS X in a VM. Legally, if you're on a Mac.

In the hardware dept, people are always bringing up the issue that Linux doesn't work well on all the random laptop hardware. It's true but it's not exactly a disadvantage compared to OS X, which works well on a much smaller set of hardware.


Except I you want to do any pro audio work. That won‘t ever happen in a VM because of the added latency. Even using Photoshop efficiently in a VM is tiresome because at least virtualbox breaks key/mouse combos.


> you can run OS X in a VM. Legally, if you're on a Mac.

This is the infuriating part. You can buy an os x license - but only to run on Apple hw. So you can't legally run it on other hw.

Not sure if this is really enforceable here (Norway) vs consumers due to somewhat strong consumer protection - but it's probably enforceable against businesses, which is bad enough.


Your mobile integration comment is incorrect. Thats exactly what kdeconnect does nowadays and it works perfectly with Android.


> touchpad drivers (libinput and wayland are making progress on this front, but the experience is still not on par with Windows Precision and macOS

This is the only thing I need before I move to Linux on my laptop. Until then I'm sticking with Linux on my desktop and MacOS on my Macbook.


Spotify works perfectly on Arch.

I have been using 1password with wine for over 3 years now.


There’s also the 1Password X browser extension.


You can also bring any business-class laptop to an authorized repair place if you want to do bring-in (which would be what I do for an Apple too, since the next official Apple store is quite a few hours away), and Apple not offering on-site support is a negative point for them IMHO. But these things depend on the country/how well run the respective service network is.


> hardware quality & support > hardware support > drivers

These aren't Linux issues, they're hardware issues. Pick up a decent-quality laptop which lots of developers use and support is rock-solid. Pick up a cheapo Win10 plastic thing, and yes, nothing will run great on that.


In this article the one of the first steps was replacing the WiFi card. This is a high quality hardware with good developer support. This is exactly the type of hardware you’d want to use.

And the first step was to open up the laptop to swap out a part!?!?!


Unable to open the article at the moment, but likely it got replaced with a Broadcom. macOS has a very limited set of supported wlan drivers. It is ofcourse not ideal, but understandable (and related to the issue of drivers for macOS in general).


You’re right... it did.

I’m sure Linux probably supports the built in WiFi card on this laptop too. But I remember having to do just this same thing with an older laptop to get Linux to support the WiFi chip.


I have a lot of reasons but I'll share the one that always comes first. I have never had a Linux laptop that would both reliably run three monitors and proper power saving when solo on battery.

Every response I ever get is always so disappointing too because it's usually people who think your time is worth nothing and they casually want you to spend hours screwing with Bumblebee or whatever. And when I do build up the patience to actually try it ends up breaking something else about my graphics.

Sorry this is a sore spot for me. I do robotics so using a Mac for years was painful when I had to run ROS, but everything else about it stayed completely out of my way.

Ubuntu is fantastic only when I low-pass my actions, avoiding anything slightly edgy, unusual, out there.

Oh my god I had to compile mouse drivers for my Logitech MX to make the wheel stop screwing up. It's a mouse!!! They were invented in the 20th century!

My Bluetooth driver just crashes once a week. Silently. My mouse just stops working. I have to power it off fully because a reboot won't do it.

I lived in an apartment around 50 or so wifi connections. I would have to toggle wifi on and off over and over the first time connecting because the damn list wouldn't show them all and mine wasn't on the list!

I've worked in product and while I still think so much about product is just utter nonsense, some of it isn't, and it's those practical parts that seem lost on so many engineers.

Why can't I make my damn software Ubuntu update thing not appear every day? I've tried to kill it with stack overflow answers a few times.

Why do I have to pick between chrome having tiny controls and the rest of my UI having massive controls on my 4k monitor?

I don't care about what brand it is. If it's cool. If it's open source or free. I just want it to all just quietly work and stay well out of my way.

No actually I'm not done yet. Why do I have to tell it every time what kind of headphones I just plugged in? And when I leave sleep it forgets and starts blasting 80s eurodisco in the office even though they're still plugged in?

Okay I think I'm done for good this time. Just off the top of my head the things that make me feel like I'm trying to become a master musician by practicing 10 000 hours on a kazoo.


I'm running Manjaro (Arch) on a nine year old X220 with almost the same specs as the authors'. I'm getting 4-6 hours on a 4 cell battery (70% remaining lifetime capacity), which is almost two hours more than what I was getting with Windows 10.

Aside from needing to write a quick bash script to get the settings for my trackpoint to stick with systemd after reboots, it's been absolutely rock solid.

I don't know which laptop you're having issues with, but a $120 used Thinkpad (X/T/W430+ for dual external monitor support) and any modern linux distro will work out of the box with little to no fooling around and will keep up with almost any new laptop in terms of performance, reliability, and battery life.


Almost all of these problems are coming from using either Ubuntu or Unity/Gnome. Don't use that stuff. Try Manjaro, Fedora or Mint and try it with KDE, Cinnamon or whatever else. The difference is huge.


I tried a different distribution, but one that had the option to install with KDE or Gnome, and I concur with your opinion of Gnome, because I tried it first and eventually had problems.

But one of the problems I attribute to Linux more generally, is that if you Google for information on something like KDE vs. Gnome, you tend to find discussions where people get into the weeds on things very removed from "which one works and lets me get things done?". The threads are more like sports fans debating the merits of their teams while trying hard to be civilized about it. Which is how I ended up not trying KDE first even though that was the obvious default.


I switched from Linux to Mac about a year ago and something that is underrated in MacOS is the _consistency_ of the UI.

Not just the visual appearance, but for example keyboard shortcuts are relatively universal; Cmd + , is the shortcut for Preferences/Settings in Safari, Chrome, iTerm, Slack, NewsReader etc. I don't know if that would even be possible on the leading Linux DEs.

I love the customizability of linux, but I've decided that a good user experience goes beyond the appearance, and what's required is putting serious thought into user workflows. Those sorts of usability studies don't seem to be a priority for most FOSS projects.

There are things about Linux that I miss on Mac, but overall I prefer a system that's consistent to one that ticks every single box.


> Not just the visual appearance, but for example keyboard shortcuts are relatively universal; Cmd + , is the shortcut for Preferences/Settings in Safari, Chrome, iTerm, Slack, NewsReader etc. I don't know if that would even be possible on the leading Linux DEs.

Kinda funny, but the inverse is true for me. I am forced to use a Macbook at work and shortcuts are a frustrating thing. Do I have to use Cmd + <x> or Ctrl+ <x>?

Considering both Windows and Linux follow almost the same shortcuts, Mac is the odd one out here.


> Kinda funny, but the inverse is true for me. I am forced to use a Macbook at work and shortcuts are a frustrating thing. Do I have to use Cmd + <x> or Ctrl+ <x>?

No reason to be confused. You need to use Cmd + <x>, because off the top of my head Ctrl is involved only into text editing shortcuts (what GTK+ calls "emacs theme") and some obscure stuff.


Switching between CMD/Ctrl is really weird for someone who has used Win/Linux all their life, where most of the major shortcuts use Ctrl.

To be fair I wouldn't really count this as a negative, this is probably very far down in my list of gripes I have against Apple devices.

It's just that it's equally pointless when a Mac user complains about shortcuts in Win/Linux. Even in this thread, you can see most of the complaints boil down to "It doesn't work like the way it does in <Insert the system you are used to>"


As a cli user, this is a big positive for me. I can use ctrl-c for "terminate program", while cmd-c continues to work for copy.


This is a regular frustration for me under Linux. The shortcuts for Cut/Copy/Paste feel so standardized that changing them in one particular instance feels like having the functions of the brake pedal and trunk lever being swapped only while driving on a particular road.


I switched to MacOS last year for work and one of the first things I did was remap keys to swap CTRL and CMD. Now I press CTRL-C to copy on Windows or Mac. It makes it a lot easier to switch between machines


This reminds me of all of the discussions people had about keyboard shortcuts when Windows 8 first came out.

It was hard to consider it a valid selling point then, so I'm not really sure why people would consider it to be a valid selling point now for MacOS (although I do get why people would find them useful).


Talking about keyboard shortcuts consistency on MacOS, how do you think about Home/End/PageUp/PageDown (or Cmd/Fn + Left/Right/Up/Down), and further on, combined with Ctrl/Shift? The behavior varies dramatically across different apps. It's a nightmare comparing to that on Windows/Linux.


For the most part, the only variance comes from non-native (usually electron) apps. Keyboard navigation in text fields is perfectly consistent between native Cocoa apps because there it’s implemented at the toolkit level so devs get it for “free”. If some cross platform app doesn’t follow suite it’s because their developers took the easy way out and ignored platform conventions.


Not just non-native apps. Try Sublime Text vs. TextEdit.app, iTerm2 vs. Terminal.app, ForkLift vs. Finder.app. The functionality of Home/End/PageUp/PageDown is much more than just navigation inside text fields. Unlike text fields navigation, OS generally cannot force application behavior in these cases. Developers on Windows/Linux all follow the same conventions while Mac developers take various directions whatsoever.

And why excluding non-native apps? They count for a large portion of app ecosystem nowadays and they don't have the same problem on Windows/Linux.


My bad, I thought you were speaking only in the context of text fields. I don’t really use those shortcuts outside of text manipulation.

As for why exclude non native apps... it’s a personal preference thing, and I don’t enjoy using them. I avoid them whenever I can and will opt for a native alternative where possible, even if it means shelling out some cash.

In fact, I’d say that’s one of my gripes with desktop Linux — for some things, an electron app is the only option and nobody cares to develop a GTK+ or Qt alternative. I’m a big believer in use of the local UI toolkit, regardless of platform.


I use Linux on a desktop at work, and like it just fine. I don't feel strongly about it, it's a tool, and it works plenty well. If someone installed macOS on it, I think that would be fine too.

But laptops are a whole different story. The last time I tried using Linux (and Windows too, actually) on a laptop, I couldn't get through a day without wishing for the touchpad from my (2015 vintage) macbook pro. Most touchpad hardware sucks to begin with, and installing Linux on the MBP itself still sucked due to drivers. My computer is sadly getting long in the tooth and I definitely won't be getting one of the new MacBooks after reading people's experiences with them, so I think I'll have an opportunity to try out a Linux laptop again soon, but I'm more dreading than looking forward to it. Maybe I'll be pleasantly surprised.


>so I think I'll have an opportunity to try out a Linux laptop again soon, but I'm more dreading than looking forward to it. Maybe I'll be pleasantly surprised.

Current crop of laptops aren't particularly good. They are all plagued by various thermal issues due to the quad-core CPUs in ultrabook form factors. Even thinkpads have suspend resume issues right now. Avoid the thin ones. Avoid nvidia. Save some money, and go with something a bit older.


I dread touchpads in general. Now that I've gotten used to a TrackPoint (specifically the IBM/Lenovo variety), it's far more efficient and comfortable particularly over longer periods of use for me.


I hate the ergonomics of portable computers in general. I always marvel at people who hunch over a little laptop when they don’t need to. I know some people truly need to do portable computing, but I view a desktop as the ultimate luxury.

If I need to suffer a laptop, then the TrackPoint is head and shoulders better than any giant Apple glass slab. For all other use, a $20 Logitech Marble Mouse is supreme bliss.


Yeah maybe I'll try that out. It just seems so different. But maybe different is good.


I hear you. I did the switch from MacOS to Linux about ten years ago and currently, if an employer forces me out from Linux it's time to look for a new job.

What would be painful for me is to lose my i3 environment with the perfectly tuned keyboard setup, all configuration stored in a version control, including the system software through the declarative NixOS configuration.

Basically getting a new laptop for me is to boot from NixOS USB stick with my configuration, run the installer and boot to the familiar i3 desktop I've been using for years. All the software including the right configuration will be there. Nothing will change through a whimsy update that introduces some new UI feature that will break my workflow.

I also really like the ThinkPad hardware, especially the perfect keyboards.


That whole commercial software thing.

Why would I futz around with Linux when I can get a Mac that has an operating system tailor made for it and a true Unix environment when needed? Only on the Mac can I develop for every mainstream desktop and mobile OS.


you're literally supporting a company because of bullshit policies that every developer would have an issue with if it was microsoft.


Which “bullshit” policy is that?


Presumably the policy that iOS apps can only be built on Macs. It is actually against ToS, it is not just that Apple doesn't provide devtools on other operating systems.


That’s not true either. Again there is an existence proof - Xamarin.

You can develop on Windows.

https://docs.microsoft.com/en-us/xamarin/ios/get-started/ins...

And use Microsoft’s hosted build environment for up to 40 hours a month for free.

https://docs.microsoft.com/en-us/vsts/pipelines/agents/hoste...

The whole kerfuffle you are referring to is about 7 years out of date.


so either: >Only on the Mac can I develop for every mainstream desktop and mobile OS. is not true. or it's bullshit policy. pick your poison.


How is it “Apple’s policy”? Apple chooses not to write a tool chain for Windows or Linux. If a third party once to make one, they are free to do so.

The post I was responding to said “it was against Apple’s TOS”, probably referring to a policy that was rescinded back in 2011.

There is a big difference between Apple not allowing something and Apple not supporting something. I choose to write native apps for the platforms I care about. If I wanted to use Xamarin I could.


To make a build of a xaramin app for publishing to the iOS store dont you need a Mac at least for one stage? It looks like you do under the "Archive for Publishing" section here:

https://docs.microsoft.com/en-us/xamarin/ios/deploy-test/app...

It's also against ToS to virtualize OSX on non-Mac hardware as far as I know.

I don't think they are just doing that for fun.


>Only on the Mac can I develop for every mainstream desktop and mobile OS.


How is that a “bullshit policy”?


Because it's by design. It's not that it is difficult to add a aarch64/ios backed to your favourite compiler. Making the artefact provisionable to emulator (which you would get on macOS only anyway, because what's inside is copyrighted and thus non-distributable) or physical device is entirely another game.

Apple is locking you in and you reward them for that.


There is no such thing as an “iOS emulator”. The iOS simulator uses an x86 compiled version of your code and links to it an x86 build of the iOS framework.

Again there is an existence proof you can deploy code directly from Windows to iOS devices....

https://techcrunch.com/2017/05/11/microsoft-now-lets-ios-dev...

Microsoft tells us that it talked to Apple about this functionality and that it has its rival’s blessing and that the Live Player application complies with all of Apple’s usual rules..


> There is no such thing as an “iOS emulator”. The iOS simulator uses an x86 compiled version of your code and links to it an x86 build of the iOS framework.

Now you are playing word games, just to avoid the point.


What matters is, that you cannot run it on another OS,

But as Microsoft has just shown, you don’t need to run “thier” emulator to develop on another platform.

because Apple does not supply one. You cannot supply your own,

Yet Microsoft did just that....

because the emulator/simulator/whatever has to run Apple code inside, and you don't have permission to distribute it, even if you had the inclination to make your own emulator/simulator/whatever.

Microsoft had the inclination to do just that....

So OK, Microsoft made an agreement and can now provision iOS packages. They had to find a way to test-run them somehow, and their solution won't allow you to write against native frameworks, only against theirs, which they can run on their platform.

So first you said that thier was no way to run on any OS, you couldn’t write your own emulator, etc. but now that I’ve shown you that you can do all that, now it’s that you can’t write against the native frameworks.

Guess what? You can’t write the same binary against different versions of Linux.


> Yet Microsoft did just that....

No, you run the .net code on the .net runtime. If you use UIKit namespace, for example, you will run with the dotnet shim.

> Microsoft had the inclination to do just that....

Because Microsoft doesn't have emulator or simulator. Not anything comparable with what Apple has, or what Google has (and what Microsoft is shipping for Android).

> So first you said that thier was no way to run on any OS, you couldn’t write your own emulator, etc. but now that I’ve shown you that you can do all that, now it’s that you can’t write against the native frameworks.

It is not a first-class solution.

> Guess what? You can’t write the same binary against different versions of Linux.

Actually, you can.


No, you run the .net code on the .net runtime. If you use UIKit namespace, for example, you will run with the dotnet shim.

And when you run your code on the iOS simulator you are also not running against the same code that’s is running on your phone...

You aren’t running ARM code.

Because Microsoft doesn't have emulator or simulator. Not anything comparable with what Apple has, or what Google has (and what Microsoft is shipping for Android).

When you run on your code on your desktop you run on a simulated environment - the Xamarin runtime on top of Windows. When you run your code on MacOS, you run on x86 version of the iOS runtime. By definition, you are running a simulator.

It is not a first-class solution.

So in your experience using both, what capabilities were you missing?

Actually you can

You haven’t had to ship C binaries to various Linux distributions....

For instance, the official MySQL driver for Python is not pure Python. Even if you are on Linux, you can’t just do a pip install from your box, create a zip file and guarantee it will work. You need to use the package built for whatever version Amazon Linux is based on....

https://stackoverflow.com/questions/35763380/problems-using-...

There are plenty of compatibility issues distributing binary builds for different versions of Linux.

Not a big deal, I use AWS CodeBuild to create my distribution anyway.


> And when you run your code on the iOS simulator you are also not running against the same code that’s is running on your phone...

> You aren’t running ARM code.

You are running the same framework by the same author, which certifies that they behave the same to some extent. It's the same way, that Gnustep is not an adequate replacement for Cocoa.

> When you run on your code on your desktop you run on a simulated environment - the Xamarin runtime on top of Windows. When you run your code on MacOS, you run on x86 version of the iOS runtime. By definition, you are running a simulator.

Are you still interested in arguing in circles? I'm not, it makes no sense and is a waste of time. You demonstrated that you are not willing to understand the point, you just want to win an argument on the internet. Go find someone else for that.

> You haven’t had to ship C binaries to various Linux distributions....

Now it is C binaries... It was just binaries before. You can move Go binaries with no problem, the crosscompiled ones too, because they do not have problem with importing symbols nilly-willy.

You can do that with C binaries too, but you have to take care. Just make sure you know what are you linking against, bundle it if it is not present on the target or just link statically. It no problem for a competent developer, who knows what a linker is and can maintain ABI hygiene.

If you think that is something complicated, go setup apache with mod_wsgi on windows, you will run into the same problem. There you will have to align the right Visual Studio version (because each msvcrt version has its own ABI) with Apache version with Python version, otherwise the hell will break loose. Of course, all the native python extensions you are going to use will have to be compiled with the same Visual Studio version. Yet nobody is saying, that it is impossible to distribute binaries that run on all Windows releases, mainly because it has nothing to do with specific system, but with the apps you chose to run, how they are built, and also boils down to ABI hygiene.


You are running the same framework by the same author, which certifies that they behave the same to some extent. It's the same way, that Gnustep is not an adequate replacement for Cocoa.

Have you done mobile development?

If you run on top of the iOS simulator, you still don’t have any “guarantees” that they will run the same way. I’ve been doing mobile development since before the iOS App Store every existed (Windows Mobile), you always had to run on the real device and the behaviors and the performance characteristics were nowhere near the same.

When you are developing with Xamarin, you smoke test on the simulator at first and then you run on the device using the Xamarin player - the same way you are going to deploy your code.

Now it is C binaries... It was just binaries before. You can move Go binaries with no problem, the crosscompiled ones too, because they do not have problem with importing symbols nilly-willy. You can do that with C binaries too, but you have to take care. Just make sure you know what are you linking against, bundle it if it is not present on the target or just link statically. It no problem for a competent developer, who knows what a linker is and can maintain ABI hygiene.

Excuse me for referring to the most popular language for writing native code (and the language that Linux itself was written in) instead of specifying that I’m not talking about Go of all things. Almost all non C languages that I’m aware of have a way to call native C code - not Go.

Your “solution” for distributing C code is “statically linking”? So if I have four python modules that all have native dependencies they should all ship with statically linked libraries? Not only will my code be larger, so will the memory requirements. If you have four processes all using the same dynamically linked library, the library is only in memory once, with statically linked libraries, you are increasing your code size and your memory footprint.

How well do you think that with work with the 128MB VMs that you get to run Lambdas? Or do you think I should just spend more on higher capacity lambdas? If I wanted to use heavyweight VMs and Docker containers, I might as well use Windows.

But this whole thread you’ve shown when it comes to developing mobile applications, you’re speaking hypothetically while I’m speaking from a perspective of doing mobile development for over 10 years.

I’ve even played around with J2ME development.

The largest issue is that you’re over indexing on the value of the simulator. Mobile simulators have never been a good way to validate the functionality, user interaction, performance, etc. of a mobile app. They are pretty good for development, and okay for negative testing (if it doesn’t work on the emulator it probably won’t work on the device), but you can’t depend on it for positive testing.


Now you are playing word games, just to avoid the point.

It’s not a word game. Both of your assertions were incorrect. You no more need Apples proprietary “emulator” to develop apps for iOS - Xamarin is proof of that, nor is there a “policy” against developing or deploying iOS apps without a Mac - again Xamarin is proof of that.

Calling Apples simulator an emulator is just as technically incorrect as saying using Safari and setting the window size to match mobile screens an “emulator”


In the context of this debate it does not matter, whether it is emulator or simulator underneath.

What matters is, that you cannot run it on another OS, because Apple does not supply one. You cannot supply your own, because the emulator/simulator/whatever has to run Apple code inside, and you don't have permission to distribute it, even if you had the inclination to make your own emulator/simulator/whatever.

So OK, Microsoft made an agreement and can now provision iOS packages. They had to find a way to test-run them somehow, and their solution won't allow you to write against native frameworks, only against theirs, which they can run on their platform.


And you think that developing on Android is better? The emulator is running in x86 with an ARM emulator that running a Java virtual machine is giving you any “guarantees” (your words in a previous post) on how it will work on the device?


The answer is simple: neither the Adobe Suite nor Cubase run on Linux, both of which I require for my side gig that keeps me from going crazy. Additionally, Color accuracy and configuration is broken beyond repair in both X and Wayland, especially in a Muli monitor setup.


Every couple of years Apple does something that makes me think "maybe it's time to try to do Linux". I'm competent with Linux. I use it daily at work and can manage the cat majority of the time. I have however, on occasion dealt with a catastrophic failure, after some normal suggested upgrade. Catastrophic as in I can't recover from it, or the best suggestion is to roll the kernel back, which is a very foreign concept.

Still, I might try the next time I'm up for getting a new laptop. I'm not pleased with the way Apple is going with their laptops, and I'm not interested in Windows.


> and I'm not interested in Windows

That seems to be the general consensus around here. If you're a dev, you either run MacOS or Linux. As a dev who runs Windows and MacOS I don't really get this. Windows 10 is excellent for dev work. I understand there are privacy issues that make people not want to use MS products but from a purely functional perspective it's a great User experience.


I use Windows 10 exclusively at home and it's indeed pretty good but there are unfortunately a few things that would stop me at work:

* There's now a major upgrade every six months and Microsoft still hasn't figured out how to make the process reliable.

* Chocolatey is good but the available packages are sometimes lacking. Linux distros have always been miles ahead in this regard, and nowadays with snap I can even find third party software that wasn't packaged before.


Actually I find choco to be miles better than apt-get. With apt-get I often have to add repos before I can install some app (usually something which fixes some basic part of Linux). This, for me, negates the whole point of having a package manager. What good is it if I have to tell it where to look?


If it is in third-party repo, it means that the distribution maintainers do not maintain/test it. A third party does.

When that third-party package replaces something that is a part of the system, you are going to have a difficult upgrade next time the new distro release is out, unless the third party ensures smooth transition. Do you have that trust in them?


Nope. I'd rather use windows than worry that I'll break my OS by installing a third party app


You can break windows by installing third party apps too. Or by well-aimed edit in registry.

All systems can be broken by incompetent user with root or administrator password.


In what distro did you experience the upgrade problems?


My solution is connecting the best of three worlds. I am running ChromeOS on Macbook Pro 13 so I am having Mac's touchpad, Pro keyboard, Retina screen and Mac's durability. It get's little hotter with ChromeOS but battery life is generally as good as under macOS.

Graphical enviroment is ChromeOS which is basically just Chrome and extensions. Yes - HBOGo, Spotify, Inbox, Photos could be used as apps but they are still working in a browser.

And with crouton I have Debian stretch CLI which is just that - Debian - solid, stable and there is solution to most problems.

This setup is amazingly portable. I make regular backups with crouton and keep USB stick with ChromeOS installer on my keychain. Now I am spending summer vacation and I have left my Mac at home. So I have just installed ChromeOS on 10-year old Acer laptop, Google downloaded all documents and extensions in 15 minutes and I have installed crouton and recreated Debian from backup.

In 15 minutes I had been up and running in my home environment. Of course - touchpad on Acer sucks compared to Mac, laptop gets hot and screen estate is 1/4 of Mac's Retina but every single setting in environment is 1:1 with my Macbook.

I have never been heavy user of macOS native apps and I have slowly moved from Apple/iCloud walled garden to Google (Photos, Drive etc.) so jumping ships had been painless for me. My memories of macOS is of never fixed annoying glitches, heavy 'obligatory' updates and tons of unnecessary files cluttering my drive.


How do you install ChromeOS on any old laptop? Last I looked into that it wasn't really possible.


It's really easy using one of Neverware builds (but you won't have dual boot, they have removed that option)

https://www.neverware.com/freedownload


One word: HiDpi

Details: try a HiDpi display and a non HiDpi display both connected to the same PC. Nothing but MacOS handles this well. Even Win10 struggles with Windows that straddle the boundary or things dragged across it.


HiDPI works well on the latest GNOME with Wayland. I use a 4k display at home with the amdgpu driver and it works great. I have also hooked up a non-4k Dell Ultrasharp display and with Wayland, both displays can use different DPIs.

The Qt and Tk (since 8.6) applications also scale nicely with GNOME3.

The only glitch is that currently only integer scaling works well.


Did you write up or have a link on how you managed to do this ? I'm currently struggling with a similar configuration. I have a XPS 9570 with integrated 15" 4K screen, and an external screen, non4K. This is literally driving me crazy as each app on Gnome seems to handle scaling slightly differently!


Not much to write up. This was on Arch, I just installed GNOME (on Nix now, so this is from memory):

    sudo pacman -S gnome
    sudo systemctl enable gdm
    sudo systemctl start gdm
GNOME Wayland is the default session. Log in, press the super key top open Settings, go to Display and set scaling to 200% on the 4k screen. After that, everything worked fine. Since I use the non-4k screen vertically, I had to rotate the second screen in the settings.

The last time I used Fedora it worked without extra work (besides setting scaling to 200% in the Display settings).

Things only started working well with GNOME 2.26 and very well with 2.28.

In any case, you have to make sure that you are using Wayland, support for multi-DPI screens on X.org has been terrible for me. Even a single 4k screen is not very consistent in X.org (e.g. the mouse cursor sometimes becomes tiny when moving between windows).


It really depends on what you do and what you use your machine for. If you are using Vim to develop C Cli Applications; and you use your iPad for facebook/messaging/etc... then Linux will work for you.

If you use your Mac to manage emails, calendar, skype, chrome, etc... then you are better of with a Mac and OS X.


It takes two is what I've found. I build software on a System76 laptop running 18.04. That's pretty much a perfect dev environment with the advantage of physical function keys -- kind of essential if you spend lots of time remoting using byobu. I keep a MacBook for Keynote, business docs, conferencing, and to carry to customer sites for presentations.

When I developed on a Mac, it used to be nice to have an integrated environment. I really liked having the iPhone/iPad/Mac ecosystem available even right on my working dev laptop. But dropping function keys, along with longer update cycles, indicates my workflow isn't Apple's priority. I've found the transition really easy, and I now think of Apple devices as appliances, not computers I can program, which is what Apple wants me to think anyway.


> I'm honestly confused as to why a developer in this community would even think of not running Linux full time

Well I'm a developer (if not 'in this community'). I don't run Linux on the desktop because when I have I've just found it too much trouble in so many ways. It took up too much of my time. Working full-time as a developer already consumes as much of my life as I'm willing to spend confined to a screen and keyboard. I know a dozen folk will chime in with the automatic responses informing me that it takes no more time to run than macOS or Windows. But in my experience that's just not true, and, frankly, the OS I use isn't of much concern to me. It wouldn't get anywhere near my "1000 things I'm most concerned about in life" list.


I've watched many of my and coworkers spending huge amounts of time (many hours a month) fiddling with the complexities of managing software packages on OSX systems. Everything was so frighteningly manual and impossible to reproduce. They easily individually used 50x as much time doing that as I spent fiddling with my Linux installation. Those are levels of effort that definitely matter to me enough that I put importance in the OS I use.


As both a Linux and macOS user, on both the desktop (I even dual boot Linux and macOS on the Mac Mini) and the laptop, I find Homebrew satisfies my needs for third party software package installation.


There goes the automatic response. I really don't care what your coworkers do, because I'm not amassing evidence for a general proposition (if I were interested enough in that I'd go an do a Phd on it). I have used OSX/macOS for several years. It has used close to zero of my limited time to maintain. That's what I want from an OS.


Can we also mention those "hacky-package-manager" that are brew and macport?

Things are installed all over the place, hard to have consistent version across different machines.

I think that a developer OS MUST have a proper package manager.


IMHO there is no better package manager then homebrew on linux. It’s so flexible and so much less in your way. The only thing that suck, is that often something breaks when apple does a major release. So in this point i agree, I would love if Apple would directly ship it. But it’s understandable why they don’t, I guess.


I wonder whose decision it was back in the very early days of Mac OS X to not include a package manager—not even when Xcode first came out or with the Mac App Store.

I'm imaging now some grey bearded dev at Apple getting immediately shot down for suggesting they build the Mac App Store on an open package manager foundation.


To each their own. I switched from Mac to Linux last year and did some experimenting. Now I’m using Ubuntu 18.04 after committing time to both Fedora and Mint. I had good experiences with all of them but this latest Ubuntu with Gnome 3 has been great.

I’d also have a hard time going back to OSX now. At the same time, it’s not for everybody. This was something I wanted to do for me and I was determined to see it through. If somebody had pushed me into it I probably would have spent more time focusing on deficiencies.


Did you switch hardware too, or end up running ubuntu on a macbook pro?


I committed and bought a Dell Developer Edition laptop. I wanted support in case I had a problem and I wanted to know all of the hardware was built to work smoothly with Linux without having to research each item. I was torn between Dell and System76 but ended up settling on Dell because of previous great hardware experiences with them.

I’d held on to my 17” MBP until last January when it was clear they were sticking with their current direction on laptops.


For me it's mainly the touchpad interaction and UI performance. I never got satisfactory results for both when setting up a laptop with Linux (the Windows/Linux dual-boot laptop I'm using has even slightly more powerful hardware than the MBP). I need to switch between all 3 desktop OSes all the time though and wouldn't mind switching to Linux as the main OS, but all in all MacOS is the most enjoyable.


> If I had to shift to MacOS now it would be far far more painful than the shift from Mac to Linux.

Wholeheartedly agree.

I've been slowly moving from Linux (mostly Ubuntu 14.04) to High Sierra, and there are a few things that make absolutely no sense about OSX:

- There's a difference, in almost every piece of software, between closing the last window and quitting. In a browser, just pressing the "close" button would lose all your tabs, whereas "quit" will preserve them. That makes no sense to someone coming from Linux or Windows; I'm sure OSX users have gotten used to this difference, but ... it's clunky.

- OSX puts one small dot next to an icon of the app that's open. Unity puts one dot for every open window the app has (but no more than 3 even if you more than 3), _and_ a marker that tells you which app has focus (and whether it is on the current workspace or on different one); also middle click on the icon universally opens a new windows. I never realized how convenient and informative this is, until I lost this feature with OSX.

- OSX litters all the (local) directories you visit with finder preview files. It does it for remote ones to, but at least that can be turned off - not so for local. There used to be 3rd party kexts to stop that, but not anymore. Gnome/unity keeps the previews in one directory; Windows is almost as bad as MACOS (except it can be turned off universally).

- Having used locally integrated menus for the past ~30 years or so, I find OSX' global menus uncomfortable. I've spent a while with Unity's global menus to try them out, and after two months of use gave up and went back to local menus. OSX doesn't give you that choice. (And yes, I know of Apple's studies that resulted in the global menu - I suspect the originals are no longer relevant -- due to higher resolution screens, larger number of open apps on average, and familiarity with window systems; Does anybody know if these studies were repeated any time in the last 5 years?)

- "apt install" (and many of its GUI frontends) are years ahead of Mac OS app installations, whether app store or DMG, both in ease of use and in breadth covered; "brew" comes close, but no cigar yet; Haven't tried Nix, Fink or MacPorts yet.

- A kernel update that requires restart is uncommon on either system; less than once a month. Yet, on the Mac it's 5-30 minutes of downtime to apply, depending on the update; on Linux I have my desktop back in less than 30 seconds.

- keyboard shortcuts. "End" goes to end of document, "Home" goes to beginning; Low frequency of use (for anyone I know). Beggining/end of line: Option+Left, Option+Right; high frequency of use. It's a "different strokes" thing, for sure, but even though the Mac is consistent, it is very weird compared to (everything else going back 50 years), and not optimized.

Yes, OSX looks more polished, and even small utility apps are really gorgeous. But so far it feels that functionality wise for a non-artist engineer like me, OSX is trailing.


> There's a difference, in almost every piece of software, between closing the last window and quitting. In a browser, just pressing the "close" button would lose all your tabs, whereas "quit" will preserve them. That makes no sense to someone coming from Linux or Windows;

MacOS distinguishes between closing an application's window and closing the whole application. Therefore, closing the last window will not automatically close the application as well. Consequently, in a browser pressing the close button results in all tabs getting closed, with the browser still running. Quit however closes the whole browser and, hence, preserves the tabs. This is the action triggered by pressing the close button on Windows and Linux.

I can understand that this is counterintuitive coming from Linux or Windows. But it is a design-choice coherently implemented throughout MacOS.


I understand where it comes from, and in the days of floppy disks and 128KB memory, it made sense (I know, I'm that old). But it's been making progressively less sense over the last 30 years, with the advent of hard drives, faster hard drives, SSDs and GBs of memory.

It used to be the terminating a process you were going to reuse would cost you tens of seconds. That's no longer the case. The fact that it was a rational, consistent, design decision made 35 years ago doesn't mean it still makes sense. Apple killed the phone jack and the replaceable batteries, two things which are even older than that - things change.


I've been eaten by this when Firefox once didn't give me back all my tabs when I reopened. For earlier versions of MacOs there's Redquits, but it doesn't work now :(


Some of your issues makes me wonder... i.e.:

>Beggining/end of line: Option+Left, Option+Right; high frequency of use.

on macOS you can just use the 30+ years old Unix convention of Ctrl+A and Ctrl+E to get to the beginning and end of every line everywhere in the OS.

> "apt install"

That is because no generic consumer in the world actually uses a text-based package manager. That's only the CLI users, Linux/Unix/BSD users and the like. At the same time, software distribution isn't a one-size-fits-all approach. In line with apt, on macOS there is a packaging format much like .deb and it's called .pkg. It is compressed, has a manifest, some pre/post install scripts and the actual files. The .app distribution method is much more like a snap or container, which is what the world has been moving to for decades. In one way, it's even more comparable to nix, since applications and their dependencies live in their own location independent namespace. On top of that, there used to be a package manager that actually proved apt and dpkg for macOS, but almost nobody wanted to use it because that method doesn't actually fix anything.

> There's a difference, in almost every piece of software,[..]

> OSX puts one small dot next to an icon of the app that's open. [..]

> OSX litters all the (local) directories you visit with finder preview files. [..] (<-- actually nothing to do with previewing anything)

Those three are pretty much all the same thing: preservation of state. By default, the concept is that it doesn't really matter if a piece of software is 'open' or 'running', what matters is that when a user takes an action (i.e. start browsing the web), it continues from the last state the user left it in. That is a really hard problem to solve and something Linux will probably never be able to solve due to the difference in even basic GUI SDKs like GTK, QT, WxWidgets etc. and that's not even considering X11 which is a whole different mess by itself (data exchange via atoms on window objects? wtf?) There is a setting that will undoubtedly become default where the dock will no longer show a difference between 'open' and 'closed' since the difference will pretty much be to small to be relevant (i.e. is it in the process table or not will be about as much as a difference that will be left). Currently, there are a number of frameworks in place that do almost all of this, some are purely for managing state of the UI, some are for document-based programs that offload revisions, storage etc. to the OS, some are for managing idle time (i.e. AppNap). The idea that an Application is just a binary that hooks into the UI thread to dump some pixels on the screen hasn't been a reflection of reality on macOS for about 15 years. Having this information in mind might make all of your observation a lot more sensible.

Regarding .DS_Store files: they are data files containing the settings you set for a directory. So if you don't change anything for a directory, then no .DS_Store file is made. But when you say, change to icon view, move stuff around, go to list view, and setup custom sorting orders, then the OS will store your settings inside that directory. If you then open that directory on another Mac, you will get the exact same representation (be it via network, mass storage or target mode).

> Global menus

This often boils down to taste, but regarding those studies: they were about 'how hard is it to aim', which is relevant today, even (to an extent) for touch screens. The idea is that the global menu is at the top, and if you throw your mouse pointer at the top of the screen, you are always at the global menu, no aiming required. You basically completely remove an axis of aiming from the process. You can't overshoot either as the screen simply ends there. If you take the 99% of users, this is much easier than trying to aim for a strip of pixels at a random location somewhere in the top half of your screen. On top of that, multiple menus make little sense, you can't use more than one as you can only focus one, you only have one pointer, and you can't (on any implementation i've seen) use them without losing focus of your current window. Maybe this background information alleviates some irritation ;-)

> A kernel update

There are no stand-alone kernel updates on macOS, at last not by default and not planned. The type of updates that require a reboot are pretty much always those system updates, which are run as transactions instead of copy+replace. They aren't small or as insignificant as a single package either, most of them are comparable to complete distro upgrades like FC27 to FC28. And yes, I'm talking about those point releases (i.e. 10.13.3 to 10.13.4). Just because the semver delta isn't that high doesn't mean that on the scale of operations the content doesn't have a big delta either ;-) Most of them are a few GB in size and comparable to a reinstall of the complete OS (because again, it is a single, reliable transaction). Keeping that in mind, isn't it great that this can be done in under an hour, only breaks if you have made extreme system modifications and basically doesn't touch your own user data or programs? While not-touching-anything-it-doesn't-need-to-touch is standard, just like it is on Linux packaging systems (well, most of them), doing a complete distro upgrade on the fly isn't always that easy and reliable on other operating systems. Windows mostly just fails or takes forever, on Linux distros you have to do things like 'sudo apt-get update; sudo apt-get upgrade; sudo apt-get dist-upgrade; sudo reboot;' and hope nothing broke. Same with yum and dnf, I've had simple upgrades (bare FC install upgrades) fail and leave it in an unworkable state, mostly when libc was upgraded, or kernels (and the modules failed to update in the initramfs) or when the packaging tools themselves were upgraded (when they break, good luck fixing anything).


Somehow your notes about global menus and kernel update were not there when I first replied .. either a late edit or a network glitch of some sort.

Re: global menus, as I mentioned, I'm well aware of the original Apple studies. I admit I'm a power user, but many professional (not necessarily power) users these days have more than one app open on the screen, and I've observed the pattern of "click app to get the right menu, then locate menu item" with a few co-workers -- whereas on Windows (and linux), they only have to locate the menu inside the right window.

When Apple did their tests, they focused on use in a single app, IIRC -- which, back then, was justified. That's why I wonder if anyone repeated that with a modern workload.

Re kernel updates: No, it isn't great that it can be done in under an hour. I often install ubuntu or debian from a very old DVD, and then, within just a few minutes, it is all up-to-date, kernel and all. Never failed once.

I haven't used dnf or yum in a while, so can't comment; I know window breaks. But debian-stable is rock solid, and even ubuntu is (with the exception of the OOM kernel that did byte -- but still took just 5 minutes to update, and left you a copy of the older kernel to boot to if you wanted - which, once you got bitten by the oom bug, you obviously did).

I agree Apple updates are miraculous if you are coming from Windows. But compared to debian ... not so much.


> on macOS you can just use the 30+ years old Unix convention of Ctrl+A and Ctrl+E to get to the beginning and end of every line everywhere in the OS.

... which appears nowhere in the keyboard help or shortcuts. I actually used this in shells, from muscle memory, but never actually tried in editors or input fields.

> > "apt install"

> That is because no generic consumer in the world actually uses a text-based package manager.

I did mention the variety of GUIs; I like the CLI better, but I know people who prefer synaptic. Regardless of the interface, once you've decided to install a piece of software, Modern Linux does it much, much better than MacOS; It's possible that your use case requires packages unavailable in the repositories, in which case it doesn't matter - but the breadth of packages in debian/ubuntu/fedora repositories covers 99% of the users 99% of the time. No app store or repostiory on the mac does as well.

> Those three are pretty much all the same thing: preservation of state.

No, they aren't - two dots for two windows vs. one dot for one window is purely informative but hugely helpful. And while it might at some point converge to the "binary loaded or not doesn't matter", it's not there yet by a long short - and likely won't be for a decade.

The "quit vs. close" difference on OSX goes directly against the "preservation of state" concept you are promoting, in my opinion. In Ubuntu, whichever way I closed my Firefox (closing the window, shutting down the computer, upgrading the kernel, killall firefox, etc.) I get my state back. On the Mac, if I press the red button it doesn't, but if I choose "quit" on the menu it does ?!?

I mentioned them as things in which the MacOS fails compared to modern Linux, whereas somehow you took it as if the Mac does better.

Linux does preservation of directory view state as well. Except it is centralized and not all over the place. Windows does most of it in the registry (but put previews in the directory). These make much more sense -- and I think come because of Linux's Unix legacy of serving multiple users and having a lot of remote, read-only file systems often mounted. OSX is Unix, but is almost entirely single-user-at-a-time-and-they-own-everything-they-see from the day Apple adopted it.

Personally, I have configured my linux to automount every USB or network share read only unless I specify otherwise - I hate to accidentally modify files or directories, and many software allow themselves to drop swap/lock/backups even if you're in read only mode. MacOS is a huge offender from this perspective.


Preservation of state from the user perspective, not from the application functional perspective. If I as a user quit an application, I'm not telling it to lose my state, but if I close the state (i.e. one or more tabs/windows) that's what I want it to do. Closing/Quitting an application isn't analogus to killing a process. It is/was on some (but not all) other platforms, but not on macOS.

Quit and Close mean two different things. The state that is preserved is the state you put it in. So if the state is: I have a window open with a site loaded, then the next time I visit this task, I want that same state. If I tell a program that I wish to not see a window or site, then that is the state I put it in, and the next time I visit that same task/program, I want it to return to that state as well. While you might not agree with that logic, that is what the HIG describes and how it is implemented. The logic is there, and it is documented and followed, both on the desktop and on the ~1 billion mobile devices where much of the same HIG applies.

Let me add a car-analogy here:

Say you are in your car with the doors open and the engine running. If I turn off the engine, I don't expect my doors to close. And when I close my doors, I don't want my engine to turn off by itself. And when I open my doors, turn the engine off, go for a walk and come back, I expect my doors to remain open when I turn the engine back on.

Regarding the view state, Linux and Windows do this for the local machine, where as macOS does it for the directory. From a UX perspective both ways are with their own good and bad elements, but if you use multiple machines or manage multiple machines and share view states across machines and accounts, this is the way to do it (short of exchanging view state across other methods like a network or integrated into the filesystem).

Regarding your read-only mounting of mass storage and remote filesystems, that makes total sense to me, but makes zero sense for the general user. While it is nice to reconfigure your local system to do this (and macOS has this as well via autofs if you want it -- it's built in), this would make your environment completely unportable to anyone else.


> It is/was on some (but not all) other platforms, but not on macOS.

I know why it's there. As I wrote in another post, this was a great choice in the 128KB/floppy days of the early macs. It no longer is - for the general user, on a computer (unlike a car) it feels like a distinction without a difference. The fact that it is consistent and based on consistent logic (albeit one that was prescribed 30 years ago) does not matter, IMO.

> Regarding the view state, Linux and Windows do this for the local machine, where as macOS does it for the directory.

I think you're rationalizing a poor design choice. For local files, the end result is mostly the same (except Linux preserves the directories mtime) -- as local files are rarely if ever accessed across the network by someone with write permissions for that directory. And those permissions often make it seam dysfunctional on a shared directory - you can't store settings, so it doesn't stay the same between visits.

How does your explanation sit with the fact that there's a way to disable this for network/remote directories, but not for local ones?

> and macOS has this as well via autofs if you want it -- it's built in

I'm aware that's a power-user preference, I couldn't figure out how to do it on the mac; I'll look up autofs permissions - any recommended documentation?


> ... which appears nowhere in the keyboard help or shortcuts.

This is a weird omission in OS X. For what it's worth, they are documented here: https://support.apple.com/en-us/HT201236


> No app store or repostiory on the mac does as well.

Homebrew?


Maybe the most recent homebrew is better, but in my experience it is brittle, breaks often with underlying OS updates, occasionally has weird permission issues (heard that's solved now); I mentioned in another post it is close, but no cigar.

Part of ubuntu's and debian's "apt" magic is not just in the code; It's in the meticulous curation of the main repositories - which brew can't really compete with.


Graphics drivers.

My System76 laptop, pre-installed with customized Ubuntu, needs to be rebooted just to switch between Nvidia and Intel GPU if you want power savings (external monitors are directly wired to Nvidia chip).


That's an ubuntu 18.04 thing that's being worked on. You can either stick with 16.04, or use this repo https://github.com/matthieugras/Prime-Ubuntu-18.04 until it is properly fixed. More importantly, the first rule of running linux well (which I assume you wanted to since you got system76) is to never buy nvidia.


I've used Linux since 2004 but because I needed to do design, I had to switch to macOS sadly. For developers, it really makes no sense to stay on macOS.


It is hard to develop for iOS and tvOS solely from Linux.


Driver for laptop hardware, battery life are the two most important reasons.


Because even developers want computers that Just Work. Because Macs are, hands down, the best modern desktop experience.


> Because even developers want computers that Just Work

Which is why I've used Linux for nearly 20 years. It Just Works.

YMMV, Plural of Annecdote is not data, etc.

I'm just happy that nowadays it's a genuine choice.


Why is the plural of anecdote not data? Everyone I know who is using Linux has some problems, with either WiFi, updates breaking things, ever changing interfaces (Iproute2, Networkmanager, systemd-networkd). I am saying this as someone with 25 years of Linux experience.


I used Linux as my general desktop through college. I love Linux, really. But when I'm not working and just want to chill at home I don't need the technical overhead. I don't really care about win vs OSX anymore, just give me something that doesn't have any edge cases.



10 years ago I dualbooted OS X 10.4 and Windows on a Thinkpad X60, and I remember searching for (and in some cases, even writing) drivers was the most difficult part. These days, if you need macOS for whatever reason and don't have a Mac, it's probably easiest to use a VM --- VirtualBox can boot, install, and run an unmodified High Sierra with only a few minor configuration changes to the VM's "hardware" to make it more Mac-like.


> with only a few minor configuration changes to the VM's "hardware" to make it more Mac-like.

Any link to do this? I've tried a number of times and still didn't work :/


https://www.insanelymac.com/forum/topic/309654-run-vanilla-o...

I believe SMCDeviceKey and the product names/serial numbers are the critical part. The OS doesn't mind the rest of the virtual hardware.


Wow. Cool, thanks!


Can confirm macOS 10.13.3 working in VirtualBox. Getting USB to work required a proprietary VirtualHere product.


I think you'll find that, ten years later, driver issues aren't completely eradicated but the situation is better.


I've tried running mac in virtualbox with no luck. What guide did you follow?


With the right machine and the right guide, it's totally doable without the help of a VM. The community can be tough to navigate at first but if you're willing to put in the time it can be done without too much pain.

The only real headache on the laptop front, beyond drivers, is that MacOS only supports a small number of WiFi cards, so you need to do your research and find a laptop where the wireless card is user-replaceable.


The only real headache on the laptop front, beyond drivers, is that MacOS only supports a small number of WiFi cards, so you need to do your research and find a laptop where the wireless card is user-replaceable.

Or at least one with existing open-source drivers that you can port, as has happened with at least one popular model:

https://github.com/kemenaran/iwidarwin

Hackintosh is, after all, a learning experience. Including kernelmode programming. ;-)

(Or at least it was --- the whole driver signing thing has probably put off quite a few who would've otherwise given it a try...)


For anyone looking for PC guide: https://www.tonymacx86.com/threads/unibeast-install-macos-hi...

I recently build one and it's super easy. My build: z370m + i5 8400 + 240g ssd + 16g ddr4. Benchmark is near 2018 MacBook Pro 15-inch.

Total cost: $550


I love when minor version breaks almost all basic Homebrew packages on macOS. I really wonder why would anybody who claims to be a developer run macOS for work! You can buy even a better looking and spec'd HP for a fraction of the MacBook price, run Linux on it, and be happy. Even Windows 10 now is a better develope platform! The arrogance of Apple is not something we should put up with anyway!


I'm a full time Linux user and a big fan but come on - claiming that minor updates don't randomly break things is pretty dishonest. It's true for all 3 major OSes.


Sure, it can happen on all systems. But Homebrew is not a good package manager.

How about major breakage because of the installation of a new program? Like having installed programs that uses readline in version 6 and installing a program that needs readline in version 7?

Homebrew will happily upgrade readline to version 7 and install the new program but thus silently breaking the programs using readline with version 6.

Edit: Every program and library involved has been installed with Homebrew.


It's not Homebrew that breaks but Apple for not caring about backward-compatibility. But honestly, Homebrew is a joke of a package manager, which, unfortunately, is the best on macOS.


This time it's not Apple's fault.

Every program and library involved has been installed with Homebrew.


It actually is. Apple should embrace Homebrew or at least offer an alternative.


Apple's alternative is using DMG.

I'm the first to blame Apple for various stuff but you can't blame Apple for using OSX in a way that Apple hasn't intended and isn't supporting.

If you're not happy with the pre-installed unixy stuff you are (still) free to add your own.

Which Homebrew allows you to do in a very Apple way. That is, works mostly as intended and almost OOTB but if you are a power user you will soon run into various problems.


This is why I really don’t get why people use homebrew over macports. Macports hasn’t given me that trouble and is pretty solid.


I've always wondered why there are 3 different efforts on macOs: Homebrew, MacPorts, Fink (http://www.finkproject.org/), instead of consolidating the effort.


i'd downvote you if my last Arch update didn't break my touchpad taps, again /s


As someone who still loves macOS but is getting really turned off by the rising costs of owning a Macbook and design compromises you have to put up with, I really hope solutions like these continue to develop and grow.

If I could slap macOS on a Matebook X Pro, I'd switch tomorrow.


Agreed. It's not even the cost I take issue with, it's the design compromises. I want a professional device Apple, not a toy fashion product.

I like macOS, but Apple's making it incredibly difficult to take their product line seriously lately.


If I could put MacOS on a surface pro..... :)


“If you encrypt your boot drive with FileVault it will be necessary to connect a USB keyboard when booting to input your password. An EFI driver for the built-in X220 keyboard is not yet available.”

Stuff like this is why Hackintoshes are sadly still super frustrating.


See it positive. At least it's possible to encrypt the filesystem now.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: