Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
MacOS Catalina: Slow by Design? (macromates.com)
1982 points by jrk 2 days ago | hide | past | web | favorite | 984 comments

> This is not just for files downloaded from the internet, nor is it only when you launch them via Finder, this is everything. So even if you write a one line shell script and run it in a terminal, you will get a delay!

> Apple’s most recent OS where it appears that low-level system API such as exec and getxattr now do synchronous network activity before returning to the caller.

Can anyone confirm this? Because honestly this is just terrifying. I don't think even Windows authorises every process from a server. This doesn't sound good for both privacy and speed.

There are two new Security/Privacy Settings that I just noticed last night.

"Full Disk Access" to allow a program to access any place on your computer without a warning. A few programs requested this, so it looks like it's been around for a while.

The other one is "Developer Tools" and it looks pretty new. The only application requesting it is "Terminal". This "allows app to run software locally that do not meet the system's security policy". So, my reading of this is that in Terminal, you could run scripts that are unsigned and not be penalized speed-wise.

I don't see it on macOS 10.15.4 (19E287). The full list of categories on my Privacy tab:

  - Location Services
  - Contacts
  - Calendars
  - Reminders
  - Photos
  - Camera
  - Microphone
  - Speech Recognition
  - Accessibility
  - Input Monitoring
  - Full Disk Access
  - Files and Folders
  - Screen Recording
  - Automation
  - Advertising
  - Analytics & Improvements
Granted I don't typically use Terminal.app (iTerm 2 user), so I launched terminal and did some privileged stuff. Had to grant Full Disk Access to, say, `ls ~/Library/Mail`, but "Developer Tools" never popped up.

Are you running a beta build or something?


Update: Okay, I checked on my other machine and that one does have it (Terminal is listed but disabled by default). What in the actual fuck?!?

You can make the category appear and put Terminal in it with this command:

sudo spctl developer-mode enable-terminal

I'd be nice if this was documented somewhere :/

I was going to be that guy and say “man spctl”, but that usage isn’t listed there. If you run spctl with no arguments, it will tell you, however. The man pages on macos really do leave something to be desired.

This does not make the "developer tools" panel show up in my machine :( tried everything already

I don't see it on my machine. Do you happen to have System Integrity Protection disabled?

No, SIP is fully enabled on both the machine with the Developer Tools category and the one without.

Interestingly, I rebooted the machine without after some benchmarking and experimentation with syspolicyd (see https://news.ycombinator.com/item?id=23274903), and after the reboot the category has mysteriously surfaced... Not sure what triggered it. Launching Xcode? Xcode and CLT were both installed on the machine, but I'm not sure when I last launched Xcode on this machine. Another possible difference I can think of: the machine without was an in-place upgrade, while the other one IIRC was a clean install of 10.15.

In the worst case scenario, you can probably insert into the TCC database (just a SQLite3 database, located at ~/Library/Application Support/com.apple.TCC/TCC.db) directly:

  INSERT INTO access VALUES('kTCCServiceDeveloperTool','com.apple.Terminal',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1590165238);
  INSERT INTO access VALUES('kTCCServiceDeveloperTool','com.googlecode.iterm2',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1590168367);
(Should be pretty self-explanatory. The first entry is for Terminal.app, the second entry is for iTerm 2.)

Back up, obviously. I'm not on the hook for any data loss or system bricking.

> In the worst case scenario, you can probably insert into the TCC database

Does this not require disabling SIP?

Yes. I got mine to appear through mysterious yet fully SIP-enabled means, but if all else fails for you you can temporarily disable SIP to change this.

Maybe you need Xcode, try running "mkdir /Applications/Xcode.app"

As mentioned in a reply to a sibling, Xcode has been installed (for like five years) on this machine, and launching it doesn't help. The next step would be to compile and run an application with it, which I haven't bothered.

I would expect checks for Xcode to go through xcselect rather than a simple directory check. Installing the command line tools (sudo xcode-select --install) might actually be a better idea to test this.

I thought the same, but actually this method worked for me when I wanted the the Spotlight "Developer" option to show up (the CLT were already installed). I have the Developer panel under "privacy" as well, even if I never installed Xcode on my machine

Maybe if you ran Terminal.app once it would work?

(I'm also on 10.15.4 (19E287))

No, I played around with Terminal.app for quite a while already. Actually the category does show up on another machine of mine (see edit)... I suspected that maybe I never ran Xcode on the first machine since I upgraded to Catalina, so I launched Xcode, but again, no luck. I'm at a complete loss now.

Terminal actually gives an error if you poke into the top level library folder with full disk access disabled, no prompt to change without me looking on stack overflow for the solution.

via https://lapcatsoftware.com/articles/catalina-executables.htm..., I've added an entry in my /etc/hosts to block requests to api.apple-cloudkit.com: api.apple-cloudkit.com *.api.apple-cloudkit.com

I wonder what "Developer Tools" grants in practice. Clicking the (?) for viewing built-in help does not mention this particular setting, it skips right over it going from "Automation" above it to "Advertising" below it.

I believe it means the process will no longer check for the Quarantine xattr.

But the quarantine xattr has nothing to do with checking notarization?

Full Disk Access was added in 10.14 (2018), so it's relatively new.

I'm using the Kitty terminal, and observed the script launch delay described in the blog post. After adding Kitty to "Developer Tools", the delay disappeared. Thanks!

Making this about speed is burying the lede. From a privacy and user-freedom perspective, it's horrifying.

Don't think so? Apple now theoretically has a centralized database of every Mac user who's ever used youtube-dl. Or Tor. Or TrueCrypt.

Richard Stallman's ideals have become a bit less crazy for me now...

Either you have the ability to control the software, or it controls you

I think coming to this realisation about Stallman's ideas (not the man, mind) is something that most rational computer users are bound to do. It happens at different times for different people, but I think people very rarely go back after that "Hang on a second ....??" moment.

I remember once he said "proprietary software subjugates people" and I just sort of blinked a bit. It seemed sort of over the top. And over time I started to understand that the way things end up working out, it is very true.

I always wonder why people usually choose to neglect privacy issues about Apple.

First, there was Apple scanning photos to check for child abuse[0] (that obviously got no attention on this site), then there was this one - Apple uploading hashes of all unsigned executables you run.

Do people really accept that company's "privacy" selling point?

[0] https://news.ycombinator.com/item?id=21180019, https://news.ycombinator.com/item?id=22008855

Is it even legal that Apple is retrieving this information?

Apple already has every iPhone user's photos, messages, browsing history, keychains etc.

Not sure how a list of installed apps is going to be worse than that.

Not if you choose to not sync them.

Yup, you can choose to not use iCloud backup and back up offline in an encrypted way (even over wifi) if you’d like.

How could this possibly not be absolutely awful on projects that run hundreds of executables during their execution (e.g. some shell wrappers like oh-my-zsh call out to a large amount of different scripts every time they run).

It looks like it is done once by executable lifetime. Changing the content doesn't cause it to rerun.

If you don’t trust Apple, don’t run a multi Gigabyte closed source OS they provide.

I can confirm that executing a trivial script takes 20-200ms longer on the first run. Using 10.15.

not sure if I'm lucky or somehow I disabled something but the trivial script problem isn't affecting me on any of my machines. I am using Homebrew for a large % of command line/scripting so maybe that's why?

Privacy it may be a plus since in theory notarization provides some protection.

Speed, definitely not, this is going to make things slowwwww

> provides some protection.

That's security, not privacy...

Although insecurity leads to less privacy as well.

Insecurity leads to loss of privacy, but security does not lead to privacy. Things can be secure and non-private by design.

Sometimes, but sometimes security measures lead to less privacy. Say, if executing local programs sends information to a remote server.

If that information can’t be used to identify anyone then it retains privacy while being secure. Being slow would still be an issue.

But you can't be 100% sure that the server where the information is sent is not putting in a database your IP, the app you run and whatever else. As a power user I would prefer a prompt before anything is sent.

I experienced this one day while tethering in the train. I was coding and running `go build` multiple times.

I could not for the life of me understand why go build would take upwards to 30 seconds to run and sometimes 100ms. I finally realized it was related to my internet connection being extremely spotty. I went online and searched if anybody had the same experience with `go build` but couldn't find anything.

I finally know what happened. This is a pretty intolerable "feature".

Does it work at all when unconnected?

There seems to be a delay of about 5 seconds, then it "gives up" trying to notarize your program .

I don't remember if it did or not, but I'm fairly certain it did. (otherwise I'd probably remember it, I think...)

As someone living in China, this is my result when I connected to my VPN (this is my normal life, thus I can visit sites like HN):

> Hello

> /tmp/test.sh 0.00s user 0.00s system 0% cpu 5.746 total

> Hello

> /tmp/test.sh 0.00s user 0.00s system 79% cpu 0.006 total

And even if I didn't connect to my VPN:

> Hello

> /tmp/test2.sh 0.00s user 0.00s system 0% cpu 1.936 total

> Hello

> /tmp/test2.sh 0.00s user 0.00s system 78% cpu 0.005 total

That's just ridiculous and unbearable.

Apple should provide a way to disable this notarization thing, and the user should still be able to enable SIP while disabling it.

additional information:

- macOS version: 10.15.4

- terminal: iTerm2 3.3.9

- didn't install any "security" software

I'm curious what your results would be with the stock Terminal. Do you have the settings that others have talked about under "Security > Privacy > Developer Tools" with Terminal.app listed? If so, and the results are better with Terminal, then it'd be interesting to see if the issue is fixed when you add iTerm2 to the list of exempted apps as well.

I have tried what you suggested. Granting "Developer Tools" access definitely FIXED THIS ISSUE for the specific application.

Here is the new result (I only run once for each case):

    │          │             │ +"Developer Tools" access │
    │ terminal │ 1.448/0.004 │ 0.016/0.004               │
    │ iTerm2   │ 1.240/0.006 │ 0.024/0.007               │
`1.448/0.004` means the first time it is `1.448 total`, and the second time it is `0.004 total`.

(It seems I have "good" VPN/internet connection condition at this time)

Upvoted for ASCII table alone

Is HN blocked in China?

HN has been blocked in China since about 9 months ago.


It doesn't work when there's no network connection, wonder if it would be possible to filter out and automatically block notarization traffic, or if it's all encrypted with cert pinning to prevent this type of MITM+filter.

Dropping packets when there is an otherwise working connection could potentially make the delay even worse depending on timeout or retry strategy used by Apple code. I assume that in the fast case without network connection it checks the network status flag and doesn't try to do any network connection at all.

I'm still on 10.14, but I guess it will show up on Little Snitch. Unless they bundle it with some other more essential traffic.

Okay, I've tried this test on my MacBook Air 2020 several times, first by saving the "echo Hello" shell script in an editor and then, because I wasn't getting the results the author experienced, trying again exactly as he wrote it. Essentially the same result:

    airyote% echo $'#!/bin/sh\necho Hello' > /tmp/test.sh
    airyote% chmod a+x /tmp/test.sh
    airyote% time /tmp/test.sh && time /tmp/test.sh
    /tmp/test.sh  0.00s user 0.00s system 74% cpu 0.009 total
    /tmp/test.sh  0.00s user 0.00s system 75% cpu 0.007 total
Is it possible that Allan Odgaard, as good a programmer as he unquestionably is, has something configured suboptimally on his end? Because it just strikes me as super unlikely that Apple has modified all the Unix shells on macOS to send shell scripts off to be notarized. (From what I've read, while shell scripts can be signed, they can't be notarized, and Gatekeeper is not invoked when you run a shell script in Terminal -- although it is invoked if you launch a "quaurantined" shell script from Finder on the first run, but it treats the shell script as an "executable document." This is the way this has worked for years, as I can find references to it in books from 2014.)

I have my complaints with macOS Catalina, and I know that Apple's "tighten all the screws" approach to security is anathema to a lot of developers (and if there was a big switch that I could click to disable it all, I probably would), but I'm using Macs running Catalina every day and I gotta admit, they just don't seem to be the dystopian, unlivable hellscape HN keeps telling me they are. At least off the top of my head, I can't think of anything I was doing on my Macs ten years ago that I can't do on my Macs today. ("Yes, but doing it today requires an extra step on the first run that it didn't used to" may be inconvenient, but that's not the same thing as an inability to perform a function -- and an awful lot of complaints about modern Macs seem to be "the security makes this less convenient." There's an argument to be had about whether Catalina's security model strikes the right balance, of course.)

I don't experience a delay in Terminal.app either, but I've tried running the script with a fresh install of iTerm2 while capturing with Wireshark and it does look like the script triggers a connection to an Apple server

I initially saw the delay in Terminal.app, but then it went away! I've made sure Terminal doesn't have the "Developers Tools" permission but the network request delay is still missing.

However, I was able to reproduce this by downloading a whole new terminal app, Alacritty. With the random script and file path I can always reproduce the delay in Alacritty. My guess is Terminal.app might have some special case behavior?

See my comment above on some shell script that does the random file name stuff for you.

I just ran the same script on iTerm2 and had no delay.

I had no delay neither until I reinstalled iTerm2, I have no idea why

Obviously I can't say that's impossible, it would just be... very weird, and would seem to contradict what Apple Developer Relations was saying on Apple's devrel forums as recently as this year.

So its an actual fact documented that it happens. I agree that overall Mac os x still has a very nice ux and I'll never go back to windows.. But it's very clear apple is platforming their os to the degree they will ios. It's not weird it's happening, it's real life...

> and if there was a big switch that I could click to disable it all, I probably would

First, disable SIP to allow yourself to modify the system. Then, disable AMFI, the component responsible for code signature checking, entitlement enforcement and all that very useful stuff, with a kernel argument:

    nvram boot-args="amfi_get_out_of_my_way=0x1"
Then you should be done.

That argument reads to me like the implementer knew this stuff was obtrusive.

I might be wrong about this but if you're running a shebang'd script directly as an executable, they wouldn't need to modify the behavior of the shell itself but rather the executable loader. It would be interesting to see whether, e.g., `bash test.sh` doesn't phone home where "./test.sh" does.

10 to one says this is because you've run something calling /bin/sh before.

if he switched the /bin/sh out to /bin/zsh or /bin/bash which ever his default shell was, he wouldn't have seen the first delay.

That's plausible -- but I'd be (mildly?) surprised if Apple hadn't pre-okayed binaries they supply with the OS. Even if you flip the Super Paranoia switches in privacy settings, you don't need to give macOS explicit permission to launch Apple-supplied binaries from the Finder.

Most vendors have separate engines for detecting malicious scripts. I'd assume notarizing is more about executables, in which case it would be checking the signatures around the shell binary.

Also worth noting "echo" doesn't spawn a process but is a routine in the shell itself. If you replaced echo with something that does spawn a process "like scp" it would be interesting to see the results. And if that's doesn't introduce latency then I'd try it with some hello world programs with a UUIDv4 in the binary to ensure they haven't seen the hash before.

> Also worth noting "echo" doesn't spawn a process but is a routine in the shell itself.

In Bash echo is a builtin but /bin/echo also exists if you do actually want to spawn a process.

Maybe OP edited a few times but it doesn't look like they are doing that to me

I'm not sure I understand?

try again with a randomized filename

There was a thread on the almost-forgotten Cocoa-dev list about this: https://lists.apple.com/archives/cocoa-dev/2020/Apr/msg00008...

Catalina has a huge number of things that synchronously block application launch, and if any of them fail you get nothing but a hung app. A friend and I have a running discussion of the many ways where an application would just hang and we’d send samples and spindumps, to each other trying to figure out the right daemon or agent to kill to get the process to start responding again. It’s madness.

I still love macOS, a lot. Since moving over after the disaster that was Windows 8 (and by then I was already using MacBook hardware), I've become a loving power user e.g. with AppleScript and setting up hotkeys or other ways to do absolutely anything I want on the screen. It really is still as powerfully customisable as Linux. Turn off SIP if need be.

My only problem in moving to Linux software is that I prefer Apple's hardware. I'm on the 2019 16-inch MBP. Linux's compatibility with all the T2 and SSD hardware isn't there yet, but apparently it almost is.

If Linux on the T2 MBP becomes solid and stable in the next 1-2 years, after extensive testing I may move over permanently. I already use Linux on secondary computers, and I love and value its privacy. Same with my phone. I just love my privacy.

My needs are a high bar though. Productivity must be held back by nothing. I use macOS notes extensively and it syncs with my iPhone which is an extremely useful tool for me to note things down both in audio and. It needs to be reliable and - heh - 'just work'. I just discovered the cross-platform 'Standard Notes' app, with a bit more money paid out to Linux-compatible services like that, maybe it can all work. Casual photoshop can be taken care of via a VM.

Surprisingly, macOS Catalina is itself a disrupter to my productivity. It seems buggy as hell - glitchy, and weirdly slow for many extremely basic things - all since Catalina. I just don't get it. Is it caused by this article's observation? Something's definitely going on.

Maybe Apple will fix this in the next release? Like how they fixed the keyboard?

Either way, I still want to move to Linux on this fabulous (fixed) hardware that is the 16-inch MBP. (T2 issues aside.)

I have a 2019 Macbook Pro 16in and I hate it. It runs exceptionally hot (leading to massive performance problems), doesn't get enough power from the adapter to start with no battery, doesn't play nicely with my display, needs restarting every couple of days so Chrome doesn't crash and takes forever to boot.

That's just the technical problems. I'm willing to give the UI a break, since it's probably as much me adjusting as it being bad.

This is my first Apple anything, and if this is what "just works" looks like, I don't want it. I could be more productive on an Android tablet at this point.

Actually, I do agree with you with some of those observations. Apple's been trying to fix their terrible T2 issue and I suspect some of the problems lately have been them trying to prevent the T2 reboot crash, while ruining other parts of the experience in the process as a necessary compromise. It may get worse (or better) as they move to all-Arm architecture.

I also am sick of the touch bar now - after 2 years living with it. I have to press it twice to actually pause my media, because it's an LCD screen and it has to auto turn off to prevent burn-in. That's a regression from the old hard media button in the Fn row which was both instant and far easier to press. At least we got 'Esc' back.

But man, their trackpad...nothing beats it. Still.

> it's an LCD screen


I hear OLED can be just as bad if not worse. So same diff.

Much worse. Just explaining why that would be a problem.

Mine starts spinning up the fan (theres kind of a pattern as to when), heating up the entire computer. The computer previously had been fine.

I usually have to restart and reset the "SMC" to stop the fan from nuking the computer.

I can let the computer drop to 5% battery life and the fan will turn off and the computer will cool down. Which is the opposite of what you want if it was actually overheating.

Counterpoint, I also have the 16 inch 2020 MBP as my first Mac work laptop and absolutely love it. No issues, it works perfectly, and I’m 2x as productive on it as I was on my previous Ubuntu setup.

Do you write anywhere online about your workflow setup using AppleScript? It sounds interesting. I’d like to configure my macOS experience more.

Oh it's not like I have a Cmd+<X> for every single possible task you can imagine, it's a very tailored and customised set of sometimes complicated scripts for my weird personal needs that I've built up over the years.

Each time I want to do something, I goddamn will spend 8 hours figuring it out if have to. E.g. this: https://apple.stackexchange.com/a/381441/163629 - one hotkey to change macOS Notes text into a specific hex colour (and/or bold etc). It took me a day but I worked it out. Where there's a will there's, 99 times out of 100, a way.

You can seemingly do almost anything with AppleScript. Emphasis on almost.

Here's another example: Right after I plug in my iPhone via USB, I have one hotkey to automate a little-known feature of macOS where you can turn your Mac into a speaker dock for the iPhone. Awesome thing when you have the dramatically improved 16-inch MBP speakers. Here's my applescipt for that, just customise according to your iPhone name near the bottom and try it out: https://pastebin.com/raw/9BY710Y6

YMMV, if you have additional audio devices in sound prefs so may need to change the code a bit.

AppleScript also has the ability to perform unix bash scripting and commands, so with homebrew able to install most common Linux packages, you can go wild if you want.

I'm definitely not 'advanced' applescript level, I'm intermediate. Hundreds of HN readers would know more than me. I just google and think until I find a way. I'm not a programmer.

I have other shortcuts e.g. to control the MPV media player even if it's not the currently active window. Again, weird personal needs, but awesome. AppleScript to the rescue.

FastScripts is how I assign universal hotkeys to any of my applescripts.

Would be great if you could write about the scripts you hack to optimize your workflow

I tested whether running a script you just wrote really contacts Apple to “notarize” it. It does.

I first used the author’s timing method. First runs are consistently about 300 ms, subsequent runs consistently about 3 ms. Something is happening at first run.

Some in the comments are saying it’s “local stuff”, so I tested timing again with internet off. First runs go to about 30 ms, subsequent remain the same. So there is “local stuff”, but it doesn’t explain the delay.

Just to be entirely sure, I installed Little Snitch and got clear confirmation: running a script you just wrote results in syspolicyd connecting to api.apple-cloudkit.com. syspolicyd is the Gatekeeper daemon.

I don’t know what exactly is being sent. Maybe somebody else can do a proper packet analysis.

I hope Apple currently has a team focused on macOS perf.

I worked on the team in charge of improving iOS (13) perf at Apple and IIRC there was no dedicated macOS “task force” like the one on iOS.

Luckily some iOS changes permeated into macOS thanks to some shared codebases.

I agree. This kind of behavior certainly smells like teams doing their development work on high-capacity low-latency networks without much performance oversight.

> I hope Apple currently has a team focused on macOS perf.

Apple doesn't give a fuck about macOS since 2015.

I wonder what % of their users are developers only begrudgingly sticking around for iOS builds.

> IIRC there was no dedicated macOS “task force” like the one on iOS

It's not surprising. Macs are less than 10% of Apple's revenue.


Except all of Apple's other devices are built on macOS. Apple's clear de-prioritization of macOS based on revenue numbers is so insane I can barely believe it's happening. If developers, who use Macs in large numbers today, go to another platform, there's very real risk that their entire empire starts to come apart at the seams. And, this may just be me being naive, but it doesn't seem like that much work to keep macOS going, all they have to do is stop trying to turn it into iOS. They are literally doing a tremendous amount of active engineering work that drives developers away from their platforms.

They are risking their entire empire because (apparently) someone at Apple has an axe to grind with macOS's Unix underpinnings. And until they start getting real consequences (developer's leaving in huge numbers), it doesn't seem like it's going to stop. The tragedy is, if they ever do reach that point, where developers are leaving in huge numbers, it'll be too late. Platforms are a momentum game, you're either going up, or you're going down. And once you're going down, you're as good as dead.

Agree. That's probably also one reason why more and more people want to use cross-platform app frameworks instead of developing for iOS natively. That way, you can do most of the dev work on Windows and Android, and you'll only need to use Mac & XCode for compiling the iOS binary.

And I'd wager that some iOS games are released without the developer ever touching XCode: https://docs.unity3d.com/Manual/UnityCloudBuildiOS.html

Signing and submitting apps to Apple is fairly annoying to do without Xcode.

Unity has a service where they do it for you.

Where you give them you key?

Yes. The procedure is explained in the link that I posted.

I'm not sure I'd be entirely comfortable with that, to be honest.

100% agree! If more people understood this, I hope this narrative would gain some traction and eventually reach Apple management.

To me, the idea that an OS is mostly finished is completely bananas. There's so much room for improvement and hardly any of that potential was tapped into in what's starting to feel like a decade.

And if Apple had invested into a successor for Cocoa, there might be a larger gap between native apps and (Electron) web apps, leading to some lock-in. Instead most new stuff is not native and for good reasons (and I do dislike the way they don't adhere to Mac conventions, but still).

I think ultimately the problem is Tim Cook. He's too attached to Apple's stock price. I think that's the one metric that he believes rates his performance. But inertia is a bitch. Like in politics, the effects might hit hard only once he's out and it could be too late to fix by then.

If I think about how much this impacts the economy overall (i.e. make millions of knowledge workers a little bit less efficient) then I can only hope that I'll see more sophisticated organizational structures in my lifetime that prevent such erosion.

Tim Cook is Apple’s Ballmer, who is their Nadella?

I was thinking exactly this, 8 years ago. I moved from an imac + mbpro to linux only.

It took longer than expected. I even intended to buy put options, but someone I trust told me otherwise and to invest in equity instead, which I did, because I know that most buy decisions are not made rationally.

But it looks like the time has come now? On the other hand, I have been off by several years before. People are crazier than you think, especially when it comes to status and association with brands and self-confirmation of past decisions. They might well put up with Apples moves for a few more years.

But at Apple scale: 9% of $58 billion = $5.2 billion Mac revenue last quarter.

Yes, that is what drives me crazy whenever people say Mac is only 9% of revenue and they dont care about it.

If the Mac revenue was separated out on its own, it would be about Fortune 120, that is higher than Kraft Heinz. With plenty more space for growth. Apple only has 100M Active Mac users. There are 1.4B Windows PC.

OTOH when Apple was a much smaller company the mac was much more important to them and it showed.

Maybe it's not related to revenue per se, but clearly since iOS became their main thing the Mac has suffered tremendously.

Apples Macintosh division is the most profitable PC company in the world and has been for at least a decade. In fact, Macintosh is likely more profitable than all other PC companies combined.

Less than 10% is no excuse.

Like I said in another comment, is not about the revenue per se, but it's undeniable that the more popular iOS is the less Apple cares about the Mac.

Do you have a source for that claim?

Just an estimate. Revenues about $25B a year with net margins around 15-20% works out to $4B to $5B in profits a year. It’s possible margins are slightly lower than that but historically they were 20%+.

The rest of the market is roughly $100B, and has net profit margins of 2-3%.

Actually other PC company revenues are at least $200B so it’s probable roughly even.

It's not surprising. Macs are less than 10% of Apple's revenue.

Without Macs for developers and other content creators that other 90% doesn’t exist.

Exactly. Especially given the Xcode lock-in nonsense.

It's surprising that they don't improve the developer experience for their own developers using their own tools, including hardware.

Apple uses the same tools you do. They just might not be using it like you are; you can find a lot of features that clearly have no reason to exist outside of Apple nonetheless shipping with their software.

> Apple uses the same tools you do.

No. A special directory can be created at the root of the file system called /AppleInternal. Then, if you work at Apple, you can put some special files there that do stuff. I've read somewhere that they are able to easily disable all of this privacy protection crap and other annoying stuff.

There's nothing really special about /AppleInternal, it's just a fairly normal directory that a couple of tools change in order to do things like offer more detailed diagnostics or the option to create a Radar. On a normal internal install there are some internal utilities, many of which are listed here: https://www.theiphonewiki.com/wiki/Category:Apple_Internal_A.... But their code is all Xcode projects and stuff, it's not like they're really using special tools for themselves except in certain cases. There are a couple of internal tools that possess entitlements to bypass security, but more often than not engineers just run with the security features disabled, which you can do yourself.

That's kind of my point - it's surprising to me that they're shipping slow hardware and software, when they're used to develop that same hardware and software. Developer time is expensive.

I would actually be quite happy if the engineers were forced to work on four-year-old MacBook Pros and develop against Display Zoomed iPhone 7 and the second generation Apple Watch, using the toolchain and software they push to their developers.

Is there a list somewhere of Apple's in house dev environments or workflows? I wonder what cool tricks they use internally that could be pretty useful generally.

Nothing special that can really be talked without internal context. You can get a hint at how they use their own tools though (which are available externally) if you pay careful attention to their public appearances and presentations.

Very messy internally, every team has their own.

I wouldn't be surprised if they've determined that developers will generally put up with a bad experience in order to have access to the massive iOS market.

There isn't much incentive to improve because they know that people will buy their hardware regardless.

Not to mention people defend and market their products for free.

Maybe internally they are using a different version of macOS?

It’s basically the same ones you’re running, possibly a couple builds ahead and with all the security features turned off.


I find it funny how people are downvoting your innocent comment pointing out a fact... out of anger and hate for the actual fact :D

What changes permeated into macOS? What did your team do to improve iOS perf?

So many of the frameworks have shared code between macOS and iOS (e.g. MapKit, Foundation, Contacts etc..), so a perf fix in iOS pays dividends on macOS too.

Perf changes are too numerous to mention, I’d recommend watching last year’s WWDC keynote describing the iOS 12 v/s 13 perf advancements.

They set "fast = true" as a global constant variable.

I would give anything to have my Mac be fast again. I have no idea what changed but even 10.14 feels a whole lot slower than it was earlier. Haven't upgraded to 10.15 seeing all the negative reviews it is getting when it comes to perf. Apple needs to seriously give perf a priority for Mac. Do they really expect developers to use a Mac to develop Apps when it is slow as molasses? I shudder to think what will happen to the Apple ecosystem if developers migrate to another OS for development. Apple will come crashing down. I don't wish for that to happen but looks like there is absolutely no one at Apple focused on making it better.

Remember, people don’t write blog posts saying nothing changes. The negative reviews tend to be one of two things: spotlight reindexing shortly afterwards, or attribution error where every new thing is blamed on the OS upgrade and similar old behavior is mentally discounted. App development didn’t suddenly get “slow as molasses” and for most users the install was a reboot and back to work.

This is completely insane. I am so glad I decided years ago to leave closed operating systems behind.

This design seems to cement the trend at Apple to position their products as consumer appliances, not platforms useful for development.

> I am so glad I decided years ago to leave closed operating systems behind.

The problem is, there's nothing else out there. Everything is going to shit in one way or another. Windows is now a disaster, Linux was always a disaster in terms of user experience and isn't improving.

Mac OS was the last bastion of somewhat good, thoughtful design, user experience and attention to detail and now they've gone to shit too.

>Linux was always a disaster in terms of user experience and isn't improving

I'm honestly pretty baffled as to what keeps this meme alive, as KDE and GNOME are both very popular and provide simple, intuitive interfaces for the typical user. Plasma is only complex if you're the type that really wants to customize, but there its complexity is (mostly) necessary for its wide range of possible configuration. People have this idea that desktop Linux users are all a bunch of dorks playing around with Arch and tiling window managers all day and then posting their anime wallpaper setups on /r/unixporn, but that hasn't actually been true for a long time.

Yeah Linux is awesome. I don't get the hate either. I have like 5 apps I use in Linux Mint, and they look exactly the same way they do in MacOS (Spotify, Discord, Firefox, Godot, Sublime, VSCodium, Terminal)...

The settings UIs in Mint are easily way better than in Windows and Mac.

If you add "unfixable" to "disaster" the problem becomes more clear.

Windows is a unfixable disaster, you can't fix it sorry.

Mac OS is now an unfixable disaster, you also can't fix it sorry.

Linux may be a UX disaster, but you can, uniquely, modify it. You can change your UI. You can attempt to fix the problem, and have a real shot at doing so.

Linux is the only one where you can do something about the problem - which is a strong reason to prefer it.

Not only can you modify Linux in theory, it is actually getting _easy_ to do so.

The biggest reason I enjoy elementary OS as a distro is that everything lives on GitHub, package releases happen through GitHub Actions, etc. Fixing a bug can be faster than merely filing a radar in the Apple ecosystem.

> Linux was always a disaster in terms of user experience and isn't improving.

Nonsense, 'Linux' can be what you make it. You can have it as sleek as something straight out of the fruit factory or as spartan as a VT100 and anything in between. If you're new to the game the pre-packaged 'consumer' distributions might be a good starting point but for those with a bit of nix savvy - of which I assume there to be many on this board - those bells and whistles probably just get in the way.

If my 8yo daughter and my 82yo mother can use Linux - the latter through a remote X2go session from her kitchen table in the Netherlands to my server under the stairs in Sweden - I'd say people around here can be assumed to be able to handle it. The nice thing about 'Linux' is that you can change out those parts which you find disagreeable for whatever reason for those you like better, this in contrast to that last bastion of somewhat good, thoughtful design, user experience and attention to detail* which by your own statement has been changed into excrement. Just take out the shitty bits and replace them with something better... oh, no, not possible...

That is why the parent poster is right in this sense, things in 'Linux' land might not be perfect - and can never be 'perfect' since one person's perfection is another's nightmare - but at least you get to do something about it.

Linux was always a disaster in terms of user experience and isn't improving.

Curious: what have you tried? People who use "Linux" as a catch-all in terms of UX usually have only tried a single distribution with a single desktop environment.

I feel like people still have in mind what Linux desktop was 15 / 20 years ago. It improved a lot in the past years, battery life improved on laptops, Ubuntu that was already very stable and feature complete also got a lot of things with previous releases and I've personally been running Arch on my main computers now for 5+ years and haven't got any major issues while upgrading.

Try using the latest version of software that has a more frequent release cycle than arch. If you have an incompatibility there goes your install.

Have yet to see a distro do multi monitor hi dipi that results in readable fonts out of the box..

This gets updated yearly - https://itvision.altervista.org/why.linux.is.not.ready.for.t...

This list is quite comprehensive, but also quite boring. It's just a list of bugs and things that are suboptimal on Linux. You could write one about any operating system. Some of the items like 'such-and-such needs to be configured using a text file' are also not even real problems.

What do you mean by 'there goes your install'? There are multiple ways you could run bleeding-edge software before it's packaged for Arch. See for example every 'xxx-git' package in the AUR. Or Flatpak.

Arch does not have a release cycle, sorry.

People who have used ubuntu might want to just once try arch linux.

I had an ubuntu machine that took a while to boot even with an SSD. Later I installed arch linux on the same machine and boom! it would be to the desktop in seconds. It was night and day.

Debian is just as quick, and does not have the problematic "rolling" updates of Arch. (It does have the "testing" and "unstable" channels which are roughly comparable, but the Debian folks won't tell you to use them in production.)

> problematic "rolling" updates

Rolling updates for me have not been problematic.

I've had a few updates that gave an error message, and they were easily fixed in one minute after searching the arch website.

I think one was a key expired - I had to manually update it and redo the update process.

The other I can recall was a package that had become obsolete/conflicting and a question had to be answered.

In general rolling updates are a tiny blip every few months.

In comparison, the several debian based distributions I've run have been a "lost weekend" type of upgrade for major updates.

Debian is not just as quick (significantly slower and higher resource usage), but Arch isn't all that fast nowadays, either.

Debian - or Devuan if you don't want systemd - can be made as spartan as you want. It boots in those mentioned few seconds on my 15yo T42p (Pentium M@1.8GHz, 2GB). Use Sid/Unstable if you want more up-to-date software with the accompanying larger flow of updates.

> Debian is not just as quick (significantly slower and higher resource usage)

In which respects? Are you talking about apt vs pacman or something? Default DEs?

Default install; a default Debian install has about 3x running.

Moreover, I've been running Linux for decades now, both in my personal laptop and at work, and Ubuntu has been (mostly) frictionless for me. I'm not an average user, of course, but for most users a friendly distro would work just as well as Windows (browsing the internet, using whatsapp web, watching movies). In some cases I've had a better user experience with Ubuntu than with Windows or OS X, namely seamlessly installing a wireless HP laser printer.

I only tried Ubuntu, a few month ago. For the day or two spent with it:

- multi-language support requires a lot of work to get to the same point as macos.

In particular I use third party shortcut mappers to get language switching on left and right command keys (mimicking the JIS keyboards, but with an english international layout). That looks like something I’d have to give up on code myself.

- printer support is not at the same level.

Using a xerox printer, some options that appear by default on macos where not there on ubuntu. I’m sure there must be drivers somewhere, or I could hunt down more settings. But then my work office two other printers. It would be a PITA to hunt down drivers every time I want to use another printer.

- Hi DPI support is still flagged as experimental, and there’s a bunch of hoops to jump through to get a good setting in multi-monitor mode. Sure it’s doable, but still arcane.

- sleep/wake was weird. It would work most of the time, but randomly kept awake after closing the lid, or not waking up when opening. Not critical, but still not good (I’d ahte to have the battery depleted while traveling)

Overall if I had no choice that would be a fine environment. But as it is now, with all its quirks, I feel macos is still a smoother environment.

Fair enough. I'm not a Mac OS X user so I don't know how it would compare. I can only compare it with my past experience with Windows, and I think it's superior (for me) to Windows circa 7 -- I stopped using Windows entirely at that point, so I wouldn't know how later versions of Windows fare.

Portability is also a fair issue to raise, but it's simply not a problem for me. When I say Linux "on the desktop", I literally mean it: to me a laptop is simply a slightly more portable desktop computer. I sometimes take my work laptop to/from the office, and the battery lasts long enough for that. I'm not worried about longer trips, since I don't use laptops for that. Again, if you do care about this (which is completely fair), I'm aware many Linux distros still have issues with battery life. You certainly can't compete with a Macbook Pro, that's for sure!

I do note that my experience with printers is opposite to yours. Like I said, when trying to connect to an HP wireless printer, Ubuntu autodetected and self-downloaded the necessary drivers; however, it took a lot of patience to get it to work with a Macbook Pro. Today, that I have it configured for my Ubuntu laptop and my wife's Macbook Pro, the Mac will sometimes fail to print (the print job simply stuck in limbo) while my laptop prints reliably. Who knows?

And like I said in another comment, I game (or used to, anyway) a lot with Ubuntu, and many games are even AAA (though they tend to arrive later than on Windows).

So I really have a hard time believing Linux is not "ready for the desktop". It is, and has been for many years now.

edit: one last thing. You mentioned HDPi modes, multimonitor, multilanguage... none of those are for average users. My mom would be comfortable browsing the net, reading mail and watching movies on Ubuntu. She doesn't even know what HDPi is, nor does she want external monitors. (Spoiler: she still uses Windows because she can't learn anything else at this point... I've thought of tricking her by themeing Ubuntu to look like Windows, but that would just be mean).

Without HiDPI support lots of applications become useless when you use a HiDPI display. Even Steam does not respect HiDPI settings in Gnome 3 even when setting custom environment variables.

It's probably a case of "I don't miss what I don't use" then. I'm a power user, I cut my teeth with MS-DOS and I've been using Linux for work and gaming for more than a decade (and less intensive usage before that) and I really never noticed anything about HiDPI. That has to mean something :)

Thanks for the additional details.

For the printers, you are right in that it’s far from being a solved problem on macos. I had an EPSON all in one before, and it was also a pain to get everything working. If I remember correctly the generic driver could print, but we didn’t get “advanced” options without going through the EPSON pkg installer and all the garbage coming with it. I’d totally imagine the linux driver being done cleaner than that.

For the record I’ve worked with a decent number of devs using linux workstations, so I totally vouch for your use case. I’d just temper the niche nature of multi-language support; that’s an everyday need for basically all Asia. Granted my use of shortcuts is niche (I wouldn’t need them if I had enough keys), but looking at maintenance projects annual reports there seem to be a sizeable amount of quality of life fixes still on the way.

Right. I forgot about Asia. In that case it must be painful, agreed!

With Linux you have to pay for proper support. HP is by far the best company in terms of supporting Linux printers. It isn't the Linux ecosystem's fault that other printer companies do not care.

Interesting. I regularly use RHEL (server/CLI only) but have not tried desktop Linux in a while.

I get a fair bit of weekly exposure to Windows 10 and well, it's not like heaps of fun, UX wise.

I'm reluctant to drop Apple mainly because I'm so 'tied up' with the rest of the ecosystem, iphone, Apple Music, iCloud etc.. They are not irreplaceable (for sure) but it always feels like moving away will cost way too much effort and be a pain... Well played, Apple.

> I'm reluctant to drop Apple mainly because I'm so 'tied up' with the rest of the ecosystem, iphone, Apple Music, iCloud etc.. They are not irreplaceable (for sure) but it always feels like moving away will cost way too much effort and be a pain... Well played, Apple.

This is why I don't want anything by Apple.

This is a good point.

It's really hard for me to use non i3wm supporting OSes now, even though I have to use Windows from work, and have used Macs for the better part of the last 2 decades personally and in college.

I use Linux everyday, and it's a UX disaster. I have tried Gnome, Xfce, Cinnamon, KDE, I like none of them. The only DE that I somewhat liked (Unity) was discontinued.

Linux sucks, but I use it becuase it sucks less than windows, for programming at least.

How interesting, I like Cinnamon and Gnome and KDE, but didn't like Unity. Instead, for me, the problem is poor printer support.

> Curious: what have you tried? People who use "Linux" as a catch-all in terms of UX usually have only tried a single distribution with a single desktop environment.

Yup. You've just described a disaster. How many permutations of <hundreds of distros> x <dozens of DMs> must a user try before finding a good UX?

Mac is a BSD. OpenBSD exists. FreeBSD exists. NetBSD exists.

Because there are at least four BSDs, Mac therefore isn't good.

Do you see how ridiculous applying that logic to any operating system is?

Linux isn't a disaster. It's a kernel. There are Linux distributions with great user interfaces and great UX, developed by people who are great at it. There are also distributions that aren't.

> There are Linux distributions with great user interfaces and great UX

Could you name some? No sarcasm, actually interested!

It sort of depends on what really fascinates you, right? I'll avoid naming some of the most popular ones, because it's likely that you've already tried them. If you haven't, I'd really recommend giving them a try. Many people seem to really love them.

In terms of defaults:

I've heard really good things about Solus, and its use of AppArmor seems really cool. Never touched its package manager, so I won't recommend it, but it might be worth checking out. Its desktop environment is really snappy and has an interesting design philosophy.

Elementary is really cool as a boutique distribution; I don't personally feel any urge to use it seriously (I dislike apt as a package manager), but I always keep its live environment on a flash drive, because it works without any setup on basically anything I throw it at, painlessly, and without error. It's got a cool indie app store full of curated Elementary-centric free software, and overall just feels great. Using it, you'll probably notice a few areas that it clones Mac on, and a few that feel delightfully different.

Clear Linux (Intel's desktop distribution) is pretty popular right now because of how simple it is & how Intel seems to be going to great lengths to optimize it and make it a serious contender, but I don't like its desktop environment (vanilla GNOME 3 as far as I'm aware) all that much.

ChromiumOS is probably the best-designed desktop operating system on the planet right now technically, and I say that as a person who really hates Google. UI-wise it's so-so, but UX-wise it's really something special.

But more interesting are desktop environments in general, since they can be used with any variant of Linux you feel the urge to use. There's an exception there, though, in that Elementary's DE and Deepin's DE tend to not work so well or nicely on platforms that aren't Elementary or Deepin.

There are modern environments:

Plasma has hands-down the best UX of any sort of desktop operating system assuming you've got an Android smartphone; you say you're coming from Apple's environment, so imagine the interop between your Mac and your iPhone, but going both ways instead of just Mac -> iPhone. Texting, handling calls, taking advantage of the computing resources of connected devices, using your phone as an extra trackpad, notifications, unlocking your PC, painless file-sharing, pretty much anything you'd like. There are a bunch of distributions that ship with Plasma by default.

Solus's Budgie is kind of neat in that it takes the main benefit of GNOME 3 (ecosystem) with far fewer downsides.

There are also retro environments, if those are your thing. There's a pretty much perfect NeXTSTEP clone (including the programming environment, not just the UI), amiwm is still pretty interesting, there are clones of basically every UNIX UI under the sun, so on.

I'm not the best person to answer your question, because for the most part I don't go out of my way to use new desktop environments and distributions, and nothing above is my first choice. (In terms of window management, I usually stick with 9wm & E just because I have ridiculous ADHD and 9wm forces me to focus while E allows me to tile painlessly if I ever need it. I use three distributions overall, none of which are very popular at the moment, pretty much solely because I'm really picky with package managers & design philosophies.) That's a "me" issue rather than a Linux issue, though.

This is excellent and indeed largely novel information, thank you.

It sounds like the finding right combination of DE and package management solution plays a big role here. I don't remember much of my experience with Gentoo's package manager in the early 2000's other than finding it generally did its job (if a bit slowly)... Experience with package managers on Mac (brew, macports) hasn't been great so I'm eager to play around with modern ones on Linux. Same goes for the DE actually: stock, out-of-the-box, macOS is essentially unusable for me until I get my customization (scroll, trackpad, KeyboardMaestro) done exactly right, I can't imagine this not being better on Linux, if anything for the ability to switch among the various DE's.

I'm starting to contemplate this (fully untested) strategy: trying out a few distros and installing the one I like best on VMWare Fusion and then try to use it as much as possible, falling back to macOS if I get stuck or I'm short on time but gradually replacing Mac-specific stuff as I find suitable replacements.. TextMate, the masterpiece of Allan Odgaard (author of the article being discussed here) probably going to be the toughest one. If I'm successful, I should eventually be able to let Linux 'out of the box' and run it on real hardware..

PS: amiwm! This is going to be a must. I do miss the Amiga, a fair bit..

My favorite package managers, personally:


apk (terrible interface; wonderful technically)

pacman (wonderful interface; so-so technically; dislike the distro that uses it because of technical choices)

InstallPackage (GoboLinux is kind of cheating, because InstallPackage isn't a "real" package manager, but that's kind of the point)

I love TextMate, too! Something you might find nice is how easy it is to run Mac in a VM on Linux; there are scripts that manage the entire thing for you, and it's pretty painless (and so fast; I was surprised). Useful if you have a few packages you can't find replacements for.

You mention Apple Music elsewhere, which you might be interested to know has an Android client and a web client, and you can probably get a native client on Linux, though I'm not immediately aware of one.

> I love TextMate, too! Something you might find nice is how easy it is to run Mac in a VM on Linux; there are scripts that manage the entire thing for you, and it's pretty painless (and so fast; I was surprised).

That would be excellent! I like the idea of swapping host and guest with this VM strategy, sort of evolutionary platform switching.

Take a look at this! It's pretty simple; it just fetches macOS and then gives you a shell script that launches qemu with a few flags:


Really, really fast, and fairly painless.

..and it works, high sierra, is back!

It's fetching the disk image right now. Gold... Thank you!

Thank you for writing this overview of interesting Linux distributions, their UX and package managers, such good info.

The last few years I've run Linux VMs on a Macbook, but I'm transitioning to a Linux desktop probably running a macOS VM, which you mentioned in another comment - didn't know there was a practical solution.

It sounds like distros like Elementary and PopOS might suit me as a gentle transition from Macs.

Stable distributions Fedora manjaro ubuntu UIX gnome kde xfce all works

macOS is actually kind of mediocre at being a BSD these days ;)

> Do you see how ridiculous applying that logic to any operating system is?

Somehow, when you ask a person about PC or a Mac, the answer is: Windows or MacOS, and then the discussion is about their quirks, or advantages, or deficiencies.

You ask about Linux, and this is what you get:

> Linux isn't a disaster. It's a kernel. There are Linux distributions with great user interfaces and great UX

So, once again: which one of the hundreds of permutations of <distro> x <DM> has a great UX?

Ask a person about UNIX, they'll list Mac, Solaris, whatever. All UNIX distributions! I listed a bunch elsewhere in this subthread. Feel free to check them out, but for some reason I'm beginning to suspect that you're probably not going to.

Ubuntu pretty much works out of the box for a lot of "regular" users (I'm excluding gaming, which also works but is not as easy).

I'm sure there are other user-friendly distros that similarly let average users browse the internet, write documents, listen to music and watch movies painlessly.

I'd say gaming on Ubuntu LTS (if not Linux in general) is quite easy provided you stay in the safe haven of games that natively support the OS, which to be fair is a pretty solid selection of games these days albeit one which is pretty much a strict subset of the games on Windows. As soon as you go outside that area and start messing with Wine or whatever all bets are off, though.

Agreed! I play a lot of games on Linux, bought via Steam or GOG, occasionally with help of WINE but mostly without. I excluded gaming because if one thing is likely to cause more problems than on Windows, it's games. But yes, I use Ubuntu even for gaming.

The fact I can install Steam and play an AAA like Mad Max or Shadow of Mordor mostly seamlessly makes me wonder why people still claim Linux on the desktop is a no-go.

>The fact I can install Steam and play an AAA like Mad Max or Shadow of Mordor mostly seamlessly makes me wonder why people still claim Linux on the desktop is a no-go.

Because they and few others are exceptions? Can you play the latest CoD? GTA V? Assasin's Creed maybe?

I think you're missing the point. I'm not arguing that Linux is the best platform if your use case is primarily gaming. Nothing beats Windows -- or a console! -- if gaming is the most important thing to you.

> GTA V?

I honestly don't know, but it wouldn't surprise me if I could using WINE. A huge library of Windows AAA games work on WINE.

> Assasin's Creed

I don't know, but Mad Max and Shadow of Mordor are pretty much the same kind of game as Assassin's Creed, following the same kind of gameplay and using the same kind and complexity of 3D graphics/engine.

In any case, these are not exceptions. I forgot to mention the XCOM remake, Alien: Isolation (this is interesting because it has tons of graphics effects, including chroma aberration -- it looks awesome on Linux), SOMA, Victor Vran, Warhammer 40K Dawn of War II, L4D2, and many others. There are tons of Linux games on GOG and Steam, many of them AAA games. If you count indie games or 2D platformers there are literally thousands of them, but I guess that's not what you're after.

My questions were mostly rhetorical.

My point is that you can't run most AAA games actually, and many of those you can - will give you enough problems (like frame drop or some graphical features unavailable).

And I really don't understand what's the point of being able to run some games. I want to play the games I'm interested in, not the ones that 'are playable'.

>I don't know, but Mad Max and Shadow of Mordor are pretty much the same kind of game as Assassin's Creed, following the same kind of gameplay and using the same kind and complexity of 3D graphics/engine.

No sure what's your point here. You can't replace one with another just because they have similar mechanics.

Steam\GoG has many games that run on linux and macos (by the way), but most of them are indie platformers or things like that. People don't play random games just to kill some time (well, some do), they play TITLES.

> I forgot to mention

more exceptions. They will stop being exceptions when you will be able to run 80% of titles without any issues and not sooner than that.

Gaming is not important to be, I'm a PS4 guy ever since macos switch, just pointing out that games are still has little to do with linux unless we are talking about rare AAA titles and indie scene

My point is that Linux is a valid gaming platform with many AAA titles and tons of indie games, not that it's the best or ideal gaming platform. Of course Windows is better for gaming.

> And I really don't understand what's the point of being able to run some games. I want to play the games I'm interested in, not the ones that 'are playable'.

With this definition neither Windows nor the PS4 are valid gaming platforms, since not every game can be played on them.

> They will stop being exceptions when you will be able to run 80% of titles without any issues and not sooner than that.

So now it's 80% when before it was "a few exceptions"? Sorry, I'm uninterested in discussing your arbitrary definitions with you. Nice try moving the goalpost.

PS: re: "without any issues", back when I used Windows for gaming, there was always some issue. The graphics card, drivers, config issues. I guess Windows is not a gaming platform either then?

> Yup. You've just described a disaster.

Hardly. The existence of a distro I don't like doesn't degrade my experience using a distro I do like. You may as well be upset at an ice cream shop for having dozens of flavors when you only like strawberry. Choose the one you like and ignore the ones you don't. It's not rocket science, even children can figure that out.

> The existence of a distro I don't like doesn't degrade my experience using a distro I do like.

The problem under discussion here is not that of using a distro you like, but finding a distro that you like.

If an icecream shop only has one flavor, I might get lucky and discover it's the flavor I like. But more likely, I'll just be screwed and have to settle for something I don't like. Only an icecream shop with variety can hope to give the most amount of people an optimal experience.

Unless the ice cream shop provides you with hundreds of flavours, 90% of which are nearly indistinguishable from each other. And hardly anyone on this planet can answer a straight question of "Which flavour is good".

If they're 90% indistinguishable, how is that distinguishable from an icecream shop that simply has fewer flavors?

Linux has been a delight to use for me. Things were rough 10-15 years ago, but it's pretty amazing now.

Any distro in particular you'd recommend?

Fedora 32 Workstation is pretty good if you want to see the best of what Linux can offer. It may not be the lightest and fastest distribution but it is easy to install and everything works. You'll get to experience Gnome which is the most original Linux desktop environment and the best one in terms of user experience in my opinion.

If you want something more traditional with the start menu or dock or desktop icons, perhaps something like KDE Neon is better place to start. It might feel more familiar. Will be lighter/faster too.

Put each of them on a USB and run them live on your machine for few minutes each and see which one makes more sense to you.

Ubuntu, Pop!_OS, Fedora...

Each of them has something done better than the others, but all of them are delight to use.

not him but same experience, from my previous comment:

I would recommend: Ubuntu, Linux Mint, Elementary OS, Pop!_OS

if you want: nice experience out of the box

I would recommend: Arch, Gentoo, Debian Net inst, Void

if you want a base system and install things you want on top of it

Thank you @all for the suggestions! I'm going to set aside some time to experiment with these and see how far I get.

Nice, I would like to hear your experience with it once you do that

Well, my head is spinning, but I've made a bit of progress. I thought I'd start by trying out a few of the ones you and others have characterized as user-friendly as well as one of the more bare-bones ones.

The (hopelessly unscientific) test plan was:

Challenge 1 - write live system ISO to USB drive and boot it on my 2015 MacBook Air (which, though old, still counts as exotic, I guess.)

Challenge 2 - make sure display, network, trackpad and keyboard (+ intl. layout) work correctly. Be able to SFTP to my Mac

Challenge 3 - with little to no docs reading (how is the package manager invoked from CLI?), use the terminal to set up the right environment for a couple of relatively portable hobby projects I've been recently working on (on Mac), compile and test them. This includes, among other things, installing clang or g++, SDL2, Wine (to run an ancient ARM assembler) and finding a usable GBA emulator.


   A: 8GB RAM. More ambitious stuff (KVM macOS, VisualStudio Code) will have to wait for an actual install.
   B: Deliberately avoiding exposure to the docs is silly but I thought 
      such an approach would give me an indication as to whether 
      there exists a distro that uhm, "thinks like me".

Candidates: Ubuntu, Mint, Fedora, KDE Neon (which, if I'm not wrong, is Ubuntu LTS preconfigured as the latest KDE) and Void.


Challenge 1: unremarkable. All worked right off the bat except for Void, which made it as far as showing the mouse pointer but then froze.

Challenge 2: well, boring ;) All distros were pretty much ready to use and required minimal tweaking. With the tweaking part ranging from effortless (Mint) to minor headscratching (Neon). Not sure whether /etc/X11/XF86Config still exists but I did not miss editing it today.

Challange 3: more interesting:

Neon: all worked as expected except some trial and error required to get Wine working: wine32 was required but it wasn't getting installed by default, apparently. (Not a whole lot easier on Mac anyway, with separate downloads & installs for Wine and XQuartz)

Ubuntu: I failed as apt refused to acknowledge the existence of the packages I needed. This is weird as I believe Neon relies on the same package database. Though undoubtedly my fault, not reading the manual, it is perhaps a bit interesting that I could not readily find my way around the problem.

Fedora: everything worked except for Wine, as the live system ran out memory (disk space) on installing it. Not a big deal, everything else worked very well. Aside: I'm an avid runner and "DNF" is not the most likeable of names for a program I have to use very frequently! j/k..

Mint: everything worked at take one.

I know this isn't even scratching the surface of the surface but I think for now I'm going to go ahead and play more with Mint and Fedora after installing them on MB Air hardware or MB Pro VMware.... with a mind of getting back to KDE/Neon eventually.

interesting! Thanks for posting your feedback, I think mint is really great, I'm an ArchLinux user but I like having mint installed on some laptop, the installation is very straightforward and I feel it's way less bloated than Ubuntu for example. And pretty much everything worked out of the box with the laptops I've installed it on (mostly dell laptops).

I haven't used Ubuntu much lately but I remember always having to add community repository to get some package I needed. (Also one of the reason I love Arch, a lot of packages there updated more quickly than most distro + the AUR for everything not present in official repo)

Aye, very happy to have found what look like really viable alternatives, this is promising. And if I manage to make the transition, I will eventually want to try out more sophisticated distro's like Arch, I am quite sure of that.

I shall post my findings.

Gentoo needs vastly better documentation to be useful.

I would say that Archwiki covers a lot of things for a lot of distros, but yeah I would only recommend Gentoo to 'advanced' users, or if you really want to get into it the hard way.

IMO Fedora or Ubuntu. I've used Fedora now for the last few years on Thinkpads (currently Carbon X1 6th gen) and it has been pretty much "just works"

The trick is to go all-in on KDE if you want that Windows feeling where things just work.

And in that case the distro choice should be KDE Neon.

...added to the list.

Fedora or Ubuntu

I hate bloated OSs and unfortunately Mac OS is one of them. I know how everyone wants everything to work out of the box and I know it's very natural to want so but I cringe if I find out my OS doing something behind my back. That's why I'd never use Windows, Mac OS, Ubuntu, etc. They all violate my privacy and slow my system to do so.

I use Debian, I like Debian. When I run Wireshark I don't see unknown requests destined to debian.com. That is the definition of simplicity for me. And yes, it doesn't always work out of the box, you have to install some drivers, change configurations but it's getting better and easier. Yet, I'm a software developer so I understand and like that stuff.

> Linux was always a disaster in terms of user experience and isn't improving.

No, you can't define it as a disaster, it's not. If you're an end-user that understands nothing of computers maybe you can but otherwise it's not a disaster. It's just harder and getting easier by day.

I think the fact is there simply isn't a solution that works for both the "layperson" and highly technical people who want to do development. Laypeople cannot be trusted to admin their machines, but experts need access to those bits. Leaving a backdoor to real admin access for the experts just means laypeople will abuse those backdoors and mess up their machines again, with dire consequences for the entire planet. You see the same problem with power user UI features vs dumbing down for phones and average users. People keep trying to bridge this divide and I'm just not sure it can be done.

> Laypeople cannot be trusted to admin their machines

Yeah, but they're the ones who paid for their machines. So... you're saying they're not allowed to use them how they wish?

> Leaving a backdoor to real admin access for the experts just means laypeople will abuse those backdoors and mess up their machines again

Remembering the last 20 years of computer history, most of the critical fail wasn't caused by "laypeople abusing backdoors" but horrible security holes in popular, widely used software packages: Outlook, Flash, Acrobat Reader, Internet Explorer. Apple/Microsoft are not locking down their OSs to protect users from themselves, but rather from other developers. We, software engineers, seem to have completely failed our users as a profession.

Someone being tricked into installing malware doesn't usually make the news.

> Linux was always a disaster in terms of user experience and isn't improving.

This as true today as saying java is slow. Why not just try? You might get pleasantly surprised.

I've tried it recently and still find it true. Death by a million paper cuts.

What did you try recently? Java or Linux?

Chrome OS?

I happen to enjoy using linux on my laptop. In fact, I think it’s pretty great. But that’s because I can customize it to work the way I want—something that I found hard or impossible to do back when I was using MACOS.

>> Linux was always a disaster in terms of user experience

Try Pop_OS!. I switched from macOS and it's been a relatively painless experience with some tweaks.

The funny thing is, Linux has amazing User Experience if you go all-in on the latest KDE and its associated tooling.

I set my Mac-loving girlfriend up with Kubuntu for this reason.

I’m pretty sure that you have never use linux ... Just try it

Buy a Mac and put ElementaryOS on it to avoid the slowdown and have a slick experience.


Might want to make it a used/refurbished Mac. Newer Macs don't run Linux well (at least as of yet); the whole T2-chip based stuff on newer machines is especially problematic.

From the comments, roughly, are you running third party "security" tools?

> Is there any "security" software running on your Mac? I've seen this sort of thing caused by that, but not in general.

> I ran the two line test and it had no delay at all. The Mac doesn't check for notarization on shell scripts or any non-bundle executable. I just did it again with a new test2.sh and Wireshark capture and there is nothing.

> I do a lot of Keychain code and I've also never seen those delays. The reason I suspect they told you not to use that API is that it's in the "legacy" macOS keychain. They really want everyone to move to the modern keychain but lots of people, myself included, still need the older macOS specific features.

> I'm not saying you are crazy, but all of these things though are the trademark reek of kernel level security software that is intercepting and scanning every exec and file read on the system. We had an issue with Cisco AMP once that took Xcode builds from under 10 seconds to over 5 minutes until we were able to get it fixed.

The only kernel-level security software on my systems is Little Snitch, and I’m pretty sure it doesn’t do anything unless there’s network activity, so it doesn’t explain anything.

Reminds me of the terrible delay I faced after having Sophos installed on my Mac.

Having to wait 5-10 seconds for a new terminal tab as Sophos churns (checking autoccomplete scripts, rbenv, etc) was infuriating. Oddly, there was fate sharing with Internet interception, so there was a good chance the browser was getting dragged down too, and vice versa.

Convincing corporate IT of how bad the problem was was maddening. Based on what this author says, 10.15 on rural internet sounds like hell.

The funny thing is its not transitive. No slowdown if you invoke bash specifically with a new shell.

% rm /tmp/test.sh ; echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh

% time bash /tmp/test.sh && time bash /tmp/test.sh


bash /tmp/test.sh 0.00s user 0.00s system 83% cpu 0.004 total


bash /tmp/test.sh 0.00s user 0.00s system 77% cpu 0.003 total

vs the one from the article:

% rm /tmp/test.sh ; echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh

% time /tmp/test.sh && time /tmp/test.sh


/tmp/test.sh 0.00s user 0.00s system 2% cpu 0.134 total


/tmp/test.sh 0.00s user 0.00s system 73% cpu 0.004 total

(edited for formating)

When you run "bash hello" you are calling exec() on bash, passing "hello" as an argument, which bash then reads; when you run "./hello" you are calling exec() on hello: the kernel then treats "hello" as an executable, but notes that "hello" starts with "#!" and then will run the specified interpreter for you, passing "./hello" as an argument. The kernel doesn't think of "hello" as a program when you run "bash hello".

Are you sure it's just not cached from the prior result? If I run the article's commands twice in a row, the 2nd time is faster.

I am using Ubuntu 20.04 on a Thinkpad X1 Extreme Gen2 and you would be surprised how "normal" it feels as a development machine. Sure there some little annoyances, the touchpad behaves a little worse than on windows, sound is a little worse. But the most important things, Keyboard and Screen are excellent. The system in general does not feel like the horror stories that people keep telling about linux on desktop(notebook). Now that WSL2 is getting Cuda even windows looks workable. Their new terminal app is amazing. After a decade of Mac notebooks it was quite liberating and I would not switch back even if the flaws in macOS would be fixed. It is for sure the nicest of the big 3 operating systems but for development work Ubuntu is hard to beat for me. YMMV but it won't hurt to look around you what else is there.

I've been seeing the trajectory of Windows (pre-2012 or so) -> Mac (2012 - ~2019 or so) -> Linux (~2018 - now) play out with quite a few people without any issues.

And I don't mean developers. They're all pretty educated people but it's taken me by surprise. They come to me in frustration over Mac, they don't want to return to Windows and they really, really, really want linux. I've been using linux since about 1997 so they come to me. I usually push back, thinking "do you really want a unix workstation?!" but they insist.

My strategy has been some x2xx lenovo (like x230 or so) for about $300 from ebay, 8/16gb of ram or so with an SSD, the extended battery pack, putting mint on it and then just handing it over. Everyone, much to my continued surprise, has loved it and are really happy with it.

It's happened 4 times now and I'm still shocked every time. They've told me they use youtube to figure things out.

They're fine with libreoffice, gimp does what they need, supposedly spotify works on it fine, they don't know what bash or the kernel is and it's all fine. Incredible.

Adding to anecdotal, same trajectory for me, for web development. Really happy with Manjaro on Razor Blade 15 for a year now.

I recently _really_ tried adopting Linux on a hobby development machine that I built back in 2016 (hardly new hardware -- and desktop not laptop). Sleep never worked, graphics sometimes borked, UI felt janky and inconsistent, icons are super fugly and often too theme-y to the point of being undifferentiated at a glance, HiDPI support is a giant mixed bag (in 2020), machine would randomly freeze (mostly elementOS; Ubuntu didn't freeze as much), Hauppage drivers rarely worked consistently and often required reboots, I hated the mouse acceleration curves and was horrified to learn they were effectively hardcoded in X (I'm not talking just speed which is tweakable), gstreamer was nightmare to develop for, the Ubuntu & elementaryOS stores are a joke, and the mix of apt/snap/nix was very frustrating and the opposite of user-friendly.

I switched back to my 2012 MBP and it's predictably gone well since, plus I get iMessage integration with my iPhone.


Yeah - the hw really has to be curated. I havent tried using a machine cobbled together from various parts (custom desktop), but off the shelf quality laptops work fine for me last 2 years or so and have none of the issues you mentioned. Emphasis on quality - not cheapo models. I think if you treat Linux same as OSX and run it on known good hardware supported well by Linux you are fine today IME

>HiDPI support is a giant mixed bag I will say that this is still a thing, although with experimental gnome fractional support it works pretty well now.

Honestly I have a 2019 macbook pro 15 and have more problems with it than I do with my Thinkpad X1 Carbon 6th gen with Fedora 32.

See, that's the response I was used to and the one I expected to get from everyone.

The crazy thing is that I haven't heard it yet from the people I helped. Times may actually be changing now, just not swiftly. Perhaps it's the "decade" of desktop linux.

It's also not because linux is so great but because windows and apple are constantly stumbling over their own shoelaces and shooing customers away.

True. Amusingly, I was always trying to make Windows behave more like Unix, but now I'm trying to make Linux behave more like Mac (just a few things, like the global keyboard bindings).

The major pain points are nearly all related to lack of integration with my iPhone (with Messages being the big one, followed by Notes).

Not associated at all but due to loving it, I wanted to share PhotoPea as you mentioned Gimp.


try this:

$ google-chrome --app=https://www.photopea.com

Seconded. I used to work on a Mac laptop for years, then started using a beefy Linux desktop tower on the side for some work that benefited from higher hardware resources. A few months later I realized that I had slowly grown into doing all my work on Linux, even when I didn't need the hardware, mostly because i3 and apt were so much better than the Mac equivalents, and that I was only opening my Mac laptop to walk into meetings. After realizing that I ditched the Mac laptop for a Linux laptop and haven't looked back.

I still use a Mac at home for entertainment (I'm typing this comment on one), and I have to say it works much better used that way. I don't have to worry anymore about random Mac OS upgrades breaking functionality that Apple doesn't care about because it's not part of their vanilla out-of-the-Apple-Store experience, but is vital to me as a developer such as 3rd party window management, dock improvements, keyboard tweaks, or not delaying every new execution by phoning home (LMAO).

Yup. Ubuntu 20 is the first desktop linux OS that just worked. Every other Linux desktop before it has had suspend/resume issues, wifi issues, sound issues, 3d issues, ratchet settings (things that can be set but never unset without some arcane magic), weird desktop behaviors, buggy software that crashes all the time, etc etc. Yes, I've tried ALL of them, including pop os and deepin.

This year marks the first year that I can just use linux without having to debug it.

These things are highly hardware-dependent. Typically it takes a few years until support for new hardware devices, features or platforms stabilizes. But it can even take way more than that, and some less common and lower-quality hardware may fail to get support altogether.

But macOS is very hardware dependent too.

Been putting off upgrading from 16.04 finally got it working a while back and was afraid to touch it.

Might give 20 a shot

Longtime Linux user (Manjaro) and I never thought I'd see the day when I could pitch it as noticeably superior to MacOS, considering Apple's once-legendary attention to user interfaces. It seems like those days are behind us, now.

Linux as an actually better experience, without gigantic embarrassing flubs like this, is looking better by the day.

A slowdown when you run an app for the first time, for security reasons -- I wouldn't categorize that as a "gigantic embarrassing flub". I haven't noticed it, actually. But I don't run new apps every day.

I think you're misunderstanding the problem, respectfully. This is not a problem for end users. This is a problem for developers - and a gigantic, embarrassing flub is justified for something as bad as this.

Think that's hyperbole? Look at this, from the link:

> The first time a user runs a new executable, Apple delays execution while waiting for a reply from their server. This check for me takes close to a second.

> This is not just for files downloaded from the internet... this is everything. So even if you write a one line shell script and run it in a terminal, you will get a delay!

Consider a developer in this situation.

If your job involves lots of scripting - not unusual, for a dev - and you create dozens of scripts a day, or more - every single one will take about a second, and up to 7 seconds (!) to run, that first time you run it. And that could easily happen upwards of a dozen times a day, because it will happen for each script you create.

That's pretty terrible, for a developer. I don't think you can normalize startup times, for some hacky script, of 1 second as pretty okay or not noticeable. Certainly not if you're talking about a high end work machine.

Times that bad are associated with some junk laptop that's 15 years old - that's not supposed to be Apple.

Even if you build apps (I do), you might have the need to create scripts now and then, possibly even a lot of them (I do, for testing). I don't consider it acceptable to wait 1 sec+ each time I run one. It really does suggest that Apple has gotten extremely careless about their developer audience.

So, yeah - compared to that, Linux performs way better, and looks like a premium work machine by comparison.

I never intended to switch away from Mac OS; it just sort of... happened. As Mac OS has grown more paternalistic over the years without adding any notable capabilities that I care about, it's felt steadily easier to just go use Linux instead. It has its own frustrations, but it can always be made to do what I want, and then it just behaves. Starting around Ubuntu 16.04, I found that the balance of frustration was tipping; these days I don't really bother to use my personal Mac any more. I still have one for work, but I'd certainly rather use Linux there too if I had the option.

For touchpad issues in Ubuntu uninstall xserver-xorg-input-synaptics and keep only xserver-xorg-input-libinput installed.

Isn't Ubuntu much worse than this with the push for Snap packages? It can take 10-30 seconds to open software installed through it.

From what I head the snap packages complaints is a lot of FUD, ubuntu is still using normal packages except the Application Store application. You can always use Debian or Kubuntu if you prefer function over form.

I have a ThinkPad with Ubuntu 19. I'm very happy with it; it's nice to have apt, and to be able to eg use minikube with docker driver rather than a VM.

It's also true that the trackpad isn't as good as Windows. (It used to be that Mac had the best, but Catalina managed somehow to screw up the trackpad and make it laggy. Catalina has not been good for me!)

Windows is still very much subpar, even with support for CUDA in WSL2. Loading packages is terribly slow in Windows, for some reason. Also don't get me started on package management (no, Anaconda doesn't cut it).

I got pretty good results with chocolatey.

But I agree that even WSL2 didn't cut the mustard, and I doubt GPU support will fix it. MS is advancing too slow, I think.

I've gone full circle. Went from desktop linux (mostly Arch) to OSX ~7 or so years ago, and now due to a combination of frustration with the butterfly keyboards and then a slew of issues with macOS itself, I'm back to linux desktop for my dev machine.

From my perspective as a quote-unquote power user, it feels like Apple just constantly insists on shooting themselves in the foot with unnecessary and ill conceived innovations. Either way, I'm happy with my new setup and probably won't go back to macbooks anytime soon.

Many of us who have been using Linux just fine on desktops and laptops for decades find those horror stories to be overstated...

I would love to switch back to Linux but Apple's Retina displays are absolutely beautiful and there is no way I could enjoy going back to anything with noticeably lower pixel density on a laptop. I'd like to be told I'm wrong, but as far as I know it's not really possible to recreate a comparable high pixel density experience under Linux on a laptop.

Well, it is. However it's much easier with resolutions perfect for 2x scaling, so 4k on 15" XPS works great. As for fractional scaling (needed for 4k on 14/13") it's still kinda work in progress, I think it will be ready when chromium on wayland finally lands (I expect at least 1 more year). If you don't use electron/chrome, you can use it right now.

Obviously you can use less elegant solutions like changing fonts but it won't work with multiple displays with different resolutions.

Two years ago, I helped a friend install Ubuntu Linux on a Retina Macbook Pro, and it worked like a charm. If you're looking for a new laptop entirely, there are loads of 4K+ Linux-compatible laptops out there (ThinkPads are probably your best bet).

Thanks. What do you think about this post? The author sounds knowledgable and I think it contradicts what you said to some degree (in that the experience and app support is not good even though Linux is installed on a machine with a high dpi display):


I don't know about Ubuntu, but my experience with Gnome on Arch Linux and Arch-derived distributions has been pretty good as far as high-DPI displays go. I've only had to make minor tweaks to a few configurations here and there depending on the application.

If you want to avoid tweaking, stick to native applications, and perhaps more importantly, go for a manufacturer with proper firmware support for high-DPI screens like System76 (Adder WS), Dell (XPS 13), or Lenovo (ThinkPad P1/P53/X1).

It seems the new Dell XPS finally have a touchpad which is close to the ones on the MacBooks. The touchpad and display are the two things which hold me back from switching away from Apple.

I would definitely consider moving to Linux for my next laptop - unfortunately I do a decent amount of iOS development, which I realize isn't impossible to do on Linux, but I can't imagine it'd be worth the hassle. :/

When I switched, I just made the macbook not suspend on lid close, plugged it in and left it running 24/7. Then I just screen shared or ssh'd in in whenever I needed to do something iOS related.

The dual GPU is a pain in the butt since Nvidia still doesn't support Optimus on Linux (and probably never will).

Have you tried 19.10 or 20.04? Before that I had a lot of issues with my Dell XPS 9560 because of optimus, but it got a lot better in those versions. YMMV but it actually worked out of the box with nary a hint of manual configuration when I installed 20.04 recently.

Edit: should note, when I say work I mean you can switch between GPUs/launch an app on the dedicated GPU with ease.

I've tried 19.10 and Arch Linux and the only option still was to statically choose only one GPU and reboot. How does the offloading work now? I haven't heard anything about it

19.10 added the "NVIDIA On-Demand" profile in Nvidia Settings. It needs the driver version 435 or newer.

It works okay, but you have to launch processes with a specific set of env variables to use the Nvidia card.

That is not true anymore. With 20.04 it supports hybrid graphics just fine. The only issue I had was sharing cuda and OpenGL context since GL ran on the Intel card. This should not be a concern for most people I assume.

Can you run everything on the iGPU and only activate the Nvidia GPU to do the render offloading on single apps? If you can, I should try 20.04 on a laptop

Yes exactly. This way you have all the GPU memory available for accelerated apps. Not sure if it works for all use cases but worked for me.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact